From Eric.Chamberland at giref.ulaval.ca Sun Jan 1 06:45:52 2017 From: Eric.Chamberland at giref.ulaval.ca (Eric Chamberland) Date: Sun, 1 Jan 2017 07:45:52 -0500 Subject: [petsc-users] Error with SuperLU_DIST (mkl related?) In-Reply-To: References: Message-ID: Hi, ok, using petsc-3.7.4 but with SuperLU_DIST 5.1.3 (--download-superlu_dist-commit=v5.1.3) fixed the issue! Thanks to both of you and happy new year! :) Eric Le 2016-12-31 ? 11:51, Matthew Knepley a ?crit : > On Sat, Dec 31, 2016 at 9:53 AM, Eric Chamberland > > wrote: > > Hi, > > I am just starting to debug a bug encountered with and only with > SuperLU_Dist combined with MKL on a 2 processes validation test. > > (the same test works fine with MUMPS on 2 processes). > > I just noticed that the SuperLU_Dist version installed by PETSc > configure script is 5.1.0 and the latest SuperLU_DIST is 5.1.3. > > Before going further, I just want to ask: > > Is there any specific reason to stick to 5.1.0? > > > Can you debug in 'master' which does have 5.1.3, including an > important bug fix? > > Matt > > > Here is some more information: > > On process 2 I have this printed in stdout: > > Intel MKL ERROR: Parameter 6 was incorrect on entry to DTRSM . > > and in stderr: > > Test.ProblemeEFGen.opt: malloc.c:2369: sysmalloc: Assertion > `(old_top == (((mbinptr) (((char *) &((av)->bins[((1) - 1) * 2])) > - __builtin_offsetof (struct malloc_chunk, fd)))) && old_size == > 0) || ((unsigned long) (old_size) >= (unsigned > long)((((__builtin_offsetof (struct malloc_chunk, > fd_nextsize))+((2 *(sizeof(size_t))) - 1)) & ~((2 > *(sizeof(size_t))) - 1))) && ((old_top)->size & 0x1) && ((unsigned > long) old_end & pagemask) == 0)' failed. > [saruman:15771] *** Process received signal *** > > This is the 7th call to KSPSolve in the same execution. Here is > the last KSPView: > > KSP Object:(o_slin) 2 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object:(o_slin) 2 MPI processes > type: lu > LU: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 2 MPI processes > type: mpiaij > rows=382, cols=382 > package used to perform factorization: superlu_dist > total: nonzeros=0, allocated nonzeros=0 > total number of mallocs used during MatSetValues calls =0 > SuperLU_DIST run parameters: > Process grid nprow 2 x npcol 1 > Equilibrate matrix TRUE > Matrix input mode 1 > Replace tiny pivots FALSE > Use iterative refinement FALSE > Processors in row 2 col partition 1 > Row permutation LargeDiag > Column permutation METIS_AT_PLUS_A > Parallel symbolic factorization FALSE > Repeated factorization SamePattern > linear system matrix = precond matrix: > Mat Object: (o_slin) 2 MPI processes > type: mpiaij > rows=382, cols=382 > total: nonzeros=4458, allocated nonzeros=4458 > total number of mallocs used during MatSetValues calls =0 > using I-node (on process 0) routines: found 109 nodes, limit > used is 5 > > I know this information is not enough to help debug, but I would > like to know if PETSc guys will upgrade to 5.1.3 before trying to > debug anything. > > Thanks, > Eric > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which > their experiments lead. > -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Eric.Chamberland at giref.ulaval.ca Sun Jan 1 07:04:39 2017 From: Eric.Chamberland at giref.ulaval.ca (Eric Chamberland) Date: Sun, 1 Jan 2017 08:04:39 -0500 Subject: [petsc-users] Error with SuperLU_DIST (mkl related?) In-Reply-To: References: <2ef21b36-fa1b-c4c9-a2c9-00ada423a0c7@giref.ulaval.ca> Message-ID: <61a8ad3b-dae4-6b6d-e60b-543d64efb126@giref.ulaval.ca> Thanks! Bboth filename and #defines are ok now. Eric Le 2016-12-31 ? 16:18, Xiaoye S. Li a ?crit : > I just updated version string in git repo and tarball. > > Sherry > > On Sat, Dec 31, 2016 at 10:39 AM, Satish Balay > wrote: > > Ok - one more place superlu_dist stores version number - that > needs updating with every release. > > cc:ing Sherry > > Satish > > On Sat, 31 Dec 2016, Eric Chamberland wrote: > > > I think there is definitly a problem. > > > > After looking at the files installed either from petsc-master > tarball or the > > manual configure I just did with > --download-superlu_dist-commit=v5.1.3, the > > file include/superlu_defs.h have these values: > > > > #define SUPERLU_DIST_MAJOR_VERSION 5 > > #define SUPERLU_DIST_MINOR_VERSION 1 > > #define SUPERLU_DIST_PATCH_VERSION 0 > > > > What's wrong? > > > > Eric > > > > > > Le 2016-12-31 ? 13:26, Eric Chamberland a ?crit : > > > Ah ok, I see! Here look at the file name in the configure.log: > > > > > > Install the project... > > > /usr/bin/cmake -P cmake_install.cmake > > > -- Install configuration: "DEBUG" > > > -- Installing: > /opt/petsc-master_debug/lib/libsuperlu_dist.so.5.1.0 > > > -- Installing: /opt/petsc-master_debug/lib/libsuperlu_dist.so.5 > > > > > > It is saying 5.1.0, but in fact you are right: it is 5.1.3 that is > > > downloaded!!! :) > > > > > > And FWIW, the nighlty automatic compilation of PETSc starts > within a brand > > > new and empty directory each night... > > > > > > Thanks to both of you again! :) > > > > > > Eric > > > > > > > > > Le 2016-12-31 ? 13:17, Satish Balay a ?crit : > > > > > =============================================================================== > > > > Trying to download > > > > git://https://github.com/xiaoyeli/superlu_dist > for SUPERLU_DIST > > > > > =============================================================================== > > > > Executing: git clone > > > > https://github.com/xiaoyeli/superlu_dist > > > > > > /pmi/cmpbib/compilation_BIB_gcc_redhat_petsc-master_debug/COMPILE_AUTO/petsc-master-debug/arch-linux2-c-debug/externalpackages/git.superlu_dist > > > > stdout: Cloning into > > > > > '/pmi/cmpbib/compilation_BIB_gcc_redhat_petsc-master_debug/COMPILE_AUTO/petsc-master-debug/arch-linux2-c-debug/externalpackages/git.superlu_dist'... > > > > Looking for SUPERLU_DIST at > git.superlu_dist, > > > > hg.superlu_dist or a directory starting with ['superlu_dist'] > > > > Found a copy of SUPERLU_DIST in > git.superlu_dist > > > > Executing: ['git', 'rev-parse', '--git-dir'] > > > > stdout: .git > > > > Executing: ['git', 'cat-file', '-e', 'v5.1.3^{commit}'] > > > > Executing: ['git', 'rev-parse', 'v5.1.3'] > > > > stdout: 7306f704c6c8d5113def649b76def3c8eb607690 > > > > Executing: ['git', 'stash'] > > > > stdout: No local changes to save > > > > Executing: ['git', 'clean', '-f', '-d', '-x'] > > > > Executing: ['git', 'checkout', '-f', > > > > '7306f704c6c8d5113def649b76def3c8eb607690'] > > > > <<<<<<<< > > > > > > > > Per log below - its using 5.1.3. Why did you think you got > 5.1.0? > > > > > > > > Satish > > > > > > > > On Sat, 31 Dec 2016, Eric Chamberland wrote: > > > > > > > > > Hi, > > > > > > > > > > ok I will test with 5.1.3 with the option you gave me > > > > > (--download-superlu_dit-commit=v5.1.3). > > > > > > > > > > But from what you and Matthew said, I should have 5.1.3 with > > > > > petsc-master, but > > > > > the last night log shows me library file name 5.1.0: > > > > > > > > > > > http://www.giref.ulaval.ca/~cmpgiref/petsc-master-debug/2016.12.31.02h00m01s_configure.log > > > > > > > > > > > > > > > > So I am a bit confused: Why did I got 5.1.0 last night? (I > use the > > > > > petsc-master tarball, is it the reason?) > > > > > > > > > > Thanks, > > > > > > > > > > Eric > > > > > > > > > > > > > > > Le 2016-12-31 ? 11:52, Satish Balay a ?crit : > > > > > > On Sat, 31 Dec 2016, Eric Chamberland wrote: > > > > > > > > > > > > > Hi, > > > > > > > > > > > > > > I am just starting to debug a bug encountered with and > only with > > > > > > > SuperLU_Dist > > > > > > > combined with MKL on a 2 processes validation test. > > > > > > > > > > > > > > (the same test works fine with MUMPS on 2 processes). > > > > > > > > > > > > > > I just noticed that the SuperLU_Dist version installed > by PETSc > > > > > > > configure > > > > > > > script is 5.1.0 and the latest SuperLU_DIST is 5.1.3. > > > > > > If you use petsc-master - it will install 5.1.3 by default. > > > > > > > Before going further, I just want to ask: > > > > > > > > > > > > > > Is there any specific reason to stick to 5.1.0? > > > > > > We don't usually upgrade externalpackage version in > PETSc releases > > > > > > [unless its tested to work and fixes known bugs]. There > could be API > > > > > > changes - or build changes that can potentially conflict. > > > > > > > > > > > > >From what I know - 5.1.3 should work with petsc-3.7 [it > fixes a > > > > > > couple of > > > > > > bugs]. > > > > > > > > > > > > You might be able to do the following with petsc-3.7 > [with git > > > > > > externalpackage repos] > > > > > > > > > > > > --download-superlu_dist --download-superlu_dit-commit=v5.1.3 > > > > > > > > > > > > Satish > > > > > > > > > > > > > Here is some more information: > > > > > > > > > > > > > > On process 2 I have this printed in stdout: > > > > > > > > > > > > > > Intel MKL ERROR: Parameter 6 was incorrect on entry to > DTRSM . > > > > > > > > > > > > > > and in stderr: > > > > > > > > > > > > > > Test.ProblemeEFGen.opt: malloc.c:2369: sysmalloc: > Assertion > > > > > > > `(old_top == > > > > > > > (((mbinptr) (((char *) &((av)->bins[((1) - 1) * 2])) - > > > > > > > __builtin_offsetof > > > > > > > (struct malloc_chunk, fd)))) && old_size == 0) || > ((unsigned long) > > > > > > > (old_size) > > > > > > > > = (unsigned long)((((__builtin_offsetof (struct > malloc_chunk, > > > > > > > fd_nextsize))+((2 *(sizeof(size_t))) - 1)) & ~((2 > *(sizeof(size_t))) > > > > > > > - > > > > > > > 1))) && > > > > > > > ((old_top)->size & 0x1) && ((unsigned long) old_end & > pagemask) == > > > > > > > 0)' > > > > > > > failed. > > > > > > > [saruman:15771] *** Process received signal *** > > > > > > > > > > > > > > This is the 7th call to KSPSolve in the same > execution. Here is the > > > > > > > last > > > > > > > KSPView: > > > > > > > > > > > > > > KSP Object:(o_slin) 2 MPI processes > > > > > > > type: preonly > > > > > > > maximum iterations=10000, initial guess is zero > > > > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > > > > left preconditioning > > > > > > > using NONE norm type for convergence test > > > > > > > PC Object:(o_slin) 2 MPI processes > > > > > > > type: lu > > > > > > > LU: out-of-place factorization > > > > > > > tolerance for zero pivot 2.22045e-14 > > > > > > > matrix ordering: natural > > > > > > > factor fill ratio given 0., needed 0. > > > > > > > Factored matrix follows: > > > > > > > Mat Object: 2 MPI processes > > > > > > > type: mpiaij > > > > > > > rows=382, cols=382 > > > > > > > package used to perform factorization: > superlu_dist > > > > > > > total: nonzeros=0, allocated nonzeros=0 > > > > > > > total number of mallocs used during > MatSetValues calls > > > > > > > =0 > > > > > > > SuperLU_DIST run parameters: > > > > > > > Process grid nprow 2 x npcol 1 > > > > > > > Equilibrate matrix TRUE > > > > > > > Matrix input mode 1 > > > > > > > Replace tiny pivots FALSE > > > > > > > Use iterative refinement FALSE > > > > > > > Processors in row 2 col partition 1 > > > > > > > Row permutation LargeDiag > > > > > > > Column permutation METIS_AT_PLUS_A > > > > > > > Parallel symbolic factorization FALSE > > > > > > > Repeated factorization SamePattern > > > > > > > linear system matrix = precond matrix: > > > > > > > Mat Object: (o_slin) 2 MPI processes > > > > > > > type: mpiaij > > > > > > > rows=382, cols=382 > > > > > > > total: nonzeros=4458, allocated nonzeros=4458 > > > > > > > total number of mallocs used during MatSetValues > calls =0 > > > > > > > using I-node (on process 0) routines: found > 109 nodes, limit > > > > > > > used > > > > > > > is 5 > > > > > > > > > > > > > > I know this information is not enough to help debug, > but I would > > > > > > > like to > > > > > > > know > > > > > > > if PETSc guys will upgrade to 5.1.3 before trying to > debug anything. > > > > > > > > > > > > > > Thanks, > > > > > > > Eric > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeremy at seamplex.com Mon Jan 2 06:13:06 2017 From: jeremy at seamplex.com (Jeremy Theler) Date: Mon, 02 Jan 2017 09:13:06 -0300 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates Message-ID: <1483359186.2320.7.camel@seamplex.com> Hi all I want to check that the near nullspace I provide to GAMG gives "almost null vectors" when multiplying each vector in the near nullspace against the matrix problem. This way I can check that the unknown ordering I am using is consistent, for example using by MatNullSpaceCreateRigidBody() or by computing the nullspace by myself. The thing is I do not know how I can get the nullspace object after calling PCSetCoordinates(). It gets a pointer to the PC object, but MatGetNearNullSpace() needs the matrix object. I assume at some point the matrix and the PC get linked, but when I ask MatGetNearNullSpace(matrix) passing the problem matrix after setting PCSetCoordinates(pc) I get: error: PETSc error 85-0 'Null Object: Parameter # 1' in /home/gtheler/libs/petsc-3.7.4/src/mat/interface/matnull.c MatNullSpaceGetVecs:64 thanks -- Jeremy Theler www.seamplex.com From jed at jedbrown.org Mon Jan 2 10:47:18 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 02 Jan 2017 09:47:18 -0700 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: <1483359186.2320.7.camel@seamplex.com> References: <1483359186.2320.7.camel@seamplex.com> Message-ID: <87wpedbe95.fsf@jedbrown.org> Jeremy Theler writes: > Hi all > > I want to check that the near nullspace I provide to GAMG gives "almost > null vectors" when multiplying each vector in the near nullspace against > the matrix problem. > > This way I can check that the unknown ordering I am using is consistent, > for example using by MatNullSpaceCreateRigidBody() or by computing the > nullspace by myself. Please use that and MatSetNearNullSpace(). It composes properly and you can check everything. PCSetCoordinates() happens to do double-duty for aggregation-based methods, but outside of semi-geometric methods, it is just ugly code duplication and makes assumptions that may be inappropriate (like elasticity with an interpolatory basis). I would recommend not using PCSetCoordinates(). > The thing is I do not know how I can get the nullspace object after > calling PCSetCoordinates(). It gets a pointer to the PC object, but > MatGetNearNullSpace() needs the matrix object. I assume at some point > the matrix and the PC get linked, but when I ask > MatGetNearNullSpace(matrix) passing the problem matrix after setting > PCSetCoordinates(pc) I get: > > error: PETSc error 85-0 'Null Object: Parameter # 1' > in /home/gtheler/libs/petsc-3.7.4/src/mat/interface/matnull.c > MatNullSpaceGetVecs:64 > > > thanks > > -- > Jeremy Theler > www.seamplex.com -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From mfadams at lbl.gov Mon Jan 2 16:58:42 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 2 Jan 2017 17:58:42 -0500 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: <87wpedbe95.fsf@jedbrown.org> References: <1483359186.2320.7.camel@seamplex.com> <87wpedbe95.fsf@jedbrown.org> Message-ID: On Mon, Jan 2, 2017 at 11:47 AM, Jed Brown wrote: > Jeremy Theler writes: > > > Hi all > > > > I want to check that the near nullspace I provide to GAMG gives "almost > > null vectors" when multiplying each vector in the near nullspace against > > the matrix problem. > > > > This way I can check that the unknown ordering I am using is consistent, > > for example using by MatNullSpaceCreateRigidBody() or by computing the > > nullspace by myself. > > Please use that and MatSetNearNullSpace(). It composes properly and you > can check everything. > > PCSetCoordinates() happens to do double-duty for aggregation-based > methods, but outside of semi-geometric methods, it is just ugly code > duplication and makes assumptions that may be inappropriate (like > elasticity with an interpolatory basis). Yes, PCSetCoordinates is an old interface that is essentially deprecated. Maybe we should officially deprecated this. > I would recommend not using > PCSetCoordinates(). > > > The thing is I do not know how I can get the nullspace object after > > calling PCSetCoordinates(). It gets a pointer to the PC object, but > > MatGetNearNullSpace() needs the matrix object. I assume at some point > > the matrix and the PC get linked, but when I ask > > MatGetNearNullSpace(matrix) passing the problem matrix after setting > > PCSetCoordinates(pc) I get: > > > > error: PETSc error 85-0 'Null Object: Parameter # 1' > > in /home/gtheler/libs/petsc-3.7.4/src/mat/interface/matnull.c > > MatNullSpaceGetVecs:64 > > > > > > thanks > > > > -- > > Jeremy Theler > > www.seamplex.com > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Jan 2 17:23:48 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 02 Jan 2017 16:23:48 -0700 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: References: <1483359186.2320.7.camel@seamplex.com> <87wpedbe95.fsf@jedbrown.org> Message-ID: <87inpxavwb.fsf@jedbrown.org> Mark Adams writes: > On Mon, Jan 2, 2017 at 11:47 AM, Jed Brown wrote: > >> Jeremy Theler writes: >> >> > Hi all >> > >> > I want to check that the near nullspace I provide to GAMG gives "almost >> > null vectors" when multiplying each vector in the near nullspace against >> > the matrix problem. >> > >> > This way I can check that the unknown ordering I am using is consistent, >> > for example using by MatNullSpaceCreateRigidBody() or by computing the >> > nullspace by myself. >> >> Please use that and MatSetNearNullSpace(). It composes properly and you >> can check everything. >> >> PCSetCoordinates() happens to do double-duty for aggregation-based >> methods, but outside of semi-geometric methods, it is just ugly code >> duplication and makes assumptions that may be inappropriate (like >> elasticity with an interpolatory basis). > > > Yes, PCSetCoordinates is an old interface that is essentially deprecated. > Maybe we should officially deprecated this. I think we should officially deprecate it, but perhaps make something more general available as a Mat function (since some algorithms may use coordinates directly). (Needing to dig up a PC to provide problem (as opposed to configuration) information is bad style.) -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From bsmith at mcs.anl.gov Mon Jan 2 19:03:46 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 2 Jan 2017 19:03:46 -0600 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: <87inpxavwb.fsf@jedbrown.org> References: <1483359186.2320.7.camel@seamplex.com> <87wpedbe95.fsf@jedbrown.org> <87inpxavwb.fsf@jedbrown.org> Message-ID: How about MatSetCoordindates(Mat, Vec). Then MatNullSpaceCreateRigidBody(Mat, MatNullSpace *); Then presumable GAMG can pass the appropriated coordinates down to the smaller matrices it creates internally and create the rigid body null spaces it wants as it moves to the smaller matrices? Barry You could have a MatGetCoordindates(Mat, Vec) and not change the calling sequence of MatNullSpaceCreateRigidBody() but I like the first alternative I suggested. > On Jan 2, 2017, at 5:23 PM, Jed Brown wrote: > > Mark Adams writes: > >> On Mon, Jan 2, 2017 at 11:47 AM, Jed Brown wrote: >> >>> Jeremy Theler writes: >>> >>>> Hi all >>>> >>>> I want to check that the near nullspace I provide to GAMG gives "almost >>>> null vectors" when multiplying each vector in the near nullspace against >>>> the matrix problem. >>>> >>>> This way I can check that the unknown ordering I am using is consistent, >>>> for example using by MatNullSpaceCreateRigidBody() or by computing the >>>> nullspace by myself. >>> >>> Please use that and MatSetNearNullSpace(). It composes properly and you >>> can check everything. >>> >>> PCSetCoordinates() happens to do double-duty for aggregation-based >>> methods, but outside of semi-geometric methods, it is just ugly code >>> duplication and makes assumptions that may be inappropriate (like >>> elasticity with an interpolatory basis). >> >> >> Yes, PCSetCoordinates is an old interface that is essentially deprecated. >> Maybe we should officially deprecated this. > > I think we should officially deprecate it, but perhaps make something > more general available as a Mat function (since some algorithms may use > coordinates directly). (Needing to dig up a PC to provide problem (as > opposed to configuration) information is bad style.) From jed at jedbrown.org Mon Jan 2 19:29:22 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 02 Jan 2017 18:29:22 -0700 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: References: <1483359186.2320.7.camel@seamplex.com> <87wpedbe95.fsf@jedbrown.org> <87inpxavwb.fsf@jedbrown.org> Message-ID: <87zij99bil.fsf@jedbrown.org> Barry Smith writes: > How about MatSetCoordindates(Mat, Vec). This would assume an interpolatory basis for which each Mat (row?)bs corresponds to one Vec bs. Should we consider mixed spaces? > Then MatNullSpaceCreateRigidBody(Mat, MatNullSpace *); Perhaps something like this as a convenience, but I think it can be useful to call the current function without first creating a Mat to attach the coordinate Vec to. > Then presumable GAMG can pass the appropriated coordinates down to the smaller matrices it creates internally and create the rigid body null spaces it wants as it moves to the smaller matrices? > > Barry > > > You could have a MatGetCoordindates(Mat, Vec) and not change the calling sequence of MatNullSpaceCreateRigidBody() but I like the first alternative I suggested. > > >> On Jan 2, 2017, at 5:23 PM, Jed Brown wrote: >> >> Mark Adams writes: >> >>> On Mon, Jan 2, 2017 at 11:47 AM, Jed Brown wrote: >>> >>>> Jeremy Theler writes: >>>> >>>>> Hi all >>>>> >>>>> I want to check that the near nullspace I provide to GAMG gives "almost >>>>> null vectors" when multiplying each vector in the near nullspace against >>>>> the matrix problem. >>>>> >>>>> This way I can check that the unknown ordering I am using is consistent, >>>>> for example using by MatNullSpaceCreateRigidBody() or by computing the >>>>> nullspace by myself. >>>> >>>> Please use that and MatSetNearNullSpace(). It composes properly and you >>>> can check everything. >>>> >>>> PCSetCoordinates() happens to do double-duty for aggregation-based >>>> methods, but outside of semi-geometric methods, it is just ugly code >>>> duplication and makes assumptions that may be inappropriate (like >>>> elasticity with an interpolatory basis). >>> >>> >>> Yes, PCSetCoordinates is an old interface that is essentially deprecated. >>> Maybe we should officially deprecated this. >> >> I think we should officially deprecate it, but perhaps make something >> more general available as a Mat function (since some algorithms may use >> coordinates directly). (Needing to dig up a PC to provide problem (as >> opposed to configuration) information is bad style.) -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From knepley at gmail.com Mon Jan 2 19:36:34 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 2 Jan 2017 19:36:34 -0600 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: <87zij99bil.fsf@jedbrown.org> References: <1483359186.2320.7.camel@seamplex.com> <87wpedbe95.fsf@jedbrown.org> <87inpxavwb.fsf@jedbrown.org> <87zij99bil.fsf@jedbrown.org> Message-ID: On Mon, Jan 2, 2017 at 7:29 PM, Jed Brown wrote: > Barry Smith writes: > > > How about MatSetCoordindates(Mat, Vec). > > This would assume an interpolatory basis for which each Mat (row?)bs > corresponds to one Vec bs. Should we consider mixed spaces? > I think none of this helps enough. You should have an easy object at hand to provide the information, and fallback to something easy like interpolatory when that is absent. Preferably, the object we use should only know about sets of dofs, just like we have for FieldSplit. Matt > > Then MatNullSpaceCreateRigidBody(Mat, MatNullSpace *); > > Perhaps something like this as a convenience, but I think it can be > useful to call the current function without first creating a Mat to > attach the coordinate Vec to. > > > Then presumable GAMG can pass the appropriated coordinates down to > the smaller matrices it creates internally and create the rigid body null > spaces it wants as it moves to the smaller matrices? > > > > Barry > > > > > > You could have a MatGetCoordindates(Mat, Vec) and not change the calling > sequence of MatNullSpaceCreateRigidBody() but I like the first alternative > I suggested. > > > > > >> On Jan 2, 2017, at 5:23 PM, Jed Brown wrote: > >> > >> Mark Adams writes: > >> > >>> On Mon, Jan 2, 2017 at 11:47 AM, Jed Brown wrote: > >>> > >>>> Jeremy Theler writes: > >>>> > >>>>> Hi all > >>>>> > >>>>> I want to check that the near nullspace I provide to GAMG gives > "almost > >>>>> null vectors" when multiplying each vector in the near nullspace > against > >>>>> the matrix problem. > >>>>> > >>>>> This way I can check that the unknown ordering I am using is > consistent, > >>>>> for example using by MatNullSpaceCreateRigidBody() or by computing > the > >>>>> nullspace by myself. > >>>> > >>>> Please use that and MatSetNearNullSpace(). It composes properly and > you > >>>> can check everything. > >>>> > >>>> PCSetCoordinates() happens to do double-duty for aggregation-based > >>>> methods, but outside of semi-geometric methods, it is just ugly code > >>>> duplication and makes assumptions that may be inappropriate (like > >>>> elasticity with an interpolatory basis). > >>> > >>> > >>> Yes, PCSetCoordinates is an old interface that is essentially > deprecated. > >>> Maybe we should officially deprecated this. > >> > >> I think we should officially deprecate it, but perhaps make something > >> more general available as a Mat function (since some algorithms may use > >> coordinates directly). (Needing to dig up a PC to provide problem (as > >> opposed to configuration) information is bad style.) > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Jan 2 20:38:58 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 2 Jan 2017 20:38:58 -0600 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: <87zij99bil.fsf@jedbrown.org> References: <1483359186.2320.7.camel@seamplex.com> <87wpedbe95.fsf@jedbrown.org> <87inpxavwb.fsf@jedbrown.org> <87zij99bil.fsf@jedbrown.org> Message-ID: > On Jan 2, 2017, at 7:29 PM, Jed Brown wrote: > > Barry Smith writes: > >> How about MatSetCoordindates(Mat, Vec). > > This would assume an interpolatory basis for which each Mat (row?)bs > corresponds to one Vec bs. Should we consider mixed spaces? Of course. > >> Then MatNullSpaceCreateRigidBody(Mat, MatNullSpace *); > > Perhaps something like this as a convenience, but I think it can be > useful to call the current function without first creating a Mat to > attach the coordinate Vec to. > >> Then presumable GAMG can pass the appropriated coordinates down to the smaller matrices it creates internally and create the rigid body null spaces it wants as it moves to the smaller matrices? >> >> Barry >> >> >> You could have a MatGetCoordindates(Mat, Vec) and not change the calling sequence of MatNullSpaceCreateRigidBody() but I like the first alternative I suggested. >> >> >>> On Jan 2, 2017, at 5:23 PM, Jed Brown wrote: >>> >>> Mark Adams writes: >>> >>>> On Mon, Jan 2, 2017 at 11:47 AM, Jed Brown wrote: >>>> >>>>> Jeremy Theler writes: >>>>> >>>>>> Hi all >>>>>> >>>>>> I want to check that the near nullspace I provide to GAMG gives "almost >>>>>> null vectors" when multiplying each vector in the near nullspace against >>>>>> the matrix problem. >>>>>> >>>>>> This way I can check that the unknown ordering I am using is consistent, >>>>>> for example using by MatNullSpaceCreateRigidBody() or by computing the >>>>>> nullspace by myself. >>>>> >>>>> Please use that and MatSetNearNullSpace(). It composes properly and you >>>>> can check everything. >>>>> >>>>> PCSetCoordinates() happens to do double-duty for aggregation-based >>>>> methods, but outside of semi-geometric methods, it is just ugly code >>>>> duplication and makes assumptions that may be inappropriate (like >>>>> elasticity with an interpolatory basis). >>>> >>>> >>>> Yes, PCSetCoordinates is an old interface that is essentially deprecated. >>>> Maybe we should officially deprecated this. >>> >>> I think we should officially deprecate it, but perhaps make something >>> more general available as a Mat function (since some algorithms may use >>> coordinates directly). (Needing to dig up a PC to provide problem (as >>> opposed to configuration) information is bad style.) From mfadams at lbl.gov Tue Jan 3 07:44:18 2017 From: mfadams at lbl.gov (Mark Adams) Date: Tue, 3 Jan 2017 08:44:18 -0500 Subject: [petsc-users] getting the near nullspace from PCSetCoordinates In-Reply-To: <87zij99bil.fsf@jedbrown.org> References: <1483359186.2320.7.camel@seamplex.com> <87wpedbe95.fsf@jedbrown.org> <87inpxavwb.fsf@jedbrown.org> <87zij99bil.fsf@jedbrown.org> Message-ID: On Mon, Jan 2, 2017 at 8:29 PM, Jed Brown wrote: > Barry Smith writes: > > > How about MatSetCoordindates(Mat, Vec). > > This would assume an interpolatory basis for which each Mat (row?)bs > corresponds to one Vec bs. Should we consider mixed spaces? > I don't see how you do that without getting into discretizations, which is not what we are addressing here and opens a big can of worms. Do we have any users that write their own null space vectors because they are using a non-interpolatory basis? (whatever the heck that is) The setCoordinates thing was put in because this is convenient for a lot of users. And the new matnullspace interface is great but it is just a minor tweak to that interface IMO. Asking users to move over the the MatNullSpace interface is only a few lines, which they can get from ksp/ex56, so not asking much if we are happy with this interface (and it has been stable for years now). > > Then MatNullSpaceCreateRigidBody(Mat, MatNullSpace *); > > Perhaps something like this as a convenience, but I think it can be > useful to call the current function without first creating a Mat to > attach the coordinate Vec to. > Yea, maybe that makes more sense because it is a space and Vec is the space(s). > > > Then presumable GAMG can pass the appropriated coordinates down to > the smaller matrices it creates internally and create the rigid body null > spaces it wants as it moves to the smaller matrices? > > > > Barry > > > > > > You could have a MatGetCoordindates(Mat, Vec) and not change the calling > sequence of MatNullSpaceCreateRigidBody() but I like the first alternative > I suggested. > > > > > >> On Jan 2, 2017, at 5:23 PM, Jed Brown wrote: > >> > >> Mark Adams writes: > >> > >>> On Mon, Jan 2, 2017 at 11:47 AM, Jed Brown wrote: > >>> > >>>> Jeremy Theler writes: > >>>> > >>>>> Hi all > >>>>> > >>>>> I want to check that the near nullspace I provide to GAMG gives > "almost > >>>>> null vectors" when multiplying each vector in the near nullspace > against > >>>>> the matrix problem. > >>>>> > >>>>> This way I can check that the unknown ordering I am using is > consistent, > >>>>> for example using by MatNullSpaceCreateRigidBody() or by computing > the > >>>>> nullspace by myself. > >>>> > >>>> Please use that and MatSetNearNullSpace(). It composes properly and > you > >>>> can check everything. > >>>> > >>>> PCSetCoordinates() happens to do double-duty for aggregation-based > >>>> methods, but outside of semi-geometric methods, it is just ugly code > >>>> duplication and makes assumptions that may be inappropriate (like > >>>> elasticity with an interpolatory basis). > >>> > >>> > >>> Yes, PCSetCoordinates is an old interface that is essentially > deprecated. > >>> Maybe we should officially deprecated this. > >> > >> I think we should officially deprecate it, but perhaps make something > >> more general available as a Mat function (since some algorithms may use > >> coordinates directly). (Needing to dig up a PC to provide problem (as > >> opposed to configuration) information is bad style.) > -------------- next part -------------- An HTML attachment was scrubbed... URL: From C.Klaij at marin.nl Tue Jan 3 09:04:59 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Tue, 3 Jan 2017 15:04:59 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 Message-ID: <1483455899357.86673@marin.nl> I've been using petsc-3.7.4 with intel mpi and compilers, superlu_dist, metis and parmetis on a cluster running SL7. Everything was working fine until SL7 got an update where glibc was upgraded from 2.17-106 to 2.17-157. This update seemed to have broken (at least) parmetis: the standalone binary gpmetis started to give a segmentation fault. The core dump shows this: Core was generated by `gpmetis'. Program terminated with signal 11, Segmentation fault. #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 That's when I decided to recompile, but to my surprise I cannot even get past the configure stage (log attached)! ******************************************************************************* UNABLE to EXECUTE BINARIES for ./configure ------------------------------------------------------------------------------- Cannot run executables created with FC. If this machine uses a batch system to submit jobs you will need to configure using ./configure with the additional option --with-batch. Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf ******************************************************************************* Note the following: 1) Configure was done with the exact same options that worked fine before the update of SL7. 2) The intel mpi and compilers are exactly the same as before the update of SL7. 3) The cluster does not require a batch system to run code. 4) I can compile and run code with mpif90 on this cluster. 5) The problem also occurs on a workstation running SL7. Any clues on how to proceed? Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 1526164 bytes Desc: configure.log URL: From knepley at gmail.com Tue Jan 3 09:36:44 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 3 Jan 2017 09:36:44 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483455899357.86673@marin.nl> References: <1483455899357.86673@marin.nl> Message-ID: On Tue, Jan 3, 2017 at 9:04 AM, Klaij, Christiaan wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > I cannot see the error in your log. We previously fixed a bug with this error reporting: https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293c0ccd0e7d7 in August. Is it possible that your PETSc is older than this? Could you apply that patch, or run the configure with 'master'? My guess is this is a dynamic library path problem, as it always is after upgrades. Thanks, Matt > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ************************************************************ > ******************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------ > ------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the > additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run > code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ************************************************************ > ******************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of- > uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Jan 3 09:37:11 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 3 Jan 2017 09:37:11 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483455899357.86673@marin.nl> References: <1483455899357.86673@marin.nl> Message-ID: Do you have similar issues with gnu compilers? It must be some incompatibility with intel compilers with this glibc change. >>>>>>>>> compilers: Check that C libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** <<<<<<<<<< Thre is a bug in configure [Matt?] that eats away some of the log - so I don't see the exact error you are getting. If standalone micc/mpif90 etc work - then you can try the following additional options: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a [replace "path_to" with the correct path to the ifort lubifcore.a library] Note: I have a RHEL7 box with this glibc - and I don't see this issue. >>>> -bash-4.2$ cat /etc/redhat-release Red Hat Enterprise Linux Server release 7.3 (Maipo) -bash-4.2$ rpm -q glibc glibc-2.17-157.el7_3.1.x86_64 glibc-2.17-157.el7_3.1.i686 -bash-4.2$ mpiicc --version icc (ICC) 17.0.0 20160721 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. -bash-4.2$ <<<< Satish On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > From balay at mcs.anl.gov Tue Jan 3 09:41:29 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 3 Jan 2017 09:41:29 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> Message-ID: On Tue, 3 Jan 2017, Matthew Knepley wrote: > We previously fixed a bug with this error reporting: > > https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293c0ccd0e7d7 > > in August. Is it possible that your PETSc is older than this? Could > you apply that patch, or run the configure with 'master'? Ok - I've added this patch to maint. Satish From C.Klaij at marin.nl Tue Jan 3 09:46:39 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Tue, 3 Jan 2017 15:46:39 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl>, Message-ID: <1483458399121.70375@marin.nl> I've downloaded the tarball on October 24th: $ ls -lh petsc-lite-3.7.4.tar.gz -rw-r--r-- 1 cklaij domain users 8.4M Oct 24 11:07 petsc-lite-3.7.4.tar.gz (no direct internet access on cluster) dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: Modelling natural transition on hydrofoils for application in underwater gliders ________________________________ From: Matthew Knepley Sent: Tuesday, January 03, 2017 4:36 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Tue, Jan 3, 2017 at 9:04 AM, Klaij, Christiaan > wrote: I've been using petsc-3.7.4 with intel mpi and compilers, superlu_dist, metis and parmetis on a cluster running SL7. Everything was working fine until SL7 got an update where glibc was upgraded from 2.17-106 to 2.17-157. I cannot see the error in your log. We previously fixed a bug with this error reporting: https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293c0ccd0e7d7 in August. Is it possible that your PETSc is older than this? Could you apply that patch, or run the configure with 'master'? My guess is this is a dynamic library path problem, as it always is after upgrades. Thanks, Matt This update seemed to have broken (at least) parmetis: the standalone binary gpmetis started to give a segmentation fault. The core dump shows this: Core was generated by `gpmetis'. Program terminated with signal 11, Segmentation fault. #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 That's when I decided to recompile, but to my surprise I cannot even get past the configure stage (log attached)! ******************************************************************************* UNABLE to EXECUTE BINARIES for ./configure ------------------------------------------------------------------------------- Cannot run executables created with FC. If this machine uses a batch system to submit jobs you will need to configure using ./configure with the additional option --with-batch. Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf ******************************************************************************* Note the following: 1) Configure was done with the exact same options that worked fine before the update of SL7. 2) The intel mpi and compilers are exactly the same as before the update of SL7. 3) The cluster does not require a batch system to run code. 4) I can compile and run code with mpif90 on this cluster. 5) The problem also occurs on a workstation running SL7. Any clues on how to proceed? Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image901d0c.PNG Type: image/png Size: 293 bytes Desc: image901d0c.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1337f2.PNG Type: image/png Size: 331 bytes Desc: image1337f2.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image75d749.PNG Type: image/png Size: 333 bytes Desc: image75d749.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imagebc4ffe.PNG Type: image/png Size: 253 bytes Desc: imagebc4ffe.PNG URL: From knepley at gmail.com Tue Jan 3 09:50:53 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 3 Jan 2017 09:50:53 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483458399121.70375@marin.nl> References: <1483455899357.86673@marin.nl> <1483458399121.70375@marin.nl> Message-ID: On Tue, Jan 3, 2017 at 9:46 AM, Klaij, Christiaan wrote: > I've downloaded the tarball on October 24th: > > $ ls -lh petsc-lite-3.7.4.tar.gz > -rw-r--r-- 1 cklaij domain users 8.4M Oct 24 11:07 petsc-lite-3.7.4.tar.gz > > (no direct internet access on cluster) > That is missing the fix. You can insert those few lines: https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293 c0ccd0e7d7 Or get the new tarball when it spins tonight, since Satish has just added the fix to maint. Thanks, Matt > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 <+31%20317%20493%20344> | C.Klaij at marin.nl | > www.marin.nl > > [image: LinkedIn] [image: > YouTube] [image: Twitter] > [image: Facebook] > > MARIN news: Modelling natural transition on hydrofoils for application in > underwater gliders > > > ------------------------------ > *From:* Matthew Knepley > *Sent:* Tuesday, January 03, 2017 4:36 PM > *To:* Klaij, Christiaan > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > On Tue, Jan 3, 2017 at 9:04 AM, Klaij, Christiaan > wrote: > >> >> I've been using petsc-3.7.4 with intel mpi and compilers, >> superlu_dist, metis and parmetis on a cluster running >> SL7. Everything was working fine until SL7 got an update where >> glibc was upgraded from 2.17-106 to 2.17-157. >> > > I cannot see the error in your log. We previously fixed a bug with this > error reporting: > > https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293 > c0ccd0e7d7 > > in August. Is it possible that your PETSc is older than this? Could you > apply that patch, or > run the configure with 'master'? > > My guess is this is a dynamic library path problem, as it always is after > upgrades. > > Thanks, > > Matt > > >> This update seemed to have broken (at least) parmetis: the >> standalone binary gpmetis started to give a segmentation >> fault. The core dump shows this: >> >> Core was generated by `gpmetis'. >> Program terminated with signal 11, Segmentation fault. >> #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 >> >> That's when I decided to recompile, but to my surprise I cannot >> even get past the configure stage (log attached)! >> >> ************************************************************ >> ******************* >> UNABLE to EXECUTE BINARIES for ./configure >> ------------------------------------------------------------ >> ------------------- >> Cannot run executables created with FC. If this machine uses a batch >> system >> to submit jobs you will need to configure using ./configure with the >> additional option --with-batch. >> Otherwise there is problem with the compilers. Can you compile and run >> code with your compiler 'mpif90'? >> See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf >> ************************************************************ >> ******************* >> >> Note the following: >> >> 1) Configure was done with the exact same options that worked >> fine before the update of SL7. >> >> 2) The intel mpi and compilers are exactly the same as before the >> update of SL7. >> >> 3) The cluster does not require a batch system to run code. >> >> 4) I can compile and run code with mpif90 on this cluster. >> >> 5) The problem also occurs on a workstation running SL7. >> >> Any clues on how to proceed? >> Chris >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | >> http://www.marin.nl >> >> MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS- >> and-BEMBEM-for-propeller-pressure-pulse-prediction.htm >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image75d749.PNG Type: image/png Size: 333 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1337f2.PNG Type: image/png Size: 331 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imagebc4ffe.PNG Type: image/png Size: 253 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image901d0c.PNG Type: image/png Size: 293 bytes Desc: not available URL: From balay at mcs.anl.gov Tue Jan 3 09:53:22 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 3 Jan 2017 09:53:22 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483458399121.70375@marin.nl> References: <1483455899357.86673@marin.nl>, <1483458399121.70375@marin.nl> Message-ID: The patch that Matt mentioned is in 'master' branch - so its not in 3.7.4. I've now added it to 'maint' branch - so it should be in next petsc patch tarball [i.e 3.7.6] - whenever its released. You can apply the patch to your current sources - that should tell us the exact error you get with the intel compiler. To workarround - you would still have to follow my suggestion [i.e disable autodetect..] Satish On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > I've downloaded the tarball on October 24th: > > > $ ls -lh petsc-lite-3.7.4.tar.gz > -rw-r--r-- 1 cklaij domain users 8.4M Oct 24 11:07 petsc-lite-3.7.4.tar.gz > > > (no direct internet access on cluster) > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl > > [LinkedIn] [YouTube] [Twitter] [Facebook] > MARIN news: Modelling natural transition on hydrofoils for application in underwater gliders > > ________________________________ > From: Matthew Knepley > Sent: Tuesday, January 03, 2017 4:36 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > On Tue, Jan 3, 2017 at 9:04 AM, Klaij, Christiaan > wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > > I cannot see the error in your log. We previously fixed a bug with this error reporting: > > https://bitbucket.org/petsc/petsc/commits/32cc76960ddbb48660f8e7c667e293c0ccd0e7d7 > > in August. Is it possible that your PETSc is older than this? Could you apply that patch, or > run the configure with 'master'? > > My guess is this is a dynamic library path problem, as it always is after upgrades. > > Thanks, > > Matt > > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > From balay at mcs.anl.gov Tue Jan 3 10:00:58 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 3 Jan 2017 10:00:58 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483458399121.70375@marin.nl> Message-ID: On Tue, 3 Jan 2017, Matthew Knepley wrote: > Or get the new tarball when it spins tonight, since Satish has just > added the fix to maint. We don't spin 'maint/patch-release' tarballs everynight. Its every 1-3 months - [partly depending upon the number of outstanding patches - or their severity] -rw-r--r-- 1 petsc pdetools 23194357 Jan 1 10:41 petsc-3.7.5.tar.gz -rw-r--r-- 1 petsc pdetools 23189526 Oct 2 22:06 petsc-3.7.4.tar.gz -rw-r--r-- 1 petsc pdetools 23172670 Jul 24 12:22 petsc-3.7.3.tar.gz -rw-r--r-- 1 petsc pdetools 23111802 Jun 5 2016 petsc-3.7.2.tar.gz -rw-r--r-- 1 petsc pdetools 23113397 May 15 2016 petsc-3.7.1.tar.gz -rw-r--r-- 1 petsc pdetools 22083999 Apr 25 2016 petsc-3.7.0.tar.gz Satish From u.rochan at gmail.com Tue Jan 3 16:02:55 2017 From: u.rochan at gmail.com (Rochan Upadhyay) Date: Tue, 3 Jan 2017 16:02:55 -0600 Subject: [petsc-users] a question on DMPlexSetAnchors Message-ID: I think I sent my previous question (on Dec 28th) to the wrong place (petsc-users-request at mcs.anl.gov). To repeat, I am having bit of a difficulty in understanding the introduction of constraints in DMPlex. From a quick study of the User Manual I gather that it is easiest done using DMPlexSetAnchors ? The description of this routine says that there is an anchorIS that specifies the anchor points (rows in the matrix). This is okay and easily understood. There is also an anchorSection which is described as a map from constraint points (columns ?) to the anchor points listed in the anchorIS. Should this not be a map between solution indices (i.e. indices appearing in the vectors and matrices) ? For example I am completely unable to set up a simple constraint matrix for the following (say): Point 1, Field A, B Point 2-10 Field A At point 1, Field B depends on Field A at points 1-10 When I set it up it appears to create a matrix where field A depends on field A values at points 1-10. How does the mapping work in this case ? Will the DMPlexSetAnchors() routine work for this simple scenario ? If not, is the only recourse to create the constraint matrix oneself using DMSetDefaultConstraints ? Also documentation for DMSetDefaultConstraints is incomplete. The function accepts three arguments (dm, section and Mat) but what the section is is not described at all. I don't know if my question makes any sense. If it does not then it is only a reflection of my utter confusion regarding the routine DMPlexSetAnchors :-( Regards, Rochan -------------- next part -------------- An HTML attachment was scrubbed... URL: From C.Klaij at marin.nl Wed Jan 4 02:26:03 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 08:26:03 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483458399121.70375@marin.nl> , Message-ID: <1483518363543.12284@marin.nl> So I've applied the patch to my current 3.7.4 source, the new configure.log is attached. It's slightly larger but not much clearer too me... Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Software-seminar-in-Shanghai-for-the-first-time-March-28.htm ________________________________________ From: Satish Balay Sent: Tuesday, January 03, 2017 5:00 PM To: Matthew Knepley Cc: Klaij, Christiaan; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Tue, 3 Jan 2017, Matthew Knepley wrote: > Or get the new tarball when it spins tonight, since Satish has just > added the fix to maint. We don't spin 'maint/patch-release' tarballs everynight. Its every 1-3 months - [partly depending upon the number of outstanding patches - or their severity] -rw-r--r-- 1 petsc pdetools 23194357 Jan 1 10:41 petsc-3.7.5.tar.gz -rw-r--r-- 1 petsc pdetools 23189526 Oct 2 22:06 petsc-3.7.4.tar.gz -rw-r--r-- 1 petsc pdetools 23172670 Jul 24 12:22 petsc-3.7.3.tar.gz -rw-r--r-- 1 petsc pdetools 23111802 Jun 5 2016 petsc-3.7.2.tar.gz -rw-r--r-- 1 petsc pdetools 23113397 May 15 2016 petsc-3.7.1.tar.gz -rw-r--r-- 1 petsc pdetools 22083999 Apr 25 2016 petsc-3.7.0.tar.gz Satish -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 1613624 bytes Desc: configure.log URL: From C.Klaij at marin.nl Wed Jan 4 03:16:18 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 09:16:18 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483518363543.12284@marin.nl> References: <1483455899357.86673@marin.nl> <1483458399121.70375@marin.nl> , , <1483518363543.12284@marin.nl> Message-ID: <1483521378917.89651@marin.nl> Well, a bit clearer perhaps. It seems the relevant ERROR is on line 31039. So I did this case by hand using the compile and link lines from the log, then run it in gdb: $ pwd /tmp/petsc-Q0URwQ/config.setCompilers $ ls confdefs.h conffix.h conftest conftest.F conftest.o $ gdb GNU gdb (GDB) Red Hat Enterprise Linux 7.6.1-80.el7 Copyright (C) 2013 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "x86_64-redhat-linux-gnu". For bug reporting instructions, please see: . (gdb) file conftest Reading symbols from /tmp/petsc-Q0URwQ/config.setCompilers/conftest...done. (gdb) run Starting program: /tmp/petsc-Q0URwQ/config.setCompilers/conftest Program received signal SIGSEGV, Segmentation fault. 0x00002aaaae32f65e in ?? () Missing separate debuginfos, use: debuginfo-install glibc-2.17-157.el7.x86_64 (gdb) bt #0 0x00002aaaae32f65e in ?? () #1 0x00002aaaaaab7675 in _dl_relocate_object () from /lib64/ld-linux-x86-64.so.2 #2 0x00002aaaaaaae792 in dl_main () from /lib64/ld-linux-x86-64.so.2 #3 0x00002aaaaaac1e36 in _dl_sysdep_start () from /lib64/ld-linux-x86-64.so.2 #4 0x00002aaaaaaafa31 in _dl_start () from /lib64/ld-linux-x86-64.so.2 #5 0x00002aaaaaaac1e8 in _start () from /lib64/ld-linux-x86-64.so.2 #6 0x0000000000000001 in ?? () #7 0x00007fffffffd4e2 in ?? () #8 0x0000000000000000 in ?? () (gdb) Does this make any sense to you? dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Software-seminar-in-Shanghai-for-the-first-time-March-28.htm ________________________________________ From: Klaij, Christiaan Sent: Wednesday, January 04, 2017 9:26 AM To: Matthew Knepley; petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 So I've applied the patch to my current 3.7.4 source, the new configure.log is attached. It's slightly larger but not much clearer too me... Chris ________________________________________ From: Satish Balay Sent: Tuesday, January 03, 2017 5:00 PM To: Matthew Knepley Cc: Klaij, Christiaan; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Tue, 3 Jan 2017, Matthew Knepley wrote: > Or get the new tarball when it spins tonight, since Satish has just > added the fix to maint. We don't spin 'maint/patch-release' tarballs everynight. Its every 1-3 months - [partly depending upon the number of outstanding patches - or their severity] -rw-r--r-- 1 petsc pdetools 23194357 Jan 1 10:41 petsc-3.7.5.tar.gz -rw-r--r-- 1 petsc pdetools 23189526 Oct 2 22:06 petsc-3.7.4.tar.gz -rw-r--r-- 1 petsc pdetools 23172670 Jul 24 12:22 petsc-3.7.3.tar.gz -rw-r--r-- 1 petsc pdetools 23111802 Jun 5 2016 petsc-3.7.2.tar.gz -rw-r--r-- 1 petsc pdetools 23113397 May 15 2016 petsc-3.7.1.tar.gz -rw-r--r-- 1 petsc pdetools 22083999 Apr 25 2016 petsc-3.7.0.tar.gz Satish From C.Klaij at marin.nl Wed Jan 4 04:32:51 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 10:32:51 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl>, Message-ID: <1483525971870.90369@marin.nl> Satish, I tried your suggestion: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a I guess I don't really need "LIBS= " twice (?) so I've used this line: LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Unfortunately, this approach also fails (attached log): ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Fortran could not successfully link C++ objects ******************************************************************************* There are multiple libifcore.a in the intel compiler lib: one in intel64_lin and one in intel64_lin_mic. Tried both, got same error. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm ________________________________________ From: Satish Balay Sent: Tuesday, January 03, 2017 4:37 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 Do you have similar issues with gnu compilers? It must be some incompatibility with intel compilers with this glibc change. >>>>>>>>> compilers: Check that C libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** <<<<<<<<<< Thre is a bug in configure [Matt?] that eats away some of the log - so I don't see the exact error you are getting. If standalone micc/mpif90 etc work - then you can try the following additional options: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a [replace "path_to" with the correct path to the ifort lubifcore.a library] Note: I have a RHEL7 box with this glibc - and I don't see this issue. >>>> -bash-4.2$ cat /etc/redhat-release Red Hat Enterprise Linux Server release 7.3 (Maipo) -bash-4.2$ rpm -q glibc glibc-2.17-157.el7_3.1.x86_64 glibc-2.17-157.el7_3.1.i686 -bash-4.2$ mpiicc --version icc (ICC) 17.0.0 20160721 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. -bash-4.2$ <<<< Satish On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 1511592 bytes Desc: configure.log URL: From info at akselos.com Wed Jan 4 06:39:50 2017 From: info at akselos.com (=?utf-8?Q?Thomas=20Leurent?=) Date: Wed, 4 Jan 2017 12:39:50 +0000 Subject: [petsc-users] =?utf-8?q?Leveraging_IIoT_=26_Moving_to_Zero_Unplan?= =?utf-8?q?ned_Downtime?= Message-ID: Akselos News View this email in your browser (http://us6.campaign-archive2.com/?u=d98ce9bd21c800d407dce9d52&id=b388563fdf&e=911b65a822) Dear All, I invite you to read my new post (http://akselos.us6.list-manage1.com/track/click?u=d98ce9bd21c800d407dce9d52&id=4bf9f07fda&e=911b65a822) on how coupling the Industrial Internet of Things (IIoT) with physics-based digital twins will be the path to materialise immense savings from the industry current IIoT investments. Upcoming touch points: if you're interested in discussing more this month, I'll be attending WEF Annual Meeting (http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=b8a83dfeeb&e=911b65a822) in Davos, January 17-20 and MIT-ILP (http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=83d32dbebc&e=911b65a822) event in Japan, January 27th. If you are attending, I hope to see you there. Thomas, http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=691ef0fc3f&e=911b65a822 If you work in the Oil & Gas industry, you?ve probably noticed that its future is being shaped around the notion of Zero Unplanned Downtime. The offshore sector, in particular, is plagued with unplanned downtime that is costing it billions in unseen value. And the problem is only going to get worse: as operating expenses are scaled down, capital projects are cut or deferred, and aging assets are operated beyond their original design life. Read more of this post. (http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=bf080b2b6f&e=911b65a822) ============================================================ ** (http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=facb7be0d3&e=911b65a822) ** Twitter (http://akselos.us6.list-manage2.com/track/click?u=d98ce9bd21c800d407dce9d52&id=ee89fd97db&e=911b65a822) ** (http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=6fd41f1f12&e=911b65a822) ** LinkedIn (http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=789a98abd9&e=911b65a822) ** (mailto:info at akselos.com) ** Email (mailto:info at akselos.com) ** (http://akselos.us6.list-manage1.com/track/click?u=d98ce9bd21c800d407dce9d52&id=dd869e82a3&e=911b65a822) ** YouTube (http://akselos.us6.list-manage.com/track/click?u=d98ce9bd21c800d407dce9d52&id=625e42bd05&e=911b65a822) Copyright ? 2016 Akselos, All rights reserved. You are on this list because you are affiliated to Akselos Our mailing address is: Akselos, EPFL Innovation Park, Building D Lausanne 1015, Switzerland ** Add us to your address book (http://akselos.us6.list-manage1.com/track/click?u=d98ce9bd21c800d407dce9d52&id=03552981d9&e=911b65a822) ** update your preferences (http://akselos.us6.list-manage1.com/profile?u=d98ce9bd21c800d407dce9d52&id=e81ae48f4a&e=911b65a822) or ** unsubscribe from this list (http://akselos.us6.list-manage.com/unsubscribe?u=d98ce9bd21c800d407dce9d52&id=e81ae48f4a&e=911b65a822&c=b388563fdf) Email Marketing Powered by MailChimp http://www.mailchimp.com/monkey-rewards/?utm_source=freemium_newsletter&utm_medium=email&utm_campaign=monkey_rewards&aid=d98ce9bd21c800d407dce9d52&afl=1 -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Jan 4 06:40:44 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Jan 2017 06:40:44 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483521378917.89651@marin.nl> References: <1483455899357.86673@marin.nl> <1483458399121.70375@marin.nl> <1483518363543.12284@marin.nl> <1483521378917.89651@marin.nl> Message-ID: On Wed, Jan 4, 2017 at 3:16 AM, Klaij, Christiaan wrote: > Well, a bit clearer perhaps. It seems the relevant ERROR is on > line 31039. So I did this case by hand using the compile and link > lines from the log, then run it in gdb: > > $ pwd > /tmp/petsc-Q0URwQ/config.setCompilers > $ ls > confdefs.h conffix.h conftest conftest.F conftest.o > $ gdb > GNU gdb (GDB) Red Hat Enterprise Linux 7.6.1-80.el7 > Copyright (C) 2013 Free Software Foundation, Inc. > License GPLv3+: GNU GPL version 3 or later html> > This is free software: you are free to change and redistribute it. > There is NO WARRANTY, to the extent permitted by law. Type "show copying" > and "show warranty" for details. > This GDB was configured as "x86_64-redhat-linux-gnu". > For bug reporting instructions, please see: > . > (gdb) file conftest > Reading symbols from /tmp/petsc-Q0URwQ/config. > setCompilers/conftest...done. > (gdb) run > Starting program: /tmp/petsc-Q0URwQ/config.setCompilers/conftest > > Program received signal SIGSEGV, Segmentation fault. > 0x00002aaaae32f65e in ?? () > Missing separate debuginfos, use: debuginfo-install > glibc-2.17-157.el7.x86_64 > (gdb) bt > #0 0x00002aaaae32f65e in ?? () > #1 0x00002aaaaaab7675 in _dl_relocate_object () > from /lib64/ld-linux-x86-64.so.2 > #2 0x00002aaaaaaae792 in dl_main () from /lib64/ld-linux-x86-64.so.2 > #3 0x00002aaaaaac1e36 in _dl_sysdep_start () from > /lib64/ld-linux-x86-64.so.2 > #4 0x00002aaaaaaafa31 in _dl_start () from /lib64/ld-linux-x86-64.so.2 > #5 0x00002aaaaaaac1e8 in _start () from /lib64/ld-linux-x86-64.so.2 > #6 0x0000000000000001 in ?? () > #7 0x00007fffffffd4e2 in ?? () > #8 0x0000000000000000 in ?? () > (gdb) > > Does this make any sense to you? No. It looks like there is something deeply wrong with the dynamic loader. You might try debuginfo-install glibc-2.17-157.el7.x86_64 as it says so that we can see the stack trace. Considering that the error happens inside of _dl_sysdep_start () from /lib64/ld-linux-x86-64.so.2 I am guessing that it is indeed connected to your upgrade of glibc. Since it only happens when you are not using compiler libraries, I think your compiler has pointers back to old things in the OS. I would recommend either a) using GNU as Satish says, or b) reinstalling the whole compiler suite. I will look at the new problem when not using compiler libraries. Thanks, Matt > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Software-seminar- > in-Shanghai-for-the-first-time-March-28.htm > > ________________________________________ > From: Klaij, Christiaan > Sent: Wednesday, January 04, 2017 9:26 AM > To: Matthew Knepley; petsc-users; Satish Balay > Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > So I've applied the patch to my current 3.7.4 source, the new > configure.log is attached. It's slightly larger but not much > clearer too me... > > Chris > ________________________________________ > From: Satish Balay > Sent: Tuesday, January 03, 2017 5:00 PM > To: Matthew Knepley > Cc: Klaij, Christiaan; petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > On Tue, 3 Jan 2017, Matthew Knepley wrote: > > > Or get the new tarball when it spins tonight, since Satish has just > > added the fix to maint. > > We don't spin 'maint/patch-release' tarballs everynight. Its every 1-3 > months - [partly depending upon the number of outstanding patches - or > their severity] > > -rw-r--r-- 1 petsc pdetools 23194357 Jan 1 10:41 petsc-3.7.5.tar.gz > -rw-r--r-- 1 petsc pdetools 23189526 Oct 2 22:06 petsc-3.7.4.tar.gz > -rw-r--r-- 1 petsc pdetools 23172670 Jul 24 12:22 petsc-3.7.3.tar.gz > -rw-r--r-- 1 petsc pdetools 23111802 Jun 5 2016 petsc-3.7.2.tar.gz > -rw-r--r-- 1 petsc pdetools 23113397 May 15 2016 petsc-3.7.1.tar.gz > -rw-r--r-- 1 petsc pdetools 22083999 Apr 25 2016 petsc-3.7.0.tar.gz > > Satish > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Jan 4 06:43:10 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Jan 2017 06:43:10 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483525971870.90369@marin.nl> References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> Message-ID: On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan wrote: > Satish, > > I tried your suggestion: > > --with-clib-autodetect=0 --with-fortranlib-autodetect=0 > --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a > > I guess I don't really need "LIBS= " twice (?) so I've used this line: > > LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin/libifcore.a > > Unfortunately, this approach also fails (attached log): > Ah, this error is much easier: Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest -fPIC -g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o /tmp/petsc-3GfeyZ/config.compilers/confc.o -ldl /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.d.DW.ref.__gxx_personality_v0+0x0): undefined reference to `__gxx_personality_v0' Intel as lazy writing its C++ compiler, so it uses some of g++. If you want to use C++, you will need to add -lstdc++ to your LIBS variable (I think). Otherwise, please turn it off using --with-cxx=0. Thanks, Matt > ************************************************************ > ******************* > UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for > details): > ------------------------------------------------------------ > ------------------- > Fortran could not successfully link C++ objects > ************************************************************ > ******************* > > There are multiple libifcore.a in the intel compiler lib: one in > intel64_lin and one in intel64_lin_mic. Tried both, got same error. > > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of- > uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > ________________________________________ > From: Satish Balay > Sent: Tuesday, January 03, 2017 4:37 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > Do you have similar issues with gnu compilers? > > It must be some incompatibility with intel compilers with this glibc > change. > > >>>>>>>>> > compilers: Check that C libraries can be used from Fortran > Pushing language FC > Popping language FC > Pushing language FC > Popping language FC > Pushing language FC > Popping language FC > **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** > <<<<<<<<<< > > Thre is a bug in configure [Matt?] that eats away some of the log - so > I don't see the exact error you are getting. > > If standalone micc/mpif90 etc work - then you can try the following > additional options: > > --with-clib-autodetect=0 --with-fortranlib-autodetect=0 > --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a > > [replace "path_to" with the correct path to the ifort lubifcore.a library] > > Note: I have a RHEL7 box with this glibc - and I don't see this issue. > > >>>> > -bash-4.2$ cat /etc/redhat-release > Red Hat Enterprise Linux Server release 7.3 (Maipo) > -bash-4.2$ rpm -q glibc > glibc-2.17-157.el7_3.1.x86_64 > glibc-2.17-157.el7_3.1.i686 > -bash-4.2$ mpiicc --version > icc (ICC) 17.0.0 20160721 > Copyright (C) 1985-2016 Intel Corporation. All rights reserved. > > -bash-4.2$ > <<<< > > Satish > > On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > > > > > I've been using petsc-3.7.4 with intel mpi and compilers, > > superlu_dist, metis and parmetis on a cluster running > > SL7. Everything was working fine until SL7 got an update where > > glibc was upgraded from 2.17-106 to 2.17-157. > > > > This update seemed to have broken (at least) parmetis: the > > standalone binary gpmetis started to give a segmentation > > fault. The core dump shows this: > > > > Core was generated by `gpmetis'. > > Program terminated with signal 11, Segmentation fault. > > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > > > That's when I decided to recompile, but to my surprise I cannot > > even get past the configure stage (log attached)! > > > > ************************************************************ > ******************* > > UNABLE to EXECUTE BINARIES for ./configure > > ------------------------------------------------------------ > ------------------- > > Cannot run executables created with FC. If this machine uses a batch > system > > to submit jobs you will need to configure using ./configure with the > additional option --with-batch. > > Otherwise there is problem with the compilers. Can you compile and run > code with your compiler 'mpif90'? > > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > > ************************************************************ > ******************* > > > > Note the following: > > > > 1) Configure was done with the exact same options that worked > > fine before the update of SL7. > > > > 2) The intel mpi and compilers are exactly the same as before the > > update of SL7. > > > > 3) The cluster does not require a batch system to run code. > > > > 4) I can compile and run code with mpif90 on this cluster. > > > > 5) The problem also occurs on a workstation running SL7. > > > > Any clues on how to proceed? > > Chris > > > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | > http://www.marin.nl > > > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of- > uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.knezevic at akselos.com Wed Jan 4 07:06:15 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Wed, 4 Jan 2017 08:06:15 -0500 Subject: [petsc-users] Leveraging IIoT & Moving to Zero Unplanned Downtime In-Reply-To: References: Message-ID: Apologies, the email below was not meant to be sent to this list. David On Wed, Jan 4, 2017 at 7:39 AM, Thomas Leurent wrote: > Akselos News > View this email in your browser > > > Dear All, > > I invite you to read my new post > on > how coupling the Industrial Internet of Things (IIoT) with physics-based > digital twins will be the path to materialise immense savings from the > industry current IIoT investments. > > Upcoming touch points: if you're interested in discussing more this month, I'll > be attending WEF Annual Meeting > > in Davos, January 17-20 and MIT-ILP > event > in Japan, January 27th. If you are attending, I hope to see you there. > > Thomas, > > > > If you work in the Oil & Gas industry, you?ve probably noticed that its > future is being shaped around the notion of Zero Unplanned Downtime. The > offshore sector, in particular, is plagued with unplanned downtime that is > costing it billions in unseen value. And the problem is only going to get > worse: as operating expenses are scaled down, capital projects are cut or > deferred, and aging assets are operated beyond their original design life. Read > more of this post. > > > > > Twitter > > > > LinkedIn > > Email > > > YouTube > > > *Copyright ? 2016 Akselos, All rights reserved.* > You are on this list because you are affiliated to Akselos > > *Our mailing address is:* > Akselos, EPFL Innovation Park, Building D > Lausanne 1015, Switzerland > Add us to your address book > > update your preferences > > or unsubscribe from this list > > > [image: Email Marketing Powered by MailChimp] > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From C.Klaij at marin.nl Wed Jan 4 07:21:25 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 13:21:25 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483458399121.70375@marin.nl> <1483518363543.12284@marin.nl> <1483521378917.89651@marin.nl>, Message-ID: <1483536085578.39194@marin.nl> Our sysadmin says that SL7 does not provide the debug version of glibc. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: Modelling natural transition on hydrofoils for application in underwater gliders ________________________________ From: Matthew Knepley Sent: Wednesday, January 04, 2017 1:40 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 3:16 AM, Klaij, Christiaan > wrote: Well, a bit clearer perhaps. It seems the relevant ERROR is on line 31039. So I did this case by hand using the compile and link lines from the log, then run it in gdb: $ pwd /tmp/petsc-Q0URwQ/config.setCompilers $ ls confdefs.h conffix.h conftest conftest.F conftest.o $ gdb GNU gdb (GDB) Red Hat Enterprise Linux 7.6.1-80.el7 Copyright (C) 2013 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "x86_64-redhat-linux-gnu". For bug reporting instructions, please see: . (gdb) file conftest Reading symbols from /tmp/petsc-Q0URwQ/config.setCompilers/conftest...done. (gdb) run Starting program: /tmp/petsc-Q0URwQ/config.setCompilers/conftest Program received signal SIGSEGV, Segmentation fault. 0x00002aaaae32f65e in ?? () Missing separate debuginfos, use: debuginfo-install glibc-2.17-157.el7.x86_64 (gdb) bt #0 0x00002aaaae32f65e in ?? () #1 0x00002aaaaaab7675 in _dl_relocate_object () from /lib64/ld-linux-x86-64.so.2 #2 0x00002aaaaaaae792 in dl_main () from /lib64/ld-linux-x86-64.so.2 #3 0x00002aaaaaac1e36 in _dl_sysdep_start () from /lib64/ld-linux-x86-64.so.2 #4 0x00002aaaaaaafa31 in _dl_start () from /lib64/ld-linux-x86-64.so.2 #5 0x00002aaaaaaac1e8 in _start () from /lib64/ld-linux-x86-64.so.2 #6 0x0000000000000001 in ?? () #7 0x00007fffffffd4e2 in ?? () #8 0x0000000000000000 in ?? () (gdb) Does this make any sense to you? No. It looks like there is something deeply wrong with the dynamic loader. You might try debuginfo-install glibc-2.17-157.el7.x86_64 as it says so that we can see the stack trace. Considering that the error happens inside of _dl_sysdep_start () from /lib64/ld-linux-x86-64.so.2 I am guessing that it is indeed connected to your upgrade of glibc. Since it only happens when you are not using compiler libraries, I think your compiler has pointers back to old things in the OS. I would recommend either a) using GNU as Satish says, or b) reinstalling the whole compiler suite. I will look at the new problem when not using compiler libraries. Thanks, Matt dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Software-seminar-in-Shanghai-for-the-first-time-March-28.htm ________________________________________ From: Klaij, Christiaan Sent: Wednesday, January 04, 2017 9:26 AM To: Matthew Knepley; petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 So I've applied the patch to my current 3.7.4 source, the new configure.log is attached. It's slightly larger but not much clearer too me... Chris ________________________________________ From: Satish Balay > Sent: Tuesday, January 03, 2017 5:00 PM To: Matthew Knepley Cc: Klaij, Christiaan; petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Tue, 3 Jan 2017, Matthew Knepley wrote: > Or get the new tarball when it spins tonight, since Satish has just > added the fix to maint. We don't spin 'maint/patch-release' tarballs everynight. Its every 1-3 months - [partly depending upon the number of outstanding patches - or their severity] -rw-r--r-- 1 petsc pdetools 23194357 Jan 1 10:41 petsc-3.7.5.tar.gz -rw-r--r-- 1 petsc pdetools 23189526 Oct 2 22:06 petsc-3.7.4.tar.gz -rw-r--r-- 1 petsc pdetools 23172670 Jul 24 12:22 petsc-3.7.3.tar.gz -rw-r--r-- 1 petsc pdetools 23111802 Jun 5 2016 petsc-3.7.2.tar.gz -rw-r--r-- 1 petsc pdetools 23113397 May 15 2016 petsc-3.7.1.tar.gz -rw-r--r-- 1 petsc pdetools 22083999 Apr 25 2016 petsc-3.7.0.tar.gz Satish -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image6cbf98.PNG Type: image/png Size: 293 bytes Desc: image6cbf98.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image60adc7.PNG Type: image/png Size: 331 bytes Desc: image60adc7.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image32b324.PNG Type: image/png Size: 333 bytes Desc: image32b324.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imagec8920a.PNG Type: image/png Size: 253 bytes Desc: imagec8920a.PNG URL: From C.Klaij at marin.nl Wed Jan 4 07:37:02 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 13:37:02 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl>, Message-ID: <1483537021972.79609@marin.nl> I've tried with: --LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a -lstdc++ but that doesn't seem to make a difference. With the option --with-cxx=0 the configure part does work(!), but then I get **************************ERROR************************************* Error during compile, check Linux-x86_64-Intel/lib/petsc/conf/make.log Send it and Linux-x86_64-Intel/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov ******************************************************************* See the attached log files. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: MARIN Report 119: highlighting naval research projects ________________________________ From: Matthew Knepley Sent: Wednesday, January 04, 2017 1:43 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan > wrote: Satish, I tried your suggestion: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a I guess I don't really need "LIBS= " twice (?) so I've used this line: LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Unfortunately, this approach also fails (attached log): Ah, this error is much easier: Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest -fPIC -g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o /tmp/petsc-3GfeyZ/config.compilers/confc.o -ldl /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.d.DW.ref.__gxx_personality_v0+0x0): undefined reference to `__gxx_personality_v0' Intel as lazy writing its C++ compiler, so it uses some of g++. If you want to use C++, you will need to add -lstdc++ to your LIBS variable (I think). Otherwise, please turn it off using --with-cxx=0. Thanks, Matt ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Fortran could not successfully link C++ objects ******************************************************************************* There are multiple libifcore.a in the intel compiler lib: one in intel64_lin and one in intel64_lin_mic. Tried both, got same error. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm ________________________________________ From: Satish Balay > Sent: Tuesday, January 03, 2017 4:37 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 Do you have similar issues with gnu compilers? It must be some incompatibility with intel compilers with this glibc change. >>>>>>>>> compilers: Check that C libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** <<<<<<<<<< Thre is a bug in configure [Matt?] that eats away some of the log - so I don't see the exact error you are getting. If standalone micc/mpif90 etc work - then you can try the following additional options: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a [replace "path_to" with the correct path to the ifort lubifcore.a library] Note: I have a RHEL7 box with this glibc - and I don't see this issue. >>>> -bash-4.2$ cat /etc/redhat-release Red Hat Enterprise Linux Server release 7.3 (Maipo) -bash-4.2$ rpm -q glibc glibc-2.17-157.el7_3.1.x86_64 glibc-2.17-157.el7_3.1.i686 -bash-4.2$ mpiicc --version icc (ICC) 17.0.0 20160721 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. -bash-4.2$ <<<< Satish On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image215e33.PNG Type: image/png Size: 293 bytes Desc: image215e33.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3607b7.PNG Type: image/png Size: 331 bytes Desc: image3607b7.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1b081f.PNG Type: image/png Size: 333 bytes Desc: image1b081f.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image17dbf5.PNG Type: image/png Size: 253 bytes Desc: image17dbf5.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 3712148 bytes Desc: configure.log URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 98311 bytes Desc: make.log URL: From knepley at gmail.com Wed Jan 4 08:13:16 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Jan 2017 08:13:16 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483537021972.79609@marin.nl> References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> Message-ID: On Wed, Jan 4, 2017 at 7:37 AM, Klaij, Christiaan wrote: > I've tried with: > > > --LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin/libifcore.a -lstdc++\\ > This is likely connected to the problem below, but I would have to see the log. > but that doesn't seem to make a difference. > > > With the option --with-cxx=0 the configure part does work(!), but then I > get > > > **************************ERROR************************************* > Error during compile, check Linux-x86_64-Intel/lib/petsc/conf/make.log > Send it and Linux-x86_64-Intel/lib/petsc/conf/configure.log to > petsc-maint at mcs.anl.gov > ******************************************************************* > Here is the problem: CLINKER /projects/developers/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.4/Linux-x86_64-Intel/lib/libpetsc.so.3.7.4 ld: /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a(for_init.o): relocation R_X86_64_32 against `.rodata.str1.4' can not be used when making a shared object; recompile with -fPIC /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a: could not read symbols: Bad value Clearly there is something wrong with the compiler install. However, can you put a libifcore.so in LIBS instead? Matt > See the attached log files. > > > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 <+31%20317%20493%20344> | C.Klaij at marin.nl | > www.marin.nl > > [image: LinkedIn] [image: > YouTube] [image: Twitter] > [image: Facebook] > > MARIN news: MARIN Report 119: highlighting naval research projects > > > ------------------------------ > *From:* Matthew Knepley > *Sent:* Wednesday, January 04, 2017 1:43 PM > *To:* Klaij, Christiaan > *Cc:* petsc-users; Satish Balay > *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan > wrote: > >> Satish, >> >> I tried your suggestion: >> >> --with-clib-autodetect=0 --with-fortranlib-autodetect=0 >> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a >> >> I guess I don't really need "LIBS= " twice (?) so I've used this line: >> >> LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3. >> 210/linux/compiler/lib/intel64_lin/libifcore.a >> >> Unfortunately, this approach also fails (attached log): >> > > Ah, this error is much easier: > > Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest > -fPIC -g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o > /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o /tmp/petsc-3GfeyZ/config.compilers/confc.o > -ldl /cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin/libifcore.a > Possible ERROR while running linker: exit code 256 > stderr: > /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.d.DW.ref.__gxx_personality_v0+0x0): > undefined reference to `__gxx_personality_v0' > > Intel as lazy writing its C++ compiler, so it uses some of g++. If you > want to use C++, you will need to add -lstdc++ to your LIBS variable (I > think). > Otherwise, please turn it off using --with-cxx=0. > > Thanks, > > Matt > > >> ************************************************************ >> ******************* >> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for >> details): >> ------------------------------------------------------------ >> ------------------- >> Fortran could not successfully link C++ objects >> ************************************************************ >> ******************* >> >> There are multiple libifcore.a in the intel compiler lib: one in >> intel64_lin and one in intel64_lin_mic. Tried both, got same error. >> >> Chris >> >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | >> http://www.marin.nl >> >> MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS- >> and-BEMBEM-for-propeller-pressure-pulse-prediction.htm >> >> ________________________________________ >> From: Satish Balay >> Sent: Tuesday, January 03, 2017 4:37 PM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 >> >> Do you have similar issues with gnu compilers? >> >> It must be some incompatibility with intel compilers with this glibc >> change. >> >> >>>>>>>>> >> compilers: Check that C libraries can be used from Fortran >> Pushing language FC >> Popping language FC >> Pushing language FC >> Popping language FC >> Pushing language FC >> Popping language FC >> **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** >> <<<<<<<<<< >> >> Thre is a bug in configure [Matt?] that eats away some of the log - so >> I don't see the exact error you are getting. >> >> If standalone micc/mpif90 etc work - then you can try the following >> additional options: >> >> --with-clib-autodetect=0 --with-fortranlib-autodetect=0 >> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a >> >> [replace "path_to" with the correct path to the ifort lubifcore.a library] >> >> Note: I have a RHEL7 box with this glibc - and I don't see this issue. >> >> >>>> >> -bash-4.2$ cat /etc/redhat-release >> Red Hat Enterprise Linux Server release 7.3 (Maipo) >> -bash-4.2$ rpm -q glibc >> glibc-2.17-157.el7_3.1.x86_64 >> glibc-2.17-157.el7_3.1.i686 >> -bash-4.2$ mpiicc --version >> icc (ICC) 17.0.0 20160721 >> Copyright (C) 1985-2016 Intel Corporation. All rights reserved. >> >> -bash-4.2$ >> <<<< >> >> Satish >> >> On Tue, 3 Jan 2017, Klaij, Christiaan wrote: >> >> > >> > I've been using petsc-3.7.4 with intel mpi and compilers, >> > superlu_dist, metis and parmetis on a cluster running >> > SL7. Everything was working fine until SL7 got an update where >> > glibc was upgraded from 2.17-106 to 2.17-157. >> > >> > This update seemed to have broken (at least) parmetis: the >> > standalone binary gpmetis started to give a segmentation >> > fault. The core dump shows this: >> > >> > Core was generated by `gpmetis'. >> > Program terminated with signal 11, Segmentation fault. >> > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 >> > >> > That's when I decided to recompile, but to my surprise I cannot >> > even get past the configure stage (log attached)! >> > >> > ************************************************************ >> ******************* >> > UNABLE to EXECUTE BINARIES for ./configure >> > ------------------------------------------------------------ >> ------------------- >> > Cannot run executables created with FC. If this machine uses a batch >> system >> > to submit jobs you will need to configure using ./configure with the >> additional option --with-batch. >> > Otherwise there is problem with the compilers. Can you compile and run >> code with your compiler 'mpif90'? >> > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf >> > ************************************************************ >> ******************* >> > >> > Note the following: >> > >> > 1) Configure was done with the exact same options that worked >> > fine before the update of SL7. >> > >> > 2) The intel mpi and compilers are exactly the same as before the >> > update of SL7. >> > >> > 3) The cluster does not require a batch system to run code. >> > >> > 4) I can compile and run code with mpif90 on this cluster. >> > >> > 5) The problem also occurs on a workstation running SL7. >> > >> > Any clues on how to proceed? >> > Chris >> > >> > >> > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | >> http://www.marin.nl >> > >> > MARIN news: http://www.marin.nl/web/News/N >> ews-items/Comparison-of-uRANS-and-BEMBEM-for-propeller- >> pressure-pulse-prediction.htm >> > >> > >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image17dbf5.PNG Type: image/png Size: 253 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image215e33.PNG Type: image/png Size: 293 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3607b7.PNG Type: image/png Size: 331 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1b081f.PNG Type: image/png Size: 333 bytes Desc: not available URL: From C.Klaij at marin.nl Wed Jan 4 08:53:30 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 14:53:30 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl>, Message-ID: <1483541610596.23482@marin.nl> So how would I do that? Does LIBS= accept spaces in the string? Something like this perhaps: LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin -lifcore" But I'm starting to believe that my intel install is somehow broken. I'm getting these intel compilers from rpm's provided by our cluster vendor. On a workstation I can try yum remove and install of the intel packages. Not so easy on a production cluster. Is this worth a try? Or will it just copy/paste the same broken (?) stuff in the same place? Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: Comparison of uRANS and BEM-BEM for propeller pressure pulse prediction ________________________________ From: Matthew Knepley Sent: Wednesday, January 04, 2017 3:13 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 7:37 AM, Klaij, Christiaan > wrote: I've tried with: --LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a -lstdc++\\ This is likely connected to the problem below, but I would have to see the log. but that doesn't seem to make a difference. With the option --with-cxx=0 the configure part does work(!), but then I get **************************ERROR************************************* Error during compile, check Linux-x86_64-Intel/lib/petsc/conf/make.log Send it and Linux-x86_64-Intel/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov ******************************************************************* Here is the problem: CLINKER /projects/developers/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.4/Linux-x86_64-Intel/lib/libpetsc.so.3.7.4 ld: /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a(for_init.o): relocation R_X86_64_32 against `.rodata.str1.4' can not be used when making a shared object; recompile with -fPIC /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a: could not read symbols: Bad value Clearly there is something wrong with the compiler install. However, can you put a libifcore.so in LIBS instead? Matt See the attached log files. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: MARIN Report 119: highlighting naval research projects ________________________________ From: Matthew Knepley > Sent: Wednesday, January 04, 2017 1:43 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan > wrote: Satish, I tried your suggestion: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a I guess I don't really need "LIBS= " twice (?) so I've used this line: LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Unfortunately, this approach also fails (attached log): Ah, this error is much easier: Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest -fPIC -g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o /tmp/petsc-3GfeyZ/config.compilers/confc.o -ldl /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.d.DW.ref.__gxx_personality_v0+0x0): undefined reference to `__gxx_personality_v0' Intel as lazy writing its C++ compiler, so it uses some of g++. If you want to use C++, you will need to add -lstdc++ to your LIBS variable (I think). Otherwise, please turn it off using --with-cxx=0. Thanks, Matt ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Fortran could not successfully link C++ objects ******************************************************************************* There are multiple libifcore.a in the intel compiler lib: one in intel64_lin and one in intel64_lin_mic. Tried both, got same error. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm ________________________________________ From: Satish Balay > Sent: Tuesday, January 03, 2017 4:37 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 Do you have similar issues with gnu compilers? It must be some incompatibility with intel compilers with this glibc change. >>>>>>>>> compilers: Check that C libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** <<<<<<<<<< Thre is a bug in configure [Matt?] that eats away some of the log - so I don't see the exact error you are getting. If standalone micc/mpif90 etc work - then you can try the following additional options: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a [replace "path_to" with the correct path to the ifort lubifcore.a library] Note: I have a RHEL7 box with this glibc - and I don't see this issue. >>>> -bash-4.2$ cat /etc/redhat-release Red Hat Enterprise Linux Server release 7.3 (Maipo) -bash-4.2$ rpm -q glibc glibc-2.17-157.el7_3.1.x86_64 glibc-2.17-157.el7_3.1.i686 -bash-4.2$ mpiicc --version icc (ICC) 17.0.0 20160721 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. -bash-4.2$ <<<< Satish On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image17dbf5.PNG Type: image/png Size: 253 bytes Desc: image17dbf5.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image215e33.PNG Type: image/png Size: 293 bytes Desc: image215e33.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3607b7.PNG Type: image/png Size: 331 bytes Desc: image3607b7.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1b081f.PNG Type: image/png Size: 333 bytes Desc: image1b081f.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imageba9b84.PNG Type: image/png Size: 293 bytes Desc: imageba9b84.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image905076.PNG Type: image/png Size: 331 bytes Desc: image905076.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image09a8e8.PNG Type: image/png Size: 333 bytes Desc: image09a8e8.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imagea84914.PNG Type: image/png Size: 253 bytes Desc: imagea84914.PNG URL: From C.Klaij at marin.nl Wed Jan 4 09:05:19 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 15:05:19 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483541610596.23482@marin.nl> References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl>, , <1483541610596.23482@marin.nl> Message-ID: <1483542319358.30478@marin.nl> By the way, petsc did compile and install metis and parmetis succesfully before the make error. However, running the newly compiled gpmetis program gives the same segmentation fault! So the original problem was not solved by recompiling, unfortunately. Chris ________________________________ From: Klaij, Christiaan Sent: Wednesday, January 04, 2017 3:53 PM To: Matthew Knepley Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 So how would I do that? Does LIBS= accept spaces in the string? Something like this perhaps: LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin -lifcore" But I'm starting to believe that my intel install is somehow broken. I'm getting these intel compilers from rpm's provided by our cluster vendor. On a workstation I can try yum remove and install of the intel packages. Not so easy on a production cluster. Is this worth a try? Or will it just copy/paste the same broken (?) stuff in the same place? Chris ________________________________ From: Matthew Knepley Sent: Wednesday, January 04, 2017 3:13 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 7:37 AM, Klaij, Christiaan > wrote: I've tried with: --LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a -lstdc++\\ This is likely connected to the problem below, but I would have to see the log. but that doesn't seem to make a difference. With the option --with-cxx=0 the configure part does work(!), but then I get **************************ERROR************************************* Error during compile, check Linux-x86_64-Intel/lib/petsc/conf/make.log Send it and Linux-x86_64-Intel/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov ******************************************************************* Here is the problem: CLINKER /projects/developers/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.4/Linux-x86_64-Intel/lib/libpetsc.so.3.7.4 ld: /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a(for_init.o): relocation R_X86_64_32 against `.rodata.str1.4' can not be used when making a shared object; recompile with -fPIC /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a: could not read symbols: Bad value Clearly there is something wrong with the compiler install. However, can you put a libifcore.so in LIBS instead? Matt See the attached log files. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: MARIN Report 119: highlighting naval research projects ________________________________ From: Matthew Knepley > Sent: Wednesday, January 04, 2017 1:43 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan > wrote: Satish, I tried your suggestion: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a I guess I don't really need "LIBS= " twice (?) so I've used this line: LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Unfortunately, this approach also fails (attached log): Ah, this error is much easier: Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest -fPIC -g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o /tmp/petsc-3GfeyZ/config.compilers/confc.o -ldl /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.d.DW.ref.__gxx_personality_v0+0x0): undefined reference to `__gxx_personality_v0' Intel as lazy writing its C++ compiler, so it uses some of g++. If you want to use C++, you will need to add -lstdc++ to your LIBS variable (I think). Otherwise, please turn it off using --with-cxx=0. Thanks, Matt ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Fortran could not successfully link C++ objects ******************************************************************************* There are multiple libifcore.a in the intel compiler lib: one in intel64_lin and one in intel64_lin_mic. Tried both, got same error. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm ________________________________________ From: Satish Balay > Sent: Tuesday, January 03, 2017 4:37 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 Do you have similar issues with gnu compilers? It must be some incompatibility with intel compilers with this glibc change. >>>>>>>>> compilers: Check that C libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** <<<<<<<<<< Thre is a bug in configure [Matt?] that eats away some of the log - so I don't see the exact error you are getting. If standalone micc/mpif90 etc work - then you can try the following additional options: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a [replace "path_to" with the correct path to the ifort lubifcore.a library] Note: I have a RHEL7 box with this glibc - and I don't see this issue. >>>> -bash-4.2$ cat /etc/redhat-release Red Hat Enterprise Linux Server release 7.3 (Maipo) -bash-4.2$ rpm -q glibc glibc-2.17-157.el7_3.1.x86_64 glibc-2.17-157.el7_3.1.i686 -bash-4.2$ mpiicc --version icc (ICC) 17.0.0 20160721 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. -bash-4.2$ <<<< Satish On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image17dbf5.PNG Type: image/png Size: 253 bytes Desc: image17dbf5.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image215e33.PNG Type: image/png Size: 293 bytes Desc: image215e33.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3607b7.PNG Type: image/png Size: 331 bytes Desc: image3607b7.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1b081f.PNG Type: image/png Size: 333 bytes Desc: image1b081f.PNG URL: From C.Klaij at marin.nl Wed Jan 4 09:19:37 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 4 Jan 2017 15:19:37 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483542319358.30478@marin.nl> References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl>, , <1483541610596.23482@marin.nl>, <1483542319358.30478@marin.nl> Message-ID: <1483543177535.33715@marin.nl> Attached is the log for LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin -lifcore" No luck there. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: MARIN Report 119: highlighting naval research projects ________________________________ From: Klaij, Christiaan Sent: Wednesday, January 04, 2017 4:05 PM To: Matthew Knepley Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 By the way, petsc did compile and install metis and parmetis succesfully before the make error. However, running the newly compiled gpmetis program gives the same segmentation fault! So the original problem was not solved by recompiling, unfortunately. Chris ________________________________ From: Klaij, Christiaan Sent: Wednesday, January 04, 2017 3:53 PM To: Matthew Knepley Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 So how would I do that? Does LIBS= accept spaces in the string? Something like this perhaps: LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin -lifcore" But I'm starting to believe that my intel install is somehow broken. I'm getting these intel compilers from rpm's provided by our cluster vendor. On a workstation I can try yum remove and install of the intel packages. Not so easy on a production cluster. Is this worth a try? Or will it just copy/paste the same broken (?) stuff in the same place? Chris ________________________________ From: Matthew Knepley Sent: Wednesday, January 04, 2017 3:13 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 7:37 AM, Klaij, Christiaan > wrote: I've tried with: --LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a -lstdc++\\ This is likely connected to the problem below, but I would have to see the log. but that doesn't seem to make a difference. With the option --with-cxx=0 the configure part does work(!), but then I get **************************ERROR************************************* Error during compile, check Linux-x86_64-Intel/lib/petsc/conf/make.log Send it and Linux-x86_64-Intel/lib/petsc/conf/configure.log to petsc-maint at mcs.anl.gov ******************************************************************* Here is the problem: CLINKER /projects/developers/cklaij/ReFRESCO/Dev/trunk/Libs/build/petsc-3.7.4/Linux-x86_64-Intel/lib/libpetsc.so.3.7.4 ld: /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a(for_init.o): relocation R_X86_64_32 against `.rodata.str1.4' can not be used when making a shared object; recompile with -fPIC /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a: could not read symbols: Bad value Clearly there is something wrong with the compiler install. However, can you put a libifcore.so in LIBS instead? Matt See the attached log files. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | C.Klaij at marin.nl | www.marin.nl [LinkedIn] [YouTube] [Twitter] [Facebook] MARIN news: MARIN Report 119: highlighting naval research projects ________________________________ From: Matthew Knepley > Sent: Wednesday, January 04, 2017 1:43 PM To: Klaij, Christiaan Cc: petsc-users; Satish Balay Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan > wrote: Satish, I tried your suggestion: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a I guess I don't really need "LIBS= " twice (?) so I've used this line: LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Unfortunately, this approach also fails (attached log): Ah, this error is much easier: Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest -fPIC -g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o /tmp/petsc-3GfeyZ/config.compilers/confc.o -ldl /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin/libifcore.a Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce.d.DW.ref.__gxx_personality_v0+0x0): undefined reference to `__gxx_personality_v0' Intel as lazy writing its C++ compiler, so it uses some of g++. If you want to use C++, you will need to add -lstdc++ to your LIBS variable (I think). Otherwise, please turn it off using --with-cxx=0. Thanks, Matt ******************************************************************************* UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log for details): ------------------------------------------------------------------------------- Fortran could not successfully link C++ objects ******************************************************************************* There are multiple libifcore.a in the intel compiler lib: one in intel64_lin and one in intel64_lin_mic. Tried both, got same error. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm ________________________________________ From: Satish Balay > Sent: Tuesday, January 03, 2017 4:37 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 Do you have similar issues with gnu compilers? It must be some incompatibility with intel compilers with this glibc change. >>>>>>>>> compilers: Check that C libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** <<<<<<<<<< Thre is a bug in configure [Matt?] that eats away some of the log - so I don't see the exact error you are getting. If standalone micc/mpif90 etc work - then you can try the following additional options: --with-clib-autodetect=0 --with-fortranlib-autodetect=0 --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a [replace "path_to" with the correct path to the ifort lubifcore.a library] Note: I have a RHEL7 box with this glibc - and I don't see this issue. >>>> -bash-4.2$ cat /etc/redhat-release Red Hat Enterprise Linux Server release 7.3 (Maipo) -bash-4.2$ rpm -q glibc glibc-2.17-157.el7_3.1.x86_64 glibc-2.17-157.el7_3.1.i686 -bash-4.2$ mpiicc --version icc (ICC) 17.0.0 20160721 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. -bash-4.2$ <<<< Satish On Tue, 3 Jan 2017, Klaij, Christiaan wrote: > > I've been using petsc-3.7.4 with intel mpi and compilers, > superlu_dist, metis and parmetis on a cluster running > SL7. Everything was working fine until SL7 got an update where > glibc was upgraded from 2.17-106 to 2.17-157. > > This update seemed to have broken (at least) parmetis: the > standalone binary gpmetis started to give a segmentation > fault. The core dump shows this: > > Core was generated by `gpmetis'. > Program terminated with signal 11, Segmentation fault. > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 > > That's when I decided to recompile, but to my surprise I cannot > even get past the configure stage (log attached)! > > ******************************************************************************* > UNABLE to EXECUTE BINARIES for ./configure > ------------------------------------------------------------------------------- > Cannot run executables created with FC. If this machine uses a batch system > to submit jobs you will need to configure using ./configure with the additional option --with-batch. > Otherwise there is problem with the compilers. Can you compile and run code with your compiler 'mpif90'? > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf > ******************************************************************************* > > Note the following: > > 1) Configure was done with the exact same options that worked > fine before the update of SL7. > > 2) The intel mpi and compilers are exactly the same as before the > update of SL7. > > 3) The cluster does not require a batch system to run code. > > 4) I can compile and run code with mpif90 on this cluster. > > 5) The problem also occurs on a workstation running SL7. > > Any clues on how to proceed? > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image17dbf5.PNG Type: image/png Size: 253 bytes Desc: image17dbf5.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image215e33.PNG Type: image/png Size: 293 bytes Desc: image215e33.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3607b7.PNG Type: image/png Size: 331 bytes Desc: image3607b7.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1b081f.PNG Type: image/png Size: 333 bytes Desc: image1b081f.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1f0435.PNG Type: image/png Size: 293 bytes Desc: image1f0435.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3ccbd0.PNG Type: image/png Size: 331 bytes Desc: image3ccbd0.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imaged8e995.PNG Type: image/png Size: 333 bytes Desc: imaged8e995.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imaged7de88.PNG Type: image/png Size: 253 bytes Desc: imaged7de88.PNG URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log Type: text/x-log Size: 155212 bytes Desc: configure.log URL: From knepley at gmail.com Wed Jan 4 09:37:46 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Jan 2017 09:37:46 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483543177535.33715@marin.nl> References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> Message-ID: On Wed, Jan 4, 2017 at 9:19 AM, Klaij, Christiaan wrote: > Attached is the log for > > > LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin -lifcore" > Something is strange with the quotes in this shell. Can you use this instead LIBS=[-L/cm/shared/apps/intel/compilers_and_libraries_2016. 3.210/linux/compiler/lib/intel64_lin,-lifcore] Thanks, Matt > No luck there. > > > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 <+31%20317%20493%20344> | C.Klaij at marin.nl | > www.marin.nl > > [image: LinkedIn] [image: > YouTube] [image: Twitter] > [image: Facebook] > > MARIN news: MARIN Report 119: highlighting naval research projects > > > ------------------------------ > *From:* Klaij, Christiaan > *Sent:* Wednesday, January 04, 2017 4:05 PM > > *To:* Matthew Knepley > *Cc:* petsc-users; Satish Balay > *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > > By the way, petsc did compile and install metis and parmetis succesfully > before the make error. However, running the newly compiled gpmetis program > gives the same segmentation fault! So the original problem was not solved > by recompiling, unfortunately. > > > Chris > > > ------------------------------ > *From:* Klaij, Christiaan > *Sent:* Wednesday, January 04, 2017 3:53 PM > *To:* Matthew Knepley > *Cc:* petsc-users; Satish Balay > *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > > So how would I do that? Does LIBS= accept spaces in the > string? Something like this perhaps: > > > LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin -lifcore" > > > But I'm starting to believe that my intel install is somehow broken. I'm > getting these intel compilers from rpm's provided by our cluster vendor. On > a workstation I can try yum remove and install of the intel packages. Not > so easy on a production cluster. Is this worth a try? Or will it just > copy/paste the same broken (?) stuff in the same place? > > > Chris > > > ------------------------------ > *From:* Matthew Knepley > *Sent:* Wednesday, January 04, 2017 3:13 PM > *To:* Klaij, Christiaan > *Cc:* petsc-users; Satish Balay > *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > On Wed, Jan 4, 2017 at 7:37 AM, Klaij, Christiaan > wrote: > >> I've tried with: >> >> >> --LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3 >> .210/linux/compiler/lib/intel64_lin/libifcore.a -lstdc++\\ >> > This is likely connected to the problem below, but I would have to see the > log. > >> but that doesn't seem to make a difference. >> >> >> With the option --with-cxx=0 the configure part does work(!), but then I >> get >> >> >> **************************ERROR************************************* >> Error during compile, check Linux-x86_64-Intel/lib/petsc/conf/make.log >> Send it and Linux-x86_64-Intel/lib/petsc/conf/configure.log to >> petsc-maint at mcs.anl.gov >> ******************************************************************* >> > Here is the problem: > > CLINKER /projects/developers/cklaij/ReFRESCO/Dev/trunk/Libs/build/ > petsc-3.7.4/Linux-x86_64-Intel/lib/libpetsc.so.3.7.4 > ld: /cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin/libifcore.a(for_init.o): relocation > R_X86_64_32 against `.rodata.str1.4' can not be used when making a shared > object; recompile with -fPIC > /cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin/libifcore.a: could not read symbols: > Bad value > > Clearly there is something wrong with the compiler install. > > However, can you put a libifcore.so in LIBS instead? > > Matt > >> See the attached log files. >> >> >> Chris >> >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 <+31%20317%20493%20344> | C.Klaij at marin.nl | >> www.marin.nl >> >> [image: LinkedIn] [image: >> YouTube] [image: Twitter] >> [image: Facebook] >> >> MARIN news: MARIN Report 119: highlighting naval research projects >> >> >> ------------------------------ >> *From:* Matthew Knepley >> *Sent:* Wednesday, January 04, 2017 1:43 PM >> *To:* Klaij, Christiaan >> *Cc:* petsc-users; Satish Balay >> *Subject:* Re: [petsc-users] problems after glibc upgrade to 2.17-157 >> >> On Wed, Jan 4, 2017 at 4:32 AM, Klaij, Christiaan >> wrote: >> >>> Satish, >>> >>> I tried your suggestion: >>> >>> --with-clib-autodetect=0 --with-fortranlib-autodetect=0 >>> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a >>> >>> I guess I don't really need "LIBS= " twice (?) so I've used this line: >>> >>> LIBS=/cm/shared/apps/intel/compilers_and_libraries_2016.3.21 >>> 0/linux/compiler/lib/intel64_lin/libifcore.a >>> >>> Unfortunately, this approach also fails (attached log): >>> >> >> Ah, this error is much easier: >> >> Executing: mpif90 -o /tmp/petsc-3GfeyZ/config.compilers/conftest >> -fPIC -g -O3 /tmp/petsc-3GfeyZ/config.compilers/conftest.o >> /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o >> /tmp/petsc-3GfeyZ/config.compilers/confc.o -ldl >> /cm/shared/apps/intel/compilers_and_libraries_2016.3.210/ >> linux/compiler/lib/intel64_lin/libifcore.a >> Possible ERROR while running linker: exit code 256 >> stderr: >> /tmp/petsc-3GfeyZ/config.compilers/cxxobj.o:(.gnu.linkonce. >> d.DW.ref.__gxx_personality_v0+0x0): undefined reference to >> `__gxx_personality_v0' >> >> Intel as lazy writing its C++ compiler, so it uses some of g++. If you >> want to use C++, you will need to add -lstdc++ to your LIBS variable (I >> think). >> Otherwise, please turn it off using --with-cxx=0. >> >> Thanks, >> >> Matt >> >> >>> ************************************************************ >>> ******************* >>> UNABLE to CONFIGURE with GIVEN OPTIONS (see configure.log >>> for details): >>> ------------------------------------------------------------ >>> ------------------- >>> Fortran could not successfully link C++ objects >>> ************************************************************ >>> ******************* >>> >>> There are multiple libifcore.a in the intel compiler lib: one in >>> intel64_lin and one in intel64_lin_mic. Tried both, got same error. >>> >>> Chris >>> >>> >>> >>> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >>> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | >>> http://www.marin.nl >>> >>> MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS- >>> and-BEMBEM-for-propeller-pressure-pulse-prediction.htm >>> >>> ________________________________________ >>> From: Satish Balay >>> Sent: Tuesday, January 03, 2017 4:37 PM >>> To: Klaij, Christiaan >>> Cc: petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 >>> >>> Do you have similar issues with gnu compilers? >>> >>> It must be some incompatibility with intel compilers with this glibc >>> change. >>> >>> >>>>>>>>> >>> compilers: Check that C libraries can be used from Fortran >>> Pushing language FC >>> Popping language FC >>> Pushing language FC >>> Popping language FC >>> Pushing language FC >>> Popping language FC >>> **** Configure header /tmp/petsc-rOjdnN/confdefs.h **** >>> <<<<<<<<<< >>> >>> Thre is a bug in configure [Matt?] that eats away some of the log - so >>> I don't see the exact error you are getting. >>> >>> If standalone micc/mpif90 etc work - then you can try the following >>> additional options: >>> >>> --with-clib-autodetect=0 --with-fortranlib-autodetect=0 >>> --with-cxxlib-autodetect=0 LIBS=LIBS=/path_to/libifcore.a >>> >>> [replace "path_to" with the correct path to the ifort lubifcore.a >>> library] >>> >>> Note: I have a RHEL7 box with this glibc - and I don't see this issue. >>> >>> >>>> >>> -bash-4.2$ cat /etc/redhat-release >>> Red Hat Enterprise Linux Server release 7.3 (Maipo) >>> -bash-4.2$ rpm -q glibc >>> glibc-2.17-157.el7_3.1.x86_64 >>> glibc-2.17-157.el7_3.1.i686 >>> -bash-4.2$ mpiicc --version >>> icc (ICC) 17.0.0 20160721 >>> Copyright (C) 1985-2016 Intel Corporation. All rights reserved. >>> >>> -bash-4.2$ >>> <<<< >>> >>> Satish >>> >>> On Tue, 3 Jan 2017, Klaij, Christiaan wrote: >>> >>> > >>> > I've been using petsc-3.7.4 with intel mpi and compilers, >>> > superlu_dist, metis and parmetis on a cluster running >>> > SL7. Everything was working fine until SL7 got an update where >>> > glibc was upgraded from 2.17-106 to 2.17-157. >>> > >>> > This update seemed to have broken (at least) parmetis: the >>> > standalone binary gpmetis started to give a segmentation >>> > fault. The core dump shows this: >>> > >>> > Core was generated by `gpmetis'. >>> > Program terminated with signal 11, Segmentation fault. >>> > #0 0x00002aaaac6b865e in memmove () from /lib64/libc.so.6 >>> > >>> > That's when I decided to recompile, but to my surprise I cannot >>> > even get past the configure stage (log attached)! >>> > >>> > ************************************************************ >>> ******************* >>> > UNABLE to EXECUTE BINARIES for ./configure >>> > ------------------------------------------------------------ >>> ------------------- >>> > Cannot run executables created with FC. If this machine uses a batch >>> system >>> > to submit jobs you will need to configure using ./configure with the >>> additional option --with-batch. >>> > Otherwise there is problem with the compilers. Can you compile and >>> run code with your compiler 'mpif90'? >>> > See http://www.mcs.anl.gov/petsc/documentation/faq.html#libimf >>> > ************************************************************ >>> ******************* >>> > >>> > Note the following: >>> > >>> > 1) Configure was done with the exact same options that worked >>> > fine before the update of SL7. >>> > >>> > 2) The intel mpi and compilers are exactly the same as before the >>> > update of SL7. >>> > >>> > 3) The cluster does not require a batch system to run code. >>> > >>> > 4) I can compile and run code with mpif90 on this cluster. >>> > >>> > 5) The problem also occurs on a workstation running SL7. >>> > >>> > Any clues on how to proceed? >>> > Chris >>> > >>> > >>> > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >>> > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | >>> http://www.marin.nl >>> > >>> > MARIN news: http://www.marin.nl/web/News/N >>> ews-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-press >>> ure-pulse-prediction.htm >>> > >>> > >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imaged8e995.PNG Type: image/png Size: 333 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image215e33.PNG Type: image/png Size: 293 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3ccbd0.PNG Type: image/png Size: 331 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1b081f.PNG Type: image/png Size: 333 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: imaged7de88.PNG Type: image/png Size: 253 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image3607b7.PNG Type: image/png Size: 331 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image1f0435.PNG Type: image/png Size: 293 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image17dbf5.PNG Type: image/png Size: 253 bytes Desc: not available URL: From balay at mcs.anl.gov Wed Jan 4 10:25:16 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 4 Jan 2017 10:25:16 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> Message-ID: On Wed, 4 Jan 2017, Matthew Knepley wrote: > On Wed, Jan 4, 2017 at 9:19 AM, Klaij, Christiaan wrote: > > > Attached is the log for > > > > > > LIBS="-L/cm/shared/apps/intel/compilers_and_libraries_2016. > > 3.210/linux/compiler/lib/intel64_lin -lifcore" > > > Something is strange with the quotes in this shell. Can you use this > instead > > LIBS=[-L/cm/shared/apps/intel/compilers_and_libraries_2016. > 3.210/linux/compiler/lib/intel64_lin,-lifcore] I don't think LIBS accepts this notation.. For eg: http://ftp.mcs.anl.gov/pub/petsc/nightlylogs/archive/2017/01/01/configure_maint_arch-linux-gcc-ifc-cmplx_crank.log The quotes should work. I don't know why its messed up. Alternative is: 'LIBS=-L/cm/shared/apps/intel/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64_lin -lifcore' BTW: Is is easier to use --with-shared-libraries=0? [does this work?] Satish From balay at mcs.anl.gov Wed Jan 4 10:58:49 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 4 Jan 2017 10:58:49 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> Message-ID: BTW - You have: Machine platform: ('Linux', 'marclus4login3', '3.10.0-327.36.3.el7.x86_64', '#1 SMP Mon Oct 24 09:16:18 CDT 2016', 'x86_64', 'x86_64') So glibc is updated - but not kernel - so its a partial update? [or machine not rebooted after a major upgrade (7.2 -> 7.3) ?] I wonder if thats the cause of problems.. On a CentOS7 box [not exactly SL - but] the following build with "icc (ICC) 16.0.3 20160415" goes through fine [I don't have IMPI here - so used --download-mpich] >>> [centos at el7 ~]$ uname -a Linux el7.novalocal 3.10.0-514.2.2.el7.x86_64 #1 SMP Tue Dec 6 23:06:41 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux [centos at el7 ~]$ rpm -q glibc glibc-2.17-157.el7_3.1.x86_64 [balay at el7 petsc]$ ./configure --download-mpich --with-x=0 --with-mpe=0 --with-debugging=0 --download-superlu_dist --with-blas-lapack-dir=/soft/com/packages/intel/16/u3/mkl --download-parmetis --download-metis <<< Satish From balay at mcs.anl.gov Wed Jan 4 11:57:27 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 4 Jan 2017 11:57:27 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> Message-ID: On Wed, 4 Jan 2017, Satish Balay wrote: > BTW - You have: > > Machine platform: > ('Linux', 'marclus4login3', '3.10.0-327.36.3.el7.x86_64', '#1 SMP Mon Oct 24 09:16:18 CDT 2016', 'x86_64', 'x86_64') > > > So glibc is updated - but not kernel - so its a partial update? [or > machine not rebooted after a major upgrade (7.2 -> 7.3) ?] > > I wonder if thats the cause of problems.. > > On a CentOS7 box [not exactly SL - but] the following build with > "icc (ICC) 16.0.3 20160415" goes through fine > [I don't have IMPI here - so used --download-mpich] > > >>> > [centos at el7 ~]$ uname -a > Linux el7.novalocal 3.10.0-514.2.2.el7.x86_64 #1 SMP Tue Dec 6 23:06:41 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux > [centos at el7 ~]$ rpm -q glibc > glibc-2.17-157.el7_3.1.x86_64 > [balay at el7 petsc]$ ./configure --download-mpich --with-x=0 --with-mpe=0 --with-debugging=0 --download-superlu_dist --with-blas-lapack-dir=/soft/com/packages/intel/16/u3/mkl --download-parmetis --download-metis > > <<< Ops - I made a mistake here - didn't specify compilers. I can reproduce this issue on CentOS >>>>>>>> [balay at el7 benchmarks]$ icc sizeof.c [balay at el7 benchmarks]$ ./a.out long double : 16 double : 8 int : 4 char : 1 short : 2 long : 8 long long : 8 int * : 8 size_t : 8 [balay at el7 benchmarks]$ icc sizeof.c -lifcore [balay at el7 benchmarks]$ ./a.out Segmentation fault [balay at el7 benchmarks]$ icc sizeof.c -Bstatic -lifcore -Bdynamic [balay at el7 benchmarks]$ ./a.out long double : 16 double : 8 int : 4 char : 1 short : 2 long : 8 long long : 8 int * : 8 size_t : 8 [balay at el7 benchmarks]$ <<<<<< So I guess your best bet is static libraries.. Satish From balay at mcs.anl.gov Wed Jan 4 12:24:06 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Wed, 4 Jan 2017 12:24:06 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> Message-ID: On Wed, 4 Jan 2017, Satish Balay wrote: > So I guess your best bet is static libraries.. Or upgrade to intel-17 compilers. Satish ------- [balay at el7 benchmarks]$ icc --version icc (ICC) 17.0.1 20161005 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. [balay at el7 benchmarks]$ icc sizeof.c -lifcore [balay at el7 benchmarks]$ ldd a.out linux-vdso.so.1 => (0x00007ffed4b02000) libifcore.so.5 => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libifcore.so.5 (0x00007f5a65430000) libm.so.6 => /lib64/libm.so.6 (0x00007f5a65124000) libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f5a64f0e000) libc.so.6 => /lib64/libc.so.6 (0x00007f5a64b4c000) libdl.so.2 => /lib64/libdl.so.2 (0x00007f5a64948000) libimf.so => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libimf.so (0x00007f5a6445c000) libsvml.so => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libsvml.so (0x00007f5a63550000) libintlc.so.5 => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libintlc.so.5 (0x00007f5a632e6000) /lib64/ld-linux-x86-64.so.2 (0x00007f5a65793000) [balay at el7 benchmarks]$ ./a.out long double : 16 double : 8 int : 4 char : 1 short : 2 long : 8 long long : 8 int * : 8 size_t : 8 [balay at el7 benchmarks]$ From mlohry at princeton.edu Wed Jan 4 13:26:12 2017 From: mlohry at princeton.edu (Mark W. Lohry) Date: Wed, 4 Jan 2017 19:26:12 +0000 Subject: [petsc-users] TSPseudo overriding SNES iterations Message-ID: I have an unsteady problem I'm trying to solve for steady state. The regular time-accurate stepping works fine (uses around 5 Newton iterations with 100 krylov iterations each per time step) with beuler stepping. But when changing only TSType to pseudo it looks like SNES max iterations is getting set to 1, and each pseduo time step then only does a single Newton step and then throws SNES CONVERGED_ITS 1 despite setting snessettolerances to allow 50 Newton steps. I'm trying to use all the same configuration here that works for backward Euler, but just continually increase the step size each time step. What am I missing here? Thanks, Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 4 13:50:51 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 13:50:51 -0600 Subject: [petsc-users] TSPseudo overriding SNES iterations In-Reply-To: References: Message-ID: <16853413-5D3D-4AD8-AE71-93F0F71C8AF7@mcs.anl.gov> Mark, This happens because some distinguished PETSc developer believes that by definition Pseudo transient continuation should not do multiple Newton iterations per "time step". Thus this developer defaulted the nonlinear solver in TSPSEUDO to be KSPONLY which, since it only does a linear solver does not support multiple Newton steps even if you ask for multiple steps. To get the effect you want, and what, IMHO many people may want, you need to run with -snes_type newtonls Please let us know if this does not work. Sorry for the confusion, Barry > On Jan 4, 2017, at 1:26 PM, Mark W. Lohry wrote: > > I have an unsteady problem I'm trying to solve for steady state. The regular time-accurate stepping works fine (uses around 5 Newton iterations with 100 krylov iterations each per time step) with beuler stepping. > > > But when changing only TSType to pseudo it looks like SNES max iterations is getting set to 1, and each pseduo time step then only does a single Newton step and then throws SNES CONVERGED_ITS 1 despite setting snessettolerances to allow 50 Newton steps. > > I'm trying to use all the same configuration here that works for backward Euler, but just continually increase the step size each time step. What am I missing here? > > Thanks, > Mark From mlohry at princeton.edu Wed Jan 4 14:09:47 2017 From: mlohry at princeton.edu (Mark W. Lohry) Date: Wed, 4 Jan 2017 20:09:47 +0000 Subject: [petsc-users] TSPseudo overriding SNES iterations Message-ID: Thanks for quick reply Barry, that did the trick. While I respect the belief of the anonymous distinguished developer, IMHO it would probably be better for things in TS to not override things in SNES unless necessary. On Jan 4, 2017 2:50 PM, Barry Smith wrote: Mark, This happens because some distinguished PETSc developer believes that by definition Pseudo transient continuation should not do multiple Newton iterations per "time step". Thus this developer defaulted the nonlinear solver in TSPSEUDO to be KSPONLY which, since it only does a linear solver does not support multiple Newton steps even if you ask for multiple steps. To get the effect you want, and what, IMHO many people may want, you need to run with -snes_type newtonls Please let us know if this does not work. Sorry for the confusion, Barry > On Jan 4, 2017, at 1:26 PM, Mark W. Lohry wrote: > > I have an unsteady problem I'm trying to solve for steady state. The regular time-accurate stepping works fine (uses around 5 Newton iterations with 100 krylov iterations each per time step) with beuler stepping. > > > But when changing only TSType to pseudo it looks like SNES max iterations is getting set to 1, and each pseduo time step then only does a single Newton step and then throws SNES CONVERGED_ITS 1 despite setting snessettolerances to allow 50 Newton steps. > > I'm trying to use all the same configuration here that works for backward Euler, but just continually increase the step size each time step. What am I missing here? > > Thanks, > Mark -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 4 14:22:16 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 14:22:16 -0600 Subject: [petsc-users] TSPseudo overriding SNES iterations In-Reply-To: References: Message-ID: <30DCF02D-7108-4DEC-A9F3-DE18DB9B0435@mcs.anl.gov> Here is the code in TSCreate_Pseudo() ierr = TSGetSNES(ts,&snes);CHKERRQ(ierr); ierr = SNESGetType(snes,&stype);CHKERRQ(ierr); if (!stype) {ierr = SNESSetType(snes,SNESKSPONLY);CHKERRQ(ierr);} it doesn't really "override" things in SNES, it just sets a different default SNES solver. Because of the complexity of the defaults for different circumstances it is really helpful to run with -ts_view (or -snes_view etc) to see what solver is actually being used as opposed to what solver you think is being used. Like you, I would assume that NEWTONLS is being used with TSPSEUDO, but as the other developer noted in the literature it seems the "single Newton step" approach is the "community standard" and it could be that using a single Newton step does led to the fastest solution time. Barry > On Jan 4, 2017, at 2:09 PM, Mark W. Lohry wrote: > > Thanks for quick reply Barry, that did the trick. > > While I respect the belief of the anonymous distinguished developer, IMHO it would probably be better for things in TS to not override things in SNES unless necessary. > > On Jan 4, 2017 2:50 PM, Barry Smith wrote: > > Mark, > > This happens because some distinguished PETSc developer believes that by definition Pseudo transient continuation should not do multiple Newton iterations per "time step". Thus this developer defaulted the nonlinear solver in TSPSEUDO to be KSPONLY which, since it only does a linear solver does not support multiple Newton steps even if you ask for multiple steps. > > To get the effect you want, and what, IMHO many people may want, you need to run with -snes_type newtonls Please let us know if this does not work. > > Sorry for the confusion, > > Barry > > > > > On Jan 4, 2017, at 1:26 PM, Mark W. Lohry wrote: > > > > I have an unsteady problem I'm trying to solve for steady state. The regular time-accurate stepping works fine (uses around 5 Newton iterations with 100 krylov iterations each per time step) with beuler stepping. > > > > > > But when changing only TSType to pseudo it looks like SNES max iterations is getting set to 1, and each pseduo time step then only does a single Newton step and then throws SNES CONVERGED_ITS 1 despite setting snessettolerances to allow 50 Newton steps. > > > > I'm trying to use all the same configuration here that works for backward Euler, but just continually increase the step size each time step. What am I missing here? > > > > Thanks, > > Mark > > From jeremy at seamplex.com Wed Jan 4 15:13:56 2017 From: jeremy at seamplex.com (Jeremy Theler) Date: Wed, 04 Jan 2017 18:13:56 -0300 Subject: [petsc-users] pc_gamg_threshol Message-ID: <1483564436.1134.3.camel@seamplex.com> Hi! Any reference to what does -pc_gamg_threshold mean and/or? -- Jeremy Theler www.seamplex.com From jeremy at seamplex.com Wed Jan 4 15:16:44 2017 From: jeremy at seamplex.com (Jeremy Theler) Date: Wed, 04 Jan 2017 18:16:44 -0300 Subject: [petsc-users] pc_gamg_threshol In-Reply-To: <1483564436.1134.3.camel@seamplex.com> References: <1483564436.1134.3.camel@seamplex.com> Message-ID: <1483564604.1134.4.camel@seamplex.com> * Any reference to what pc_gamg_treshold means and/or does? On Wed, 2017-01-04 at 18:13 -0300, Jeremy Theler wrote: > Hi! Any reference to what does -pc_gamg_threshold mean and/or? > From bsmith at mcs.anl.gov Wed Jan 4 15:33:53 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 15:33:53 -0600 Subject: [petsc-users] pc_gamg_threshol In-Reply-To: <1483564604.1134.4.camel@seamplex.com> References: <1483564436.1134.3.camel@seamplex.com> <1483564604.1134.4.camel@seamplex.com> Message-ID: The manual page gives a high-level description http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetThreshold.html the exact details can be found in the code here http://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/pc/impls/gamg/util.c.html#PCGAMGFilterGraph I'm adding a link from the former to the later in the documentation. Barry > On Jan 4, 2017, at 3:16 PM, Jeremy Theler wrote: > > * Any reference to what pc_gamg_treshold means and/or does? > > > > On Wed, 2017-01-04 at 18:13 -0300, Jeremy Theler wrote: >> Hi! Any reference to what does -pc_gamg_threshold mean and/or? >> > From niko.karin at gmail.com Wed Jan 4 15:39:44 2017 From: niko.karin at gmail.com (Karin&NiKo) Date: Wed, 4 Jan 2017 22:39:44 +0100 Subject: [petsc-users] Fwd: Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: Message-ID: Dear Petsc team, I am (still) trying to solve Biot's poroelasticity problem : [image: Images int?gr?es 1] I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : -ksp_rtol 1.0e-5 -ksp_type fgmres -pc_type fieldsplit -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_type schur -pc_fieldsplit_schur_precondition selfp -fieldsplit_0_pc_type lu -fieldsplit_0_pc_factor_mat_solver_package mumps -fieldsplit_0_ksp_type preonly -fieldsplit_0_ksp_converged_reason -fieldsplit_1_pc_type lu -fieldsplit_1_pc_factor_mat_solver_package mumps -fieldsplit_1_ksp_type preonly -fieldsplit_1_ksp_converged_reason On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? Thanks for your precious help, Nicolas -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 9086 bytes Desc: not available URL: -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 1 KSP unpreconditioned resid norm 6.196553125161e+04 true resid norm 6.196553125161e+04 ||r(i)||/||b|| 2.608424203547e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 2 KSP unpreconditioned resid norm 2.675633869591e+03 true resid norm 2.675633869591e+03 ||r(i)||/||b|| 1.126301671962e-07 KSP Object: 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.99982e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 153549. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.99982e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 3 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 3 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 624 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 3 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 3 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 3 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3584, allocated nonzeros=3584 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 123808. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1024. RINFO(3) (local estimated flops for the elimination after factorization): [0] 123808. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 123808. RINFOG(2) (global estimated flops for the assembly after factorization): 1024. RINFOG(3) (global estimated flops for the elimination after factorization): 123808. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3584 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 222 INFOG(5) (estimated maximum front size in the complete tree): 48 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3584 INFOG(10) (total integer space store the matrix factors after factorization): 222 INFOG(11) (order of largest frontal matrix after factorization): 48 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3584 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3584 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 1 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=64, cols=64 total: nonzeros=1000, allocated nonzeros=1000 total number of mallocs used during MatSetValues calls =0 not using I-node routines A10 Mat Object: 1 MPI processes type: seqaij rows=64, cols=624 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.99982e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 153549. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.99982e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 3 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 3 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 624 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 3 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 3 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 3 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=624, cols=64 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 Mat Object: 1 MPI processes type: seqaij rows=64, cols=64 total: nonzeros=2744, allocated nonzeros=2744 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 28 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=86740 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- on a arch-linux2-c-debug named dsp0780450 with 1 processor, by B07947 Wed Jan 4 22:21:05 2017 Using Petsc Release Version 3.7.2, Jun, 05, 2016 Max Max/Min Avg Total Time (sec): 1.568e-01 1.00000 1.568e-01 Objects: 7.000e+01 1.00000 7.000e+01 Flops: 1.231e+06 1.00000 1.231e+06 1.231e+06 Flops/sec: 7.847e+06 1.00000 7.847e+06 7.847e+06 Memory: 3.043e+06 1.00000 3.043e+06 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.5683e-01 100.0% 1.2308e+06 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ ########################################################## # # # WARNING!!! # # # # This code was compiled with a debugging option, # # To get timing results run ./configure # # using --with-debugging=no, the performance will # # be generally two or three times faster. # # # ########################################################## Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecMDot 2 1.0 1.0014e-05 1.0 4.12e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 412 VecNorm 7 1.0 2.7895e-05 1.0 9.63e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 345 VecScale 5 1.0 3.2425e-04 1.0 2.19e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 7 VecCopy 10 1.0 4.2439e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 29 1.0 9.0361e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 5 1.0 1.9193e-04 1.0 6.62e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 35 VecAYPX 3 1.0 1.9312e-05 1.0 2.06e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 107 VecMAXPY 5 1.0 3.5524e-05 1.0 1.10e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 310 VecScatterBegin 9 1.0 5.2929e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 9 1.0 1.1399e-03 1.0 9.30e+05 1.0 0.0e+00 0.0e+00 0.0e+00 1 76 0 0 0 1 76 0 0 0 816 MatSolve 6 1.0 1.6625e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatLUFactorSym 2 1.0 2.3232e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatLUFactorNum 2 1.0 7.0572e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 4 0 0 0 0 4 0 0 0 0 0 MatConvert 1 1.0 1.1921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 2 1.0 9.2983e-05 1.0 1.11e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 120 MatAssemblyBegin 8 1.0 9.7752e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 8 1.0 4.3797e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 128 1.0 1.3161e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 1.7309e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 4 1.0 6.2459e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 4 0 0 0 0 4 0 0 0 0 0 MatGetOrdering 2 1.0 8.4996e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatView 6 1.0 2.7837e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 18 0 0 0 0 18 0 0 0 0 0 MatAXPY 1 1.0 6.2799e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 1 1.0 1.5202e-03 1.0 2.54e+05 1.0 0.0e+00 0.0e+00 0.0e+00 1 21 0 0 0 1 21 0 0 0 167 MatMatMultSym 1 1.0 8.8406e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatMatMultNum 1 1.0 6.2108e-04 1.0 2.54e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 21 0 0 0 0 21 0 0 0 409 KSPGMRESOrthog 2 1.0 5.2691e-05 1.0 8.25e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 157 KSPSetUp 3 1.0 1.8287e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 1.5977e-02 1.0 9.65e+05 1.0 0.0e+00 0.0e+00 0.0e+00 10 78 0 0 0 10 78 0 0 0 60 PCSetUp 3 1.0 2.1162e-02 1.0 2.65e+05 1.0 0.0e+00 0.0e+00 0.0e+00 13 22 0 0 0 13 22 0 0 0 13 PCApply 2 1.0 1.4143e-02 1.0 6.84e+04 1.0 0.0e+00 0.0e+00 0.0e+00 9 6 0 0 0 9 6 0 0 0 5 KSPSolve_FS_0 2 1.0 1.0471e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 KSPSolve_FS_Schu 2 1.0 1.9639e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 KSPSolve_FS_Low 2 1.0 1.0584e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 7 0 0 0 0 7 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 26 26 172600 0. Vector Scatter 3 3 1992 0. Matrix 11 11 2316912 0. Krylov Solver 5 5 24280 0. Preconditioner 5 5 4840 0. Index Set 15 13 17584 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 #PETSc Option Table entries: -fieldsplit_0_ksp_converged_reason -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_factor_mat_solver_package mumps -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_converged_reason -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_factor_mat_solver_package mumps -fieldsplit_1_pc_type lu -ksp_rtol 1.0e-5 -ksp_type fgmres -ksp_view -log_summary -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_type schur -pc_type fieldsplit #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --prefix=/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install --with-mpi=yes --with-x=yes --download-ml=/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/ml-6.2-p3.tar.gz --with-mumps-lib="-L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster6/MPI/lib -lesmumps -lptscotch -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Parmetis_aster-403_aster/lib -lparmetis -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster1/lib -lmetis -L/usr/lib -lscalapack-openmpi -L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp " --with-mumps-include=/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/include --with-scalapack-lib="-L/usr/lib -lscalapack-openmpi" --with-blacs-lib="-L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi" --with-blas-lib="-L/usr/lib -lopenblas -lcblas" --with-lapack-lib="-L/usr/lib -llapack" ----------------------------------------- Libraries compiled on Wed Nov 30 11:59:58 2016 on dsp0780450 Machine characteristics: Linux-3.16.0-4-amd64-x86_64-with-debian-8.6 Using PETSc directory: /home/B07947/dev/codeaster-prerequisites/petsc-3.7.2 Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/include -I/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/lib -L/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/lib -lpetsc -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster6/MPI/lib -lesmumps -lptscotch -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Parmetis_aster-403_aster/lib -lparmetis -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster1/lib -lmetis -L/usr/lib -lscalapack-openmpi -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp -Wl,-rpath,/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/lib -L/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/lib -lml -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack-openmpi -llapack -lopenblas -lcblas -lX11 -lssl -lcrypto -lm -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl ----------------------------------------- -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 1 KSP unpreconditioned resid norm 1.303564177055e+10 true resid norm 1.303564177055e+10 ||r(i)||/||b|| 5.487322196112e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 2 KSP unpreconditioned resid norm 1.071183507871e+10 true resid norm 1.071183507871e+10 ||r(i)||/||b|| 4.509121332352e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 3 KSP unpreconditioned resid norm 1.055458147519e+10 true resid norm 1.055458147519e+10 ||r(i)||/||b|| 4.442925804412e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 4 KSP unpreconditioned resid norm 6.525228688268e+09 true resid norm 6.525228688268e+09 ||r(i)||/||b|| 2.746779394991e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 5 KSP unpreconditioned resid norm 3.448826301973e+09 true resid norm 3.448826301973e+09 ||r(i)||/||b|| 1.451775175358e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 6 KSP unpreconditioned resid norm 2.198988275290e+09 true resid norm 2.198988275290e+09 ||r(i)||/||b|| 9.256588501262e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 7 KSP unpreconditioned resid norm 2.004030973880e+09 true resid norm 2.004030973880e+09 ||r(i)||/||b|| 8.435920408237e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 8 KSP unpreconditioned resid norm 1.484494627562e+09 true resid norm 1.484494627562e+09 ||r(i)||/||b|| 6.248944596060e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 9 KSP unpreconditioned resid norm 1.349414519794e+09 true resid norm 1.349414519794e+09 ||r(i)||/||b|| 5.680328116218e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 10 KSP unpreconditioned resid norm 1.261837655142e+09 true resid norm 1.261837655142e+09 ||r(i)||/||b|| 5.311675401045e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 11 KSP unpreconditioned resid norm 9.178806138105e+08 true resid norm 9.178806138106e+08 ||r(i)||/||b|| 3.863796469861e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 12 KSP unpreconditioned resid norm 5.407955744874e+08 true resid norm 5.407955744874e+08 ||r(i)||/||b|| 2.276466024210e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 13 KSP unpreconditioned resid norm 5.265237838393e+08 true resid norm 5.265237838393e+08 ||r(i)||/||b|| 2.216389263142e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 14 KSP unpreconditioned resid norm 3.590262609540e+08 true resid norm 3.590262609540e+08 ||r(i)||/||b|| 1.511312450431e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 15 KSP unpreconditioned resid norm 1.793849911732e+08 true resid norm 1.793849911732e+08 ||r(i)||/||b|| 7.551168258837e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 16 KSP unpreconditioned resid norm 1.434526359423e+08 true resid norm 1.434526359423e+08 ||r(i)||/||b|| 6.038604367567e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 17 KSP unpreconditioned resid norm 1.236301089463e+08 true resid norm 1.236301089464e+08 ||r(i)||/||b|| 5.204179839167e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 18 KSP unpreconditioned resid norm 6.088689145868e+07 true resid norm 6.088689145872e+07 ||r(i)||/||b|| 2.563019119691e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 19 KSP unpreconditioned resid norm 3.931575158104e+07 true resid norm 3.931575158114e+07 ||r(i)||/||b|| 1.654987150654e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 20 KSP unpreconditioned resid norm 3.835858071661e+07 true resid norm 3.835858071665e+07 ||r(i)||/||b|| 1.614695272259e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 21 KSP unpreconditioned resid norm 2.077211508217e+07 true resid norm 2.077211508226e+07 ||r(i)||/||b|| 8.743972115632e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 22 KSP unpreconditioned resid norm 8.335358693348e+06 true resid norm 8.335358693410e+06 ||r(i)||/||b|| 3.508749287221e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 23 KSP unpreconditioned resid norm 2.472991237989e+06 true resid norm 2.472991238011e+06 ||r(i)||/||b|| 1.040999741323e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 24 KSP unpreconditioned resid norm 6.209407637559e+05 true resid norm 6.209407637614e+05 ||r(i)||/||b|| 2.613835279791e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 25 KSP unpreconditioned resid norm 1.765314903856e+05 true resid norm 1.765314903114e+05 ||r(i)||/||b|| 7.431050823187e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 26 KSP unpreconditioned resid norm 4.334232432165e+04 true resid norm 4.334232435112e+04 ||r(i)||/||b|| 1.824484767449e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 27 KSP unpreconditioned resid norm 1.788850623345e+04 true resid norm 1.788850627330e+04 ||r(i)||/||b|| 7.530123890830e-07 KSP Object: 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 2 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 2 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=146498, allocated nonzeros=146498 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.51255e+07 [1] 6.89106e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 87833. [1] 72313. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.51255e+07 [1] 6.89106e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 8 [1] 8 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 8 [1] 8 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 413 [1] 211 RINFOG(1) (global estimated flops for the elimination after analysis): 2.20165e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 160146. RINFOG(3) (global estimated flops for the elimination after factorization): 2.20165e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 146498 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 5065 INFOG(5) (estimated maximum front size in the complete tree): 263 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 146498 INFOG(10) (total integer space store the matrix factors after factorization): 5065 INFOG(11) (order of largest frontal matrix after factorization): 263 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 8 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 16 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 8 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 16 INFOG(20) (estimated number of entries in the factors): 146498 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 8 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 15 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 146498 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 3 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 2 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=73470, allocated nonzeros=73470 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 112 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 2 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_1_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3038, allocated nonzeros=3038 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 71763. [1] 15274. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1205. [1] 256. RINFO(3) (local estimated flops for the elimination after factorization): [0] 71763. [1] 15274. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 52 [1] 12 RINFOG(1) (global estimated flops for the elimination after analysis): 87037. RINFOG(2) (global estimated flops for the assembly after factorization): 1461. RINFOG(3) (global estimated flops for the elimination after factorization): 87037. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3038 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 318 INFOG(5) (estimated maximum front size in the complete tree): 45 INFOG(6) (number of nodes in the complete tree): 4 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3038 INFOG(10) (total integer space store the matrix factors after factorization): 318 INFOG(11) (order of largest frontal matrix after factorization): 45 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 2 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 2 INFOG(20) (estimated number of entries in the factors): 3038 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 2 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3038 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 2 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 2 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1110, allocated nonzeros=1110 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines A10 Mat Object: 2 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=6080, allocated nonzeros=6080 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines KSP of A00 KSP Object: (fieldsplit_0_) 2 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=146498, allocated nonzeros=146498 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.51255e+07 [1] 6.89106e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 87833. [1] 72313. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.51255e+07 [1] 6.89106e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 8 [1] 8 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 8 [1] 8 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 413 [1] 211 RINFOG(1) (global estimated flops for the elimination after analysis): 2.20165e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 160146. RINFOG(3) (global estimated flops for the elimination after factorization): 2.20165e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 146498 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 5065 INFOG(5) (estimated maximum front size in the complete tree): 263 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 146498 INFOG(10) (total integer space store the matrix factors after factorization): 5065 INFOG(11) (order of largest frontal matrix after factorization): 263 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 8 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 16 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 8 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 16 INFOG(20) (estimated number of entries in the factors): 146498 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 8 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 15 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 146498 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 3 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 2 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=73470, allocated nonzeros=73470 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 112 nodes, limit used is 5 A01 Mat Object: 2 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=6080, allocated nonzeros=6080 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 111 nodes, limit used is 5 Mat Object: 2 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=2846, allocated nonzeros=2846 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 37 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 2 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=94187 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 117 nodes, limit used is 5 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- on a arch-linux2-c-debug named dsp0780450 with 2 processors, by B07947 Wed Jan 4 22:23:35 2017 Using Petsc Release Version 3.7.2, Jun, 05, 2016 Max Max/Min Avg Total Time (sec): 3.263e-01 1.00000 3.263e-01 Objects: 2.620e+02 1.00769 2.610e+02 Flops: 7.742e+06 1.75026 6.083e+06 1.217e+07 Flops/sec: 2.372e+07 1.75027 1.864e+07 3.728e+07 Memory: 3.540e+06 1.24844 6.376e+06 MPI Messages: 4.890e+02 1.00411 4.880e+02 9.760e+02 MPI Message Lengths: 1.677e+06 1.01260 3.415e+03 3.333e+06 MPI Reductions: 1.975e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 3.2632e-01 100.0% 1.2165e+07 100.0% 9.760e+02 100.0% 3.415e+03 100.0% 1.973e+03 99.9% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ ########################################################## # # # WARNING!!! # # # # This code was compiled with a debugging option, # # To get timing results run ./configure # # using --with-debugging=no, the performance will # # be generally two or three times faster. # # # ########################################################## Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecMDot 27 1.0 3.4330e-03 4.2 3.07e+05 1.4 0.0e+00 0.0e+00 5.4e+01 1 4 0 0 3 1 4 0 0 3 151 VecNorm 57 1.0 4.5562e-04 1.2 4.63e+04 1.4 0.0e+00 0.0e+00 1.1e+02 0 1 0 0 6 0 1 0 0 6 172 VecScale 55 1.0 6.2103e-03 1.1 1.31e+04 1.7 0.0e+00 0.0e+00 0.0e+00 2 0 0 0 0 2 0 0 0 0 3 VecCopy 29 1.0 1.5283e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 121 1.0 4.5204e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 55 1.0 3.2640e-03 1.0 4.12e+04 1.3 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 22 VecAYPX 28 1.0 1.8382e-04 1.1 1.14e+04 1.4 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 105 VecMAXPY 55 1.0 1.5895e-03 1.2 6.36e+05 1.4 0.0e+00 0.0e+00 0.0e+00 0 9 0 0 0 0 9 0 0 0 678 VecAssemblyBegin 1 1.0 5.9128e-05 1.0 0.00e+00 0.0 2.0e+00 1.0e+03 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 1 1.0 1.0014e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 379 1.0 2.6152e-03 1.1 0.00e+00 0.0 3.0e+02 1.8e+03 8.1e+01 1 0 31 16 4 1 0 31 16 4 0 VecScatterEnd 298 1.0 2.0123e-03 1.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatMult 109 1.0 1.4856e-02 1.4 6.46e+06 1.8 1.6e+02 1.3e+03 0.0e+00 4 83 17 7 0 4 83 17 7 0 683 MatSolve 81 1.0 3.3831e-02 1.0 0.00e+00 0.0 1.4e+02 2.4e+03 9.5e+01 10 0 14 10 5 10 0 14 10 5 0 MatLUFactorSym 2 1.0 4.2470e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.6e+01 1 0 0 0 1 1 0 0 0 1 0 MatLUFactorNum 2 1.0 9.1913e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 28 0 0 0 0 28 0 0 0 0 0 MatConvert 1 1.0 4.1819e-04 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 2 1.0 2.6798e-04 1.2 8.40e+0316.1 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 33 MatAssemblyBegin 14 1.0 3.8922e-03 3.9 0.00e+00 0.0 3.0e+00 9.5e+04 3.2e+01 1 0 0 9 2 1 0 0 9 2 0 MatAssemblyEnd 14 1.0 1.0869e-02 1.0 0.00e+00 0.0 1.2e+01 3.3e+02 1.6e+02 3 0 1 0 8 3 0 1 0 8 0 MatGetRow 128 0.0 2.7323e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 2.0 3.8147e-06 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrix 4 1.0 4.1627e-02 1.0 0.00e+00 0.0 8.0e+00 2.9e+02 1.8e+02 13 0 1 0 9 13 0 1 0 9 0 MatGetOrdering 2 2.0 4.3321e-04 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 6 1.0 1.7805e-02 1.4 0.00e+00 0.0 7.2e+01 2.1e+03 1.8e+01 5 0 7 4 1 5 0 7 4 1 0 MatAXPY 1 1.0 1.7102e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.7e+01 1 0 0 0 2 1 0 0 0 2 0 MatMatMult 1 1.0 5.5580e-03 1.0 2.20e+05 0.0 4.0e+00 2.8e+03 3.7e+01 2 2 0 0 2 2 2 0 0 2 40 MatMatMultSym 1 1.0 4.2591e-03 1.0 0.00e+00 0.0 3.0e+00 2.3e+03 3.3e+01 1 0 0 0 2 1 0 0 0 2 0 MatMatMultNum 1 1.0 1.2820e-03 1.0 2.20e+05 0.0 1.0e+00 4.2e+03 4.0e+00 0 2 0 0 0 0 2 0 0 0 171 MatGetLocalMat 2 1.0 3.0327e-04 1.7 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 2 1.0 4.1389e-04 1.0 0.00e+00 0.0 4.0e+00 2.8e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPGMRESOrthog 27 1.0 4.7998e-03 2.1 6.13e+05 1.4 0.0e+00 0.0e+00 4.3e+02 1 9 0 0 22 1 9 0 0 22 217 KSPSetUp 3 1.0 3.2687e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 1 0 0 0 0 1 0 KSPSolve 1 1.0 1.8870e-01 1.0 7.51e+06 1.7 3.0e+02 1.8e+03 1.5e+03 58 98 31 16 77 58 98 31 16 77 63 PCSetUp 3 1.0 1.5060e-01 1.0 2.28e+05436.0 1.8e+01 1.1e+03 3.8e+02 46 2 2 1 19 46 2 2 1 19 2 PCApply 27 1.0 1.6088e-01 1.0 6.38e+0514.7 2.0e+02 1.9e+03 4.3e+02 49 6 20 11 22 49 6 20 11 22 4 KSPSolve_FS_0 27 1.0 1.9183e-02 1.1 0.00e+00 0.0 5.4e+01 3.0e+03 1.1e+02 6 0 6 5 5 6 0 6 5 5 0 KSPSolve_FS_Schu 27 1.0 3.1643e-02 1.1 0.00e+00 0.0 2.9e+01 3.3e+02 1.5e+02 9 0 3 0 7 9 0 3 0 7 0 KSPSolve_FS_Low 27 1.0 1.0088e-01 1.0 0.00e+00 0.0 5.8e+01 2.8e+03 1.5e+02 31 0 6 5 7 31 0 6 5 7 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 154 154 692384 0. Vector Scatter 14 14 19448 0. Matrix 37 37 2353112 0. Index Set 42 40 39600 0. Krylov Solver 5 5 24280 0. Preconditioner 5 5 4840 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 9.53674e-08 Average time for MPI_Barrier(): 6.19888e-07 Average time for zero size MPI_Send(): 1.43051e-06 #PETSc Option Table entries: -fieldsplit_0_ksp_converged_reason -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_factor_mat_solver_package mumps -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_converged_reason -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_factor_mat_solver_package mumps -fieldsplit_1_pc_type lu -ksp_rtol 1.0e-5 -ksp_type fgmres -ksp_view -log_summary -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_type schur -pc_type fieldsplit #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --prefix=/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install --with-mpi=yes --with-x=yes --download-ml=/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/ml-6.2-p3.tar.gz --with-mumps-lib="-L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster6/MPI/lib -lesmumps -lptscotch -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Parmetis_aster-403_aster/lib -lparmetis -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster1/lib -lmetis -L/usr/lib -lscalapack-openmpi -L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp " --with-mumps-include=/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/include --with-scalapack-lib="-L/usr/lib -lscalapack-openmpi" --with-blacs-lib="-L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi" --with-blas-lib="-L/usr/lib -lopenblas -lcblas" --with-lapack-lib="-L/usr/lib -llapack" ----------------------------------------- Libraries compiled on Wed Nov 30 11:59:58 2016 on dsp0780450 Machine characteristics: Linux-3.16.0-4-amd64-x86_64-with-debian-8.6 Using PETSc directory: /home/B07947/dev/codeaster-prerequisites/petsc-3.7.2 Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/include -I/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/lib -L/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/lib -lpetsc -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster6/MPI/lib -lesmumps -lptscotch -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Parmetis_aster-403_aster/lib -lparmetis -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster1/lib -lmetis -L/usr/lib -lscalapack-openmpi -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp -Wl,-rpath,/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/lib -L/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/lib -lml -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack-openmpi -llapack -lopenblas -lcblas -lX11 -lssl -lcrypto -lm -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl ----------------------------------------- -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 1 KSP unpreconditioned resid norm 2.354387765268e+10 true resid norm 2.354387765268e+10 ||r(i)||/||b|| 9.910738934074e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 2 KSP unpreconditioned resid norm 7.565226261144e+09 true resid norm 7.565226261144e+09 ||r(i)||/||b|| 3.184563883548e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 3 KSP unpreconditioned resid norm 7.046563779272e+09 true resid norm 7.046563779272e+09 ||r(i)||/||b|| 2.966234153478e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 4 KSP unpreconditioned resid norm 6.712061059639e+09 true resid norm 6.712061059639e+09 ||r(i)||/||b|| 2.825426034445e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 5 KSP unpreconditioned resid norm 4.119098261045e+09 true resid norm 4.119098261045e+09 ||r(i)||/||b|| 1.733924551905e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 6 KSP unpreconditioned resid norm 4.093786883357e+09 true resid norm 4.093786883357e+09 ||r(i)||/||b|| 1.723269788062e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 7 KSP unpreconditioned resid norm 2.519521928135e+09 true resid norm 2.519521928135e+09 ||r(i)||/||b|| 1.060586723937e-01 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 8 KSP unpreconditioned resid norm 2.373916289395e+09 true resid norm 2.373916289395e+09 ||r(i)||/||b|| 9.992943788879e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 9 KSP unpreconditioned resid norm 2.092596303640e+09 true resid norm 2.092596303640e+09 ||r(i)||/||b|| 8.808734043616e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 10 KSP unpreconditioned resid norm 2.075761860693e+09 true resid norm 2.075761860693e+09 ||r(i)||/||b|| 8.737869859045e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 11 KSP unpreconditioned resid norm 1.710779772940e+09 true resid norm 1.710779772940e+09 ||r(i)||/||b|| 7.201486498286e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 12 KSP unpreconditioned resid norm 1.648406379841e+09 true resid norm 1.648406379841e+09 ||r(i)||/||b|| 6.938927193247e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 13 KSP unpreconditioned resid norm 1.168410200806e+09 true resid norm 1.168410200806e+09 ||r(i)||/||b|| 4.918394768662e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 14 KSP unpreconditioned resid norm 1.166360451836e+09 true resid norm 1.166360451836e+09 ||r(i)||/||b|| 4.909766399445e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 15 KSP unpreconditioned resid norm 8.277208864954e+08 true resid norm 8.277208864951e+08 ||r(i)||/||b|| 3.484271256141e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 16 KSP unpreconditioned resid norm 8.276908129860e+08 true resid norm 8.276908129857e+08 ||r(i)||/||b|| 3.484144662423e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 17 KSP unpreconditioned resid norm 6.225378714956e+08 true resid norm 6.225378714955e+08 ||r(i)||/||b|| 2.620558266562e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 18 KSP unpreconditioned resid norm 5.761996033674e+08 true resid norm 5.761996033668e+08 ||r(i)||/||b|| 2.425498436208e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 19 KSP unpreconditioned resid norm 4.207038617987e+08 true resid norm 4.207038617981e+08 ||r(i)||/||b|| 1.770942834629e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 20 KSP unpreconditioned resid norm 3.481124393225e+08 true resid norm 3.481124393216e+08 ||r(i)||/||b|| 1.465370979546e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 21 KSP unpreconditioned resid norm 3.417301046733e+08 true resid norm 3.417301046727e+08 ||r(i)||/||b|| 1.438504694633e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 22 KSP unpreconditioned resid norm 2.567284390162e+08 true resid norm 2.567284390162e+08 ||r(i)||/||b|| 1.080692217984e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 23 KSP unpreconditioned resid norm 2.564957247063e+08 true resid norm 2.564957247063e+08 ||r(i)||/||b|| 1.079712612668e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 24 KSP unpreconditioned resid norm 2.418470506609e+08 true resid norm 2.418470506610e+08 ||r(i)||/||b|| 1.018049369962e-02 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 25 KSP unpreconditioned resid norm 1.561397633279e+08 true resid norm 1.561397633288e+08 ||r(i)||/||b|| 6.572665957613e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 26 KSP unpreconditioned resid norm 1.559442432818e+08 true resid norm 1.559442432828e+08 ||r(i)||/||b|| 6.564435588086e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 27 KSP unpreconditioned resid norm 7.736290224512e+07 true resid norm 7.736290224773e+07 ||r(i)||/||b|| 3.256572849513e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 28 KSP unpreconditioned resid norm 5.012849210895e+07 true resid norm 5.012849211088e+07 ||r(i)||/||b|| 2.110146874694e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 29 KSP unpreconditioned resid norm 4.745677367047e+07 true resid norm 4.745677367368e+07 ||r(i)||/||b|| 1.997681526687e-03 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 30 KSP unpreconditioned resid norm 2.290461197130e+07 true resid norm 2.290461197130e+07 ||r(i)||/||b|| 9.641641575896e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 31 KSP unpreconditioned resid norm 2.222737485540e+07 true resid norm 2.222737485540e+07 ||r(i)||/||b|| 9.356560233254e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 32 KSP unpreconditioned resid norm 2.201309723363e+07 true resid norm 2.201309723363e+07 ||r(i)||/||b|| 9.266360581351e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 33 KSP unpreconditioned resid norm 2.194874591650e+07 true resid norm 2.194874591650e+07 ||r(i)||/||b|| 9.239272048461e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 34 KSP unpreconditioned resid norm 2.128835019268e+07 true resid norm 2.128835019267e+07 ||r(i)||/||b|| 8.961280049498e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 35 KSP unpreconditioned resid norm 2.120111549764e+07 true resid norm 2.120111549764e+07 ||r(i)||/||b|| 8.924558813461e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 36 KSP unpreconditioned resid norm 1.976723092962e+07 true resid norm 1.976723092962e+07 ||r(i)||/||b|| 8.320968537262e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 37 KSP unpreconditioned resid norm 1.944717353711e+07 true resid norm 1.944717353711e+07 ||r(i)||/||b|| 8.186241144095e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 38 KSP unpreconditioned resid norm 1.903474216858e+07 true resid norm 1.903474216858e+07 ||r(i)||/||b|| 8.012629146873e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 39 KSP unpreconditioned resid norm 1.898023583422e+07 true resid norm 1.898023583422e+07 ||r(i)||/||b|| 7.989684835911e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 40 KSP unpreconditioned resid norm 1.694730587985e+07 true resid norm 1.694730587985e+07 ||r(i)||/||b|| 7.133927838434e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 41 KSP unpreconditioned resid norm 1.638896360821e+07 true resid norm 1.638896360821e+07 ||r(i)||/||b|| 6.898894995852e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 42 KSP unpreconditioned resid norm 1.519868139235e+07 true resid norm 1.519868139236e+07 ||r(i)||/||b|| 6.397848546613e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 43 KSP unpreconditioned resid norm 1.435025731851e+07 true resid norm 1.435025731851e+07 ||r(i)||/||b|| 6.040706463847e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 44 KSP unpreconditioned resid norm 1.398045017059e+07 true resid norm 1.398045017059e+07 ||r(i)||/||b|| 5.885037030245e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 45 KSP unpreconditioned resid norm 9.779767739199e+06 true resid norm 9.779767739203e+06 ||r(i)||/||b|| 4.116769817146e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 46 KSP unpreconditioned resid norm 9.762980432404e+06 true resid norm 9.762980432407e+06 ||r(i)||/||b|| 4.109703240539e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 47 KSP unpreconditioned resid norm 8.724157152360e+06 true resid norm 8.724157152364e+06 ||r(i)||/||b|| 3.672413067738e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 48 KSP unpreconditioned resid norm 8.701406903245e+06 true resid norm 8.701406903245e+06 ||r(i)||/||b|| 3.662836404835e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 49 KSP unpreconditioned resid norm 7.415213990246e+06 true resid norm 7.415213990273e+06 ||r(i)||/||b|| 3.121416577253e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 50 KSP unpreconditioned resid norm 7.191611476981e+06 true resid norm 7.191611477012e+06 ||r(i)||/||b|| 3.027291634598e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 51 KSP unpreconditioned resid norm 5.811126704500e+06 true resid norm 5.811126704556e+06 ||r(i)||/||b|| 2.446179874501e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 52 KSP unpreconditioned resid norm 4.174772896367e+06 true resid norm 4.174772896381e+06 ||r(i)||/||b|| 1.757360656365e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 53 KSP unpreconditioned resid norm 4.033730917892e+06 true resid norm 4.033730917902e+06 ||r(i)||/||b|| 1.697989373177e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 54 KSP unpreconditioned resid norm 3.072328590301e+06 true resid norm 3.072328590311e+06 ||r(i)||/||b|| 1.293289364966e-04 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 55 KSP unpreconditioned resid norm 2.184704206300e+06 true resid norm 2.184704206347e+06 ||r(i)||/||b|| 9.196460055008e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 56 KSP unpreconditioned resid norm 2.087788381502e+06 true resid norm 2.087788381688e+06 ||r(i)||/||b|| 8.788495211261e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 57 KSP unpreconditioned resid norm 1.295154635850e+06 true resid norm 1.295154636297e+06 ||r(i)||/||b|| 5.451922435611e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 58 KSP unpreconditioned resid norm 1.293544515105e+06 true resid norm 1.293544515547e+06 ||r(i)||/||b|| 5.445144670860e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 59 KSP unpreconditioned resid norm 1.034191001856e+06 true resid norm 1.034191002189e+06 ||r(i)||/||b|| 4.353402265279e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 60 KSP unpreconditioned resid norm 8.496752472634e+05 true resid norm 8.496752472634e+05 ||r(i)||/||b|| 3.576687612210e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 61 KSP unpreconditioned resid norm 8.496711167836e+05 true resid norm 8.496711167853e+05 ||r(i)||/||b|| 3.576670225062e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 62 KSP unpreconditioned resid norm 8.482716981331e+05 true resid norm 8.482716981338e+05 ||r(i)||/||b|| 3.570779405750e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 63 KSP unpreconditioned resid norm 8.475428645046e+05 true resid norm 8.475428645056e+05 ||r(i)||/||b|| 3.567711398040e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 64 KSP unpreconditioned resid norm 8.465314549473e+05 true resid norm 8.465314549483e+05 ||r(i)||/||b|| 3.563453893722e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 65 KSP unpreconditioned resid norm 8.211742970257e+05 true resid norm 8.211742970257e+05 ||r(i)||/||b|| 3.456713544495e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 66 KSP unpreconditioned resid norm 8.199302190668e+05 true resid norm 8.199302190672e+05 ||r(i)||/||b|| 3.451476628112e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 67 KSP unpreconditioned resid norm 7.928265198998e+05 true resid norm 7.928265198998e+05 ||r(i)||/||b|| 3.337384255326e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 68 KSP unpreconditioned resid norm 7.879389590406e+05 true resid norm 7.879389590406e+05 ||r(i)||/||b|| 3.316810184897e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 69 KSP unpreconditioned resid norm 7.459634072370e+05 true resid norm 7.459634072382e+05 ||r(i)||/||b|| 3.140115104476e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 70 KSP unpreconditioned resid norm 7.257146826826e+05 true resid norm 7.257146826828e+05 ||r(i)||/||b|| 3.054878583212e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 71 KSP unpreconditioned resid norm 6.685588318458e+05 true resid norm 6.685588318459e+05 ||r(i)||/||b|| 2.814282397420e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 72 KSP unpreconditioned resid norm 6.124291328776e+05 true resid norm 6.124291328784e+05 ||r(i)||/||b|| 2.578005773356e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 73 KSP unpreconditioned resid norm 6.094795320798e+05 true resid norm 6.094795320799e+05 ||r(i)||/||b|| 2.565589499408e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 74 KSP unpreconditioned resid norm 5.357696539008e+05 true resid norm 5.357696539018e+05 ||r(i)||/||b|| 2.255309531825e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 75 KSP unpreconditioned resid norm 5.242423536496e+05 true resid norm 5.242423536500e+05 ||r(i)||/||b|| 2.206785637378e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 76 KSP unpreconditioned resid norm 3.951117863892e+05 true resid norm 3.951117863909e+05 ||r(i)||/||b|| 1.663213605874e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 77 KSP unpreconditioned resid norm 3.370805526075e+05 true resid norm 3.370805526076e+05 ||r(i)||/||b|| 1.418932516525e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 78 KSP unpreconditioned resid norm 3.130844272170e+05 true resid norm 3.130844272167e+05 ||r(i)||/||b|| 1.317921401157e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 79 KSP unpreconditioned resid norm 2.798298725063e+05 true resid norm 2.798298725051e+05 ||r(i)||/||b|| 1.177937149210e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 80 KSP unpreconditioned resid norm 2.379383176697e+05 true resid norm 2.379383176686e+05 ||r(i)||/||b|| 1.001595652005e-05 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 81 KSP unpreconditioned resid norm 2.174610490053e+05 true resid norm 2.174610490057e+05 ||r(i)||/||b|| 9.153970797925e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 82 KSP unpreconditioned resid norm 2.135469574287e+05 true resid norm 2.135469574286e+05 ||r(i)||/||b|| 8.989208050016e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 83 KSP unpreconditioned resid norm 1.169940105631e+05 true resid norm 1.169940105627e+05 ||r(i)||/||b|| 4.924834866383e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 84 KSP unpreconditioned resid norm 7.882375855358e+04 true resid norm 7.882375855376e+04 ||r(i)||/||b|| 3.318067245987e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 85 KSP unpreconditioned resid norm 7.804803294150e+04 true resid norm 7.804803294074e+04 ||r(i)||/||b|| 3.285413262015e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 86 KSP unpreconditioned resid norm 5.378742608807e+04 true resid norm 5.378742608692e+04 ||r(i)||/||b|| 2.264168824470e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 87 KSP unpreconditioned resid norm 3.900919284375e+04 true resid norm 3.900919284392e+04 ||r(i)||/||b|| 1.642082634001e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 88 KSP unpreconditioned resid norm 3.889074491459e+04 true resid norm 3.889074491442e+04 ||r(i)||/||b|| 1.637096596765e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 89 KSP unpreconditioned resid norm 2.953428329101e+04 true resid norm 2.953428328769e+04 ||r(i)||/||b|| 1.243238584516e-06 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_1_ solve converged due to CONVERGED_ITS iterations 1 Linear fieldsplit_0_ solve converged due to CONVERGED_ITS iterations 1 90 KSP unpreconditioned resid norm 2.172827479107e+04 true resid norm 2.172827478767e+04 ||r(i)||/||b|| 9.146465254586e-07 KSP Object: 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 3 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 3 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=148148, allocated nonzeros=148148 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 55439. [1] 30276. [2] 65773. RINFO(3) (local estimated flops for the elimination after factorization): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 9 [1] 9 [2] 10 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 9 [1] 9 [2] 10 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 278 [1] 87 [2] 259 RINFOG(1) (global estimated flops for the elimination after analysis): 2.29761e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 151488. RINFOG(3) (global estimated flops for the elimination after factorization): 2.29761e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 148148 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4993 INFOG(5) (estimated maximum front size in the complete tree): 286 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 148148 INFOG(10) (total integer space store the matrix factors after factorization): 4993 INFOG(11) (order of largest frontal matrix after factorization): 286 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 10 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 28 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 10 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 28 INFOG(20) (estimated number of entries in the factors): 148148 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 10 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 28 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 148148 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 3 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=72004, allocated nonzeros=72004 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 84 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 3 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_1_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3736, allocated nonzeros=3736 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 0. [1] 0. [2] 137244. RINFO(2) (local estimated flops for the assembly after factorization): [0] 0. [1] 0. [2] 1225. RINFO(3) (local estimated flops for the elimination after factorization): [0] 0. [1] 0. [2] 137244. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 [2] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 [2] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 0 [1] 0 [2] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 137244. RINFOG(2) (global estimated flops for the assembly after factorization): 1225. RINFOG(3) (global estimated flops for the elimination after factorization): 137244. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3736 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 228 INFOG(5) (estimated maximum front size in the complete tree): 55 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3736 INFOG(10) (total integer space store the matrix factors after factorization): 228 INFOG(11) (order of largest frontal matrix after factorization): 55 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 3736 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3736 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 3 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 3 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1498, allocated nonzeros=1498 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 49 nodes, limit used is 5 A10 Mat Object: 3 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=6619, allocated nonzeros=6619 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 49 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_) 3 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_0_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=148148, allocated nonzeros=148148 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 55439. [1] 30276. [2] 65773. RINFO(3) (local estimated flops for the elimination after factorization): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 9 [1] 9 [2] 10 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 9 [1] 9 [2] 10 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 278 [1] 87 [2] 259 RINFOG(1) (global estimated flops for the elimination after analysis): 2.29761e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 151488. RINFOG(3) (global estimated flops for the elimination after factorization): 2.29761e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 148148 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4993 INFOG(5) (estimated maximum front size in the complete tree): 286 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 148148 INFOG(10) (total integer space store the matrix factors after factorization): 4993 INFOG(11) (order of largest frontal matrix after factorization): 286 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 10 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 28 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 10 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 28 INFOG(20) (estimated number of entries in the factors): 148148 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 10 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 28 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 148148 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 3 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=72004, allocated nonzeros=72004 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 84 nodes, limit used is 5 A01 Mat Object: 3 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=6619, allocated nonzeros=6619 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 84 nodes, limit used is 5 Mat Object: 3 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=3254, allocated nonzeros=3254 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 36 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 3 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=100623 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 89 nodes, limit used is 5 ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- on a arch-linux2-c-debug named dsp0780450 with 3 processors, by B07947 Wed Jan 4 22:24:59 2017 Using Petsc Release Version 3.7.2, Jun, 05, 2016 Max Max/Min Avg Total Time (sec): 6.375e-01 1.00000 6.375e-01 Objects: 3.880e+02 1.00518 3.867e+02 Flops: 1.960e+07 3.76942 1.356e+07 4.069e+07 Flops/sec: 3.074e+07 3.76942 2.128e+07 6.383e+07 Memory: 2.840e+06 2.46007 6.751e+06 MPI Messages: 1.442e+03 1.39324 1.209e+03 3.627e+03 MPI Message Lengths: 2.737e+06 1.92279 1.757e+03 6.373e+06 MPI Reductions: 5.530e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 6.3745e-01 100.0% 4.0689e+07 100.0% 3.627e+03 100.0% 1.757e+03 100.0% 5.528e+03 100.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ ########################################################## # # # WARNING!!! # # # # This code was compiled with a debugging option, # # To get timing results run ./configure # # using --with-debugging=no, the performance will # # be generally two or three times faster. # # # ########################################################## Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecMDot 90 1.0 1.0739e-02 2.8 8.55e+05 2.8 0.0e+00 0.0e+00 1.8e+02 1 5 0 0 3 1 5 0 0 3 178 VecNorm 185 1.0 4.0359e-03 2.1 1.14e+05 2.8 0.0e+00 0.0e+00 3.7e+02 0 1 0 0 7 0 1 0 0 7 63 VecScale 183 1.0 2.8165e-02 1.2 3.43e+04 3.4 0.0e+00 0.0e+00 0.0e+00 4 0 0 0 0 4 0 0 0 0 2 VecCopy 92 1.0 4.0483e-04 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 373 1.0 1.3938e-03 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 181 1.0 1.4770e-02 1.2 9.96e+04 2.5 0.0e+00 0.0e+00 0.0e+00 2 1 0 0 0 2 1 0 0 0 16 VecAYPX 91 1.0 4.9877e-04 1.3 2.79e+04 2.8 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 126 VecWAXPY 2 1.0 1.1921e-05 1.4 6.14e+02 2.8 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 115 VecMAXPY 181 1.0 5.1925e-03 1.6 1.73e+06 2.8 0.0e+00 0.0e+00 0.0e+00 1 10 0 0 0 1 10 0 0 0 747 VecAssemblyBegin 1 1.0 4.1962e-05 1.1 0.00e+00 0.0 6.0e+00 7.2e+02 6.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 1 1.0 1.0014e-05 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 1263 1.0 8.3861e-03 1.0 0.00e+00 0.0 2.6e+03 9.1e+02 2.7e+02 1 0 72 38 5 1 0 72 38 5 0 VecScatterEnd 993 1.0 9.8801e-03 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatMult 363 1.0 4.3428e-02 1.7 1.64e+07 3.9 1.5e+03 7.9e+02 0.0e+00 6 84 40 18 0 6 84 40 18 0 782 MatSolve 270 1.0 1.1593e-01 1.0 0.00e+00 0.0 1.2e+03 1.0e+03 2.8e+02 18 0 33 19 5 18 0 33 19 5 0 MatLUFactorSym 2 1.0 1.4674e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.6e+01 2 0 0 0 0 2 0 0 0 0 0 MatLUFactorNum 2 1.0 1.6366e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 26 0 0 0 0 26 0 0 0 0 0 MatConvert 1 1.0 3.1614e-04 1.3 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 2 1.0 2.5296e-04 1.5 8.44e+0319.6 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 39 MatAssemblyBegin 14 1.0 3.6721e-03 3.6 0.00e+00 0.0 9.0e+00 5.3e+04 3.2e+01 0 0 0 8 1 0 0 0 8 1 0 MatAssemblyEnd 14 1.0 9.9134e-03 1.1 0.00e+00 0.0 3.2e+01 2.0e+02 1.6e+02 2 0 1 0 3 2 0 1 0 3 0 MatGetRow 128 0.0 2.0242e-04 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 2.0 5.2452e-06 5.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrix 4 1.0 3.2568e-02 1.0 0.00e+00 0.0 2.0e+01 1.8e+02 1.8e+02 5 0 1 0 3 5 0 1 0 3 0 MatGetOrdering 2 2.0 2.8801e-04 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 6 1.0 5.2862e-03 1.2 0.00e+00 0.0 1.4e+02 2.1e+03 1.8e+01 1 0 4 5 0 1 0 4 5 0 0 MatAXPY 1 1.0 1.1809e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 3.7e+01 0 0 0 0 1 0 0 0 0 1 0 MatMatMult 1 1.0 4.0901e-03 1.0 2.82e+05 0.0 8.0e+00 3.7e+03 3.7e+01 1 1 0 0 1 1 1 0 0 1 69 MatMatMultSym 1 1.0 3.1250e-03 1.0 0.00e+00 0.0 6.0e+00 3.0e+03 3.3e+01 0 0 0 0 1 0 0 0 0 1 0 MatMatMultNum 1 1.0 9.5606e-04 1.0 2.82e+05 0.0 2.0e+00 5.7e+03 4.0e+00 0 1 0 0 0 0 1 0 0 0 295 MatGetLocalMat 2 1.0 1.9097e-04 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetBrAoCol 2 1.0 2.4199e-04 1.5 0.00e+00 0.0 8.0e+00 3.7e+03 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPGMRESOrthog 90 1.0 1.7338e-02 1.8 1.71e+06 2.8 0.0e+00 0.0e+00 1.6e+03 2 9 0 0 28 2 9 0 0 28 221 KSPSetUp 3 1.0 3.6407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 1.0e+01 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 5.1757e-01 1.0 1.93e+07 3.7 2.6e+03 9.1e+02 5.1e+03 81 99 73 38 92 81 99 73 38 92 78 PCSetUp 3 1.0 2.2046e-01 1.0 2.90e+05673.4 3.6e+01 1.1e+03 3.8e+02 35 1 1 1 7 35 1 1 1 7 1 PCApply 90 1.0 4.1464e-01 1.0 2.15e+0622.0 1.5e+03 9.1e+02 1.2e+03 65 6 43 22 23 65 6 43 22 23 6 KSPSolve_FS_0 90 1.0 6.7575e-02 1.0 0.00e+00 0.0 5.4e+02 1.1e+03 3.6e+02 10 0 15 9 7 10 0 15 9 7 0 KSPSolve_FS_Schu 90 1.0 8.9136e-02 1.1 0.00e+00 0.0 9.2e+01 7.5e+02 4.0e+02 14 0 3 1 7 14 0 3 1 7 0 KSPSolve_FS_Low 90 1.0 2.1874e-01 1.0 0.00e+00 0.0 5.5e+02 1.1e+03 4.0e+02 34 0 15 9 7 34 0 15 9 7 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 280 280 1105608 0. Vector Scatter 14 14 17840 0. Matrix 37 37 1853640 0. Index Set 42 40 37772 0. Krylov Solver 5 5 24280 0. Preconditioner 5 5 4840 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 0. Average time for MPI_Barrier(): 1.19209e-06 Average time for zero size MPI_Send(): 6.35783e-07 #PETSc Option Table entries: -fieldsplit_0_ksp_converged_reason -fieldsplit_0_ksp_type preonly -fieldsplit_0_pc_factor_mat_solver_package mumps -fieldsplit_0_pc_type lu -fieldsplit_1_ksp_converged_reason -fieldsplit_1_ksp_type preonly -fieldsplit_1_pc_factor_mat_solver_package mumps -fieldsplit_1_pc_type lu -ksp_rtol 1.0e-5 -ksp_type fgmres -ksp_view -log_summary -pc_fieldsplit_schur_factorization_type full -pc_fieldsplit_schur_precondition selfp -pc_fieldsplit_type schur -pc_type fieldsplit #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --prefix=/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install --with-mpi=yes --with-x=yes --download-ml=/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/ml-6.2-p3.tar.gz --with-mumps-lib="-L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster6/MPI/lib -lesmumps -lptscotch -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Parmetis_aster-403_aster/lib -lparmetis -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster1/lib -lmetis -L/usr/lib -lscalapack-openmpi -L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp " --with-mumps-include=/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/include --with-scalapack-lib="-L/usr/lib -lscalapack-openmpi" --with-blacs-lib="-L/usr/lib -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi" --with-blas-lib="-L/usr/lib -lopenblas -lcblas" --with-lapack-lib="-L/usr/lib -llapack" ----------------------------------------- Libraries compiled on Wed Nov 30 11:59:58 2016 on dsp0780450 Machine characteristics: Linux-3.16.0-4-amd64-x86_64-with-debian-8.6 Using PETSc directory: /home/B07947/dev/codeaster-prerequisites/petsc-3.7.2 Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/include -I/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/include -I/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/lib -L/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/arch-linux2-c-debug/lib -lpetsc -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Mumps-502_consortium_aster1/MPI/lib -lzmumps -ldmumps -lmumps_common -lpord -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Scotch_aster-604_aster6/MPI/lib -lesmumps -lptscotch -lptscotcherr -lptscotcherrexit -lscotch -lscotcherr -lscotcherrexit -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Parmetis_aster-403_aster/lib -lparmetis -L/home/B07947/dev/codeaster-prerequisites/v13/prerequisites/Metis_aster-510_aster1/lib -lmetis -L/usr/lib -lscalapack-openmpi -lblacs-openmpi -lblacsCinit-openmpi -lblacsF77init-openmpi -L/usr/lib/x86_64-linux-gnu -lgomp -Wl,-rpath,/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/lib -L/home/B07947/dev/codeaster-prerequisites/petsc-3.7.2/Install/lib -lml -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack-openmpi -llapack -lopenblas -lcblas -lX11 -lssl -lcrypto -lm -lmpi_f90 -lmpi_f77 -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/4.9 -L/usr/lib/gcc/x86_64-linux-gnu/4.9 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -lmpi -lhwloc -lgcc_s -lpthread -ldl ----------------------------------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: Matrix.bin.gz Type: application/x-gzip Size: 177035 bytes Desc: not available URL: From bsmith at mcs.anl.gov Wed Jan 4 15:59:13 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 15:59:13 -0600 Subject: [petsc-users] Fwd: Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: Message-ID: <523B5A35-79B9-454C-9CB0-2535877568DF@mcs.anl.gov> > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? Please add the following options -fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right -fieldsplit_0_ksp_monitor -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right -fieldsplit_1_ksp_monitor and send the output (don't bother with the -log_summary). Barry > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > Dear Petsc team, > > I am (still) trying to solve Biot's poroelasticity problem : > > > I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. > > I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : > > -ksp_rtol 1.0e-5 > -ksp_type fgmres > -pc_type fieldsplit > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_precondition selfp > -fieldsplit_0_pc_type lu > -fieldsplit_0_pc_factor_mat_solver_package mumps > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_pc_type lu > -fieldsplit_1_pc_factor_mat_solver_package mumps > -fieldsplit_1_ksp_type preonly > -fieldsplit_1_ksp_converged_reason > > On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? > > > Thanks for your precious help, > Nicolas > > <1_Warning.txt> From dave.mayhem23 at gmail.com Wed Jan 4 16:06:12 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Wed, 04 Jan 2017 22:06:12 +0000 Subject: [petsc-users] Fwd: Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: Message-ID: The issue is your fieldsplit_1 solve. You are applying mumps to an approximate Schur complement - not the true Schur complement. Seemingly the approximation is dependent on the communicator size. If you want to see iteration counts of 2, independent of mesh size and communicator size you need to solve the true Schur complement system (fieldsplit_1) to a specified tolerance (Erik 1e-10) - don't use preonly. In practice you probably don't want to iterate on the Schur complement either as it is likely too expensive. If you provided fieldsplit with a spectrally equivalent approximation to S, iteration counts would be larger than two, but they would be independent of the number of elements and comm size Thanks, Dave On Wed, 4 Jan 2017 at 22:39, Karin&NiKo wrote: > Dear Petsc team, > > I am (still) trying to solve Biot's poroelasticity problem : > [image: Images int?gr?es 1] > > I am using a mixed P2-P1 finite element discretization. The matrix of the > discretized system in binary format is attached to this email. > > I am using the fieldsplit framework to solve the linear system. Since I am > facing some troubles, I have decided to go back to simple things. Here are > the options I am using : > > -ksp_rtol 1.0e-5 > -ksp_type fgmres > -pc_type fieldsplit > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_precondition selfp > -fieldsplit_0_pc_type lu > -fieldsplit_0_pc_factor_mat_solver_package mumps > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_pc_type lu > -fieldsplit_1_pc_factor_mat_solver_package mumps > -fieldsplit_1_ksp_type preonly > -fieldsplit_1_ksp_converged_reason > > On a single proc, everything runs fine : the solver converges in 3 > iterations, according to the theory (see Run-1-proc.txt [contains > -log_view]). > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > I do not understand this behavior : since MUMPS is a parallel direct > solver, shouldn't the solver converge in max 3 iterations whatever the > number of procs? > > > Thanks for your precious help, > Nicolas > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 9086 bytes Desc: not available URL: From mvalera at mail.sdsu.edu Wed Jan 4 16:21:27 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Wed, 4 Jan 2017 14:21:27 -0800 Subject: [petsc-users] VecSetSizes hangs in MPI Message-ID: Hello all, happy new year, I'm working on parallelizing my code, it worked and provided some results when i just called more than one processor, but created artifacts because i didn't need one image of the whole program in each processor, conflicting with each other. Since the pressure solver is the main part i need in parallel im chosing mpi to run everything in root processor until its time to solve for pressure, at this point im trying to create a distributed vector using either call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) or call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) call VecSetType(xp,VECMPI,ierr) call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) In both cases program hangs at this point, something it never happened on the naive way i described before. I've made sure the global size, nbdp, is the same in every processor. What can be wrong? Thanks for your kind help, Manuel. -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Wed Jan 4 16:29:46 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Wed, 04 Jan 2017 22:29:46 +0000 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: You need to swap the order of your function calls. Call VecSetSizes() before VecSetType() Thanks, Dave On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: Hello all, happy new year, I'm working on parallelizing my code, it worked and provided some results when i just called more than one processor, but created artifacts because i didn't need one image of the whole program in each processor, conflicting with each other. Since the pressure solver is the main part i need in parallel im chosing mpi to run everything in root processor until its time to solve for pressure, at this point im trying to create a distributed vector using either call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) or call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) call VecSetType(xp,VECMPI,ierr) call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) In both cases program hangs at this point, something it never happened on the naive way i described before. I've made sure the global size, nbdp, is the same in every processor. What can be wrong? Thanks for your kind help, Manuel. -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Wed Jan 4 16:34:12 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Wed, 4 Jan 2017 14:34:12 -0800 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: Thanks Dave for the quick answer, appreciate it, I just tried that and it didn't make a difference, any other suggestions ? Thanks, Manuel On Wed, Jan 4, 2017 at 2:29 PM, Dave May wrote: > You need to swap the order of your function calls. > Call VecSetSizes() before VecSetType() > > Thanks, > Dave > > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: > > Hello all, happy new year, > > I'm working on parallelizing my code, it worked and provided some results > when i just called more than one processor, but created artifacts because i > didn't need one image of the whole program in each processor, conflicting > with each other. > > Since the pressure solver is the main part i need in parallel im chosing > mpi to run everything in root processor until its time to solve for > pressure, at this point im trying to create a distributed vector using > either > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > or > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > > call VecSetType(xp,VECMPI,ierr) > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > > In both cases program hangs at this point, something it never happened on > the naive way i described before. I've made sure the global size, nbdp, is > the same in every processor. What can be wrong? > > > Thanks for your kind help, > > > Manuel. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 4 16:36:33 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 16:36:33 -0600 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: > On Jan 4, 2017, at 4:21 PM, Manuel Valera wrote: > > Hello all, happy new year, > > I'm working on parallelizing my code, it worked and provided some results when i just called more than one processor, but created artifacts because i didn't need one image of the whole program in each processor, conflicting with each other. > > Since the pressure solver is the main part i need in parallel im chosing mpi to run everything in root processor until its time to solve for pressure, at this point im trying to create a distributed vector using either > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > or > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > call VecSetType(xp,VECMPI,ierr) > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > In both cases program hangs at this point, something it never happened on the naive way i described before. I've made sure the global size, nbdp, is the same in every processor. What can be wrong? How are you insuring that all processes are calling the VecCreateMPI()? My guess is that one process in MPI_COMM_WORLD is calling it and the other is not hence it is hanging waiting for that process. Run with two processes with option -start_in_debugger and two xterms should pop up, type cont in both, when it hangs wait a little while and then do control c in each terminal and type bt to see where each process is. Barry > > Thanks for your kind help, > > Manuel. From knepley at gmail.com Wed Jan 4 16:37:06 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Jan 2017 16:37:06 -0600 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: On Wed, Jan 4, 2017 at 4:21 PM, Manuel Valera wrote: > Hello all, happy new year, > > I'm working on parallelizing my code, it worked and provided some results > when i just called more than one processor, but created artifacts because i > didn't need one image of the whole program in each processor, conflicting > with each other. > > Since the pressure solver is the main part i need in parallel im chosing > mpi to run everything in root processor until its time to solve for > pressure, at this point im trying to create a distributed vector using > either > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > or > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > > call VecSetType(xp,VECMPI,ierr) > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > > In both cases program hangs at this point, something it never happened on > the naive way i described before. I've made sure the global size, nbdp, is > the same in every processor. What can be wrong? > It sounds like every process is not calling this function. This will cause a hang since its collective. Matt > Thanks for your kind help, > > > Manuel. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Wed Jan 4 16:39:15 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Wed, 04 Jan 2017 22:39:15 +0000 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). These functions cannot be inside if statements like if (rank == 0){ VecCreateMPI(...) } On Wed, 4 Jan 2017 at 23:34, Manuel Valera wrote: > Thanks Dave for the quick answer, appreciate it, > > I just tried that and it didn't make a difference, any other suggestions ? > > Thanks, > Manuel > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May wrote: > > You need to swap the order of your function calls. > Call VecSetSizes() before VecSetType() > > Thanks, > Dave > > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: > > Hello all, happy new year, > > I'm working on parallelizing my code, it worked and provided some results > when i just called more than one processor, but created artifacts because i > didn't need one image of the whole program in each processor, conflicting > with each other. > > Since the pressure solver is the main part i need in parallel im chosing > mpi to run everything in root processor until its time to solve for > pressure, at this point im trying to create a distributed vector using > either > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > or > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > > call VecSetType(xp,VECMPI,ierr) > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > > In both cases program hangs at this point, something it never happened on > the naive way i described before. I've made sure the global size, nbdp, is > the same in every processor. What can be wrong? > > > Thanks for your kind help, > > > Manuel. > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Wed Jan 4 16:55:42 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Wed, 4 Jan 2017 14:55:42 -0800 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: Thanks for the answers ! heres the screenshot of what i got from bt in gdb (great hint in how to debug in petsc, didn't know that) I don't really know what to look at here, Thanks, Manuel On Wed, Jan 4, 2017 at 2:39 PM, Dave May wrote: > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). > These functions cannot be inside if statements like > if (rank == 0){ > VecCreateMPI(...) > } > > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera wrote: > >> Thanks Dave for the quick answer, appreciate it, >> >> I just tried that and it didn't make a difference, any other suggestions ? >> >> Thanks, >> Manuel >> >> On Wed, Jan 4, 2017 at 2:29 PM, Dave May wrote: >> >> You need to swap the order of your function calls. >> Call VecSetSizes() before VecSetType() >> >> Thanks, >> Dave >> >> >> On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: >> >> Hello all, happy new year, >> >> I'm working on parallelizing my code, it worked and provided some results >> when i just called more than one processor, but created artifacts because i >> didn't need one image of the whole program in each processor, conflicting >> with each other. >> >> Since the pressure solver is the main part i need in parallel im chosing >> mpi to run everything in root processor until its time to solve for >> pressure, at this point im trying to create a distributed vector using >> either >> >> call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >> or >> >> call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >> >> call VecSetType(xp,VECMPI,ierr) >> >> call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >> >> >> >> In both cases program hangs at this point, something it never happened on >> the naive way i described before. I've made sure the global size, nbdp, is >> the same in every processor. What can be wrong? >> >> >> Thanks for your kind help, >> >> >> Manuel. >> >> >> >> >> >> >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screen Shot 2017-01-04 at 2.53.05 PM.png Type: image/png Size: 141017 bytes Desc: not available URL: From mvalera at mail.sdsu.edu Wed Jan 4 17:21:48 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Wed, 4 Jan 2017 15:21:48 -0800 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: I did a PetscBarrier just before calling the vicariate routine and im pretty sure im calling it from every processor, code looks like this: call PetscBarrier(PETSC_NULL_OBJECT,ierr) print*,'entering POInit from',rank !call exit() call PetscObjsInit() And output gives: entering POInit from 0 entering POInit from 1 entering POInit from 2 entering POInit from 3 Still hangs in the same way, Thanks, Manuel On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera wrote: > Thanks for the answers ! > > heres the screenshot of what i got from bt in gdb (great hint in how to > debug in petsc, didn't know that) > > I don't really know what to look at here, > > Thanks, > > Manuel > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May wrote: > >> Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). >> These functions cannot be inside if statements like >> if (rank == 0){ >> VecCreateMPI(...) >> } >> >> >> On Wed, 4 Jan 2017 at 23:34, Manuel Valera wrote: >> >>> Thanks Dave for the quick answer, appreciate it, >>> >>> I just tried that and it didn't make a difference, any other suggestions >>> ? >>> >>> Thanks, >>> Manuel >>> >>> On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>> wrote: >>> >>> You need to swap the order of your function calls. >>> Call VecSetSizes() before VecSetType() >>> >>> Thanks, >>> Dave >>> >>> >>> On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>> wrote: >>> >>> Hello all, happy new year, >>> >>> I'm working on parallelizing my code, it worked and provided some >>> results when i just called more than one processor, but created artifacts >>> because i didn't need one image of the whole program in each processor, >>> conflicting with each other. >>> >>> Since the pressure solver is the main part i need in parallel im chosing >>> mpi to run everything in root processor until its time to solve for >>> pressure, at this point im trying to create a distributed vector using >>> either >>> >>> call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>> or >>> >>> call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>> >>> call VecSetType(xp,VECMPI,ierr) >>> >>> call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>> >>> >>> >>> In both cases program hangs at this point, something it never happened >>> on the naive way i described before. I've made sure the global size, nbdp, >>> is the same in every processor. What can be wrong? >>> >>> >>> Thanks for your kind help, >>> >>> >>> Manuel. >>> >>> >>> >>> >>> >>> >>> >>> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Jan 4 17:23:32 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Jan 2017 17:23:32 -0600 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera wrote: > I did a PetscBarrier just before calling the vicariate routine and im > pretty sure im calling it from every processor, code looks like this: > >From the gdb trace. Proc 0: Is in some MPI routine you call yourself, line 113 Proc 1: Is in VecCreate(), line 130 You need to fix your communication code. Matt > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > print*,'entering POInit from',rank > > !call exit() > > > call PetscObjsInit() > > > > And output gives: > > > entering POInit from 0 > > entering POInit from 1 > > entering POInit from 2 > > entering POInit from 3 > > > Still hangs in the same way, > > Thanks, > > Manuel > > > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera > wrote: > >> Thanks for the answers ! >> >> heres the screenshot of what i got from bt in gdb (great hint in how to >> debug in petsc, didn't know that) >> >> I don't really know what to look at here, >> >> Thanks, >> >> Manuel >> >> On Wed, Jan 4, 2017 at 2:39 PM, Dave May wrote: >> >>> Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). >>> These functions cannot be inside if statements like >>> if (rank == 0){ >>> VecCreateMPI(...) >>> } >>> >>> >>> On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>> wrote: >>> >>>> Thanks Dave for the quick answer, appreciate it, >>>> >>>> I just tried that and it didn't make a difference, any other >>>> suggestions ? >>>> >>>> Thanks, >>>> Manuel >>>> >>>> On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>>> wrote: >>>> >>>> You need to swap the order of your function calls. >>>> Call VecSetSizes() before VecSetType() >>>> >>>> Thanks, >>>> Dave >>>> >>>> >>>> On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>>> wrote: >>>> >>>> Hello all, happy new year, >>>> >>>> I'm working on parallelizing my code, it worked and provided some >>>> results when i just called more than one processor, but created artifacts >>>> because i didn't need one image of the whole program in each processor, >>>> conflicting with each other. >>>> >>>> Since the pressure solver is the main part i need in parallel im >>>> chosing mpi to run everything in root processor until its time to solve for >>>> pressure, at this point im trying to create a distributed vector using >>>> either >>>> >>>> call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>>> or >>>> >>>> call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>> >>>> call VecSetType(xp,VECMPI,ierr) >>>> >>>> call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>> >>>> >>>> >>>> In both cases program hangs at this point, something it never happened >>>> on the naive way i described before. I've made sure the global size, nbdp, >>>> is the same in every processor. What can be wrong? >>>> >>>> >>>> Thanks for your kind help, >>>> >>>> >>>> Manuel. >>>> >>>> >>>> >>>> >>>> >>>> >>>> >>>> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Wed Jan 4 17:30:51 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Wed, 4 Jan 2017 15:30:51 -0800 Subject: [petsc-users] VecSetSizes hangs in MPI In-Reply-To: References: Message-ID: Thanks i had no idea how to debug and read those logs, that solved this issue at least (i was sending a message from root to everyone else, but trying to catch from everyone else including root) Until next time, many thanks, Manuel On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley wrote: > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera > wrote: > >> I did a PetscBarrier just before calling the vicariate routine and im >> pretty sure im calling it from every processor, code looks like this: >> > > From the gdb trace. > > Proc 0: Is in some MPI routine you call yourself, line 113 > > Proc 1: Is in VecCreate(), line 130 > > You need to fix your communication code. > > Matt > > >> call PetscBarrier(PETSC_NULL_OBJECT,ierr) >> >> >> print*,'entering POInit from',rank >> >> !call exit() >> >> >> call PetscObjsInit() >> >> >> >> And output gives: >> >> >> entering POInit from 0 >> >> entering POInit from 1 >> >> entering POInit from 2 >> >> entering POInit from 3 >> >> >> Still hangs in the same way, >> >> Thanks, >> >> Manuel >> >> >> >> On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera >> wrote: >> >>> Thanks for the answers ! >>> >>> heres the screenshot of what i got from bt in gdb (great hint in how to >>> debug in petsc, didn't know that) >>> >>> I don't really know what to look at here, >>> >>> Thanks, >>> >>> Manuel >>> >>> On Wed, Jan 4, 2017 at 2:39 PM, Dave May >>> wrote: >>> >>>> Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). >>>> These functions cannot be inside if statements like >>>> if (rank == 0){ >>>> VecCreateMPI(...) >>>> } >>>> >>>> >>>> On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>>> wrote: >>>> >>>>> Thanks Dave for the quick answer, appreciate it, >>>>> >>>>> I just tried that and it didn't make a difference, any other >>>>> suggestions ? >>>>> >>>>> Thanks, >>>>> Manuel >>>>> >>>>> On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>>>> wrote: >>>>> >>>>> You need to swap the order of your function calls. >>>>> Call VecSetSizes() before VecSetType() >>>>> >>>>> Thanks, >>>>> Dave >>>>> >>>>> >>>>> On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>>>> wrote: >>>>> >>>>> Hello all, happy new year, >>>>> >>>>> I'm working on parallelizing my code, it worked and provided some >>>>> results when i just called more than one processor, but created artifacts >>>>> because i didn't need one image of the whole program in each processor, >>>>> conflicting with each other. >>>>> >>>>> Since the pressure solver is the main part i need in parallel im >>>>> chosing mpi to run everything in root processor until its time to solve for >>>>> pressure, at this point im trying to create a distributed vector using >>>>> either >>>>> >>>>> call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>>>> or >>>>> >>>>> call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>> >>>>> call VecSetType(xp,VECMPI,ierr) >>>>> >>>>> call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>> >>>>> >>>>> >>>>> In both cases program hangs at this point, something it never happened >>>>> on the naive way i described before. I've made sure the global size, nbdp, >>>>> is the same in every processor. What can be wrong? >>>>> >>>>> >>>>> Thanks for your kind help, >>>>> >>>>> >>>>> Manuel. >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>>>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 4 17:32:44 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 17:32:44 -0600 Subject: [petsc-users] Fwd: Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: Message-ID: <328C5389-E158-4A85-8536-7DCC7FC261B8@mcs.anl.gov> > On Jan 4, 2017, at 4:06 PM, Dave May wrote: > > The issue is your fieldsplit_1 solve. You are applying mumps to an approximate Schur complement - not the true Schur complement. Seemingly the approximation is dependent on the communicator size. Yes, but why and how is it dependent on the communicator size? From the output Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse. Note to PETSc developers: this output is horrible and needs to be fixed. "(lumped, if requested)" WTF! if lumped was requested the output should just say lumping was used, if lumping was not used it shouldn't say anything!! I've fixed this in master. if Sp = A11 - A10* inv(diagonal(A00))*A01 shouldn't this be independent of the number of processes? Another note to PETSc developers: I am thinking PCFIELDSPLIT is way to complex. Perhaps it should be factored into a PCSCHURCOMPLEMENT that only does 2 by 2 blocks via Schur complements and a PCFIELDSPLIT that does non-Schur complement methods for any number of blocks. Barry > > If you want to see iteration counts of 2, independent of mesh size and communicator size you need to solve the true Schur complement system (fieldsplit_1) to a specified tolerance (Erik 1e-10) - don't use preonly. > > In practice you probably don't want to iterate on the Schur complement either as it is likely too expensive. If you provided fieldsplit with a spectrally equivalent approximation to S, iteration counts would be larger than two, but they would be independent of the number of elements and comm size > > Thanks, > Dave > > > > > On Wed, 4 Jan 2017 at 22:39, Karin&NiKo wrote: > Dear Petsc team, > > I am (still) trying to solve Biot's poroelasticity problem : > > > I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. > > I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : > > -ksp_rtol 1.0e-5 > -ksp_type fgmres > -pc_type fieldsplit > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_precondition selfp > -fieldsplit_0_pc_type lu > -fieldsplit_0_pc_factor_mat_solver_package mumps > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_pc_type lu > -fieldsplit_1_pc_factor_mat_solver_package mumps > -fieldsplit_1_ksp_type preonly > -fieldsplit_1_ksp_converged_reason > > On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? > > > Thanks for your precious help, > Nicolas > > > > > From knepley at gmail.com Wed Jan 4 17:59:53 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 4 Jan 2017 17:59:53 -0600 Subject: [petsc-users] a question on DMPlexSetAnchors In-Reply-To: References: Message-ID: On Tue, Jan 3, 2017 at 4:02 PM, Rochan Upadhyay wrote: > I think I sent my previous question (on Dec 28th) to the wrong place > (petsc-users-request at mcs.anl.gov). > Yes, this is the correct mailing list. > To repeat, > > I am having bit of a difficulty in understanding the introduction of > constraints in DMPlex. From a quick study of the User Manual I gather > that it is easiest done using DMPlexSetAnchors ? The description of this > routine says that there is an anchorIS that specifies the anchor points > (rows in the > matrix). This is okay and easily understood. > I think this is not the right mechanism for you. Anchors: This is intended for constraints in the discretization, such as hanging nodes, which are purely local, and intended to take place across the entire domain. That determines the interface. Dirichlet Boundary Conditions: For these, I would recommend using the Constraint interface in PetscSection, which eliminates these unknowns from the global system, but includes the values in the local vectors used in assembly. You can also just alter your equations for constrained unknowns. Constraints among Fields: I would recommend just putting the constraint in as an equation. In your case the effect can be non-local, so this seems like the best strategy. Thanks, Matt > There is also an anchorSection which is described as a map from constraint > points > (columns ?) to the anchor points listed in the anchorIS. Should this not > be a map between > solution indices (i.e. indices appearing in the vectors and matrices) ? > > For example I am completely unable to set up a simple constraint matrix > for the following (say): > > Point 1, Field A, B > Point 2-10 Field A > At point 1, Field B depends on Field A at points 1-10 > > When I set it up it appears to create a matrix where field A depends on > field A values at points 1-10. > > How does the mapping work in this case ? Will the DMPlexSetAnchors() > routine work > for this simple scenario ? > > If not, is the only recourse to create the constraint matrix oneself > using DMSetDefaultConstraints ? > > Also documentation for DMSetDefaultConstraints is incomplete. > The function accepts three arguments (dm, section and Mat) but > what the section is is not described at all. > > I don't know if my question makes any sense. If it does not then it is > only a reflection of my utter confusion regarding the routine > DMPlexSetAnchors :-( > > Regards, > Rochan > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 4 18:33:18 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 18:33:18 -0600 Subject: [petsc-users] Fwd: Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: Message-ID: <7F52B55E-EF7F-4067-80A0-8828FC0A4861@mcs.anl.gov> Dave, When I run your example with what I think are the same options I do get the same convergence independent of processes (with exact solvers) ./ex42 -stokes_pc_type fieldsplit -stokes_ksp_monitor -stokes_fieldsplit_u_pc_type lu -stokes_fieldsplit_p_pc_type lu -stokes_pc_fieldsplit_type schur -stokes_pc_fieldsplit_schur_factorization_type full -stokes_pc_fieldsplit_schur_precondition selfp Residual norms for stokes_ solve. 0 KSP Residual norm 2.219697707545e-01 1 KSP Residual norm 2.160497527164e-01 2 KSP Residual norm 2.931344898250e-02 3 KSP Residual norm 3.504825774118e-03 4 KSP Residual norm 5.626318301896e-04 5 KSP Residual norm 9.099204283519e-05 6 KSP Residual norm 1.731194762517e-05 7 KSP Residual norm 2.920732887195e-06 8 KSP Residual norm 4.154207455723e-07 ~/Src/petsc/src/ksp/ksp/examples/tutorials (master=) arch-mpich $ petscmpiexec -n 2 ./ex42 -stokes_pc_type fieldsplit -stokes_ksp_monitor -stokes_fieldsplit_u_pc_type lu -stokes_fieldsplit_p_pc_type lu -stokes_pc_fieldsplit_type schur -stokes_pc_fieldsplit_schur_factorization_type full -stokes_pc_fieldsplit_schur_precondition selfp Residual norms for stokes_ solve. 0 KSP Residual norm 2.219697707545e-01 1 KSP Residual norm 2.160497527164e-01 2 KSP Residual norm 2.931344898250e-02 3 KSP Residual norm 3.504825774118e-03 4 KSP Residual norm 5.626318301897e-04 5 KSP Residual norm 9.099204283514e-05 6 KSP Residual norm 1.731194762514e-05 7 KSP Residual norm 2.920732887196e-06 8 KSP Residual norm 4.154207455766e-07 ~/Src/petsc/src/ksp/ksp/examples/tutorials (master=) arch-mpich $ petscmpiexec -n 3 ./ex42 -stokes_pc_type fieldsplit -stokes_ksp_monitor -stokes_fieldsplit_u_pc_type lu -stokes_fieldsplit_p_pc_type lu -stokes_pc_fieldsplit_type schur -stokes_pc_fieldsplit_schur_factorization_type full -stokes_pc_fieldsplit_schur_precondition selfp Residual norms for stokes_ solve. 0 KSP Residual norm 2.219697707545e-01 1 KSP Residual norm 2.160497527164e-01 2 KSP Residual norm 2.931344898250e-02 3 KSP Residual norm 3.504825774117e-03 4 KSP Residual norm 5.626318301897e-04 5 KSP Residual norm 9.099204283517e-05 6 KSP Residual norm 1.731194762515e-05 7 KSP Residual norm 2.920732887202e-06 8 KSP Residual norm 4.154207455736e-07 ~/Src/petsc/src/ksp/ksp/examples/tutorials (master=) arch-mpich $ petscmpiexec -n 4 ./ex42 -stokes_pc_type fieldsplit -stokes_ksp_monitor -stokes_fieldsplit_u_pc_type lu -stokes_fieldsplit_p_pc_type lu -stokes_pc_fieldsplit_type schur -stokes_pc_fieldsplit_schur_factorization_type full -stokes_pc_fieldsplit_schur_precondition selfp Residual norms for stokes_ solve. 0 KSP Residual norm 2.219697707545e-01 1 KSP Residual norm 2.160497527164e-01 2 KSP Residual norm 2.931344898250e-02 3 KSP Residual norm 3.504825774118e-03 4 KSP Residual norm 5.626318301897e-04 5 KSP Residual norm 9.099204283517e-05 6 KSP Residual norm 1.731194762513e-05 7 KSP Residual norm 2.920732887190e-06 8 KSP Residual norm 4.154207455781e-07 KSP Object: (stokes_) 4 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (stokes_) 4 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, blocksize = 4, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses A00's diagonal's inverse Split info: Split number 0 Fields 0, 1, 2 Split number 1 Fields 3 KSP solver for A00 block KSP Object: (stokes_fieldsplit_u_) 4 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (stokes_fieldsplit_u_) 4 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: superlu_dist rows=3993, cols=3993 package used to perform factorization: superlu_dist total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 SuperLU_DIST run parameters: Process grid nprow 2 x npcol 2 Equilibrate matrix TRUE Matrix input mode 1 Replace tiny pivots FALSE Use iterative refinement FALSE Processors in row 2 col partition 2 Row permutation LargeDiag Column permutation METIS_AT_PLUS_A Parallel symbolic factorization FALSE Repeated factorization SamePattern linear system matrix = precond matrix: Mat Object: (stokes_fieldsplit_u_) 4 MPI processes type: mpiaij rows=3993, cols=3993, bs=3 total: nonzeros=268119, allocated nonzeros=268119 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 396 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (stokes_fieldsplit_p_) 4 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (stokes_fieldsplit_p_) 4 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: superlu_dist rows=1331, cols=1331 package used to perform factorization: superlu_dist total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 SuperLU_DIST run parameters: Process grid nprow 2 x npcol 2 Equilibrate matrix TRUE Matrix input mode 1 Replace tiny pivots FALSE Use iterative refinement FALSE Processors in row 2 col partition 2 Row permutation LargeDiag Column permutation METIS_AT_PLUS_A Parallel symbolic factorization FALSE Repeated factorization SamePattern linear system matrix followed by preconditioner matrix: Mat Object: (stokes_fieldsplit_p_) 4 MPI processes type: schurcomplement rows=1331, cols=1331 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (stokes_fieldsplit_p_) 4 MPI processes type: mpiaij rows=1331, cols=1331 total: nonzeros=29791, allocated nonzeros=29791 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines A10 Mat Object: 4 MPI processes type: mpiaij rows=1331, cols=3993 total: nonzeros=89373, allocated nonzeros=89373 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines KSP of A00 KSP Object: (stokes_fieldsplit_u_) 4 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (stokes_fieldsplit_u_) 4 MPI processes type: lu out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: superlu_dist rows=3993, cols=3993 package used to perform factorization: superlu_dist total: nonzeros=0, allocated nonzeros=0 total number of mallocs used during MatSetValues calls =0 SuperLU_DIST run parameters: Process grid nprow 2 x npcol 2 Equilibrate matrix TRUE Matrix input mode 1 Replace tiny pivots FALSE Use iterative refinement FALSE Processors in row 2 col partition 2 Row permutation LargeDiag Column permutation METIS_AT_PLUS_A Parallel symbolic factorization FALSE Repeated factorization SamePattern linear system matrix = precond matrix: Mat Object: (stokes_fieldsplit_u_) 4 MPI processes type: mpiaij rows=3993, cols=3993, bs=3 total: nonzeros=268119, allocated nonzeros=268119 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 396 nodes, limit used is 5 A01 Mat Object: 4 MPI processes type: mpiaij rows=3993, cols=1331, rbs=3, cbs = 1 total: nonzeros=89373, allocated nonzeros=89373 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 396 nodes, limit used is 5 Mat Object: 4 MPI processes type: mpiaij rows=1331, cols=1331 total: nonzeros=117649, allocated nonzeros=117649 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines linear system matrix followed by preconditioner matrix: Mat Object: 4 MPI processes type: mpiaij rows=5324, cols=5324, bs=4 total: nonzeros=476656, allocated nonzeros=476656 total number of mallocs used during MatSetValues calls =0 Mat Object: 4 MPI processes type: mpiaij rows=5324, cols=5324, bs=4 total: nonzeros=476656, allocated nonzeros=476656 total number of mallocs used during MatSetValues calls =0 > On Jan 4, 2017, at 4:06 PM, Dave May wrote: > > The issue is your fieldsplit_1 solve. You are applying mumps to an approximate Schur complement - not the true Schur complement. Seemingly the approximation is dependent on the communicator size. > > If you want to see iteration counts of 2, independent of mesh size and communicator size you need to solve the true Schur complement system (fieldsplit_1) to a specified tolerance (Erik 1e-10) - don't use preonly. > > In practice you probably don't want to iterate on the Schur complement either as it is likely too expensive. If you provided fieldsplit with a spectrally equivalent approximation to S, iteration counts would be larger than two, but they would be independent of the number of elements and comm size > > Thanks, > Dave > > > > > On Wed, 4 Jan 2017 at 22:39, Karin&NiKo wrote: > Dear Petsc team, > > I am (still) trying to solve Biot's poroelasticity problem : > > > I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. > > I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : > > -ksp_rtol 1.0e-5 > -ksp_type fgmres > -pc_type fieldsplit > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_precondition selfp > -fieldsplit_0_pc_type lu > -fieldsplit_0_pc_factor_mat_solver_package mumps > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_pc_type lu > -fieldsplit_1_pc_factor_mat_solver_package mumps > -fieldsplit_1_ksp_type preonly > -fieldsplit_1_ksp_converged_reason > > On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? > > > Thanks for your precious help, > Nicolas > > > > > From bsmith at mcs.anl.gov Wed Jan 4 18:36:29 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 4 Jan 2017 18:36:29 -0600 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: Message-ID: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> There is something wrong with your set up. 1 process total: nonzeros=140616, allocated nonzeros=140616 total: nonzeros=68940, allocated nonzeros=68940 total: nonzeros=3584, allocated nonzeros=3584 total: nonzeros=1000, allocated nonzeros=1000 total: nonzeros=8400, allocated nonzeros=8400 2 processes total: nonzeros=146498, allocated nonzeros=146498 total: nonzeros=73470, allocated nonzeros=73470 total: nonzeros=3038, allocated nonzeros=3038 total: nonzeros=1110, allocated nonzeros=1110 total: nonzeros=6080, allocated nonzeros=6080 total: nonzeros=146498, allocated nonzeros=146498 total: nonzeros=73470, allocated nonzeros=73470 total: nonzeros=6080, allocated nonzeros=6080 total: nonzeros=2846, allocated nonzeros=2846 total: nonzeros=86740, allocated nonzeros=94187 It looks like you are setting up the problem differently in parallel and seq. If it is suppose to be an identical problem then the number nonzeros should be the same in at least the first two matrices. > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > Dear Petsc team, > > I am (still) trying to solve Biot's poroelasticity problem : > > > I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. > > I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : > > -ksp_rtol 1.0e-5 > -ksp_type fgmres > -pc_type fieldsplit > -pc_fieldsplit_schur_factorization_type full > -pc_fieldsplit_type schur > -pc_fieldsplit_schur_precondition selfp > -fieldsplit_0_pc_type lu > -fieldsplit_0_pc_factor_mat_solver_package mumps > -fieldsplit_0_ksp_type preonly > -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_pc_type lu > -fieldsplit_1_pc_factor_mat_solver_package mumps > -fieldsplit_1_ksp_type preonly > -fieldsplit_1_ksp_converged_reason > > On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? > > > Thanks for your precious help, > Nicolas > > <1_Warning.txt> From C.Klaij at marin.nl Thu Jan 5 02:37:49 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Thu, 5 Jan 2017 08:37:49 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> , Message-ID: <1483605469330.79182@marin.nl> Satish, Matt Our sysadmin tells me Scientific Linux is still busy with the RedHat 7.3 update, so yes, this is a partial update somewhere between 7.2 and 7.3... No luck with the quotes on my system, but the option --with-shared-libraries=0 does work! make test gives: Running test examples to verify correct installation Using PETSC_DIR=/projects/developers/cklaij/ReFRESCO/Dev/trunk/Libs/install/petsc-3.7.4 and PETSC_ARCH= C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 MPI process C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 MPI processes Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 MPI process Completed test examples I've also tested the standalone program of metis, that works too: $ ./gpmetis Missing parameters. Usage: gpmetis [options] use 'gpmetis -help' for a summary of the options. So problem solved for now, thanks to you and Matt for all your help! On the long run I will go for Intel-17 on SL7.3. What worries me though is that a simple update (which happens all the time according to sysadmin) can have such a dramatic effect. Thanks again, Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm ________________________________________ From: Satish Balay Sent: Wednesday, January 04, 2017 7:24 PM To: petsc-users Cc: Klaij, Christiaan Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Wed, 4 Jan 2017, Satish Balay wrote: > So I guess your best bet is static libraries.. Or upgrade to intel-17 compilers. Satish ------- [balay at el7 benchmarks]$ icc --version icc (ICC) 17.0.1 20161005 Copyright (C) 1985-2016 Intel Corporation. All rights reserved. [balay at el7 benchmarks]$ icc sizeof.c -lifcore [balay at el7 benchmarks]$ ldd a.out linux-vdso.so.1 => (0x00007ffed4b02000) libifcore.so.5 => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libifcore.so.5 (0x00007f5a65430000) libm.so.6 => /lib64/libm.so.6 (0x00007f5a65124000) libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f5a64f0e000) libc.so.6 => /lib64/libc.so.6 (0x00007f5a64b4c000) libdl.so.2 => /lib64/libdl.so.2 (0x00007f5a64948000) libimf.so => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libimf.so (0x00007f5a6445c000) libsvml.so => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libsvml.so (0x00007f5a63550000) libintlc.so.5 => /soft/com/packages/intel/17/u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libintlc.so.5 (0x00007f5a632e6000) /lib64/ld-linux-x86-64.so.2 (0x00007f5a65793000) [balay at el7 benchmarks]$ ./a.out long double : 16 double : 8 int : 4 char : 1 short : 2 long : 8 long long : 8 int * : 8 size_t : 8 [balay at el7 benchmarks]$ From niko.karin at gmail.com Thu Jan 5 05:11:04 2017 From: niko.karin at gmail.com (Karin&NiKo) Date: Thu, 5 Jan 2017 12:11:04 +0100 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> References: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> Message-ID: Dear Barry, dear Dave, THANK YOU! You two pointed out the right problem.By using the options you provided (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver converges in 3 iterations whatever the size of the communicator. All the trick is in the precise resolution of the Schur complement, by using a Krylov method (and not only preonly) *and* applying the preconditioner on the right (so evaluating the convergence on the unpreconditioned residual). @Barry : the difference you see on the nonzero allocations for the different runs is just an artefact : when using more than one proc, we slighly over-estimate the number of non-zero terms. If I run the same problem with the -info option, I get extra information : [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 110; storage space: 0 unneeded,5048 used [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 271; storage space: 4249 unneeded,26167 used [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 307; storage space: 7988 unneeded,31093 used [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 244; storage space: 0 unneeded,6194 used [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 233; storage space: 823 unneeded,9975 used [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 197; storage space: 823 unneeded,8263 used And 5048+26167+31093+6194+9975+8263=86740 which is the number of exactly estimated nonzero terms for 1 proc. Thank you again! Best regards, Nicolas 2017-01-05 1:36 GMT+01:00 Barry Smith : > > There is something wrong with your set up. > > 1 process > > total: nonzeros=140616, allocated nonzeros=140616 > total: nonzeros=68940, allocated nonzeros=68940 > total: nonzeros=3584, allocated nonzeros=3584 > total: nonzeros=1000, allocated nonzeros=1000 > total: nonzeros=8400, allocated nonzeros=8400 > > 2 processes > total: nonzeros=146498, allocated nonzeros=146498 > total: nonzeros=73470, allocated nonzeros=73470 > total: nonzeros=3038, allocated nonzeros=3038 > total: nonzeros=1110, allocated nonzeros=1110 > total: nonzeros=6080, allocated nonzeros=6080 > total: nonzeros=146498, allocated nonzeros=146498 > total: nonzeros=73470, allocated nonzeros=73470 > total: nonzeros=6080, allocated nonzeros=6080 > total: nonzeros=2846, allocated nonzeros=2846 > total: nonzeros=86740, allocated nonzeros=94187 > > It looks like you are setting up the problem differently in parallel and > seq. If it is suppose to be an identical problem then the number nonzeros > should be the same in at least the first two matrices. > > > > > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > > > Dear Petsc team, > > > > I am (still) trying to solve Biot's poroelasticity problem : > > > > > > I am using a mixed P2-P1 finite element discretization. The matrix of > the discretized system in binary format is attached to this email. > > > > I am using the fieldsplit framework to solve the linear system. Since I > am facing some troubles, I have decided to go back to simple things. Here > are the options I am using : > > > > -ksp_rtol 1.0e-5 > > -ksp_type fgmres > > -pc_type fieldsplit > > -pc_fieldsplit_schur_factorization_type full > > -pc_fieldsplit_type schur > > -pc_fieldsplit_schur_precondition selfp > > -fieldsplit_0_pc_type lu > > -fieldsplit_0_pc_factor_mat_solver_package mumps > > -fieldsplit_0_ksp_type preonly > > -fieldsplit_0_ksp_converged_reason > > -fieldsplit_1_pc_type lu > > -fieldsplit_1_pc_factor_mat_solver_package mumps > > -fieldsplit_1_ksp_type preonly > > -fieldsplit_1_ksp_converged_reason > > > > On a single proc, everything runs fine : the solver converges in 3 > iterations, according to the theory (see Run-1-proc.txt [contains > -log_view]). > > > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > > > I do not understand this behavior : since MUMPS is a parallel direct > solver, shouldn't the solver converge in max 3 iterations whatever the > number of procs? > > > > > > Thanks for your precious help, > > Nicolas > > > > <1_Warning.txt> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Thu Jan 5 05:58:45 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Thu, 05 Jan 2017 11:58:45 +0000 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> Message-ID: Do you now see identical residual histories for a job using 1 rank and 4 ranks? If not, I am inclined to believe that the IS's you are defining for the splits in the parallel case are incorrect. The operator created to approximate the Schur complement with selfp should not depend on the number of ranks. Or possibly your problem is horribly I'll-conditioned. If it is, then this could result in slightly different residual histories when using different numbers of ranks - even if the operators are in fact identical Thanks, Dave On Thu, 5 Jan 2017 at 12:14, Karin&NiKo wrote: > Dear Barry, dear Dave, > > THANK YOU! > You two pointed out the right problem.By using the options you provided > (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right > -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver > converges in 3 iterations whatever the size of the communicator. > All the trick is in the precise resolution of the Schur complement, by > using a Krylov method (and not only preonly) *and* applying the > preconditioner on the right (so evaluating the convergence on the > unpreconditioned residual). > > @Barry : the difference you see on the nonzero allocations for the > different runs is just an artefact : when using more than one proc, we > slighly over-estimate the number of non-zero terms. If I run the same > problem with the -info option, I get extra information : > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 110; storage space: 0 > unneeded,5048 used > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 271; storage space: 4249 > unneeded,26167 used > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 307; storage space: 7988 > unneeded,31093 used > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 244; storage space: 0 > unneeded,6194 used > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 233; storage space: 823 > unneeded,9975 used > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 197; storage space: 823 > unneeded,8263 used > And 5048+26167+31093+6194+9975+8263=86740 which is the number of exactly > estimated nonzero terms for 1 proc. > > > Thank you again! > > Best regards, > Nicolas > > > 2017-01-05 1:36 GMT+01:00 Barry Smith : > > > > > There is something wrong with your set up. > > > > > > 1 process > > > > > > total: nonzeros=140616, allocated nonzeros=140616 > > > total: nonzeros=68940, allocated nonzeros=68940 > > > total: nonzeros=3584, allocated nonzeros=3584 > > > total: nonzeros=1000, allocated nonzeros=1000 > > > total: nonzeros=8400, allocated nonzeros=8400 > > > > > > 2 processes > > > total: nonzeros=146498, allocated nonzeros=146498 > > > total: nonzeros=73470, allocated nonzeros=73470 > > > total: nonzeros=3038, allocated nonzeros=3038 > > > total: nonzeros=1110, allocated nonzeros=1110 > > > total: nonzeros=6080, allocated nonzeros=6080 > > > total: nonzeros=146498, allocated nonzeros=146498 > > > total: nonzeros=73470, allocated nonzeros=73470 > > > total: nonzeros=6080, allocated nonzeros=6080 > > > total: nonzeros=2846, allocated nonzeros=2846 > > > total: nonzeros=86740, allocated nonzeros=94187 > > > > > > It looks like you are setting up the problem differently in parallel and > seq. If it is suppose to be an identical problem then the number nonzeros > should be the same in at least the first two matrices. > > > > > > > > > > > > > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > > > > > > > Dear Petsc team, > > > > > > > > I am (still) trying to solve Biot's poroelasticity problem : > > > > > > > > > > > > I am using a mixed P2-P1 finite element discretization. The matrix of > the discretized system in binary format is attached to this email. > > > > > > > > I am using the fieldsplit framework to solve the linear system. Since I > am facing some troubles, I have decided to go back to simple things. Here > are the options I am using : > > > > > > > > -ksp_rtol 1.0e-5 > > > > -ksp_type fgmres > > > > -pc_type fieldsplit > > > > -pc_fieldsplit_schur_factorization_type full > > > > -pc_fieldsplit_type schur > > > > -pc_fieldsplit_schur_precondition selfp > > > > -fieldsplit_0_pc_type lu > > > > -fieldsplit_0_pc_factor_mat_solver_package mumps > > > > -fieldsplit_0_ksp_type preonly > > > > -fieldsplit_0_ksp_converged_reason > > > > -fieldsplit_1_pc_type lu > > > > -fieldsplit_1_pc_factor_mat_solver_package mumps > > > > -fieldsplit_1_ksp_type preonly > > > > -fieldsplit_1_ksp_converged_reason > > > > > > > > On a single proc, everything runs fine : the solver converges in 3 > iterations, according to the theory (see Run-1-proc.txt [contains > -log_view]). > > > > > > > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > > > > > > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > > > > > > > I do not understand this behavior : since MUMPS is a parallel direct > solver, shouldn't the solver converge in max 3 iterations whatever the > number of procs? > > > > > > > > > > > > Thanks for your precious help, > > > > Nicolas > > > > > > > > <1_Warning.txt> > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Patrick.Begou at legi.grenoble-inp.fr Thu Jan 5 06:31:59 2017 From: Patrick.Begou at legi.grenoble-inp.fr (Patrick Begou) Date: Thu, 5 Jan 2017 13:31:59 +0100 Subject: [petsc-users] make test freeze Message-ID: <586E3CBF.9020605@legi.grenoble-inp.fr> I am unable to run any test on petsc. It looks like if the ex19 run freeze on the server as it do not use any cpu time and pstree shows sshd---bash-+-gedit `-make---sh-+-gmake---sh---gmake---sh---mpiexec---ex19 `-tee I've tested petsc-3.7.5.tar.gz and the latest sources on the Git repository. Setup from the Git repo: ./configure --prefix=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries \ --PETSC_ARCH=GCC48 \ --PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git \ --with-shared-libraries=0 \ --with-fortran-interfaces=1 \ --with-fortran-kernels=1 \ --with-cc=mpicc \ --with-fc=mpif90 \ --with-cxx=mpicxx make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git PETSC_ARCH=GCC48 all make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git PETSC_ARCH=GCC48 install make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries PETSC_ARCH="" test In the log file I've just: Running test examples to verify correct installation Using PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries and PETSC_ARCH= I'm using: gcc version 4.8.1 Open MPI: 1.7.3 (build with gcc 4.8.1) (This environment is in production for a while for many local software and works fine) Any suggestion is welcome Patrick -- =================================================================== | Equipe M.O.S.T. | | | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr | | LEGI | | | BP 53 X | Tel 04 76 82 51 35 | | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | =================================================================== From mfadams at lbl.gov Thu Jan 5 08:18:33 2017 From: mfadams at lbl.gov (Mark Adams) Date: Thu, 5 Jan 2017 09:18:33 -0500 Subject: [petsc-users] pc_gamg_threshol In-Reply-To: References: <1483564436.1134.3.camel@seamplex.com> <1483564604.1134.4.camel@seamplex.com> Message-ID: You want the bottom of page 84 in the manual. On Wed, Jan 4, 2017 at 4:33 PM, Barry Smith wrote: > > The manual page gives a high-level description > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCGAMGSetThreshold.html the exact details can be found in the code here > http://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/pc/impls/gamg/util.c.html# > PCGAMGFilterGraph I'm adding a link from the former to the later in the > documentation. > > Barry > > > > > On Jan 4, 2017, at 3:16 PM, Jeremy Theler wrote: > > > > * Any reference to what pc_gamg_treshold means and/or does? > > > > > > > > On Wed, 2017-01-04 at 18:13 -0300, Jeremy Theler wrote: > >> Hi! Any reference to what does -pc_gamg_threshold mean and/or? > >> > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 5 08:20:30 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 5 Jan 2017 08:20:30 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483605469330.79182@marin.nl> References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> <1483605469330.79182@marin.nl> Message-ID: On Thu, Jan 5, 2017 at 2:37 AM, Klaij, Christiaan wrote: > Satish, Matt > > Our sysadmin tells me Scientific Linux is still busy with the > RedHat 7.3 update, so yes, this is a partial update somewhere > between 7.2 and 7.3... > > No luck with the quotes on my system, but the option > --with-shared-libraries=0 does work! make test gives: > > Running test examples to verify correct installation > Using PETSC_DIR=/projects/developers/cklaij/ReFRESCO/ > Dev/trunk/Libs/install/petsc-3.7.4 and PETSC_ARCH= > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 1 > MPI process > C/C++ example src/snes/examples/tutorials/ex19 run successfully with 2 > MPI processes > Fortran example src/snes/examples/tutorials/ex5f run successfully with 1 > MPI process > Completed test examples > > I've also tested the standalone program of metis, that works too: > > $ ./gpmetis > Missing parameters. > Usage: gpmetis [options] > use 'gpmetis -help' for a summary of the options. > > So problem solved for now, thanks to you and Matt for all your > help! On the long run I will go for Intel-17 on SL7.3. > > What worries me though is that a simple update (which happens all > the time according to sysadmin) can have such a dramatic effect. > I agree. It seems SL has broken the ability to use shared libraries with a simple point release. It seems the robustness of all this process is a myth. Thanks, Matt > Thanks again, > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of- > uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm > > ________________________________________ > From: Satish Balay > Sent: Wednesday, January 04, 2017 7:24 PM > To: petsc-users > Cc: Klaij, Christiaan > Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 > > On Wed, 4 Jan 2017, Satish Balay wrote: > > > So I guess your best bet is static libraries.. > > Or upgrade to intel-17 compilers. > > Satish > > ------- > > [balay at el7 benchmarks]$ icc --version > icc (ICC) 17.0.1 20161005 > Copyright (C) 1985-2016 Intel Corporation. All rights reserved. > > [balay at el7 benchmarks]$ icc sizeof.c -lifcore > [balay at el7 benchmarks]$ ldd a.out > linux-vdso.so.1 => (0x00007ffed4b02000) > libifcore.so.5 => /soft/com/packages/intel/17/ > u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libifcore.so.5 > (0x00007f5a65430000) > libm.so.6 => /lib64/libm.so.6 (0x00007f5a65124000) > libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f5a64f0e000) > libc.so.6 => /lib64/libc.so.6 (0x00007f5a64b4c000) > libdl.so.2 => /lib64/libdl.so.2 (0x00007f5a64948000) > libimf.so => /soft/com/packages/intel/17/ > u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libimf.so > (0x00007f5a6445c000) > libsvml.so => /soft/com/packages/intel/17/ > u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libsvml.so > (0x00007f5a63550000) > libintlc.so.5 => /soft/com/packages/intel/17/ > u1/compilers_and_libraries_2017.1.132/linux/compiler/lib/intel64/libintlc.so.5 > (0x00007f5a632e6000) > /lib64/ld-linux-x86-64.so.2 (0x00007f5a65793000) > [balay at el7 benchmarks]$ ./a.out > long double : 16 > double : 8 > int : 4 > char : 1 > short : 2 > long : 8 > long long : 8 > int * : 8 > size_t : 8 > [balay at el7 benchmarks]$ > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jeremy at seamplex.com Thu Jan 5 08:23:07 2017 From: jeremy at seamplex.com (Jeremy Theler) Date: Thu, 05 Jan 2017 11:23:07 -0300 Subject: [petsc-users] pc_gamg_threshol In-Reply-To: References: <1483564436.1134.3.camel@seamplex.com> <1483564604.1134.4.camel@seamplex.com> Message-ID: <1483626187.2370.11.camel@seamplex.com> Yes, I read that page and it was that paragraph that made me want to learn more. For example, that pages says: ?-pc_gamg_threshold 0.0 is the most robust option (the reason for this is not obvious) ...? Where can I find more math-based background on this subject? I mean, some text that describes the methods and not just the implementation as the source code at gamg/util.c so I can better understand what is going on. Thanks -- Jeremy Theler www.seamplex.com On Thu, 2017-01-05 at 09:18 -0500, Mark Adams wrote: > You want the bottom of page 84 in the manual. > > On Wed, Jan 4, 2017 at 4:33 PM, Barry Smith > wrote: > > The manual page gives a high-level description > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCGAMGSetThreshold.html the exact details can be found in the code here http://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/pc/impls/gamg/util.c.html#PCGAMGFilterGraph I'm adding a link from the former to the later in the documentation. > > Barry > > > > > On Jan 4, 2017, at 3:16 PM, Jeremy Theler > wrote: > > > > * Any reference to what pc_gamg_treshold means and/or does? > > > > > > > > On Wed, 2017-01-04 at 18:13 -0300, Jeremy Theler wrote: > >> Hi! Any reference to what does -pc_gamg_threshold mean > and/or? > >> > > > > > > From knepley at gmail.com Thu Jan 5 08:32:09 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 5 Jan 2017 08:32:09 -0600 Subject: [petsc-users] make test freeze In-Reply-To: <586E3CBF.9020605@legi.grenoble-inp.fr> References: <586E3CBF.9020605@legi.grenoble-inp.fr> Message-ID: On Thu, Jan 5, 2017 at 6:31 AM, Patrick Begou < Patrick.Begou at legi.grenoble-inp.fr> wrote: > I am unable to run any test on petsc. It looks like if the ex19 run freeze > on the server as it do not use any cpu time and pstree shows > > sshd---bash-+-gedit > `-make---sh-+-gmake---sh---gmake---sh---mpiexec---ex19 > `-tee > I've tested petsc-3.7.5.tar.gz and the latest sources on the Git > repository. > All make is doing is running ex19, which you can do by hand. What do you get for cd $PETSC_DIR cd src/snes/examples/tutorials make ex19 mpiexec -n 2 ./ex19 -snes_monitor Thanks, Matt > Setup from the Git repo: > ./configure --prefix=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries > \ > --PETSC_ARCH=GCC48 \ > --PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git \ > --with-shared-libraries=0 \ > --with-fortran-interfaces=1 \ > --with-fortran-kernels=1 \ > --with-cc=mpicc \ > --with-fc=mpif90 \ > --with-cxx=mpicxx > > make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git > PETSC_ARCH=GCC48 all > > make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git > PETSC_ARCH=GCC48 install > > make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries > PETSC_ARCH="" test > > > In the log file I've just: > > Running test examples to verify correct installation > Using PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries > and PETSC_ARCH= > > I'm using: > gcc version 4.8.1 > Open MPI: 1.7.3 (build with gcc 4.8.1) > (This environment is in production for a while for many local software and > works fine) > > Any suggestion is welcome > > Patrick > > -- > =================================================================== > | Equipe M.O.S.T. | | > | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr | > | LEGI | | > | BP 53 X | Tel 04 76 82 51 35 | > | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | > =================================================================== > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Jan 5 08:46:01 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 5 Jan 2017 08:46:01 -0600 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> Message-ID: > On Jan 5, 2017, at 5:58 AM, Dave May wrote: > > Do you now see identical residual histories for a job using 1 rank and 4 ranks? Please send the residual histories with the extra options, I'm curious too, because a Krylov method should not be needed in the inner solve, I just asked for it so we can see what the residuals look like. Barry > > If not, I am inclined to believe that the IS's you are defining for the splits in the parallel case are incorrect. The operator created to approximate the Schur complement with selfp should not depend on the number of ranks. > > Or possibly your problem is horribly I'll-conditioned. If it is, then this could result in slightly different residual histories when using different numbers of ranks - even if the operators are in fact identical > > > Thanks, > Dave > > > > > On Thu, 5 Jan 2017 at 12:14, Karin&NiKo wrote: > Dear Barry, dear Dave, > > THANK YOU! > You two pointed out the right problem.By using the options you provided (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver converges in 3 iterations whatever the size of the communicator. > All the trick is in the precise resolution of the Schur complement, by using a Krylov method (and not only preonly) *and* applying the preconditioner on the right (so evaluating the convergence on the unpreconditioned residual). > > @Barry : the difference you see on the nonzero allocations for the different runs is just an artefact : when using more than one proc, we slighly over-estimate the number of non-zero terms. If I run the same problem with the -info option, I get extra information : > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 110; storage space: 0 unneeded,5048 used > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 271; storage space: 4249 unneeded,26167 used > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 307; storage space: 7988 unneeded,31093 used > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 244; storage space: 0 unneeded,6194 used > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 233; storage space: 823 unneeded,9975 used > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 197; storage space: 823 unneeded,8263 used > And 5048+26167+31093+6194+9975+8263=86740 which is the number of exactly estimated nonzero terms for 1 proc. > > > Thank you again! > > Best regards, > Nicolas > > > 2017-01-05 1:36 GMT+01:00 Barry Smith : > > > > There is something wrong with your set up. > > > > > > 1 process > > > > > > total: nonzeros=140616, allocated nonzeros=140616 > > > total: nonzeros=68940, allocated nonzeros=68940 > > > total: nonzeros=3584, allocated nonzeros=3584 > > > total: nonzeros=1000, allocated nonzeros=1000 > > > total: nonzeros=8400, allocated nonzeros=8400 > > > > > > 2 processes > > > total: nonzeros=146498, allocated nonzeros=146498 > > > total: nonzeros=73470, allocated nonzeros=73470 > > > total: nonzeros=3038, allocated nonzeros=3038 > > > total: nonzeros=1110, allocated nonzeros=1110 > > > total: nonzeros=6080, allocated nonzeros=6080 > > > total: nonzeros=146498, allocated nonzeros=146498 > > > total: nonzeros=73470, allocated nonzeros=73470 > > > total: nonzeros=6080, allocated nonzeros=6080 > > > total: nonzeros=2846, allocated nonzeros=2846 > > > total: nonzeros=86740, allocated nonzeros=94187 > > > > > > It looks like you are setting up the problem differently in parallel and seq. If it is suppose to be an identical problem then the number nonzeros should be the same in at least the first two matrices. > > > > > > > > > > > > > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > > > > > > > Dear Petsc team, > > > > > > > > I am (still) trying to solve Biot's poroelasticity problem : > > > > > > > > > > > > I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. > > > > > > > > I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : > > > > > > > > -ksp_rtol 1.0e-5 > > > > -ksp_type fgmres > > > > -pc_type fieldsplit > > > > -pc_fieldsplit_schur_factorization_type full > > > > -pc_fieldsplit_type schur > > > > -pc_fieldsplit_schur_precondition selfp > > > > -fieldsplit_0_pc_type lu > > > > -fieldsplit_0_pc_factor_mat_solver_package mumps > > > > -fieldsplit_0_ksp_type preonly > > > > -fieldsplit_0_ksp_converged_reason > > > > -fieldsplit_1_pc_type lu > > > > -fieldsplit_1_pc_factor_mat_solver_package mumps > > > > -fieldsplit_1_ksp_type preonly > > > > -fieldsplit_1_ksp_converged_reason > > > > > > > > On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). > > > > > > > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > > > > > > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > > > > > > > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? > > > > > > > > > > > > Thanks for your precious help, > > > > Nicolas > > > > > > > > <1_Warning.txt> > > > > > > > > From niko.karin at gmail.com Thu Jan 5 10:36:04 2017 From: niko.karin at gmail.com (Karin&NiKo) Date: Thu, 5 Jan 2017 17:36:04 +0100 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> Message-ID: Dave, Indeed the residual histories differ. Concerning the IS's, I have checked them on small cases, so that I am quite sure they are OK. What could I do with PETSc to evaluate the ill-conditioning of the system or of the sub-systems? Thanks again for your help, Nicolas 2017-01-05 15:46 GMT+01:00 Barry Smith : > > > On Jan 5, 2017, at 5:58 AM, Dave May wrote: > > > > Do you now see identical residual histories for a job using 1 rank and 4 > ranks? > > Please send the residual histories with the extra options, I'm curious > too, because a Krylov method should not be needed in the inner solve, I > just asked for it so we can see what the residuals look like. > > Barry > > > > > If not, I am inclined to believe that the IS's you are defining for the > splits in the parallel case are incorrect. The operator created to > approximate the Schur complement with selfp should not depend on the > number of ranks. > > > > Or possibly your problem is horribly I'll-conditioned. If it is, then > this could result in slightly different residual histories when using > different numbers of ranks - even if the operators are in fact identical > > > > > > Thanks, > > Dave > > > > > > > > > > On Thu, 5 Jan 2017 at 12:14, Karin&NiKo wrote: > > Dear Barry, dear Dave, > > > > THANK YOU! > > You two pointed out the right problem.By using the options you provided > (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right > -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver > converges in 3 iterations whatever the size of the communicator. > > All the trick is in the precise resolution of the Schur complement, by > using a Krylov method (and not only preonly) *and* applying the > preconditioner on the right (so evaluating the convergence on the > unpreconditioned residual). > > > > @Barry : the difference you see on the nonzero allocations for the > different runs is just an artefact : when using more than one proc, we > slighly over-estimate the number of non-zero terms. If I run the same > problem with the -info option, I get extra information : > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 110; storage space: 0 > unneeded,5048 used > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 271; storage space: 4249 > unneeded,26167 used > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 307; storage space: 7988 > unneeded,31093 used > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 244; storage space: 0 > unneeded,6194 used > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 233; storage space: 823 > unneeded,9975 used > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 197; storage space: 823 > unneeded,8263 used > > And 5048+26167+31093+6194+9975+8263=86740 which is the number of > exactly estimated nonzero terms for 1 proc. > > > > > > Thank you again! > > > > Best regards, > > Nicolas > > > > > > 2017-01-05 1:36 GMT+01:00 Barry Smith : > > > > > > > > There is something wrong with your set up. > > > > > > > > > > > > 1 process > > > > > > > > > > > > total: nonzeros=140616, allocated nonzeros=140616 > > > > > > total: nonzeros=68940, allocated nonzeros=68940 > > > > > > total: nonzeros=3584, allocated nonzeros=3584 > > > > > > total: nonzeros=1000, allocated nonzeros=1000 > > > > > > total: nonzeros=8400, allocated nonzeros=8400 > > > > > > > > > > > > 2 processes > > > > > > total: nonzeros=146498, allocated nonzeros=146498 > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > total: nonzeros=3038, allocated nonzeros=3038 > > > > > > total: nonzeros=1110, allocated nonzeros=1110 > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > total: nonzeros=146498, allocated nonzeros=146498 > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > total: nonzeros=2846, allocated nonzeros=2846 > > > > > > total: nonzeros=86740, allocated nonzeros=94187 > > > > > > > > > > > > It looks like you are setting up the problem differently in parallel > and seq. If it is suppose to be an identical problem then the number > nonzeros should be the same in at least the first two matrices. > > > > > > > > > > > > > > > > > > > > > > > > > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > > > > > > > > > > > > > Dear Petsc team, > > > > > > > > > > > > > > I am (still) trying to solve Biot's poroelasticity problem : > > > > > > > > > > > > > > > > > > > > > I am using a mixed P2-P1 finite element discretization. The matrix of > the discretized system in binary format is attached to this email. > > > > > > > > > > > > > > I am using the fieldsplit framework to solve the linear system. Since > I am facing some troubles, I have decided to go back to simple things. Here > are the options I am using : > > > > > > > > > > > > > > -ksp_rtol 1.0e-5 > > > > > > > -ksp_type fgmres > > > > > > > -pc_type fieldsplit > > > > > > > -pc_fieldsplit_schur_factorization_type full > > > > > > > -pc_fieldsplit_type schur > > > > > > > -pc_fieldsplit_schur_precondition selfp > > > > > > > -fieldsplit_0_pc_type lu > > > > > > > -fieldsplit_0_pc_factor_mat_solver_package mumps > > > > > > > -fieldsplit_0_ksp_type preonly > > > > > > > -fieldsplit_0_ksp_converged_reason > > > > > > > -fieldsplit_1_pc_type lu > > > > > > > -fieldsplit_1_pc_factor_mat_solver_package mumps > > > > > > > -fieldsplit_1_ksp_type preonly > > > > > > > -fieldsplit_1_ksp_converged_reason > > > > > > > > > > > > > > On a single proc, everything runs fine : the solver converges in 3 > iterations, according to the theory (see Run-1-proc.txt [contains > -log_view]). > > > > > > > > > > > > > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > > > > > > > > > > > > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > > > > > > > > > > > > > I do not understand this behavior : since MUMPS is a parallel direct > solver, shouldn't the solver converge in max 3 iterations whatever the > number of procs? > > > > > > > > > > > > > > > > > > > > > Thanks for your precious help, > > > > > > > Nicolas > > > > > > > > > > > > > > <1_Warning.txt> > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 0.000000000000e+00 Linear fieldsplit_1_ solve converged due to CONVERGED_ATOL iterations 0 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 1 KSP Residual norm 4.313410630558e-15 1 KSP Residual norm 4.313410630558e-15 1 KSP unpreconditioned resid norm 6.190344827565e+04 true resid norm 6.190344827565e+04 ||r(i)||/||b|| 2.605810835536e-06 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 1 KSP Residual norm 1.553056052550e-09 1 KSP Residual norm 1.553056052550e-09 1 KSP Residual norm 1.810321861046e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 1 KSP Residual norm 6.852859005090e-10 1 KSP Residual norm 6.852859005090e-10 2 KSP Residual norm 4.110160641015e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 1 KSP Residual norm 3.391519472149e-10 1 KSP Residual norm 3.391519472149e-10 3 KSP Residual norm 9.399363055282e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 1 KSP Residual norm 4.488756555375e-10 1 KSP Residual norm 4.488756555375e-10 4 KSP Residual norm 1.571092856159e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 1 KSP Residual norm 2.684362494425e-10 1 KSP Residual norm 2.684362494425e-10 5 KSP Residual norm 1.963417150656e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546913e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546913e+05 1 KSP Residual norm 1.680082274413e-10 1 KSP Residual norm 1.680082274413e-10 6 KSP Residual norm 2.086077021964e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963975e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963975e+05 1 KSP Residual norm 1.773409123937e-10 1 KSP Residual norm 1.773409123937e-10 7 KSP Residual norm 2.638900162683e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 1 KSP Residual norm 8.396841831477e-10 1 KSP Residual norm 8.396841831477e-10 2 KSP unpreconditioned resid norm 1.633570314420e-01 true resid norm 1.633570534028e-01 ||r(i)||/||b|| 6.876476055467e-12 KSP Object: 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.99982e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 153549. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.99982e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 3 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 3 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 624 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 3 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 3 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 3 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3584, allocated nonzeros=3584 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 123808. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1024. RINFO(3) (local estimated flops for the elimination after factorization): [0] 123808. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 123808. RINFOG(2) (global estimated flops for the assembly after factorization): 1024. RINFOG(3) (global estimated flops for the elimination after factorization): 123808. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3584 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 222 INFOG(5) (estimated maximum front size in the complete tree): 48 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3584 INFOG(10) (total integer space store the matrix factors after factorization): 222 INFOG(11) (order of largest frontal matrix after factorization): 48 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3584 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3584 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 1 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=64, cols=64 total: nonzeros=1000, allocated nonzeros=1000 total number of mallocs used during MatSetValues calls =0 not using I-node routines A10 Mat Object: 1 MPI processes type: seqaij rows=64, cols=624 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.99982e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 153549. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.99982e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 3 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 3 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 624 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 3 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 3 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 3 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=624, cols=64 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 Mat Object: 1 MPI processes type: seqaij rows=64, cols=64 total: nonzeros=2744, allocated nonzeros=2744 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 28 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=86740 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 5.937865172382e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.234217463695e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.234217463695e+01 1 KSP Residual norm 3.118700901238e-14 1 KSP Residual norm 3.118700901238e-14 1 KSP Residual norm 5.933157534122e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.551000577788e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.551000577788e+02 1 KSP Residual norm 1.093099680676e-13 1 KSP Residual norm 1.093099680676e-13 2 KSP Residual norm 4.547416945551e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.737103989449e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.737103989449e+01 1 KSP Residual norm 1.028451435945e-14 1 KSP Residual norm 1.028451435945e-14 3 KSP Residual norm 4.536313349461e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.076405285175e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.076405285175e+01 1 KSP Residual norm 1.273088385228e-14 1 KSP Residual norm 1.273088385228e-14 4 KSP Residual norm 4.508959132560e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.776609322045e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.776609322045e+01 1 KSP Residual norm 1.223256724488e-14 1 KSP Residual norm 1.223256724488e-14 5 KSP Residual norm 4.326575141601e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.629553485051e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.629553485051e+01 1 KSP Residual norm 2.354296902115e-14 1 KSP Residual norm 2.354296902115e-14 6 KSP Residual norm 3.936112364961e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.975372549146e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.975372549146e+01 1 KSP Residual norm 1.108812883295e-14 1 KSP Residual norm 1.108812883295e-14 7 KSP Residual norm 2.851493746707e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.189629983813e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.189629983813e+01 1 KSP Residual norm 1.321171004401e-14 1 KSP Residual norm 1.321171004401e-14 8 KSP Residual norm 2.641045917065e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.000370968172e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.000370968172e+01 1 KSP Residual norm 3.111931447011e-14 1 KSP Residual norm 3.111931447011e-14 9 KSP Residual norm 2.589794225526e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.582922395480e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.582922395480e+01 1 KSP Residual norm 2.124772509356e-14 1 KSP Residual norm 2.124772509356e-14 10 KSP Residual norm 1.702204742288e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.033329772194e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.033329772194e+01 1 KSP Residual norm 5.205672206586e-15 1 KSP Residual norm 5.205672206586e-15 11 KSP Residual norm 1.697748745328e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.058818490130e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.058818490130e+01 1 KSP Residual norm 1.676089941959e-14 1 KSP Residual norm 1.676089941959e-14 12 KSP Residual norm 1.474477289739e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.512605566330e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.512605566330e+01 1 KSP Residual norm 1.101013330523e-14 1 KSP Residual norm 1.101013330523e-14 13 KSP Residual norm 7.993867975878e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.908076291684e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.908076291684e+01 1 KSP Residual norm 2.142452615562e-14 1 KSP Residual norm 2.142452615562e-14 14 KSP Residual norm 6.629219736366e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.192268282659e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.192268282659e+01 1 KSP Residual norm 9.502213484402e-15 1 KSP Residual norm 9.502213484402e-15 15 KSP Residual norm 6.052348486548e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.453321213362e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.453321213362e+01 1 KSP Residual norm 3.272046064366e-14 1 KSP Residual norm 3.272046064366e-14 16 KSP Residual norm 4.020336463565e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.784852832330e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.784852832330e+00 1 KSP Residual norm 4.860790727577e-15 1 KSP Residual norm 4.860790727577e-15 17 KSP Residual norm 2.549661912882e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.032060539012e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.032060539012e+01 1 KSP Residual norm 5.409446981951e-15 1 KSP Residual norm 5.409446981951e-15 18 KSP Residual norm 2.545838697911e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.615049979264e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.615049979264e+01 1 KSP Residual norm 1.933081342600e-14 1 KSP Residual norm 1.933081342600e-14 19 KSP Residual norm 1.887748224202e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.863199580723e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.863199580723e+00 1 KSP Residual norm 4.154361434029e-15 1 KSP Residual norm 4.154361434029e-15 20 KSP Residual norm 7.717282078825e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443583386855e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443583386855e+01 1 KSP Residual norm 2.580291980533e-14 1 KSP Residual norm 2.580291980533e-14 21 KSP Residual norm 2.423723070704e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.942812936843e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.942812936843e+01 1 KSP Residual norm 1.166564006857e-14 1 KSP Residual norm 1.166564006857e-14 22 KSP Residual norm 6.757780242449e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.466822334895e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.466822334895e+01 1 KSP Residual norm 8.386112126578e-15 1 KSP Residual norm 8.386112126578e-15 23 KSP Residual norm 2.265569464460e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.961396757334e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.961396757334e+01 1 KSP Residual norm 1.495066107508e-14 1 KSP Residual norm 1.495066107508e-14 24 KSP Residual norm 5.899547964062e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.160271694698e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.160271694698e+01 1 KSP Residual norm 7.056918950209e-15 1 KSP Residual norm 7.056918950209e-15 25 KSP Residual norm 1.575417880059e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.455608530437e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.455608530437e+01 1 KSP Residual norm 1.644989124097e-14 1 KSP Residual norm 1.644989124097e-14 26 KSP Residual norm 4.026925350528e-07 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 26 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.386097125234e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.386097125234e-01 1 KSP Residual norm 1.605159181466e-15 1 KSP Residual norm 1.605159181466e-15 1 KSP unpreconditioned resid norm 4.441623086433e+09 true resid norm 4.441623086433e+09 ||r(i)||/||b|| 1.869690605030e-01 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 9.995733040893e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.454918438859e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.454918438859e+01 1 KSP Residual norm 1.223419335125e-14 1 KSP Residual norm 1.223419335125e-14 1 KSP Residual norm 9.944523252125e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.583995535156e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.583995535156e+02 1 KSP Residual norm 8.825640694457e-14 1 KSP Residual norm 8.825640694457e-14 2 KSP Residual norm 8.949281315445e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.226784992842e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.226784992842e+01 1 KSP Residual norm 2.614144737961e-14 1 KSP Residual norm 2.614144737961e-14 3 KSP Residual norm 8.933447355096e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.396572368493e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.396572368493e+01 1 KSP Residual norm 1.229693916841e-14 1 KSP Residual norm 1.229693916841e-14 4 KSP Residual norm 8.214589151615e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.235328400341e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.235328400341e+01 1 KSP Residual norm 1.043544686072e-14 1 KSP Residual norm 1.043544686072e-14 5 KSP Residual norm 5.263010202342e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.734218208288e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.734218208288e+01 1 KSP Residual norm 1.882179620447e-14 1 KSP Residual norm 1.882179620447e-14 6 KSP Residual norm 4.783501706809e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.797826701278e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.797826701278e+01 1 KSP Residual norm 8.815613165812e-15 1 KSP Residual norm 8.815613165812e-15 7 KSP Residual norm 4.766735056010e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.378301418359e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.378301418359e+01 1 KSP Residual norm 7.112099880638e-15 1 KSP Residual norm 7.112099880638e-15 8 KSP Residual norm 3.972131687194e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.309752053339e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.309752053339e+00 1 KSP Residual norm 4.727621077237e-15 1 KSP Residual norm 4.727621077237e-15 9 KSP Residual norm 3.792263297146e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.566574961568e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.566574961568e+00 1 KSP Residual norm 2.476526063026e-15 1 KSP Residual norm 2.476526063026e-15 10 KSP Residual norm 3.408233654575e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.954137437840e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.954137437840e+01 1 KSP Residual norm 9.786196116090e-15 1 KSP Residual norm 9.786196116090e-15 11 KSP Residual norm 2.642739341639e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.235692697522e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.235692697522e+01 1 KSP Residual norm 2.984886822060e-14 1 KSP Residual norm 2.984886822060e-14 12 KSP Residual norm 2.575125899492e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.132368047040e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.132368047040e+01 1 KSP Residual norm 1.722885597134e-14 1 KSP Residual norm 1.722885597134e-14 13 KSP Residual norm 1.403660227010e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.714687023560e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.714687023560e+01 1 KSP Residual norm 3.125740341553e-14 1 KSP Residual norm 3.125740341553e-14 14 KSP Residual norm 7.177870026278e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.261683387580e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.261683387580e+00 1 KSP Residual norm 1.978759574236e-15 1 KSP Residual norm 1.978759574236e-15 15 KSP Residual norm 5.005112707018e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.419699934297e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.419699934297e+01 1 KSP Residual norm 3.831022881510e-14 1 KSP Residual norm 3.831022881510e-14 16 KSP Residual norm 4.942664995089e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.048711823470e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.048711823470e+01 1 KSP Residual norm 6.225736356283e-15 1 KSP Residual norm 6.225736356283e-15 17 KSP Residual norm 2.551613762746e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.269449392975e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.269449392975e+01 1 KSP Residual norm 6.869644736488e-15 1 KSP Residual norm 6.869644736488e-15 18 KSP Residual norm 1.175268249018e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.634428310012e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.634428310012e+01 1 KSP Residual norm 1.159312925122e-14 1 KSP Residual norm 1.159312925122e-14 19 KSP Residual norm 9.914714704583e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.295339266370e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.295339266370e+01 1 KSP Residual norm 1.443683776456e-14 1 KSP Residual norm 1.443683776456e-14 20 KSP Residual norm 9.398895997393e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.839116504759e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.839116504759e+01 1 KSP Residual norm 2.898144099453e-14 1 KSP Residual norm 2.898144099453e-14 21 KSP Residual norm 4.572179069537e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.460918536613e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.460918536613e+01 1 KSP Residual norm 8.358504740258e-15 1 KSP Residual norm 8.358504740258e-15 22 KSP Residual norm 1.720926954498e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.505332571786e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.505332571786e+00 1 KSP Residual norm 5.136221996807e-15 1 KSP Residual norm 5.136221996807e-15 23 KSP Residual norm 5.546608543469e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.855169634997e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.855169634997e+00 1 KSP Residual norm 6.085063593122e-15 1 KSP Residual norm 6.085063593122e-15 24 KSP Residual norm 1.225330068103e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.658392935709e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.658392935709e+01 1 KSP Residual norm 2.921900986545e-14 1 KSP Residual norm 2.921900986545e-14 25 KSP Residual norm 3.155201041607e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.059617562808e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.059617562808e+01 1 KSP Residual norm 9.566032504495e-15 1 KSP Residual norm 9.566032504495e-15 26 KSP Residual norm 1.035291641520e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.916173965386e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.916173965386e+01 1 KSP Residual norm 1.270318519153e-14 1 KSP Residual norm 1.270318519153e-14 27 KSP Residual norm 3.422078169478e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 27 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.452631488055e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.452631488055e+00 1 KSP Residual norm 3.295757024840e-15 1 KSP Residual norm 3.295757024840e-15 2 KSP unpreconditioned resid norm 1.553990983947e+04 true resid norm 1.553990984101e+04 ||r(i)||/||b|| 6.541487845177e-07 KSP Object: 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 2 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=146498, allocated nonzeros=146498 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.51255e+07 [1] 6.89106e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 87833. [1] 72313. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.51255e+07 [1] 6.89106e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 8 [1] 8 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 8 [1] 8 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 413 [1] 211 RINFOG(1) (global estimated flops for the elimination after analysis): 2.20165e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 160146. RINFOG(3) (global estimated flops for the elimination after factorization): 2.20165e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 146498 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 5065 INFOG(5) (estimated maximum front size in the complete tree): 263 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 146498 INFOG(10) (total integer space store the matrix factors after factorization): 5065 INFOG(11) (order of largest frontal matrix after factorization): 263 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 8 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 16 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 8 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 16 INFOG(20) (estimated number of entries in the factors): 146498 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 8 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 15 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 146498 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 3 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 2 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=73470, allocated nonzeros=73470 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 112 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3038, allocated nonzeros=3038 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 71763. [1] 15274. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1205. [1] 256. RINFO(3) (local estimated flops for the elimination after factorization): [0] 71763. [1] 15274. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 52 [1] 12 RINFOG(1) (global estimated flops for the elimination after analysis): 87037. RINFOG(2) (global estimated flops for the assembly after factorization): 1461. RINFOG(3) (global estimated flops for the elimination after factorization): 87037. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3038 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 318 INFOG(5) (estimated maximum front size in the complete tree): 45 INFOG(6) (number of nodes in the complete tree): 4 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3038 INFOG(10) (total integer space store the matrix factors after factorization): 318 INFOG(11) (order of largest frontal matrix after factorization): 45 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 2 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 2 INFOG(20) (estimated number of entries in the factors): 3038 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 2 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3038 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 2 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 2 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1110, allocated nonzeros=1110 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines A10 Mat Object: 2 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=6080, allocated nonzeros=6080 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines KSP of A00 KSP Object: (fieldsplit_0_) 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=146498, allocated nonzeros=146498 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.51255e+07 [1] 6.89106e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 87833. [1] 72313. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.51255e+07 [1] 6.89106e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 8 [1] 8 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 8 [1] 8 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 413 [1] 211 RINFOG(1) (global estimated flops for the elimination after analysis): 2.20165e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 160146. RINFOG(3) (global estimated flops for the elimination after factorization): 2.20165e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 146498 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 5065 INFOG(5) (estimated maximum front size in the complete tree): 263 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 146498 INFOG(10) (total integer space store the matrix factors after factorization): 5065 INFOG(11) (order of largest frontal matrix after factorization): 263 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 8 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 16 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 8 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 16 INFOG(20) (estimated number of entries in the factors): 146498 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 8 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 15 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 146498 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 3 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 2 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=73470, allocated nonzeros=73470 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 112 nodes, limit used is 5 A01 Mat Object: 2 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=6080, allocated nonzeros=6080 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 111 nodes, limit used is 5 Mat Object: 2 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=2846, allocated nonzeros=2846 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 37 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 2 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=94187 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 117 nodes, limit used is 5 -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 2.647227604295e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.563845763796e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.563845763796e+01 1 KSP Residual norm 1.590629734547e-14 1 KSP Residual norm 1.590629734547e-14 1 KSP Residual norm 2.605565139600e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.909035314623e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.909035314623e+02 1 KSP Residual norm 2.267826634834e-13 1 KSP Residual norm 2.267826634834e-13 2 KSP Residual norm 2.250585637608e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.330887927554e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.330887927554e+01 1 KSP Residual norm 6.513658585363e-14 1 KSP Residual norm 6.513658585363e-14 3 KSP Residual norm 2.208000740627e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.561164381969e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.561164381969e+01 1 KSP Residual norm 3.969087970696e-14 1 KSP Residual norm 3.969087970696e-14 4 KSP Residual norm 1.869828337939e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.153388606860e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.153388606860e+01 1 KSP Residual norm 5.713318140590e-15 1 KSP Residual norm 5.713318140590e-15 5 KSP Residual norm 1.794036720446e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.781962412388e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.781962412388e+00 1 KSP Residual norm 3.300831086316e-15 1 KSP Residual norm 3.300831086316e-15 6 KSP Residual norm 1.561653176489e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.495691274322e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.495691274322e+01 1 KSP Residual norm 2.376574086230e-14 1 KSP Residual norm 2.376574086230e-14 7 KSP Residual norm 1.560891088246e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.979404438790e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.979404438790e+01 1 KSP Residual norm 4.591614152870e-14 1 KSP Residual norm 4.591614152870e-14 8 KSP Residual norm 1.395166058530e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.619352784224e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.619352784224e+01 1 KSP Residual norm 3.919850310257e-14 1 KSP Residual norm 3.919850310257e-14 9 KSP Residual norm 1.220617129680e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.928236375030e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.928236375030e+01 1 KSP Residual norm 4.743493991329e-14 1 KSP Residual norm 4.743493991329e-14 10 KSP Residual norm 1.218069676114e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.028882510371e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.028882510371e+01 1 KSP Residual norm 2.263693695029e-14 1 KSP Residual norm 2.263693695029e-14 11 KSP Residual norm 1.050411964537e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.854058069775e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.854058069775e+01 1 KSP Residual norm 5.292366071944e-14 1 KSP Residual norm 5.292366071944e-14 12 KSP Residual norm 9.809355206907e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.639593572920e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.639593572920e+01 1 KSP Residual norm 4.085666194460e-14 1 KSP Residual norm 4.085666194460e-14 13 KSP Residual norm 9.365799197733e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.203422670496e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.203422670496e+01 1 KSP Residual norm 2.135526921770e-14 1 KSP Residual norm 2.135526921770e-14 14 KSP Residual norm 8.796649024171e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.851329263544e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.851329263544e+00 1 KSP Residual norm 6.772567754349e-15 1 KSP Residual norm 6.772567754349e-15 15 KSP Residual norm 6.733987269492e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.766565466183e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.766565466183e+01 1 KSP Residual norm 1.075817081759e-14 1 KSP Residual norm 1.075817081759e-14 16 KSP Residual norm 5.509826260896e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.983496450668e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.983496450668e+01 1 KSP Residual norm 2.820060590514e-14 1 KSP Residual norm 2.820060590514e-14 17 KSP Residual norm 3.150107095478e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.081910220332e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.081910220332e+01 1 KSP Residual norm 6.000725096350e-15 1 KSP Residual norm 6.000725096350e-15 18 KSP Residual norm 3.129419737054e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.808540096114e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.808540096114e+01 1 KSP Residual norm 1.456892648344e-14 1 KSP Residual norm 1.456892648344e-14 19 KSP Residual norm 2.649702912434e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.203107785732e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.203107785732e+01 1 KSP Residual norm 1.755306403735e-14 1 KSP Residual norm 1.755306403735e-14 20 KSP Residual norm 2.648345476991e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.091952142201e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.091952142201e+01 1 KSP Residual norm 3.516517521666e-14 1 KSP Residual norm 3.516517521666e-14 21 KSP Residual norm 1.901524967217e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.143854860841e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.143854860841e+02 1 KSP Residual norm 7.556936156281e-14 1 KSP Residual norm 7.556936156281e-14 22 KSP Residual norm 1.411221461199e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.494808712357e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.494808712357e+01 1 KSP Residual norm 1.581447993093e-14 1 KSP Residual norm 1.581447993093e-14 23 KSP Residual norm 1.354650215503e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.175022762008e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.175022762008e+01 1 KSP Residual norm 1.253216489904e-14 1 KSP Residual norm 1.253216489904e-14 24 KSP Residual norm 9.897514103297e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.286752538664e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.286752538664e+00 1 KSP Residual norm 3.983901036990e-15 1 KSP Residual norm 3.983901036990e-15 25 KSP Residual norm 7.594972887124e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.865959125903e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.865959125903e+01 1 KSP Residual norm 5.854515898291e-14 1 KSP Residual norm 5.854515898291e-14 26 KSP Residual norm 6.357010324881e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.153248929079e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.153248929079e+01 1 KSP Residual norm 4.342917924936e-14 1 KSP Residual norm 4.342917924936e-14 27 KSP Residual norm 3.837249141333e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.028466059984e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.028466059984e+01 1 KSP Residual norm 1.260250746468e-14 1 KSP Residual norm 1.260250746468e-14 28 KSP Residual norm 3.825093258049e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.086259623580e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.086259623580e+01 1 KSP Residual norm 1.096933134163e-14 1 KSP Residual norm 1.096933134163e-14 29 KSP Residual norm 3.200692740417e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.749989619084e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.749989619084e+01 1 KSP Residual norm 1.653818542601e-14 1 KSP Residual norm 1.653818542601e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.212456473468e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.212456473468e-01 1 KSP Residual norm 7.811434870340e-16 1 KSP Residual norm 7.811434870340e-16 30 KSP Residual norm 2.636282707457e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.401615923172e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.401615923172e+01 1 KSP Residual norm 9.222569222439e-15 1 KSP Residual norm 9.222569222439e-15 31 KSP Residual norm 2.636039697841e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.844677267452e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.844677267452e+02 1 KSP Residual norm 1.325038325340e-13 1 KSP Residual norm 1.325038325340e-13 32 KSP Residual norm 2.630597523590e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.630770614906e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.630770614906e+01 1 KSP Residual norm 5.821918623882e-14 1 KSP Residual norm 5.821918623882e-14 33 KSP Residual norm 2.630412262238e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.087220583599e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.087220583599e+01 1 KSP Residual norm 6.754654176999e-15 1 KSP Residual norm 6.754654176999e-15 34 KSP Residual norm 2.625793279994e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.906682862299e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.906682862299e+00 1 KSP Residual norm 6.179914130748e-15 1 KSP Residual norm 6.179914130748e-15 35 KSP Residual norm 2.614954909094e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.991241843863e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.991241843863e+01 1 KSP Residual norm 6.140979627531e-14 1 KSP Residual norm 6.140979627531e-14 36 KSP Residual norm 2.599866229697e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.219579076718e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.219579076718e+01 1 KSP Residual norm 2.864524563276e-14 1 KSP Residual norm 2.864524563276e-14 37 KSP Residual norm 2.598183274119e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.159783668981e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.159783668981e+01 1 KSP Residual norm 2.836923165594e-14 1 KSP Residual norm 2.836923165594e-14 38 KSP Residual norm 2.541308443042e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.893999639592e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.893999639592e+01 1 KSP Residual norm 2.241867414113e-14 1 KSP Residual norm 2.241867414113e-14 39 KSP Residual norm 2.390909782968e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.737908166385e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.737908166385e+01 1 KSP Residual norm 1.043913942513e-14 1 KSP Residual norm 1.043913942513e-14 40 KSP Residual norm 2.347462798286e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.727905145982e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.727905145982e+00 1 KSP Residual norm 3.872497638305e-15 1 KSP Residual norm 3.872497638305e-15 41 KSP Residual norm 2.222999088699e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.174674207663e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.174674207663e+02 1 KSP Residual norm 7.070746301170e-14 1 KSP Residual norm 7.070746301170e-14 42 KSP Residual norm 2.008122363433e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.544557284475e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.544557284475e+01 1 KSP Residual norm 1.512873082080e-14 1 KSP Residual norm 1.512873082080e-14 43 KSP Residual norm 2.001840657014e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.532764632359e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.532764632359e+01 1 KSP Residual norm 2.613884490190e-14 1 KSP Residual norm 2.613884490190e-14 44 KSP Residual norm 1.852152153525e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.899681731882e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.899681731882e+01 1 KSP Residual norm 3.353112192675e-14 1 KSP Residual norm 3.353112192675e-14 45 KSP Residual norm 1.815248619938e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.420027510756e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.420027510756e+01 1 KSP Residual norm 1.409629405446e-14 1 KSP Residual norm 1.409629405446e-14 46 KSP Residual norm 1.699026079178e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.566749427758e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.566749427758e+00 1 KSP Residual norm 2.642825459400e-15 1 KSP Residual norm 2.642825459400e-15 47 KSP Residual norm 1.528463836818e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.519786948736e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.519786948736e+01 1 KSP Residual norm 2.496355267304e-14 1 KSP Residual norm 2.496355267304e-14 48 KSP Residual norm 1.316483206944e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.407846345747e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.407846345747e+01 1 KSP Residual norm 2.087084676860e-14 1 KSP Residual norm 2.087084676860e-14 49 KSP Residual norm 1.012947165095e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.134672390740e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.134672390740e+01 1 KSP Residual norm 1.222476443679e-14 1 KSP Residual norm 1.222476443679e-14 50 KSP Residual norm 9.849964861771e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.453172987213e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.453172987213e+01 1 KSP Residual norm 1.525121906953e-14 1 KSP Residual norm 1.525121906953e-14 51 KSP Residual norm 8.512189759291e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.976532936626e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.976532936626e+01 1 KSP Residual norm 4.023893121718e-14 1 KSP Residual norm 4.023893121718e-14 52 KSP Residual norm 8.436370424839e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.382442903744e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.382442903744e+01 1 KSP Residual norm 3.055032223097e-14 1 KSP Residual norm 3.055032223097e-14 53 KSP Residual norm 6.357908139806e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.858252044144e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.858252044144e+01 1 KSP Residual norm 1.759490679685e-14 1 KSP Residual norm 1.759490679685e-14 54 KSP Residual norm 4.940876814429e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.071590224960e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.071590224960e+00 1 KSP Residual norm 2.511589599165e-15 1 KSP Residual norm 2.511589599165e-15 55 KSP Residual norm 4.875725277714e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.307035788258e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.307035788258e+01 1 KSP Residual norm 2.735393934819e-14 1 KSP Residual norm 2.735393934819e-14 56 KSP Residual norm 2.070493208681e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.301273218135e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.301273218135e+01 1 KSP Residual norm 5.447468527974e-14 1 KSP Residual norm 5.447468527974e-14 57 KSP Residual norm 1.729830089614e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.659065943991e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.659065943991e+01 1 KSP Residual norm 9.308110561435e-15 1 KSP Residual norm 9.308110561435e-15 58 KSP Residual norm 1.429022167527e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.099205223016e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.099205223016e+01 1 KSP Residual norm 5.453570992272e-14 1 KSP Residual norm 5.453570992272e-14 59 KSP Residual norm 8.072422593685e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.600325142041e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.600325142041e+01 1 KSP Residual norm 8.834547680334e-15 1 KSP Residual norm 8.834547680334e-15 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.191268161987e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.191268161987e-01 1 KSP Residual norm 5.992259523703e-16 1 KSP Residual norm 5.992259523703e-16 60 KSP Residual norm 6.262411345439e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.080986200168e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.080986200168e+00 1 KSP Residual norm 4.002334951234e-15 1 KSP Residual norm 4.002334951234e-15 61 KSP Residual norm 6.254142996255e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.417623919935e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.417623919935e+02 1 KSP Residual norm 1.024406175755e-13 1 KSP Residual norm 1.024406175755e-13 62 KSP Residual norm 6.243167082803e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.386503977267e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.386503977267e+02 1 KSP Residual norm 7.439418887555e-14 1 KSP Residual norm 7.439418887555e-14 63 KSP Residual norm 6.153278086760e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.778224278172e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.778224278172e+01 1 KSP Residual norm 8.314681512243e-15 1 KSP Residual norm 8.314681512243e-15 64 KSP Residual norm 6.085814154261e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.515169701694e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.515169701694e+01 1 KSP Residual norm 4.550637381414e-14 1 KSP Residual norm 4.550637381414e-14 65 KSP Residual norm 5.749081462333e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.184301048304e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.184301048304e+01 1 KSP Residual norm 3.917546833977e-14 1 KSP Residual norm 3.917546833977e-14 66 KSP Residual norm 5.715814756364e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.189810131099e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.189810131099e+01 1 KSP Residual norm 2.060235636196e-14 1 KSP Residual norm 2.060235636196e-14 67 KSP Residual norm 5.285103554622e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.770245081696e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.770245081696e+00 1 KSP Residual norm 4.935171402031e-15 1 KSP Residual norm 4.935171402031e-15 68 KSP Residual norm 4.439814025855e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.829018666461e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.829018666461e+01 1 KSP Residual norm 5.272724492711e-14 1 KSP Residual norm 5.272724492711e-14 69 KSP Residual norm 4.430835969157e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.378293107947e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.378293107947e+01 1 KSP Residual norm 2.969215266404e-14 1 KSP Residual norm 2.969215266404e-14 70 KSP Residual norm 3.533204784519e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.390952576475e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.390952576475e+01 1 KSP Residual norm 3.453873499577e-14 1 KSP Residual norm 3.453873499577e-14 71 KSP Residual norm 3.531906628482e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.728597955221e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.728597955221e+01 1 KSP Residual norm 3.956727722166e-14 1 KSP Residual norm 3.956727722166e-14 72 KSP Residual norm 2.169987908929e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.594899641221e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.594899641221e+01 1 KSP Residual norm 1.030594558616e-14 1 KSP Residual norm 1.030594558616e-14 73 KSP Residual norm 2.037731575928e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.293092803067e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.293092803067e+01 1 KSP Residual norm 3.495063112140e-14 1 KSP Residual norm 3.495063112140e-14 74 KSP Residual norm 1.693927376521e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.570958507353e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.570958507353e+01 1 KSP Residual norm 3.768974204549e-14 1 KSP Residual norm 3.768974204549e-14 75 KSP Residual norm 1.656092094331e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.612816495733e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.612816495733e+00 1 KSP Residual norm 3.455660567119e-15 1 KSP Residual norm 3.455660567119e-15 76 KSP Residual norm 1.324603506478e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.342722299112e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.342722299112e+01 1 KSP Residual norm 3.543613274327e-14 1 KSP Residual norm 3.543613274327e-14 77 KSP Residual norm 1.112116907646e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.558637380532e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.558637380532e+00 1 KSP Residual norm 4.765708951924e-15 1 KSP Residual norm 4.765708951924e-15 78 KSP Residual norm 9.792527426920e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.586790586197e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.586790586197e+01 1 KSP Residual norm 3.393127203229e-14 1 KSP Residual norm 3.393127203229e-14 79 KSP Residual norm 8.999446867141e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.392452152663e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.392452152663e+01 1 KSP Residual norm 4.783813012661e-14 1 KSP Residual norm 4.783813012661e-14 80 KSP Residual norm 8.839832531248e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.045145816041e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.045145816041e+01 1 KSP Residual norm 6.153936531165e-15 1 KSP Residual norm 6.153936531165e-15 81 KSP Residual norm 7.647795119149e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.959135996874e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.959135996874e+00 1 KSP Residual norm 2.275593074487e-15 1 KSP Residual norm 2.275593074487e-15 82 KSP Residual norm 7.593975663833e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.850292431359e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.850292431359e+01 1 KSP Residual norm 1.184622563874e-14 1 KSP Residual norm 1.184622563874e-14 83 KSP Residual norm 4.394648633950e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.468590897433e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.468590897433e+01 1 KSP Residual norm 7.458904334684e-15 1 KSP Residual norm 7.458904334684e-15 84 KSP Residual norm 3.429174284862e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.184953274476e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.184953274476e+01 1 KSP Residual norm 4.288600833302e-14 1 KSP Residual norm 4.288600833302e-14 85 KSP Residual norm 2.922046005477e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.108136514544e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.108136514544e+01 1 KSP Residual norm 3.561336998791e-14 1 KSP Residual norm 3.561336998791e-14 86 KSP Residual norm 1.644609515697e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 86 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.071781318954e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.071781318954e+00 1 KSP Residual norm 1.362129448598e-15 1 KSP Residual norm 1.362129448598e-15 1 KSP unpreconditioned resid norm 6.572198917831e+09 true resid norm 6.572198917831e+09 ||r(i)||/||b|| 2.766551400679e-01 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 9.848380124959e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.501420186583e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.501420186583e+01 1 KSP Residual norm 3.596072484238e-14 1 KSP Residual norm 3.596072484238e-14 1 KSP Residual norm 9.664519068852e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.884754111473e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.884754111473e+02 1 KSP Residual norm 1.094865748271e-13 1 KSP Residual norm 1.094865748271e-13 2 KSP Residual norm 9.151140887760e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.498192495160e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.498192495160e+01 1 KSP Residual norm 2.800616143568e-14 1 KSP Residual norm 2.800616143568e-14 3 KSP Residual norm 8.929766395420e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.902323826652e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.902323826652e+01 1 KSP Residual norm 2.975726631139e-14 1 KSP Residual norm 2.975726631139e-14 4 KSP Residual norm 8.549376988986e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.782359119738e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.782359119738e+01 1 KSP Residual norm 3.635309889749e-14 1 KSP Residual norm 3.635309889749e-14 5 KSP Residual norm 8.352879905370e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.127106609342e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.127106609342e+01 1 KSP Residual norm 3.574010985759e-14 1 KSP Residual norm 3.574010985759e-14 6 KSP Residual norm 6.987129723317e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.322880140073e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.322880140073e+01 1 KSP Residual norm 2.896415963782e-14 1 KSP Residual norm 2.896415963782e-14 7 KSP Residual norm 6.678448419857e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.493509795762e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.493509795762e+01 1 KSP Residual norm 8.217241409848e-15 1 KSP Residual norm 8.217241409848e-15 8 KSP Residual norm 5.359170547747e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.346360940645e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.346360940645e+01 1 KSP Residual norm 1.347793520261e-14 1 KSP Residual norm 1.347793520261e-14 9 KSP Residual norm 5.314808437591e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.969064647594e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.969064647594e+01 1 KSP Residual norm 2.032159273570e-14 1 KSP Residual norm 2.032159273570e-14 10 KSP Residual norm 5.076434654384e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.222260891370e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.222260891370e+02 1 KSP Residual norm 6.987067519281e-14 1 KSP Residual norm 6.987067519281e-14 11 KSP Residual norm 4.408500752515e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.916464818129e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.916464818129e+01 1 KSP Residual norm 4.025208433555e-14 1 KSP Residual norm 4.025208433555e-14 12 KSP Residual norm 3.837143107187e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.081812445989e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.081812445989e+01 1 KSP Residual norm 1.251769212430e-14 1 KSP Residual norm 1.251769212430e-14 13 KSP Residual norm 3.662775827315e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.731979415993e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.731979415993e+01 1 KSP Residual norm 5.172532783199e-14 1 KSP Residual norm 5.172532783199e-14 14 KSP Residual norm 3.067481960889e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.865367130438e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.865367130438e+01 1 KSP Residual norm 5.368393423596e-14 1 KSP Residual norm 5.368393423596e-14 15 KSP Residual norm 2.735961518468e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.013285313161e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.013285313161e+01 1 KSP Residual norm 2.725238238414e-14 1 KSP Residual norm 2.725238238414e-14 16 KSP Residual norm 2.165247750228e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.786197137801e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.786197137801e+00 1 KSP Residual norm 5.554414761541e-15 1 KSP Residual norm 5.554414761541e-15 17 KSP Residual norm 1.331008305488e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.280743891939e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.280743891939e+01 1 KSP Residual norm 6.465753713550e-15 1 KSP Residual norm 6.465753713550e-15 18 KSP Residual norm 1.330540406549e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.764463340215e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.764463340215e+01 1 KSP Residual norm 1.515891339632e-14 1 KSP Residual norm 1.515891339632e-14 19 KSP Residual norm 9.944993879790e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.023175409073e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.023175409073e+01 1 KSP Residual norm 2.996757017926e-14 1 KSP Residual norm 2.996757017926e-14 20 KSP Residual norm 9.923808900026e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.544689121560e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.544689121560e+00 1 KSP Residual norm 2.668373629069e-15 1 KSP Residual norm 2.668373629069e-15 21 KSP Residual norm 6.439740657262e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.750179877441e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.750179877441e+01 1 KSP Residual norm 4.499589395881e-14 1 KSP Residual norm 4.499589395881e-14 22 KSP Residual norm 5.161995067260e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.646724380082e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.646724380082e+00 1 KSP Residual norm 3.381391291577e-15 1 KSP Residual norm 3.381391291577e-15 23 KSP Residual norm 5.120003030315e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.153610821276e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.153610821276e+01 1 KSP Residual norm 6.242742332199e-15 1 KSP Residual norm 6.242742332199e-15 24 KSP Residual norm 4.301852875762e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.015534865920e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.015534865920e+01 1 KSP Residual norm 1.704075348318e-14 1 KSP Residual norm 1.704075348318e-14 25 KSP Residual norm 3.123500223251e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.166606416724e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.166606416724e+01 1 KSP Residual norm 3.660192724908e-14 1 KSP Residual norm 3.660192724908e-14 26 KSP Residual norm 3.085043028547e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000380616221e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000380616221e+01 1 KSP Residual norm 7.356450370532e-15 1 KSP Residual norm 7.356450370532e-15 27 KSP Residual norm 1.992660303657e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.996383239267e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.996383239267e+00 1 KSP Residual norm 4.226945372101e-15 1 KSP Residual norm 4.226945372101e-15 28 KSP Residual norm 1.967028510457e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.152191223945e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.152191223945e+01 1 KSP Residual norm 5.548580748043e-14 1 KSP Residual norm 5.548580748043e-14 29 KSP Residual norm 1.436872937611e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.081565543262e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.081565543262e+01 1 KSP Residual norm 5.855985392948e-15 1 KSP Residual norm 5.855985392948e-15 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.231450038089e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.231450038089e+00 1 KSP Residual norm 3.927959157970e-15 1 KSP Residual norm 3.927959157970e-15 30 KSP Residual norm 1.135047212506e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.003951319371e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.003951319371e+01 1 KSP Residual norm 6.548703411090e-15 1 KSP Residual norm 6.548703411090e-15 31 KSP Residual norm 1.134768565553e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.047999325283e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.047999325283e+02 1 KSP Residual norm 1.233946936211e-13 1 KSP Residual norm 1.233946936211e-13 32 KSP Residual norm 1.131345996525e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.445728355176e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.445728355176e+00 1 KSP Residual norm 6.068465020255e-15 1 KSP Residual norm 6.068465020255e-15 33 KSP Residual norm 1.129815169680e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.572595663348e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.572595663348e+01 1 KSP Residual norm 7.704099295392e-15 1 KSP Residual norm 7.704099295392e-15 34 KSP Residual norm 1.129416642425e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.411160109435e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.411160109435e+01 1 KSP Residual norm 3.868040705854e-14 1 KSP Residual norm 3.868040705854e-14 35 KSP Residual norm 1.114403858382e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.925908488458e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.925908488458e+01 1 KSP Residual norm 1.725333952254e-14 1 KSP Residual norm 1.725333952254e-14 36 KSP Residual norm 1.109684952237e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.094752481871e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.094752481871e+01 1 KSP Residual norm 2.548496980880e-14 1 KSP Residual norm 2.548496980880e-14 37 KSP Residual norm 1.039962803868e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.677928660892e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.677928660892e+01 1 KSP Residual norm 2.575035605781e-14 1 KSP Residual norm 2.575035605781e-14 38 KSP Residual norm 9.607317624260e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.396143505284e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.396143505284e+01 1 KSP Residual norm 2.306756663535e-14 1 KSP Residual norm 2.306756663535e-14 39 KSP Residual norm 9.455091778539e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.132420908643e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.132420908643e+01 1 KSP Residual norm 7.892947404093e-15 1 KSP Residual norm 7.892947404093e-15 40 KSP Residual norm 8.725129855043e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.027802213355e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.027802213355e+02 1 KSP Residual norm 5.644204417728e-14 1 KSP Residual norm 5.644204417728e-14 41 KSP Residual norm 8.700872603655e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.092617041672e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.092617041672e+01 1 KSP Residual norm 2.128084237682e-14 1 KSP Residual norm 2.128084237682e-14 42 KSP Residual norm 8.305367523871e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.896865476397e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.896865476397e+01 1 KSP Residual norm 7.027730871792e-14 1 KSP Residual norm 7.027730871792e-14 43 KSP Residual norm 8.017116947073e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.489635812253e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.489635812253e+01 1 KSP Residual norm 1.488894143307e-14 1 KSP Residual norm 1.488894143307e-14 44 KSP Residual norm 7.101120826232e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.078927249023e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.078927249023e+01 1 KSP Residual norm 3.393801507715e-14 1 KSP Residual norm 3.393801507715e-14 45 KSP Residual norm 7.099249405523e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.399091879839e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.399091879839e+01 1 KSP Residual norm 3.970740979389e-14 1 KSP Residual norm 3.970740979389e-14 46 KSP Residual norm 6.239964305039e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.006107627223e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.006107627223e+01 1 KSP Residual norm 6.275065766989e-15 1 KSP Residual norm 6.275065766989e-15 47 KSP Residual norm 6.220767037758e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.039510262414e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.039510262414e+01 1 KSP Residual norm 1.226928303532e-14 1 KSP Residual norm 1.226928303532e-14 48 KSP Residual norm 4.582507692912e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.479122642260e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.479122642260e+01 1 KSP Residual norm 1.980214432629e-14 1 KSP Residual norm 1.980214432629e-14 49 KSP Residual norm 3.950466021185e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.949023209700e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.949023209700e+01 1 KSP Residual norm 2.830451594639e-14 1 KSP Residual norm 2.830451594639e-14 50 KSP Residual norm 3.156823554111e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.397673792480e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.397673792480e+01 1 KSP Residual norm 8.333365174822e-15 1 KSP Residual norm 8.333365174822e-15 51 KSP Residual norm 3.074542857812e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.745938868708e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.745938868708e+00 1 KSP Residual norm 5.757620759309e-15 1 KSP Residual norm 5.757620759309e-15 52 KSP Residual norm 2.436229423268e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.138575813336e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.138575813336e+01 1 KSP Residual norm 7.419021580885e-15 1 KSP Residual norm 7.419021580885e-15 53 KSP Residual norm 1.991468511931e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.222970375394e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.222970375394e+01 1 KSP Residual norm 1.687252149997e-14 1 KSP Residual norm 1.687252149997e-14 54 KSP Residual norm 1.990980997161e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.541206630034e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.541206630034e+01 1 KSP Residual norm 2.378475012463e-14 1 KSP Residual norm 2.378475012463e-14 55 KSP Residual norm 1.578606371328e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.852187822467e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.852187822467e+01 1 KSP Residual norm 9.691012609173e-15 1 KSP Residual norm 9.691012609173e-15 56 KSP Residual norm 9.884828200093e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.295541145164e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.295541145164e+00 1 KSP Residual norm 6.736125221102e-15 1 KSP Residual norm 6.736125221102e-15 57 KSP Residual norm 9.797809483947e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.214064944530e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.214064944530e+01 1 KSP Residual norm 1.296576938899e-14 1 KSP Residual norm 1.296576938899e-14 58 KSP Residual norm 5.763774407829e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.921061980045e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.921061980045e+01 1 KSP Residual norm 4.469546477185e-14 1 KSP Residual norm 4.469546477185e-14 59 KSP Residual norm 3.792297931482e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.191677932851e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.191677932851e+02 1 KSP Residual norm 6.630922675029e-14 1 KSP Residual norm 6.630922675029e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.224933659981e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.224933659981e+00 1 KSP Residual norm 2.497733403124e-15 1 KSP Residual norm 2.497733403124e-15 60 KSP Residual norm 3.591550716791e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.276388696588e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.276388696588e+02 1 KSP Residual norm 7.469339373319e-14 1 KSP Residual norm 7.469339373319e-14 61 KSP Residual norm 3.591548935140e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.990598481455e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.990598481455e+02 1 KSP Residual norm 1.408982976160e-13 1 KSP Residual norm 1.408982976160e-13 62 KSP Residual norm 3.591026732446e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.734860422205e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.734860422205e+00 1 KSP Residual norm 3.533425444020e-15 1 KSP Residual norm 3.533425444020e-15 63 KSP Residual norm 3.590971023537e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.623760676941e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.623760676941e+02 1 KSP Residual norm 1.015281556116e-13 1 KSP Residual norm 1.015281556116e-13 64 KSP Residual norm 3.590301520440e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.832622645904e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.832622645904e+01 1 KSP Residual norm 1.089337554485e-14 1 KSP Residual norm 1.089337554485e-14 65 KSP Residual norm 3.589525226283e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.883806323876e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.883806323876e+01 1 KSP Residual norm 3.544699676840e-14 1 KSP Residual norm 3.544699676840e-14 66 KSP Residual norm 3.587395507263e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.240277600891e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.240277600891e+00 1 KSP Residual norm 3.504024802977e-15 1 KSP Residual norm 3.504024802977e-15 67 KSP Residual norm 3.582983449722e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.550629626011e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.550629626011e+00 1 KSP Residual norm 4.267672013506e-15 1 KSP Residual norm 4.267672013506e-15 68 KSP Residual norm 3.569893638194e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.365836388242e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.365836388242e+01 1 KSP Residual norm 1.364611914826e-14 1 KSP Residual norm 1.364611914826e-14 69 KSP Residual norm 3.512264058619e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.885570512570e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.885570512570e+01 1 KSP Residual norm 2.231047121564e-14 1 KSP Residual norm 2.231047121564e-14 70 KSP Residual norm 3.453695781694e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.785672778004e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.785672778004e+01 1 KSP Residual norm 1.128829898101e-14 1 KSP Residual norm 1.128829898101e-14 71 KSP Residual norm 3.379821113338e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.818703910385e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.818703910385e+01 1 KSP Residual norm 1.835880458824e-14 1 KSP Residual norm 1.835880458824e-14 72 KSP Residual norm 3.067650549825e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.242867319078e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.242867319078e+01 1 KSP Residual norm 1.247178795371e-14 1 KSP Residual norm 1.247178795371e-14 73 KSP Residual norm 3.054979710119e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.669198661138e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.669198661138e+01 1 KSP Residual norm 1.492131065423e-14 1 KSP Residual norm 1.492131065423e-14 74 KSP Residual norm 2.495874003759e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.342726072881e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.342726072881e+00 1 KSP Residual norm 2.902529214942e-15 1 KSP Residual norm 2.902529214942e-15 75 KSP Residual norm 2.468487101598e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.392521903608e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.392521903608e+01 1 KSP Residual norm 1.829624338486e-14 1 KSP Residual norm 1.829624338486e-14 76 KSP Residual norm 1.773281505854e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.295158643849e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.295158643849e+01 1 KSP Residual norm 1.466453415456e-14 1 KSP Residual norm 1.466453415456e-14 77 KSP Residual norm 1.675716371561e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.216446384790e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.216446384790e+01 1 KSP Residual norm 2.670744229188e-14 1 KSP Residual norm 2.670744229188e-14 78 KSP Residual norm 1.482259877515e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.176388567547e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.176388567547e+01 1 KSP Residual norm 2.807830881429e-14 1 KSP Residual norm 2.807830881429e-14 79 KSP Residual norm 1.406068628540e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.012892551040e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.012892551040e+01 1 KSP Residual norm 5.243773909284e-15 1 KSP Residual norm 5.243773909284e-15 80 KSP Residual norm 1.317485789064e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.156869226474e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.156869226474e+01 1 KSP Residual norm 5.924271035097e-15 1 KSP Residual norm 5.924271035097e-15 81 KSP Residual norm 9.076670979346e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.989122609933e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.989122609933e+00 1 KSP Residual norm 4.258362758318e-15 1 KSP Residual norm 4.258362758318e-15 82 KSP Residual norm 9.013796592102e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.966024298464e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.966024298464e+01 1 KSP Residual norm 3.163767923803e-14 1 KSP Residual norm 3.163767923803e-14 83 KSP Residual norm 6.586622357040e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.328318841428e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.328318841428e+01 1 KSP Residual norm 3.339851714016e-14 1 KSP Residual norm 3.339851714016e-14 84 KSP Residual norm 6.369410061991e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.372703920714e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.372703920714e+01 1 KSP Residual norm 6.057922045754e-14 1 KSP Residual norm 6.057922045754e-14 85 KSP Residual norm 5.883106495026e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.741350051119e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.741350051119e+01 1 KSP Residual norm 3.900500452206e-14 1 KSP Residual norm 3.900500452206e-14 86 KSP Residual norm 3.722015215339e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.006569837915e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.006569837915e+01 1 KSP Residual norm 2.735673521250e-14 1 KSP Residual norm 2.735673521250e-14 87 KSP Residual norm 3.720630015478e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.476618885951e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.476618885951e+01 1 KSP Residual norm 1.239800253805e-14 1 KSP Residual norm 1.239800253805e-14 88 KSP Residual norm 2.135948372700e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.563406544187e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.563406544187e+01 1 KSP Residual norm 6.048724242367e-14 1 KSP Residual norm 6.048724242367e-14 89 KSP Residual norm 1.816266768652e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.168707958879e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.168707958879e+01 1 KSP Residual norm 3.407108533174e-14 1 KSP Residual norm 3.407108533174e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.225064282276e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.225064282276e+00 1 KSP Residual norm 2.673605755194e-15 1 KSP Residual norm 2.673605755194e-15 90 KSP Residual norm 1.683628726867e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.525827966750e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.525827966750e+01 1 KSP Residual norm 9.510793334472e-15 1 KSP Residual norm 9.510793334472e-15 91 KSP Residual norm 1.682892819348e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.970973967869e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.970973967869e+02 1 KSP Residual norm 1.463095390722e-13 1 KSP Residual norm 1.463095390722e-13 92 KSP Residual norm 1.670654997355e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.134970417364e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.134970417364e+01 1 KSP Residual norm 2.679864390331e-14 1 KSP Residual norm 2.679864390331e-14 93 KSP Residual norm 1.669541545510e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.243744933528e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.243744933528e+02 1 KSP Residual norm 6.479760510434e-14 1 KSP Residual norm 6.479760510434e-14 94 KSP Residual norm 1.649566531828e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.665756152536e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.665756152536e+00 1 KSP Residual norm 4.987873333199e-15 1 KSP Residual norm 4.987873333199e-15 95 KSP Residual norm 1.620495827749e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.396177358022e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.396177358022e+01 1 KSP Residual norm 7.871763025339e-15 1 KSP Residual norm 7.871763025339e-15 96 KSP Residual norm 1.613120803508e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.347530393061e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.347530393061e+01 1 KSP Residual norm 8.015122439573e-15 1 KSP Residual norm 8.015122439573e-15 97 KSP Residual norm 1.597909793013e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.964572995928e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.964572995928e+01 1 KSP Residual norm 3.552937639210e-14 1 KSP Residual norm 3.552937639210e-14 98 KSP Residual norm 1.484704881709e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.790866234771e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.790866234771e+01 1 KSP Residual norm 1.588503602414e-14 1 KSP Residual norm 1.588503602414e-14 99 KSP Residual norm 1.394465899749e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.901014898016e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.901014898016e+01 1 KSP Residual norm 2.619826027835e-14 1 KSP Residual norm 2.619826027835e-14 100 KSP Residual norm 1.351932328200e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.894524695594e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.894524695594e+01 1 KSP Residual norm 4.850263628781e-14 1 KSP Residual norm 4.850263628781e-14 101 KSP Residual norm 1.333742691380e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.243255058264e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.243255058264e+01 1 KSP Residual norm 4.895629816759e-14 1 KSP Residual norm 4.895629816759e-14 102 KSP Residual norm 1.282339179797e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.252864548303e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.252864548303e+01 1 KSP Residual norm 1.503522226521e-14 1 KSP Residual norm 1.503522226521e-14 103 KSP Residual norm 1.094586182913e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.986463828179e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.986463828179e+01 1 KSP Residual norm 3.308055500220e-14 1 KSP Residual norm 3.308055500220e-14 104 KSP Residual norm 1.073066142185e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.228853623752e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.228853623752e+01 1 KSP Residual norm 7.997676112923e-15 1 KSP Residual norm 7.997676112923e-15 105 KSP Residual norm 8.901585776449e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 105 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.283996885378e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.283996885378e+00 1 KSP Residual norm 2.816714568450e-15 1 KSP Residual norm 2.816714568450e-15 2 KSP unpreconditioned resid norm 6.438608672179e+04 true resid norm 6.438608672019e+04 ||r(i)||/||b|| 2.710316906518e-06 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 9.999555556883e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.353695927814e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.353695927814e+01 1 KSP Residual norm 1.425324244435e-14 1 KSP Residual norm 1.425324244435e-14 1 KSP Residual norm 9.994399522514e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.030669830370e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.030669830370e+02 1 KSP Residual norm 1.486885510352e-13 1 KSP Residual norm 1.486885510352e-13 2 KSP Residual norm 9.977259845412e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.941831553295e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.941831553295e+01 1 KSP Residual norm 3.382669055833e-14 1 KSP Residual norm 3.382669055833e-14 3 KSP Residual norm 9.938173184077e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.636052991315e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.636052991315e+01 1 KSP Residual norm 1.277256301908e-14 1 KSP Residual norm 1.277256301908e-14 4 KSP Residual norm 9.913501774901e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.189810874463e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.189810874463e+00 1 KSP Residual norm 3.637573622646e-15 1 KSP Residual norm 3.637573622646e-15 5 KSP Residual norm 9.820091324909e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.236401222670e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.236401222670e+02 1 KSP Residual norm 7.141879012712e-14 1 KSP Residual norm 7.141879012712e-14 6 KSP Residual norm 9.813808060219e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.031651415085e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.031651415085e+01 1 KSP Residual norm 1.917481970774e-14 1 KSP Residual norm 1.917481970774e-14 7 KSP Residual norm 9.556999134850e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.494361884956e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.494361884956e+01 1 KSP Residual norm 5.302118065701e-14 1 KSP Residual norm 5.302118065701e-14 8 KSP Residual norm 9.444932047691e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.970221540505e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.970221540505e+00 1 KSP Residual norm 4.651620610378e-15 1 KSP Residual norm 4.651620610378e-15 9 KSP Residual norm 8.872073957956e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.616931708860e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.616931708860e+01 1 KSP Residual norm 3.372336498494e-14 1 KSP Residual norm 3.372336498494e-14 10 KSP Residual norm 8.563719267045e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.906284866371e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.906284866371e+01 1 KSP Residual norm 2.963623266967e-14 1 KSP Residual norm 2.963623266967e-14 11 KSP Residual norm 7.825206625240e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.251050794709e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.251050794709e+01 1 KSP Residual norm 2.379173497429e-14 1 KSP Residual norm 2.379173497429e-14 12 KSP Residual norm 6.854953711433e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.558826217382e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.558826217382e+01 1 KSP Residual norm 3.261396764736e-14 1 KSP Residual norm 3.261396764736e-14 13 KSP Residual norm 6.792461509441e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.448485493426e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.448485493426e+01 1 KSP Residual norm 2.358375810593e-14 1 KSP Residual norm 2.358375810593e-14 14 KSP Residual norm 6.191910235349e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.017765985325e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.017765985325e+01 1 KSP Residual norm 1.754095465701e-14 1 KSP Residual norm 1.754095465701e-14 15 KSP Residual norm 6.126017234069e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.480456017437e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.480456017437e+01 1 KSP Residual norm 2.317228816482e-14 1 KSP Residual norm 2.317228816482e-14 16 KSP Residual norm 5.656042821840e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.947761065377e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.947761065377e+01 1 KSP Residual norm 1.657721549132e-14 1 KSP Residual norm 1.657721549132e-14 17 KSP Residual norm 4.071896024268e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.673338734909e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.673338734909e+01 1 KSP Residual norm 1.631101737943e-14 1 KSP Residual norm 1.631101737943e-14 18 KSP Residual norm 3.641868322225e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.137690034956e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.137690034956e+01 1 KSP Residual norm 1.244970852076e-14 1 KSP Residual norm 1.244970852076e-14 19 KSP Residual norm 3.019702364584e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.896891756059e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.896891756059e+00 1 KSP Residual norm 3.591678005999e-15 1 KSP Residual norm 3.591678005999e-15 20 KSP Residual norm 2.963918359869e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.991072559271e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.991072559271e+01 1 KSP Residual norm 1.747281332404e-14 1 KSP Residual norm 1.747281332404e-14 21 KSP Residual norm 2.593248332823e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.804871294187e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.804871294187e+01 1 KSP Residual norm 2.188008718899e-14 1 KSP Residual norm 2.188008718899e-14 22 KSP Residual norm 2.514762212240e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.447581033715e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.447581033715e+01 1 KSP Residual norm 1.928158785603e-14 1 KSP Residual norm 1.928158785603e-14 23 KSP Residual norm 1.766544426861e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.377218921922e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.377218921922e+00 1 KSP Residual norm 4.517947563482e-15 1 KSP Residual norm 4.517947563482e-15 24 KSP Residual norm 9.265750187397e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.930421679920e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.930421679920e+01 1 KSP Residual norm 3.647534113740e-14 1 KSP Residual norm 3.647534113740e-14 25 KSP Residual norm 9.260357558906e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.129721522446e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.129721522446e+02 1 KSP Residual norm 6.680346215739e-14 1 KSP Residual norm 6.680346215739e-14 26 KSP Residual norm 5.747265051047e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.697414568570e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.697414568570e+01 1 KSP Residual norm 1.140109257527e-14 1 KSP Residual norm 1.140109257527e-14 27 KSP Residual norm 4.413037641705e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.927774415668e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.927774415668e+01 1 KSP Residual norm 4.709631511851e-14 1 KSP Residual norm 4.709631511851e-14 28 KSP Residual norm 3.477454752471e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.622392428862e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.622392428862e+01 1 KSP Residual norm 2.856274514870e-14 1 KSP Residual norm 2.856274514870e-14 29 KSP Residual norm 2.254554159844e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.277898386018e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.277898386018e+01 1 KSP Residual norm 1.545848031054e-14 1 KSP Residual norm 1.545848031054e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.065063454214e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.065063454214e+01 1 KSP Residual norm 3.967556833620e-15 1 KSP Residual norm 3.967556833620e-15 30 KSP Residual norm 2.228027017534e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.667854404395e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.667854404395e+01 1 KSP Residual norm 4.832478671300e-14 1 KSP Residual norm 4.832478671300e-14 31 KSP Residual norm 2.228022072114e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.991903342783e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.991903342783e+02 1 KSP Residual norm 1.447689407360e-13 1 KSP Residual norm 1.447689407360e-13 32 KSP Residual norm 2.227281490564e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.700425410361e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.700425410361e+00 1 KSP Residual norm 3.641691760878e-15 1 KSP Residual norm 3.641691760878e-15 33 KSP Residual norm 2.226608947816e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.572786843549e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.572786843549e+02 1 KSP Residual norm 8.905071779068e-14 1 KSP Residual norm 8.905071779068e-14 34 KSP Residual norm 2.224788802748e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.116499026749e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.116499026749e+01 1 KSP Residual norm 1.253874587368e-14 1 KSP Residual norm 1.253874587368e-14 35 KSP Residual norm 2.220679064680e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.934806060237e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.934806060237e+01 1 KSP Residual norm 4.437835816224e-14 1 KSP Residual norm 4.437835816224e-14 36 KSP Residual norm 2.217734379137e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.896827775306e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.896827775306e+01 1 KSP Residual norm 1.616656963850e-14 1 KSP Residual norm 1.616656963850e-14 37 KSP Residual norm 2.205990647709e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.642803318222e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.642803318222e+01 1 KSP Residual norm 1.126364489217e-14 1 KSP Residual norm 1.126364489217e-14 38 KSP Residual norm 2.199703593686e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.518398852179e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.518398852179e+01 1 KSP Residual norm 1.985076009075e-14 1 KSP Residual norm 1.985076009075e-14 39 KSP Residual norm 2.174801670090e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.144612436794e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.144612436794e+01 1 KSP Residual norm 5.756878864730e-15 1 KSP Residual norm 5.756878864730e-15 40 KSP Residual norm 2.049943689215e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.152420288707e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.152420288707e+01 1 KSP Residual norm 2.737555444587e-14 1 KSP Residual norm 2.737555444587e-14 41 KSP Residual norm 2.048820624991e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.383345143006e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.383345143006e+01 1 KSP Residual norm 7.633625669085e-15 1 KSP Residual norm 7.633625669085e-15 42 KSP Residual norm 1.712518826879e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.827096130426e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.827096130426e+01 1 KSP Residual norm 3.165109854183e-14 1 KSP Residual norm 3.165109854183e-14 43 KSP Residual norm 1.711883670834e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.025927974638e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.025927974638e+00 1 KSP Residual norm 4.148427497577e-15 1 KSP Residual norm 4.148427497577e-15 44 KSP Residual norm 1.591796646582e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.251158032641e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.251158032641e+01 1 KSP Residual norm 1.507563020703e-14 1 KSP Residual norm 1.507563020703e-14 45 KSP Residual norm 1.584524964268e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.167365199449e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.167365199449e+00 1 KSP Residual norm 3.379105359765e-15 1 KSP Residual norm 3.379105359765e-15 46 KSP Residual norm 1.349639659612e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.282989106082e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.282989106082e+01 1 KSP Residual norm 8.258313562387e-15 1 KSP Residual norm 8.258313562387e-15 47 KSP Residual norm 1.148719261322e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.256049481325e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.256049481325e+01 1 KSP Residual norm 6.281823667347e-15 1 KSP Residual norm 6.281823667347e-15 48 KSP Residual norm 1.079578319384e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.211907560795e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.211907560795e+00 1 KSP Residual norm 2.180241730309e-15 1 KSP Residual norm 2.180241730309e-15 49 KSP Residual norm 9.448798431757e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.727093655712e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.727093655712e+01 1 KSP Residual norm 1.738181256754e-14 1 KSP Residual norm 1.738181256754e-14 50 KSP Residual norm 8.896951676290e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.823125854765e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.823125854765e+01 1 KSP Residual norm 4.118553687570e-14 1 KSP Residual norm 4.118553687570e-14 51 KSP Residual norm 7.138499399917e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.362221724155e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.362221724155e+01 1 KSP Residual norm 5.667658718302e-14 1 KSP Residual norm 5.667658718302e-14 52 KSP Residual norm 7.116852115789e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.240034073465e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.240034073465e+00 1 KSP Residual norm 4.334051539140e-15 1 KSP Residual norm 4.334051539140e-15 53 KSP Residual norm 5.250165507525e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.095522616646e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.095522616646e+01 1 KSP Residual norm 2.000416121332e-14 1 KSP Residual norm 2.000416121332e-14 54 KSP Residual norm 4.866567651521e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.327228390250e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.327228390250e+01 1 KSP Residual norm 8.161300830659e-15 1 KSP Residual norm 8.161300830659e-15 55 KSP Residual norm 4.238892476107e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.959064809773e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.959064809773e+01 1 KSP Residual norm 3.482334176604e-14 1 KSP Residual norm 3.482334176604e-14 56 KSP Residual norm 2.340047951042e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.037456562505e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.037456562505e+01 1 KSP Residual norm 6.418123454948e-15 1 KSP Residual norm 6.418123454948e-15 57 KSP Residual norm 2.094665497633e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.787274616221e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.787274616221e+00 1 KSP Residual norm 5.081208509626e-15 1 KSP Residual norm 5.081208509626e-15 58 KSP Residual norm 1.966887867784e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.827046845101e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.827046845101e+01 1 KSP Residual norm 1.842102294859e-14 1 KSP Residual norm 1.842102294859e-14 59 KSP Residual norm 1.439583590501e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.917573265878e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.917573265878e+01 1 KSP Residual norm 2.859758370055e-14 1 KSP Residual norm 2.859758370055e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.067835000333e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.067835000333e+01 1 KSP Residual norm 6.056107330393e-15 1 KSP Residual norm 6.056107330393e-15 60 KSP Residual norm 1.437223156204e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.897692055893e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.897692055893e+01 1 KSP Residual norm 3.483917653134e-14 1 KSP Residual norm 3.483917653134e-14 61 KSP Residual norm 1.437222455874e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.979950295805e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.979950295805e+02 1 KSP Residual norm 9.692799839679e-14 1 KSP Residual norm 9.692799839679e-14 62 KSP Residual norm 1.436931519419e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.614058435142e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.614058435142e+01 1 KSP Residual norm 1.145346746167e-14 1 KSP Residual norm 1.145346746167e-14 63 KSP Residual norm 1.434863940755e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.311234616234e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.311234616234e+01 1 KSP Residual norm 4.422867387644e-14 1 KSP Residual norm 4.422867387644e-14 64 KSP Residual norm 1.430266308873e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.351463785495e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.351463785495e+01 1 KSP Residual norm 6.276636129421e-15 1 KSP Residual norm 6.276636129421e-15 65 KSP Residual norm 1.422795975788e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.006053249097e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.006053249097e+00 1 KSP Residual norm 5.219585311233e-15 1 KSP Residual norm 5.219585311233e-15 66 KSP Residual norm 1.407082023168e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.793693610326e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.793693610326e+01 1 KSP Residual norm 5.281120573516e-14 1 KSP Residual norm 5.281120573516e-14 67 KSP Residual norm 1.361194751432e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.167679381144e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.167679381144e+01 1 KSP Residual norm 4.111699980206e-14 1 KSP Residual norm 4.111699980206e-14 68 KSP Residual norm 1.360028333969e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.658485636909e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.658485636909e+01 1 KSP Residual norm 2.338053373490e-14 1 KSP Residual norm 2.338053373490e-14 69 KSP Residual norm 1.215632100869e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.381118803734e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.381118803734e+01 1 KSP Residual norm 5.729584278408e-14 1 KSP Residual norm 5.729584278408e-14 70 KSP Residual norm 1.019028930687e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.335844295233e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.335844295233e+01 1 KSP Residual norm 3.532211557294e-14 1 KSP Residual norm 3.532211557294e-14 71 KSP Residual norm 1.012734321370e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.866383120189e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.866383120189e+00 1 KSP Residual norm 4.049252634598e-15 1 KSP Residual norm 4.049252634598e-15 72 KSP Residual norm 9.136832782632e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.360924223766e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.360924223766e+01 1 KSP Residual norm 1.441464363138e-14 1 KSP Residual norm 1.441464363138e-14 73 KSP Residual norm 9.016799244251e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.573632152504e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.573632152504e+01 1 KSP Residual norm 1.226453211559e-14 1 KSP Residual norm 1.226453211559e-14 74 KSP Residual norm 8.504560013469e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.712563939357e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.712563939357e+01 1 KSP Residual norm 3.538436178324e-14 1 KSP Residual norm 3.538436178324e-14 75 KSP Residual norm 7.967375475347e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.048510524737e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.048510524737e+01 1 KSP Residual norm 1.142629566057e-14 1 KSP Residual norm 1.142629566057e-14 76 KSP Residual norm 7.725342704854e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.149390243310e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.149390243310e+01 1 KSP Residual norm 1.212168050608e-14 1 KSP Residual norm 1.212168050608e-14 77 KSP Residual norm 7.055554122603e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.641623529528e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.641623529528e+00 1 KSP Residual norm 6.405001038799e-15 1 KSP Residual norm 6.405001038799e-15 78 KSP Residual norm 6.888816310925e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.111882289314e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.111882289314e+01 1 KSP Residual norm 3.901566240696e-14 1 KSP Residual norm 3.901566240696e-14 79 KSP Residual norm 5.170122086622e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.624382670754e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.624382670754e+01 1 KSP Residual norm 1.927716593710e-14 1 KSP Residual norm 1.927716593710e-14 80 KSP Residual norm 5.107276466327e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.679364083079e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.679364083079e+01 1 KSP Residual norm 3.521489721406e-14 1 KSP Residual norm 3.521489721406e-14 81 KSP Residual norm 4.963504983781e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.604389593451e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.604389593451e+01 1 KSP Residual norm 1.550584626231e-14 1 KSP Residual norm 1.550584626231e-14 82 KSP Residual norm 4.582722422975e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.694403508481e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.694403508481e+01 1 KSP Residual norm 4.243077435794e-14 1 KSP Residual norm 4.243077435794e-14 83 KSP Residual norm 3.925555789380e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.983414687166e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.983414687166e+01 1 KSP Residual norm 1.064235673251e-14 1 KSP Residual norm 1.064235673251e-14 84 KSP Residual norm 2.632566109675e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.592496351987e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.592496351987e+01 1 KSP Residual norm 9.451475167593e-15 1 KSP Residual norm 9.451475167593e-15 85 KSP Residual norm 2.538224085246e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.710068599656e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.710068599656e+01 1 KSP Residual norm 1.507898897802e-14 1 KSP Residual norm 1.507898897802e-14 86 KSP Residual norm 1.285183498353e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.921308519134e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.921308519134e+01 1 KSP Residual norm 4.356902065732e-14 1 KSP Residual norm 4.356902065732e-14 87 KSP Residual norm 6.937880096307e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.179135051924e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.179135051924e+00 1 KSP Residual norm 3.585592552275e-15 1 KSP Residual norm 3.585592552275e-15 88 KSP Residual norm 6.629418155872e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.370852429775e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.370852429775e+01 1 KSP Residual norm 3.955664816638e-14 1 KSP Residual norm 3.955664816638e-14 89 KSP Residual norm 3.348941676231e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.949759959284e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.949759959284e+01 1 KSP Residual norm 2.823056205423e-14 1 KSP Residual norm 2.823056205423e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.068983292438e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.068983292438e+01 1 KSP Residual norm 5.389853126500e-15 1 KSP Residual norm 5.389853126500e-15 90 KSP Residual norm 2.431893823182e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.757467315144e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.757467315144e+01 1 KSP Residual norm 4.293339736665e-14 1 KSP Residual norm 4.293339736665e-14 91 KSP Residual norm 2.431886219474e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.983252707264e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.983252707264e+02 1 KSP Residual norm 1.109895694437e-13 1 KSP Residual norm 1.109895694437e-13 92 KSP Residual norm 2.430167100275e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.220141175816e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.220141175816e+01 1 KSP Residual norm 6.563612565523e-15 1 KSP Residual norm 6.563612565523e-15 93 KSP Residual norm 2.429335340318e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.309265251769e+02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.309265251769e+02 1 KSP Residual norm 7.832652821703e-14 1 KSP Residual norm 7.832652821703e-14 94 KSP Residual norm 2.427295128660e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.561036206769e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.561036206769e+00 1 KSP Residual norm 5.024715359286e-15 1 KSP Residual norm 5.024715359286e-15 95 KSP Residual norm 2.427098980829e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.707846667047e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.707846667047e+01 1 KSP Residual norm 2.730831747912e-14 1 KSP Residual norm 2.730831747912e-14 96 KSP Residual norm 2.413024211264e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.306516179208e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.306516179208e+01 1 KSP Residual norm 1.799576974063e-14 1 KSP Residual norm 1.799576974063e-14 97 KSP Residual norm 2.412365231431e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.672532210296e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.672532210296e+01 1 KSP Residual norm 1.232067118408e-14 1 KSP Residual norm 1.232067118408e-14 98 KSP Residual norm 2.352567094709e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.544804978896e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.544804978896e+00 1 KSP Residual norm 5.125319316741e-15 1 KSP Residual norm 5.125319316741e-15 99 KSP Residual norm 2.262041446839e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.349374375944e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.349374375944e+01 1 KSP Residual norm 5.451742904231e-14 1 KSP Residual norm 5.451742904231e-14 100 KSP Residual norm 2.249193750880e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.630664178839e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.630664178839e+01 1 KSP Residual norm 3.873127160607e-14 1 KSP Residual norm 3.873127160607e-14 101 KSP Residual norm 2.199665874258e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.250953859631e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.250953859631e+01 1 KSP Residual norm 1.561092635880e-14 1 KSP Residual norm 1.561092635880e-14 102 KSP Residual norm 2.059948500401e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.099195741992e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.099195741992e+01 1 KSP Residual norm 5.776744864697e-15 1 KSP Residual norm 5.776744864697e-15 103 KSP Residual norm 1.984149460152e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.112075890965e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.112075890965e+00 1 KSP Residual norm 5.677412765590e-15 1 KSP Residual norm 5.677412765590e-15 104 KSP Residual norm 1.775525377844e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.729837274371e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.729837274371e+00 1 KSP Residual norm 5.245085374515e-15 1 KSP Residual norm 5.245085374515e-15 105 KSP Residual norm 1.729062001583e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.876559854574e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.876559854574e+00 1 KSP Residual norm 3.117525493160e-15 1 KSP Residual norm 3.117525493160e-15 106 KSP Residual norm 1.355670504843e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.536973178874e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.536973178874e+01 1 KSP Residual norm 1.091255670410e-14 1 KSP Residual norm 1.091255670410e-14 107 KSP Residual norm 1.324837246363e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.834964986671e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.834964986671e+01 1 KSP Residual norm 1.056189704322e-14 1 KSP Residual norm 1.056189704322e-14 108 KSP Residual norm 9.854785712624e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 108 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.069036799388e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.069036799388e+01 1 KSP Residual norm 7.405094057249e-15 1 KSP Residual norm 7.405094057249e-15 3 KSP unpreconditioned resid norm 6.325000280140e-01 true resid norm 6.325019129368e-01 ||r(i)||/||b|| 2.662501660471e-11 KSP Object: 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 3 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=148148, allocated nonzeros=148148 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 55439. [1] 30276. [2] 65773. RINFO(3) (local estimated flops for the elimination after factorization): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 9 [1] 9 [2] 10 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 9 [1] 9 [2] 10 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 278 [1] 87 [2] 259 RINFOG(1) (global estimated flops for the elimination after analysis): 2.29761e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 151488. RINFOG(3) (global estimated flops for the elimination after factorization): 2.29761e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 148148 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4993 INFOG(5) (estimated maximum front size in the complete tree): 286 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 148148 INFOG(10) (total integer space store the matrix factors after factorization): 4993 INFOG(11) (order of largest frontal matrix after factorization): 286 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 10 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 28 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 10 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 28 INFOG(20) (estimated number of entries in the factors): 148148 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 10 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 28 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 148148 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 3 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=72004, allocated nonzeros=72004 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 84 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3736, allocated nonzeros=3736 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 0. [1] 0. [2] 137244. RINFO(2) (local estimated flops for the assembly after factorization): [0] 0. [1] 0. [2] 1225. RINFO(3) (local estimated flops for the elimination after factorization): [0] 0. [1] 0. [2] 137244. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 [2] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 [2] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 0 [1] 0 [2] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 137244. RINFOG(2) (global estimated flops for the assembly after factorization): 1225. RINFOG(3) (global estimated flops for the elimination after factorization): 137244. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3736 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 228 INFOG(5) (estimated maximum front size in the complete tree): 55 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3736 INFOG(10) (total integer space store the matrix factors after factorization): 228 INFOG(11) (order of largest frontal matrix after factorization): 55 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 3736 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3736 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 3 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 3 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1498, allocated nonzeros=1498 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 49 nodes, limit used is 5 A10 Mat Object: 3 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=6619, allocated nonzeros=6619 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 49 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_) 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=148148, allocated nonzeros=148148 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 55439. [1] 30276. [2] 65773. RINFO(3) (local estimated flops for the elimination after factorization): [0] 4.02519e+06 [1] 2.70367e+06 [2] 1.62473e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 9 [1] 9 [2] 10 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 9 [1] 9 [2] 10 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 278 [1] 87 [2] 259 RINFOG(1) (global estimated flops for the elimination after analysis): 2.29761e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 151488. RINFOG(3) (global estimated flops for the elimination after factorization): 2.29761e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 148148 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4993 INFOG(5) (estimated maximum front size in the complete tree): 286 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 148148 INFOG(10) (total integer space store the matrix factors after factorization): 4993 INFOG(11) (order of largest frontal matrix after factorization): 286 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 10 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 28 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 10 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 28 INFOG(20) (estimated number of entries in the factors): 148148 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 10 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 28 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 148148 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 3 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=72004, allocated nonzeros=72004 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 84 nodes, limit used is 5 A01 Mat Object: 3 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=6619, allocated nonzeros=6619 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 84 nodes, limit used is 5 Mat Object: 3 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=3254, allocated nonzeros=3254 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 36 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 3 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=100623 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 89 nodes, limit used is 5 -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 2.521869279267e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.806099413608e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.806099413608e+01 1 KSP Residual norm 1.432171001499e-14 1 KSP Residual norm 1.432171001499e-14 1 KSP Residual norm 2.445634516606e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.329002388032e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.329002388032e+01 1 KSP Residual norm 3.678452143235e-14 1 KSP Residual norm 3.678452143235e-14 2 KSP Residual norm 2.399701984035e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.154412453190e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.154412453190e+01 1 KSP Residual norm 2.397884362424e-14 1 KSP Residual norm 2.397884362424e-14 3 KSP Residual norm 2.213788128356e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.647018644622e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.647018644622e+01 1 KSP Residual norm 1.698023338640e-14 1 KSP Residual norm 1.698023338640e-14 4 KSP Residual norm 2.213179093002e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.175770266807e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.175770266807e+01 1 KSP Residual norm 1.381365862606e-14 1 KSP Residual norm 1.381365862606e-14 5 KSP Residual norm 1.504363405295e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.622388030195e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.622388030195e+01 1 KSP Residual norm 9.598596291082e-15 1 KSP Residual norm 9.598596291082e-15 6 KSP Residual norm 1.491015929887e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.711987132934e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.711987132934e+01 1 KSP Residual norm 9.648436647953e-15 1 KSP Residual norm 9.648436647953e-15 7 KSP Residual norm 1.203534873699e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.561471454393e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.561471454393e+00 1 KSP Residual norm 6.095012008259e-15 1 KSP Residual norm 6.095012008259e-15 8 KSP Residual norm 1.129338478244e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.113869007125e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.113869007125e+00 1 KSP Residual norm 5.635175307550e-15 1 KSP Residual norm 5.635175307550e-15 9 KSP Residual norm 9.412298012302e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.867446395666e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.867446395666e+00 1 KSP Residual norm 5.308787069292e-15 1 KSP Residual norm 5.308787069292e-15 10 KSP Residual norm 9.076344069699e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.111884792764e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.111884792764e+00 1 KSP Residual norm 4.585644747924e-15 1 KSP Residual norm 4.585644747924e-15 11 KSP Residual norm 8.864297546579e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.569461905956e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.569461905956e+00 1 KSP Residual norm 2.698380151461e-15 1 KSP Residual norm 2.698380151461e-15 12 KSP Residual norm 5.115236257501e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.935129989983e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.935129989983e+00 1 KSP Residual norm 4.332144654425e-15 1 KSP Residual norm 4.332144654425e-15 13 KSP Residual norm 4.481643764200e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.390181565186e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.390181565186e+01 1 KSP Residual norm 1.248562714000e-14 1 KSP Residual norm 1.248562714000e-14 14 KSP Residual norm 4.241523078453e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.774493089199e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.774493089199e+01 1 KSP Residual norm 8.532117108176e-15 1 KSP Residual norm 8.532117108176e-15 15 KSP Residual norm 3.417337301035e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.919235593232e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.919235593232e+00 1 KSP Residual norm 3.970215047557e-15 1 KSP Residual norm 3.970215047557e-15 16 KSP Residual norm 3.108911404130e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.591664532146e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.591664532146e+01 1 KSP Residual norm 9.667065076883e-15 1 KSP Residual norm 9.667065076883e-15 17 KSP Residual norm 2.277620657164e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.797600047879e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.797600047879e+00 1 KSP Residual norm 5.176679186923e-15 1 KSP Residual norm 5.176679186923e-15 18 KSP Residual norm 1.639271572922e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.318920068597e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.318920068597e+01 1 KSP Residual norm 6.253170030367e-15 1 KSP Residual norm 6.253170030367e-15 19 KSP Residual norm 1.619319380401e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.764544707982e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.764544707982e+01 1 KSP Residual norm 2.462908891721e-14 1 KSP Residual norm 2.462908891721e-14 20 KSP Residual norm 1.472897742746e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.317333100856e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.317333100856e+01 1 KSP Residual norm 1.095637991572e-14 1 KSP Residual norm 1.095637991572e-14 21 KSP Residual norm 1.454869897024e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.049288666085e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.049288666085e+01 1 KSP Residual norm 5.980964479485e-15 1 KSP Residual norm 5.980964479485e-15 22 KSP Residual norm 1.268410085656e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.254603107752e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.254603107752e+01 1 KSP Residual norm 7.351481319161e-15 1 KSP Residual norm 7.351481319161e-15 23 KSP Residual norm 5.537616809231e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.084124465773e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.084124465773e+01 1 KSP Residual norm 2.056359178445e-14 1 KSP Residual norm 2.056359178445e-14 24 KSP Residual norm 4.035596019528e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.874909563488e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.874909563488e+01 1 KSP Residual norm 1.108739232485e-14 1 KSP Residual norm 1.108739232485e-14 25 KSP Residual norm 3.734776718350e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.464253943112e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.464253943112e+00 1 KSP Residual norm 2.340278829916e-15 1 KSP Residual norm 2.340278829916e-15 26 KSP Residual norm 1.908455217505e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.104496809103e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.104496809103e+01 1 KSP Residual norm 6.713282665205e-15 1 KSP Residual norm 6.713282665205e-15 27 KSP Residual norm 1.301798524704e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.816124801554e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.816124801554e+00 1 KSP Residual norm 4.186724343140e-15 1 KSP Residual norm 4.186724343140e-15 28 KSP Residual norm 1.300974942151e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.308031782782e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.308031782782e+01 1 KSP Residual norm 3.418754736403e-14 1 KSP Residual norm 3.418754736403e-14 29 KSP Residual norm 6.699990870445e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.133054255347e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.133054255347e+01 1 KSP Residual norm 4.611113512061e-14 1 KSP Residual norm 4.611113512061e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.649242661504e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.649242661504e-01 1 KSP Residual norm 6.275354901241e-16 1 KSP Residual norm 6.275354901241e-16 30 KSP Residual norm 2.264634759192e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.259693533077e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.259693533077e+01 1 KSP Residual norm 1.208079839116e-14 1 KSP Residual norm 1.208079839116e-14 31 KSP Residual norm 2.259029118632e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.355699745753e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.355699745753e+01 1 KSP Residual norm 5.314817710931e-14 1 KSP Residual norm 5.314817710931e-14 32 KSP Residual norm 2.235265647802e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.279431355060e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.279431355060e+01 1 KSP Residual norm 2.163416708995e-14 1 KSP Residual norm 2.163416708995e-14 33 KSP Residual norm 2.160330139606e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.976331189059e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.976331189059e+01 1 KSP Residual norm 1.805719527130e-14 1 KSP Residual norm 1.805719527130e-14 34 KSP Residual norm 2.103972955743e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.916935366136e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.916935366136e+01 1 KSP Residual norm 1.122048193810e-14 1 KSP Residual norm 1.122048193810e-14 35 KSP Residual norm 2.084486759714e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.813466960479e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.813466960479e+01 1 KSP Residual norm 1.078964070295e-14 1 KSP Residual norm 1.078964070295e-14 36 KSP Residual norm 1.998184711285e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.238583715918e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.238583715918e+01 1 KSP Residual norm 6.518916445052e-15 1 KSP Residual norm 6.518916445052e-15 37 KSP Residual norm 1.995442017291e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.106750683761e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.106750683761e+01 1 KSP Residual norm 6.139614465238e-15 1 KSP Residual norm 6.139614465238e-15 38 KSP Residual norm 1.673417490448e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.801972298915e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.801972298915e+00 1 KSP Residual norm 3.308071991861e-15 1 KSP Residual norm 3.308071991861e-15 39 KSP Residual norm 1.640347139666e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.886311041541e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.886311041541e+00 1 KSP Residual norm 4.727711665107e-15 1 KSP Residual norm 4.727711665107e-15 40 KSP Residual norm 1.532482827507e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.357082702798e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.357082702798e+01 1 KSP Residual norm 1.350689741244e-14 1 KSP Residual norm 1.350689741244e-14 41 KSP Residual norm 1.524699383850e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.334880543637e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.334880543637e+01 1 KSP Residual norm 1.038849189409e-14 1 KSP Residual norm 1.038849189409e-14 42 KSP Residual norm 1.326740371764e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.613405829492e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.613405829492e+00 1 KSP Residual norm 4.351469045081e-15 1 KSP Residual norm 4.351469045081e-15 43 KSP Residual norm 9.702146143517e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.344940891608e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.344940891608e+00 1 KSP Residual norm 5.223445727563e-15 1 KSP Residual norm 5.223445727563e-15 44 KSP Residual norm 9.701957156472e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.500120922077e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.500120922077e+01 1 KSP Residual norm 7.343137005473e-15 1 KSP Residual norm 7.343137005473e-15 45 KSP Residual norm 8.589372653172e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.438410329744e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.438410329744e+00 1 KSP Residual norm 5.389480482248e-15 1 KSP Residual norm 5.389480482248e-15 46 KSP Residual norm 8.496029911613e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.253517408867e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.253517408867e+00 1 KSP Residual norm 3.902550193105e-15 1 KSP Residual norm 3.902550193105e-15 47 KSP Residual norm 7.688125158693e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.402387747121e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.402387747121e+01 1 KSP Residual norm 8.804993340094e-15 1 KSP Residual norm 8.804993340094e-15 48 KSP Residual norm 7.215610809159e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.210264588773e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.210264588773e+01 1 KSP Residual norm 5.236690525330e-15 1 KSP Residual norm 5.236690525330e-15 49 KSP Residual norm 5.345162832655e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.987837617226e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.987837617226e+01 1 KSP Residual norm 1.338727965951e-14 1 KSP Residual norm 1.338727965951e-14 50 KSP Residual norm 3.156343000770e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.540807976137e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.540807976137e+01 1 KSP Residual norm 1.107249944979e-14 1 KSP Residual norm 1.107249944979e-14 51 KSP Residual norm 3.053522075305e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.830313802276e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.830313802276e+01 1 KSP Residual norm 1.760193567014e-14 1 KSP Residual norm 1.760193567014e-14 52 KSP Residual norm 1.633994412866e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.265827563112e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.265827563112e+01 1 KSP Residual norm 1.998832119322e-14 1 KSP Residual norm 1.998832119322e-14 53 KSP Residual norm 7.266025291062e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.848423047101e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.848423047101e+01 1 KSP Residual norm 2.394486478782e-14 1 KSP Residual norm 2.394486478782e-14 54 KSP Residual norm 4.779862162525e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.873351653264e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.873351653264e+01 1 KSP Residual norm 1.326653645744e-14 1 KSP Residual norm 1.326653645744e-14 55 KSP Residual norm 4.289115120672e-06 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.147848066320e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.147848066320e+01 1 KSP Residual norm 1.093806167006e-14 1 KSP Residual norm 1.093806167006e-14 56 KSP Residual norm 2.130089431312e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 56 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.020423824033e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.020423824033e+00 1 KSP Residual norm 1.340495878854e-15 1 KSP Residual norm 1.340495878854e-15 1 KSP unpreconditioned resid norm 5.518013746793e+09 true resid norm 5.518013746793e+09 ||r(i)||/||b|| 2.322794676640e-01 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 9.921649121308e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.786684935264e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.786684935264e+00 1 KSP Residual norm 6.300573552342e-15 1 KSP Residual norm 6.300573552342e-15 1 KSP Residual norm 9.846150887323e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.982900774306e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.982900774306e+01 1 KSP Residual norm 1.175913600201e-14 1 KSP Residual norm 1.175913600201e-14 2 KSP Residual norm 9.845295387442e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.064326577914e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.064326577914e+01 1 KSP Residual norm 5.258040320685e-15 1 KSP Residual norm 5.258040320685e-15 3 KSP Residual norm 9.550877928390e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.011781346614e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.011781346614e+01 1 KSP Residual norm 5.091014818918e-14 1 KSP Residual norm 5.091014818918e-14 4 KSP Residual norm 7.382688784340e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.757292807303e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.757292807303e+01 1 KSP Residual norm 1.734378343854e-14 1 KSP Residual norm 1.734378343854e-14 5 KSP Residual norm 6.548392876144e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.746825441526e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.746825441526e+01 1 KSP Residual norm 1.502786203235e-14 1 KSP Residual norm 1.502786203235e-14 6 KSP Residual norm 6.542185211824e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.110142258010e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.110142258010e+01 1 KSP Residual norm 6.500412657488e-15 1 KSP Residual norm 6.500412657488e-15 7 KSP Residual norm 6.166015616760e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.504330846262e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.504330846262e+01 1 KSP Residual norm 9.150321495498e-15 1 KSP Residual norm 9.150321495498e-15 8 KSP Residual norm 6.018805768301e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.819469112227e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.819469112227e+01 1 KSP Residual norm 9.917949871378e-15 1 KSP Residual norm 9.917949871378e-15 9 KSP Residual norm 5.101663323566e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.219244177922e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.219244177922e+00 1 KSP Residual norm 2.566905173517e-15 1 KSP Residual norm 2.566905173517e-15 10 KSP Residual norm 5.082790563248e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.057731883008e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.057731883008e+00 1 KSP Residual norm 3.788570094014e-15 1 KSP Residual norm 3.788570094014e-15 11 KSP Residual norm 4.259267623713e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.913073719114e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.913073719114e+00 1 KSP Residual norm 2.648276587257e-15 1 KSP Residual norm 2.648276587257e-15 12 KSP Residual norm 4.217605191516e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.546853992536e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.546853992536e+01 1 KSP Residual norm 7.586472222653e-15 1 KSP Residual norm 7.586472222653e-15 13 KSP Residual norm 3.645455167577e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.232787025706e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.232787025706e+01 1 KSP Residual norm 1.332033120817e-14 1 KSP Residual norm 1.332033120817e-14 14 KSP Residual norm 2.963053714929e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.031462030172e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.031462030172e+01 1 KSP Residual norm 5.064897993862e-15 1 KSP Residual norm 5.064897993862e-15 15 KSP Residual norm 2.950586509416e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.271490936555e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.271490936555e+01 1 KSP Residual norm 5.959609491498e-15 1 KSP Residual norm 5.959609491498e-15 16 KSP Residual norm 2.509465967902e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.339336422925e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.339336422925e+00 1 KSP Residual norm 4.611842071541e-15 1 KSP Residual norm 4.611842071541e-15 17 KSP Residual norm 1.816083599502e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.869180645027e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.869180645027e+01 1 KSP Residual norm 9.516921481679e-15 1 KSP Residual norm 9.516921481679e-15 18 KSP Residual norm 1.785417729810e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.143628637419e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.143628637419e+00 1 KSP Residual norm 4.117829217383e-15 1 KSP Residual norm 4.117829217383e-15 19 KSP Residual norm 1.311181348810e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.308755912250e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.308755912250e+00 1 KSP Residual norm 2.973736047510e-15 1 KSP Residual norm 2.973736047510e-15 20 KSP Residual norm 1.034998543700e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.235426150209e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.235426150209e+01 1 KSP Residual norm 5.546613973367e-15 1 KSP Residual norm 5.546613973367e-15 21 KSP Residual norm 9.296803910151e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.364943420264e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.364943420264e+01 1 KSP Residual norm 1.442397276091e-14 1 KSP Residual norm 1.442397276091e-14 22 KSP Residual norm 4.194011567465e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.654021421986e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.654021421986e+01 1 KSP Residual norm 2.187300113984e-14 1 KSP Residual norm 2.187300113984e-14 23 KSP Residual norm 3.685382820500e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.800600601628e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.800600601628e+01 1 KSP Residual norm 2.969039908430e-14 1 KSP Residual norm 2.969039908430e-14 24 KSP Residual norm 3.523119764435e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.726790062548e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.726790062548e+01 1 KSP Residual norm 1.323712596846e-14 1 KSP Residual norm 1.323712596846e-14 25 KSP Residual norm 2.152327979441e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.096196086907e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.096196086907e+01 1 KSP Residual norm 5.359278004446e-15 1 KSP Residual norm 5.359278004446e-15 26 KSP Residual norm 2.143614849769e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.048874304328e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.048874304328e+01 1 KSP Residual norm 1.334835347090e-14 1 KSP Residual norm 1.334835347090e-14 27 KSP Residual norm 1.693569452346e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.128480634134e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.128480634134e+01 1 KSP Residual norm 6.649264895961e-15 1 KSP Residual norm 6.649264895961e-15 28 KSP Residual norm 5.564609247553e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.557918235445e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.557918235445e+01 1 KSP Residual norm 3.227129301566e-14 1 KSP Residual norm 3.227129301566e-14 29 KSP Residual norm 1.769604988520e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.644329330955e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.644329330955e+01 1 KSP Residual norm 1.369295734449e-14 1 KSP Residual norm 1.369295734449e-14 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.013628445075e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.013628445075e+00 1 KSP Residual norm 2.297023413331e-15 1 KSP Residual norm 2.297023413331e-15 30 KSP Residual norm 3.780292422787e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.878549878832e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.878549878832e+01 1 KSP Residual norm 5.582808788638e-14 1 KSP Residual norm 5.582808788638e-14 31 KSP Residual norm 3.779609717168e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.524185725671e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.524185725671e+01 1 KSP Residual norm 2.948925583795e-14 1 KSP Residual norm 2.948925583795e-14 32 KSP Residual norm 3.761891968007e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.261547123354e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.261547123354e+01 1 KSP Residual norm 4.339446472292e-14 1 KSP Residual norm 4.339446472292e-14 33 KSP Residual norm 3.453463015746e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.032561955001e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.032561955001e+01 1 KSP Residual norm 5.670375392335e-15 1 KSP Residual norm 5.670375392335e-15 34 KSP Residual norm 3.323439164298e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.237735451509e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.237735451509e+01 1 KSP Residual norm 1.259118333623e-14 1 KSP Residual norm 1.259118333623e-14 35 KSP Residual norm 3.312643911003e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.483656967206e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.483656967206e+00 1 KSP Residual norm 4.577586639900e-15 1 KSP Residual norm 4.577586639900e-15 36 KSP Residual norm 2.933977342983e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.263968932249e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.263968932249e+01 1 KSP Residual norm 1.195090229111e-14 1 KSP Residual norm 1.195090229111e-14 37 KSP Residual norm 2.927559292018e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.570173088691e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.570173088691e+01 1 KSP Residual norm 2.439856809117e-14 1 KSP Residual norm 2.439856809117e-14 38 KSP Residual norm 2.766456470778e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.543641058302e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.543641058302e+01 1 KSP Residual norm 9.512540695658e-15 1 KSP Residual norm 9.512540695658e-15 39 KSP Residual norm 2.722351685899e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.960108743626e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.960108743626e+01 1 KSP Residual norm 1.638507242437e-14 1 KSP Residual norm 1.638507242437e-14 40 KSP Residual norm 1.966809871057e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.181169858553e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.181169858553e+01 1 KSP Residual norm 1.491899872836e-14 1 KSP Residual norm 1.491899872836e-14 41 KSP Residual norm 1.960355506999e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.102852988449e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.102852988449e+01 1 KSP Residual norm 6.139303597704e-15 1 KSP Residual norm 6.139303597704e-15 42 KSP Residual norm 1.579088890931e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.955001444291e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.955001444291e+01 1 KSP Residual norm 1.119075160254e-14 1 KSP Residual norm 1.119075160254e-14 43 KSP Residual norm 1.498343643989e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.119714756404e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.119714756404e+01 1 KSP Residual norm 5.435830585621e-15 1 KSP Residual norm 5.435830585621e-15 44 KSP Residual norm 1.177012410410e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.149957637981e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.149957637981e+01 1 KSP Residual norm 1.414378886080e-14 1 KSP Residual norm 1.414378886080e-14 45 KSP Residual norm 9.618179814688e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.001393397970e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.001393397970e+00 1 KSP Residual norm 2.168754284658e-15 1 KSP Residual norm 2.168754284658e-15 46 KSP Residual norm 9.617788353595e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.105047015469e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.105047015469e+01 1 KSP Residual norm 6.959946362637e-15 1 KSP Residual norm 6.959946362637e-15 47 KSP Residual norm 7.273688620090e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.219260106164e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.219260106164e+01 1 KSP Residual norm 1.629207059504e-14 1 KSP Residual norm 1.629207059504e-14 48 KSP Residual norm 6.571848764176e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.270553344009e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.270553344009e+01 1 KSP Residual norm 5.509909853652e-15 1 KSP Residual norm 5.509909853652e-15 49 KSP Residual norm 6.452998487040e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.022029856595e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.022029856595e+01 1 KSP Residual norm 1.077753660865e-14 1 KSP Residual norm 1.077753660865e-14 50 KSP Residual norm 3.951376554426e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.406626662353e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.406626662353e+01 1 KSP Residual norm 7.075535826621e-15 1 KSP Residual norm 7.075535826621e-15 51 KSP Residual norm 3.938931734863e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.388041353344e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.388041353344e+01 1 KSP Residual norm 7.968359529568e-15 1 KSP Residual norm 7.968359529568e-15 52 KSP Residual norm 3.038877167921e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.510270920614e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.510270920614e+01 1 KSP Residual norm 3.017984388501e-14 1 KSP Residual norm 3.017984388501e-14 53 KSP Residual norm 1.730421998562e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.762064545159e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.762064545159e+01 1 KSP Residual norm 8.988961441533e-15 1 KSP Residual norm 8.988961441533e-15 54 KSP Residual norm 1.009814936425e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.504153920060e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.504153920060e+01 1 KSP Residual norm 1.544394188030e-14 1 KSP Residual norm 1.544394188030e-14 55 KSP Residual norm 9.315642093529e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 55 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.039500133857e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.039500133857e+00 1 KSP Residual norm 1.693585899029e-15 1 KSP Residual norm 1.693585899029e-15 2 KSP unpreconditioned resid norm 5.468663366545e+04 true resid norm 5.468663366544e+04 ||r(i)||/||b|| 2.302020752218e-06 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 9.999660137454e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.512966219295e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.512966219295e+01 1 KSP Residual norm 8.466432884640e-15 1 KSP Residual norm 8.466432884640e-15 1 KSP Residual norm 9.999651564078e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.994288764858e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.994288764858e+01 1 KSP Residual norm 1.711280396368e-14 1 KSP Residual norm 1.711280396368e-14 2 KSP Residual norm 9.994185927841e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.567501839169e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.567501839169e+01 1 KSP Residual norm 2.327531957351e-14 1 KSP Residual norm 2.327531957351e-14 3 KSP Residual norm 9.980051058368e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.220771381500e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.220771381500e+01 1 KSP Residual norm 2.851758386798e-14 1 KSP Residual norm 2.851758386798e-14 4 KSP Residual norm 9.944061357622e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.210667437341e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.210667437341e+01 1 KSP Residual norm 7.918678254417e-15 1 KSP Residual norm 7.918678254417e-15 5 KSP Residual norm 9.926732118291e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.710051059968e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.710051059968e+01 1 KSP Residual norm 1.031198893709e-14 1 KSP Residual norm 1.031198893709e-14 6 KSP Residual norm 9.892888332377e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.065143304135e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.065143304135e+01 1 KSP Residual norm 5.120284635222e-15 1 KSP Residual norm 5.120284635222e-15 7 KSP Residual norm 9.423300046616e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.795802699122e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.795802699122e+00 1 KSP Residual norm 6.049395267594e-15 1 KSP Residual norm 6.049395267594e-15 8 KSP Residual norm 9.423277208555e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.607454810811e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.607454810811e+01 1 KSP Residual norm 1.136826940611e-14 1 KSP Residual norm 1.136826940611e-14 9 KSP Residual norm 8.902036915271e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.545261073989e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.545261073989e+01 1 KSP Residual norm 7.685240753800e-15 1 KSP Residual norm 7.685240753800e-15 10 KSP Residual norm 8.691165837785e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.004819978325e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.004819978325e+01 1 KSP Residual norm 4.836988811849e-15 1 KSP Residual norm 4.836988811849e-15 11 KSP Residual norm 8.177375284121e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.224742916689e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.224742916689e+01 1 KSP Residual norm 8.017639380786e-15 1 KSP Residual norm 8.017639380786e-15 12 KSP Residual norm 8.148035496617e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.490916395786e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.490916395786e+00 1 KSP Residual norm 4.218343229663e-15 1 KSP Residual norm 4.218343229663e-15 13 KSP Residual norm 7.686550907751e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.863707110986e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.863707110986e+01 1 KSP Residual norm 9.836415324799e-15 1 KSP Residual norm 9.836415324799e-15 14 KSP Residual norm 7.631125961607e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.961151505734e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.961151505734e+00 1 KSP Residual norm 3.983069193141e-15 1 KSP Residual norm 3.983069193141e-15 15 KSP Residual norm 6.578122073809e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.587829981409e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.587829981409e+01 1 KSP Residual norm 8.048592135319e-15 1 KSP Residual norm 8.048592135319e-15 16 KSP Residual norm 5.618775907525e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.340318743246e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.340318743246e+01 1 KSP Residual norm 9.056459718356e-15 1 KSP Residual norm 9.056459718356e-15 17 KSP Residual norm 5.588123879404e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.072081083047e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.072081083047e+01 1 KSP Residual norm 1.644987691963e-14 1 KSP Residual norm 1.644987691963e-14 18 KSP Residual norm 4.188304169445e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.264335868720e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.264335868720e+01 1 KSP Residual norm 1.220794237774e-14 1 KSP Residual norm 1.220794237774e-14 19 KSP Residual norm 3.126485943590e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.012967342147e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.012967342147e+01 1 KSP Residual norm 4.654532243035e-15 1 KSP Residual norm 4.654532243035e-15 20 KSP Residual norm 2.598228967616e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.043843349481e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.043843349481e+01 1 KSP Residual norm 2.116726666204e-14 1 KSP Residual norm 2.116726666204e-14 21 KSP Residual norm 1.310574819467e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.095731649043e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.095731649043e+01 1 KSP Residual norm 2.113250959118e-14 1 KSP Residual norm 2.113250959118e-14 22 KSP Residual norm 1.057293976409e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.020225055281e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.020225055281e+01 1 KSP Residual norm 1.369072336849e-14 1 KSP Residual norm 1.369072336849e-14 23 KSP Residual norm 7.581876193623e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.386480955463e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.386480955463e+01 1 KSP Residual norm 9.593263139405e-15 1 KSP Residual norm 9.593263139405e-15 24 KSP Residual norm 2.702311026908e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.558249339488e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.558249339488e+00 1 KSP Residual norm 3.738309153297e-15 1 KSP Residual norm 3.738309153297e-15 25 KSP Residual norm 2.501151584374e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.249618163888e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.249618163888e+01 1 KSP Residual norm 2.606677327976e-14 1 KSP Residual norm 2.606677327976e-14 26 KSP Residual norm 2.098141467454e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.980653251345e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.980653251345e+01 1 KSP Residual norm 3.085739367977e-14 1 KSP Residual norm 3.085739367977e-14 27 KSP Residual norm 5.319308686431e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.561559510835e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.561559510835e+01 1 KSP Residual norm 4.682604537783e-14 1 KSP Residual norm 4.682604537783e-14 28 KSP Residual norm 2.357578266531e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.697543500140e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.697543500140e+00 1 KSP Residual norm 4.705437412484e-15 1 KSP Residual norm 4.705437412484e-15 29 KSP Residual norm 7.831623172258e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.552171877694e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.552171877694e+01 1 KSP Residual norm 8.274479026171e-15 1 KSP Residual norm 8.274479026171e-15 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.832210783163e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.832210783163e+00 1 KSP Residual norm 5.127892959467e-15 1 KSP Residual norm 5.127892959467e-15 30 KSP Residual norm 2.224830879397e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.730826658204e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.730826658204e+01 1 KSP Residual norm 2.645799065112e-14 1 KSP Residual norm 2.645799065112e-14 31 KSP Residual norm 2.222038938192e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.194629282389e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.194629282389e+01 1 KSP Residual norm 2.529145394481e-14 1 KSP Residual norm 2.529145394481e-14 32 KSP Residual norm 2.172372103427e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.401608062279e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.401608062279e+01 1 KSP Residual norm 3.696831155297e-14 1 KSP Residual norm 3.696831155297e-14 33 KSP Residual norm 2.086657654397e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.613724980915e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.613724980915e+01 1 KSP Residual norm 1.424783750506e-14 1 KSP Residual norm 1.424783750506e-14 34 KSP Residual norm 2.078584508122e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.678222730477e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.678222730477e+01 1 KSP Residual norm 9.846634634588e-15 1 KSP Residual norm 9.846634634588e-15 35 KSP Residual norm 2.026965179502e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.492173520125e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.492173520125e+01 1 KSP Residual norm 1.233283083744e-14 1 KSP Residual norm 1.233283083744e-14 36 KSP Residual norm 1.968355930907e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.151121176221e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.151121176221e+01 1 KSP Residual norm 1.459698625490e-14 1 KSP Residual norm 1.459698625490e-14 37 KSP Residual norm 1.963770357554e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.659693841222e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.659693841222e+00 1 KSP Residual norm 2.851591003851e-15 1 KSP Residual norm 2.851591003851e-15 38 KSP Residual norm 1.743036703175e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.898868752942e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.898868752942e+00 1 KSP Residual norm 3.915083832774e-15 1 KSP Residual norm 3.915083832774e-15 39 KSP Residual norm 1.594980545053e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.161844406360e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.161844406360e+01 1 KSP Residual norm 7.104272159155e-15 1 KSP Residual norm 7.104272159155e-15 40 KSP Residual norm 1.563819578420e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.174928568690e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.174928568690e+00 1 KSP Residual norm 2.736188681388e-15 1 KSP Residual norm 2.736188681388e-15 41 KSP Residual norm 1.436521416665e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.266991290007e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.266991290007e+01 1 KSP Residual norm 7.315039100890e-15 1 KSP Residual norm 7.315039100890e-15 42 KSP Residual norm 1.003788321403e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.072816720445e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 7.072816720445e+00 1 KSP Residual norm 3.214074293067e-15 1 KSP Residual norm 3.214074293067e-15 43 KSP Residual norm 7.735059060567e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.196219446995e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.196219446995e+01 1 KSP Residual norm 6.709790075083e-15 1 KSP Residual norm 6.709790075083e-15 44 KSP Residual norm 7.106374657769e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.172536791568e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.172536791568e+01 1 KSP Residual norm 6.350834757822e-15 1 KSP Residual norm 6.350834757822e-15 45 KSP Residual norm 4.533210969996e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.948305762127e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 5.948305762127e+00 1 KSP Residual norm 3.631931750053e-15 1 KSP Residual norm 3.631931750053e-15 46 KSP Residual norm 3.855331321954e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.995034077642e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.995034077642e+00 1 KSP Residual norm 3.493881144042e-15 1 KSP Residual norm 3.493881144042e-15 47 KSP Residual norm 3.380297750312e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.559035831681e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 6.559035831681e+00 1 KSP Residual norm 3.762657305592e-15 1 KSP Residual norm 3.762657305592e-15 48 KSP Residual norm 2.758375321853e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.463600681096e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.463600681096e+01 1 KSP Residual norm 9.925614576717e-15 1 KSP Residual norm 9.925614576717e-15 49 KSP Residual norm 2.713608472722e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.641339591064e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.641339591064e+01 1 KSP Residual norm 2.288329222235e-14 1 KSP Residual norm 2.288329222235e-14 50 KSP Residual norm 1.644008204304e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.777796092614e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.777796092614e+01 1 KSP Residual norm 1.736200237872e-14 1 KSP Residual norm 1.736200237872e-14 51 KSP Residual norm 1.601928968478e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.608269728513e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 8.608269728513e+01 1 KSP Residual norm 4.736176807131e-14 1 KSP Residual norm 4.736176807131e-14 52 KSP Residual norm 1.435535309320e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.005888854858e+01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.005888854858e+01 1 KSP Residual norm 2.201592997318e-14 1 KSP Residual norm 2.201592997318e-14 53 KSP Residual norm 5.863164482628e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 53 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.832346329590e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 9.832346329590e+00 1 KSP Residual norm 4.680739207301e-15 1 KSP Residual norm 4.680739207301e-15 3 KSP unpreconditioned resid norm 3.034412141321e-01 true resid norm 3.034423463859e-01 ||r(i)||/||b|| 1.277333292730e-11 KSP Object: 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 4 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=134430, allocated nonzeros=134430 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 2.11737e+06 [1] 5.16896e+06 [2] 8.22021e+06 [3] 2.19477e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 26687. [1] 45712. [2] 81753. [3] 30601. RINFO(3) (local estimated flops for the elimination after factorization): [0] 2.11737e+06 [1] 5.16896e+06 [2] 8.22021e+06 [3] 2.19477e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 11 [1] 12 [2] 12 [3] 11 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 11 [1] 12 [2] 12 [3] 11 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 118 [1] 129 [2] 303 [3] 74 RINFOG(1) (global estimated flops for the elimination after analysis): 1.77013e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 184753. RINFOG(3) (global estimated flops for the elimination after factorization): 1.77013e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 134430 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 5517 INFOG(5) (estimated maximum front size in the complete tree): 219 INFOG(6) (number of nodes in the complete tree): 25 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 134430 INFOG(10) (total integer space store the matrix factors after factorization): 5517 INFOG(11) (order of largest frontal matrix after factorization): 219 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 12 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 46 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 12 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 46 INFOG(20) (estimated number of entries in the factors): 134430 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 12 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 45 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 134430 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 4 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=71358, allocated nonzeros=71358 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 66 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 4 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3450, allocated nonzeros=3450 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 0. [1] 0. [2] 0. [3] 113595. RINFO(2) (local estimated flops for the assembly after factorization): [0] 0. [1] 0. [2] 0. [3] 784. RINFO(3) (local estimated flops for the elimination after factorization): [0] 0. [1] 0. [2] 0. [3] 113595. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 [2] 1 [3] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 [2] 1 [3] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 0 [1] 0 [2] 0 [3] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 113595. RINFOG(2) (global estimated flops for the assembly after factorization): 784. RINFOG(3) (global estimated flops for the elimination after factorization): 113595. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3450 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 214 INFOG(5) (estimated maximum front size in the complete tree): 47 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3450 INFOG(10) (total integer space store the matrix factors after factorization): 214 INFOG(11) (order of largest frontal matrix after factorization): 47 INFOG(12) (number of off-diagonal pivots): 1 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 4 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 4 INFOG(20) (estimated number of entries in the factors): 3450 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 4 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3450 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 4 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 4 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1550, allocated nonzeros=1550 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 48 nodes, limit used is 5 A10 Mat Object: 4 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=6916, allocated nonzeros=6916 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 48 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_0_) 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 4 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=134430, allocated nonzeros=134430 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 2.11737e+06 [1] 5.16896e+06 [2] 8.22021e+06 [3] 2.19477e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 26687. [1] 45712. [2] 81753. [3] 30601. RINFO(3) (local estimated flops for the elimination after factorization): [0] 2.11737e+06 [1] 5.16896e+06 [2] 8.22021e+06 [3] 2.19477e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 11 [1] 12 [2] 12 [3] 11 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 11 [1] 12 [2] 12 [3] 11 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 118 [1] 129 [2] 303 [3] 74 RINFOG(1) (global estimated flops for the elimination after analysis): 1.77013e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 184753. RINFOG(3) (global estimated flops for the elimination after factorization): 1.77013e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 134430 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 5517 INFOG(5) (estimated maximum front size in the complete tree): 219 INFOG(6) (number of nodes in the complete tree): 25 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 134430 INFOG(10) (total integer space store the matrix factors after factorization): 5517 INFOG(11) (order of largest frontal matrix after factorization): 219 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 12 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 46 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 12 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 46 INFOG(20) (estimated number of entries in the factors): 134430 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 12 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 45 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 134430 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 4 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=71358, allocated nonzeros=71358 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 66 nodes, limit used is 5 A01 Mat Object: 4 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=6916, allocated nonzeros=6916 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 66 nodes, limit used is 5 Mat Object: 4 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=3246, allocated nonzeros=3246 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 30 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=101416 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 72 nodes, limit used is 5 From balay at mcs.anl.gov Thu Jan 5 12:02:08 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 5 Jan 2017 12:02:08 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> <1483605469330.79182@marin.nl> Message-ID: On Thu, 5 Jan 2017, Matthew Knepley wrote: > On Thu, Jan 5, 2017 at 2:37 AM, Klaij, Christiaan wrote: > > So problem solved for now, thanks to you and Matt for all your > > help! On the long run I will go for Intel-17 on SL7.3. > > > > What worries me though is that a simple update (which happens all > > the time according to sysadmin) can have such a dramatic effect. > > > I agree. It seems SL has broken the ability to use shared libraries with a > simple point release. > It seems the robustness of all this process is a myth. Well its more of RHEL - than SL. And its just Intel .so files [as far as we know] thats triggering this issue. RHEL generally doesn't make changes that break old binaries. But any code change [wihch bug fixes are] - can introduce changed behavior with some stuff.. [esp stuff that might use internal - non-api features] RHEL7 glibc had huge number of fixes since 2.17-106.el7_2.8 https://rpmfind.net/linux/RPM/centos/updates/7.3.1611/x86_64/Packages/glibc-2.17-157.el7_3.1.x86_64.html Interestingly the code crashes at dynamic linking time? [ even before main() starts] - perhaps something to do with the way libintlc.so.5 uses memmove? (gdb) where #0 0x00007ffff722865e in ?? () #1 0x00007ffff7de9675 in elf_machine_rela (reloc=0x7ffff592ae38, reloc=0x7ffff592ae38, skip_ifunc=, reloc_addr_arg=0x7ffff5b8e8f0, version=0x0, sym=0x7ffff5925f58, map=0x7ffff7fee570) at ../sysdeps/x86_64/dl-machine.h:288 #2 elf_dynamic_do_Rela (skip_ifunc=, lazy=, nrelative=, relsize=, reladdr=, map=0x7ffff7fee570) at do-rel.h:170 #3 _dl_relocate_object (scope=, reloc_mode=, consider_profiling=, consider_profiling at entry=0) at dl-reloc.c:259 #4 0x00007ffff7de0792 in dl_main (phdr=, phdr at entry=0x400040, phnum=, phnum at entry=9, user_entry=user_entry at entry=0x7fffffffceb8, auxv=) at rtld.c:2192 #5 0x00007ffff7df3e36 in _dl_sysdep_start (start_argptr=start_argptr at entry=0x7fffffffcf70, dl_main=dl_main at entry=0x7ffff7dde820 ) at ../elf/dl-sysdep.c:244 #6 0x00007ffff7de1a31 in _dl_start_final (arg=0x7fffffffcf70) at rtld.c:318 #7 _dl_start (arg=0x7fffffffcf70) at rtld.c:544 #8 0x00007ffff7dde1e8 in _start () from /lib64/ld-linux-x86-64.so.2 #9 0x0000000000000001 in ?? () #10 0x00007fffffffd25c in ?? () #11 0x0000000000000000 in ?? () (gdb) [balay at localhost benchmarks]$ LD_DEBUG=all ./a.out 2468: symbol=__xpg_basename; lookup in file=./a.out [0] 2468: symbol=__xpg_basename; lookup in file=/soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libifcore.so.5 [0] 2468: symbol=__xpg_basename; lookup in file=/lib64/libm.so.6 [0] 2468: symbol=__xpg_basename; lookup in file=/lib64/libgcc_s.so.1 [0] 2468: symbol=__xpg_basename; lookup in file=/lib64/libc.so.6 [0] 2468: binding file /soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libintlc.so.5 [0] to /lib64/libc.so.6 [0]: normal symbol `__xpg_basename' 2468: symbol=memmove; lookup in file=./a.out [0] 2468: symbol=memmove; lookup in file=/soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libifcore.so.5 [0] 2468: symbol=memmove; lookup in file=/lib64/libm.so.6 [0] 2468: symbol=memmove; lookup in file=/lib64/libgcc_s.so.1 [0] 2468: symbol=memmove; lookup in file=/lib64/libc.so.6 [0] 2468: binding file /soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libintlc.so.5 [0] to /lib64/libc.so.6 [0]: normal symbol `memmove' Segmentation fault (core dumped) If intel-16 compilers are critical - one can always downgrade to old glibc - but then would miss out on all the fixes.. yum downgrade glibc* Satish From bsmith at mcs.anl.gov Thu Jan 5 13:17:50 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 5 Jan 2017 13:17:50 -0600 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> Message-ID: <77441AAA-D70B-449F-A330-B417B6DA64A6@mcs.anl.gov> This is not good. Something is out of whack. First run 1 and 2 processes with -ksp_view_mat binary -ksp_view_rhs binary in each case this will generate a file called binaryoutput . Send both files to petsc-maint at mcs.anl.gov I want to confirm that the matrices are the same in both cases. Barry > On Jan 5, 2017, at 10:36 AM, Karin&NiKo wrote: > > Dave, > > Indeed the residual histories differ. Concerning the IS's, I have checked them on small cases, so that I am quite sure they are OK. > What could I do with PETSc to evaluate the ill-conditioning of the system or of the sub-systems? > > Thanks again for your help, > Nicolas > > 2017-01-05 15:46 GMT+01:00 Barry Smith : > > > On Jan 5, 2017, at 5:58 AM, Dave May wrote: > > > > Do you now see identical residual histories for a job using 1 rank and 4 ranks? > > Please send the residual histories with the extra options, I'm curious too, because a Krylov method should not be needed in the inner solve, I just asked for it so we can see what the residuals look like. > > Barry > > > > > If not, I am inclined to believe that the IS's you are defining for the splits in the parallel case are incorrect. The operator created to approximate the Schur complement with selfp should not depend on the number of ranks. > > > > Or possibly your problem is horribly I'll-conditioned. If it is, then this could result in slightly different residual histories when using different numbers of ranks - even if the operators are in fact identical > > > > > > Thanks, > > Dave > > > > > > > > > > On Thu, 5 Jan 2017 at 12:14, Karin&NiKo wrote: > > Dear Barry, dear Dave, > > > > THANK YOU! > > You two pointed out the right problem.By using the options you provided (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver converges in 3 iterations whatever the size of the communicator. > > All the trick is in the precise resolution of the Schur complement, by using a Krylov method (and not only preonly) *and* applying the preconditioner on the right (so evaluating the convergence on the unpreconditioned residual). > > > > @Barry : the difference you see on the nonzero allocations for the different runs is just an artefact : when using more than one proc, we slighly over-estimate the number of non-zero terms. If I run the same problem with the -info option, I get extra information : > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 110; storage space: 0 unneeded,5048 used > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 271; storage space: 4249 unneeded,26167 used > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 307; storage space: 7988 unneeded,31093 used > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 244; storage space: 0 unneeded,6194 used > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 233; storage space: 823 unneeded,9975 used > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 197; storage space: 823 unneeded,8263 used > > And 5048+26167+31093+6194+9975+8263=86740 which is the number of exactly estimated nonzero terms for 1 proc. > > > > > > Thank you again! > > > > Best regards, > > Nicolas > > > > > > 2017-01-05 1:36 GMT+01:00 Barry Smith : > > > > > > > > There is something wrong with your set up. > > > > > > > > > > > > 1 process > > > > > > > > > > > > total: nonzeros=140616, allocated nonzeros=140616 > > > > > > total: nonzeros=68940, allocated nonzeros=68940 > > > > > > total: nonzeros=3584, allocated nonzeros=3584 > > > > > > total: nonzeros=1000, allocated nonzeros=1000 > > > > > > total: nonzeros=8400, allocated nonzeros=8400 > > > > > > > > > > > > 2 processes > > > > > > total: nonzeros=146498, allocated nonzeros=146498 > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > total: nonzeros=3038, allocated nonzeros=3038 > > > > > > total: nonzeros=1110, allocated nonzeros=1110 > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > total: nonzeros=146498, allocated nonzeros=146498 > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > total: nonzeros=2846, allocated nonzeros=2846 > > > > > > total: nonzeros=86740, allocated nonzeros=94187 > > > > > > > > > > > > It looks like you are setting up the problem differently in parallel and seq. If it is suppose to be an identical problem then the number nonzeros should be the same in at least the first two matrices. > > > > > > > > > > > > > > > > > > > > > > > > > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > > > > > > > > > > > > > Dear Petsc team, > > > > > > > > > > > > > > I am (still) trying to solve Biot's poroelasticity problem : > > > > > > > > > > > > > > > > > > > > > I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. > > > > > > > > > > > > > > I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : > > > > > > > > > > > > > > -ksp_rtol 1.0e-5 > > > > > > > -ksp_type fgmres > > > > > > > -pc_type fieldsplit > > > > > > > -pc_fieldsplit_schur_factorization_type full > > > > > > > -pc_fieldsplit_type schur > > > > > > > -pc_fieldsplit_schur_precondition selfp > > > > > > > -fieldsplit_0_pc_type lu > > > > > > > -fieldsplit_0_pc_factor_mat_solver_package mumps > > > > > > > -fieldsplit_0_ksp_type preonly > > > > > > > -fieldsplit_0_ksp_converged_reason > > > > > > > -fieldsplit_1_pc_type lu > > > > > > > -fieldsplit_1_pc_factor_mat_solver_package mumps > > > > > > > -fieldsplit_1_ksp_type preonly > > > > > > > -fieldsplit_1_ksp_converged_reason > > > > > > > > > > > > > > On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). > > > > > > > > > > > > > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > > > > > > > > > > > > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > > > > > > > > > > > > > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? > > > > > > > > > > > > > > > > > > > > > Thanks for your precious help, > > > > > > > Nicolas > > > > > > > > > > > > > > <1_Warning.txt> > > > > > > > > > > > > > > > > > > > From balay at mcs.anl.gov Thu Jan 5 13:18:45 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Thu, 5 Jan 2017 13:18:45 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> <1483605469330.79182@marin.nl> Message-ID: On Thu, 5 Jan 2017, Satish Balay wrote: > Well its more of RHEL - than SL. And its just Intel .so files [as far > as we know] thats triggering this issue. > > RHEL generally doesn't make changes that break old binaries. But any > code change [wihch bug fixes are] - can introduce changed behavior > with some stuff.. [esp stuff that might use internal - non-api > features] Reported this issue to Redhat. https://bugzilla.redhat.com/show_bug.cgi?id=1410576 Satish From mvalera at mail.sdsu.edu Thu Jan 5 18:21:39 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Thu, 5 Jan 2017 16:21:39 -0800 Subject: [petsc-users] Best way to scatter a Seq vector ? Message-ID: Hello Devs is me again, I'm trying to distribute a vector to all called processes, the vector would be originally in root as a sequential vector and i would like to scatter it, what would the best call to do this ? I already know how to gather a distributed vector to root with VecScatterCreateToZero, this would be the inverse operation, i'm currently trying with VecScatterCreate() and as of now im doing the following: if(rank==0)then call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i use WORLD !freezes in SetSizes call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) call VecSetType(bp0,VECSEQ,ierr) call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) !rhs do i=0,nbdp-1,1 ind(i+1) = i enddo call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if i use SELF !freezes here. call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) endif bp2 being the receptor MPI vector to scatter to But it freezes in VecScatterCreate when trying to use more than one processor, what would be a better approach ? Thanks once again, Manuel On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera wrote: > Thanks i had no idea how to debug and read those logs, that solved this > issue at least (i was sending a message from root to everyone else, but > trying to catch from everyone else including root) > > Until next time, many thanks, > > Manuel > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley wrote: > >> On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera >> wrote: >> >>> I did a PetscBarrier just before calling the vicariate routine and im >>> pretty sure im calling it from every processor, code looks like this: >>> >> >> From the gdb trace. >> >> Proc 0: Is in some MPI routine you call yourself, line 113 >> >> Proc 1: Is in VecCreate(), line 130 >> >> You need to fix your communication code. >> >> Matt >> >> >>> call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>> >>> >>> print*,'entering POInit from',rank >>> >>> !call exit() >>> >>> >>> call PetscObjsInit() >>> >>> >>> >>> And output gives: >>> >>> >>> entering POInit from 0 >>> >>> entering POInit from 1 >>> >>> entering POInit from 2 >>> >>> entering POInit from 3 >>> >>> >>> Still hangs in the same way, >>> >>> Thanks, >>> >>> Manuel >>> >>> >>> >>> On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera >>> wrote: >>> >>>> Thanks for the answers ! >>>> >>>> heres the screenshot of what i got from bt in gdb (great hint in how to >>>> debug in petsc, didn't know that) >>>> >>>> I don't really know what to look at here, >>>> >>>> Thanks, >>>> >>>> Manuel >>>> >>>> On Wed, Jan 4, 2017 at 2:39 PM, Dave May >>>> wrote: >>>> >>>>> Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). >>>>> These functions cannot be inside if statements like >>>>> if (rank == 0){ >>>>> VecCreateMPI(...) >>>>> } >>>>> >>>>> >>>>> On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>>>> wrote: >>>>> >>>>>> Thanks Dave for the quick answer, appreciate it, >>>>>> >>>>>> I just tried that and it didn't make a difference, any other >>>>>> suggestions ? >>>>>> >>>>>> Thanks, >>>>>> Manuel >>>>>> >>>>>> On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>>>>> wrote: >>>>>> >>>>>> You need to swap the order of your function calls. >>>>>> Call VecSetSizes() before VecSetType() >>>>>> >>>>>> Thanks, >>>>>> Dave >>>>>> >>>>>> >>>>>> On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>>>>> wrote: >>>>>> >>>>>> Hello all, happy new year, >>>>>> >>>>>> I'm working on parallelizing my code, it worked and provided some >>>>>> results when i just called more than one processor, but created artifacts >>>>>> because i didn't need one image of the whole program in each processor, >>>>>> conflicting with each other. >>>>>> >>>>>> Since the pressure solver is the main part i need in parallel im >>>>>> chosing mpi to run everything in root processor until its time to solve for >>>>>> pressure, at this point im trying to create a distributed vector using >>>>>> either >>>>>> >>>>>> call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>>>>> or >>>>>> >>>>>> call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>>> >>>>>> call VecSetType(xp,VECMPI,ierr) >>>>>> >>>>>> call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>>> >>>>>> >>>>>> >>>>>> In both cases program hangs at this point, something it never >>>>>> happened on the naive way i described before. I've made sure the global >>>>>> size, nbdp, is the same in every processor. What can be wrong? >>>>>> >>>>>> >>>>>> Thanks for your kind help, >>>>>> >>>>>> >>>>>> Manuel. >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From u.rochan at gmail.com Thu Jan 5 18:35:24 2017 From: u.rochan at gmail.com (Rochan Upadhyay) Date: Thu, 5 Jan 2017 18:35:24 -0600 Subject: [petsc-users] a question on DMPlexSetAnchors In-Reply-To: References: Message-ID: Thanks for prompt reply. I don't need hanging nodes or Dirichlet conditions which can be easily done by adding constraint DoFs in the Section as you mention. My requirement is the following: >>> Constraints among Fields: >>> I would recommend just putting the constraint in as an equation. In your case the effect can >>> be non-local, so this seems like the best strategy. The constraint dof is described by an equation. In fact I have easily set up residuals for the system. My (perceived) difficulties are in the Jacobian. My additional Dof is a scalar quantity that is not physically tied to any specific point but needs to be solved tightly coupled to a FEM system. In order to use the global section (default section for the FEM system) to fill up the Mats and Vecs, I have artificially appended this extra dof to a particular point. Now in the Jacobian matrix there will be one extra row and column that, once filled, should be dense (rather block dense) due to the non-local dependence of this extra Dof on field values at some other points. My question is once the DM has allocated non-zeros for the matrix (based on the given section) would it be possible to add non-zeros in non-standard locations (namely a few dense sub-rows and sub-columns) in a way that does not destroy performance. Does using the built in routine DMSetDefaultConstraint (or for that matter the DMPlexSetAnchors) create another (separate) constraint matrix that presumably does an efficient job of incorporating these additional non-zeros ? Or does this Constraint matrix only come in during the DMLocalToGLobal (& vice versa) calls as mentioned in the documentation ? I appreciate your reading through my rather verbose mail, especially considering the numerous other queries that you receive each day. Thanks. On Wed, Jan 4, 2017 at 5:59 PM, Matthew Knepley wrote: > On Tue, Jan 3, 2017 at 4:02 PM, Rochan Upadhyay > wrote: > >> I think I sent my previous question (on Dec 28th) to the wrong place >> (petsc-users-request at mcs.anl.gov). >> > > Yes, this is the correct mailing list. > > >> To repeat, >> >> I am having bit of a difficulty in understanding the introduction of >> constraints in DMPlex. From a quick study of the User Manual I gather >> that it is easiest done using DMPlexSetAnchors ? The description of this >> routine says that there is an anchorIS that specifies the anchor points >> (rows in the >> matrix). This is okay and easily understood. >> > > I think this is not the right mechanism for you. > > Anchors: > > This is intended for constraints in the discretization, such as hanging > nodes, which are > purely local, and intended to take place across the entire domain. That > determines the > interface. > > Dirichlet Boundary Conditions: > > For these, I would recommend using the Constraint interface in > PetscSection, which > eliminates these unknowns from the global system, but includes the values > in the local > vectors used in assembly. > > You can also just alter your equations for constrained unknowns. > > Constraints among Fields: > > I would recommend just putting the constraint in as an equation. In your > case the effect can > be non-local, so this seems like the best strategy. > > Thanks, > > Matt > > >> There is also an anchorSection which is described as a map from >> constraint points >> (columns ?) to the anchor points listed in the anchorIS. Should this not >> be a map between >> solution indices (i.e. indices appearing in the vectors and matrices) ? >> >> For example I am completely unable to set up a simple constraint matrix >> for the following (say): >> >> Point 1, Field A, B >> Point 2-10 Field A >> At point 1, Field B depends on Field A at points 1-10 >> >> When I set it up it appears to create a matrix where field A depends on >> field A values at points 1-10. >> >> How does the mapping work in this case ? Will the DMPlexSetAnchors() >> routine work >> for this simple scenario ? >> >> If not, is the only recourse to create the constraint matrix oneself >> using DMSetDefaultConstraints ? >> >> Also documentation for DMSetDefaultConstraints is incomplete. >> The function accepts three arguments (dm, section and Mat) but >> what the section is is not described at all. >> >> I don't know if my question makes any sense. If it does not then it is >> only a reflection of my utter confusion regarding the routine >> DMPlexSetAnchors :-( >> >> Regards, >> Rochan >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Jan 5 18:39:12 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 5 Jan 2017 18:39:12 -0600 Subject: [petsc-users] Best way to scatter a Seq vector ? In-Reply-To: References: Message-ID: <7C6E6D1D-AA28-4889-A647-40AB8FA4ED3C@mcs.anl.gov> > On Jan 5, 2017, at 6:21 PM, Manuel Valera wrote: > > Hello Devs is me again, > > I'm trying to distribute a vector to all called processes, the vector would be originally in root as a sequential vector and i would like to scatter it, what would the best call to do this ? > > I already know how to gather a distributed vector to root with VecScatterCreateToZero, this would be the inverse operation, Use the same VecScatter object but with SCATTER_REVERSE, not you need to reverse the two vector arguments as well. > i'm currently trying with VecScatterCreate() and as of now im doing the following: > > > if(rank==0)then > > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i use WORLD > !freezes in SetSizes > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > call VecSetType(bp0,VECSEQ,ierr) > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) !rhs > > do i=0,nbdp-1,1 > ind(i+1) = i > enddo > > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if i use SELF > !freezes here. > > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) > > endif > > bp2 being the receptor MPI vector to scatter to > > But it freezes in VecScatterCreate when trying to use more than one processor, what would be a better approach ? > > > Thanks once again, > > Manuel > > > > > > > > > > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera wrote: > Thanks i had no idea how to debug and read those logs, that solved this issue at least (i was sending a message from root to everyone else, but trying to catch from everyone else including root) > > Until next time, many thanks, > > Manuel > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley wrote: > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera wrote: > I did a PetscBarrier just before calling the vicariate routine and im pretty sure im calling it from every processor, code looks like this: > > From the gdb trace. > > Proc 0: Is in some MPI routine you call yourself, line 113 > > Proc 1: Is in VecCreate(), line 130 > > You need to fix your communication code. > > Matt > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > print*,'entering POInit from',rank > !call exit() > > call PetscObjsInit() > > > And output gives: > > entering POInit from 0 > entering POInit from 1 > entering POInit from 2 > entering POInit from 3 > > > Still hangs in the same way, > > Thanks, > > Manuel > > > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera wrote: > Thanks for the answers ! > > heres the screenshot of what i got from bt in gdb (great hint in how to debug in petsc, didn't know that) > > I don't really know what to look at here, > > Thanks, > > Manuel > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May wrote: > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). These functions cannot be inside if statements like > if (rank == 0){ > VecCreateMPI(...) > } > > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera wrote: > Thanks Dave for the quick answer, appreciate it, > > I just tried that and it didn't make a difference, any other suggestions ? > > Thanks, > Manuel > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May wrote: > You need to swap the order of your function calls. > Call VecSetSizes() before VecSetType() > > Thanks, > Dave > > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: > Hello all, happy new year, > > I'm working on parallelizing my code, it worked and provided some results when i just called more than one processor, but created artifacts because i didn't need one image of the whole program in each processor, conflicting with each other. > > Since the pressure solver is the main part i need in parallel im chosing mpi to run everything in root processor until its time to solve for pressure, at this point im trying to create a distributed vector using either > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > or > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > call VecSetType(xp,VECMPI,ierr) > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > In both cases program hangs at this point, something it never happened on the naive way i described before. I've made sure the global size, nbdp, is the same in every processor. What can be wrong? > > Thanks for your kind help, > > Manuel. > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > From knepley at gmail.com Thu Jan 5 20:40:24 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 5 Jan 2017 20:40:24 -0600 Subject: [petsc-users] a question on DMPlexSetAnchors In-Reply-To: References: Message-ID: On Thu, Jan 5, 2017 at 6:35 PM, Rochan Upadhyay wrote: > Thanks for prompt reply. I don't need hanging nodes or Dirichlet > conditions which can > be easily done by adding constraint DoFs in the Section as you mention. > My requirement is the following: > >>> Constraints among Fields: > >>> I would recommend just putting the constraint in as an equation. In > your case the effect can > >>> be non-local, so this seems like the best strategy. > The constraint dof is described by an equation. In fact I have easily > set up residuals for the system. My (perceived) difficulties are in the > Jacobian. My additional > Dof is a scalar quantity that is not physically tied to any specific point > but needs to be solved tightly coupled > to a FEM system. In order to use the global section (default section for > the FEM system) > to fill up the Mats and Vecs, I have artificially appended this extra dof > to a particular point. > Now in the Jacobian matrix there will be one extra row and column that, > once filled, should be dense > (rather block dense) due to the non-local dependence of this extra Dof on > field values at some other points. > Now, if you want good performance, you have to describe the constraint in terms of the topology. All our DMs are setup for local equations. Nonlocal equations are not correctly preallocated. You can a) Just turn off checking for proper preallocation, MatSetOption(A, MAT_NEW_NONZERO_LOCATION_ERR, PETSC_FALSE) b) Do the preallocation yourself If instead, the pattern "fits inside" a common pattern described by these http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMPlexGetAdjacencyUseClosure.html http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/DMPlexSetAdjacencyUseCone.html you can just use that. What creates your constraints? Matt My question is once the DM has allocated non-zeros for the matrix (based on > the given section) would it be > possible to add non-zeros in non-standard locations (namely a few dense > sub-rows and sub-columns) in a way > that does not destroy performance. Does using the built in routine > DMSetDefaultConstraint (or for that > matter the DMPlexSetAnchors) create another (separate) constraint matrix > that presumably does an efficient job > of incorporating these additional non-zeros ? Or does this Constraint > matrix only come in during the DMLocalToGLobal > (& vice versa) calls as mentioned in the documentation ? > I appreciate your reading through my rather verbose mail, especially > considering the numerous other queries that > you receive each day. > Thanks. > > On Wed, Jan 4, 2017 at 5:59 PM, Matthew Knepley wrote: > >> On Tue, Jan 3, 2017 at 4:02 PM, Rochan Upadhyay >> wrote: >> >>> I think I sent my previous question (on Dec 28th) to the wrong place >>> (petsc-users-request at mcs.anl.gov). >>> >> >> Yes, this is the correct mailing list. >> >> >>> To repeat, >>> >>> I am having bit of a difficulty in understanding the introduction of >>> constraints in DMPlex. From a quick study of the User Manual I gather >>> that it is easiest done using DMPlexSetAnchors ? The description of this >>> routine says that there is an anchorIS that specifies the anchor points >>> (rows in the >>> matrix). This is okay and easily understood. >>> >> >> I think this is not the right mechanism for you. >> >> Anchors: >> >> This is intended for constraints in the discretization, such as hanging >> nodes, which are >> purely local, and intended to take place across the entire domain. That >> determines the >> interface. >> >> Dirichlet Boundary Conditions: >> >> For these, I would recommend using the Constraint interface in >> PetscSection, which >> eliminates these unknowns from the global system, but includes the values >> in the local >> vectors used in assembly. >> >> You can also just alter your equations for constrained unknowns. >> >> Constraints among Fields: >> >> I would recommend just putting the constraint in as an equation. In your >> case the effect can >> be non-local, so this seems like the best strategy. >> >> Thanks, >> >> Matt >> >> >>> There is also an anchorSection which is described as a map from >>> constraint points >>> (columns ?) to the anchor points listed in the anchorIS. Should this >>> not be a map between >>> solution indices (i.e. indices appearing in the vectors and matrices) ? >>> >>> For example I am completely unable to set up a simple constraint matrix >>> for the following (say): >>> >>> Point 1, Field A, B >>> Point 2-10 Field A >>> At point 1, Field B depends on Field A at points 1-10 >>> >>> When I set it up it appears to create a matrix where field A depends on >>> field A values at points 1-10. >>> >>> How does the mapping work in this case ? Will the DMPlexSetAnchors() >>> routine work >>> for this simple scenario ? >>> >>> If not, is the only recourse to create the constraint matrix oneself >>> using DMSetDefaultConstraints ? >>> >>> Also documentation for DMSetDefaultConstraints is incomplete. >>> The function accepts three arguments (dm, section and Mat) but >>> what the section is is not described at all. >>> >>> I don't know if my question makes any sense. If it does not then it is >>> only a reflection of my utter confusion regarding the routine >>> DMPlexSetAnchors :-( >>> >>> Regards, >>> Rochan >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From C.Klaij at marin.nl Fri Jan 6 02:28:43 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Fri, 6 Jan 2017 08:28:43 +0000 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> <1483605469330.79182@marin.nl> , Message-ID: <1483691323190.12133@marin.nl> Satish, Our sysadmin is not keen on downgrading glibc. I'll stick with "--with-shared-libraries=0" for now and wait for SL7.3 with intel 17. Thanks for filing the bugreport at RHEL, very curious to see their response. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Comparison-of-uRANS-and-BEMBEM-for-propeller-pressure-pulse-prediction.htm ________________________________________ From: Satish Balay Sent: Thursday, January 05, 2017 7:02 PM To: Matthew Knepley Cc: Klaij, Christiaan; petsc-users Subject: Re: [petsc-users] problems after glibc upgrade to 2.17-157 On Thu, 5 Jan 2017, Matthew Knepley wrote: > On Thu, Jan 5, 2017 at 2:37 AM, Klaij, Christiaan wrote: > > So problem solved for now, thanks to you and Matt for all your > > help! On the long run I will go for Intel-17 on SL7.3. > > > > What worries me though is that a simple update (which happens all > > the time according to sysadmin) can have such a dramatic effect. > > > I agree. It seems SL has broken the ability to use shared libraries with a > simple point release. > It seems the robustness of all this process is a myth. Well its more of RHEL - than SL. And its just Intel .so files [as far as we know] thats triggering this issue. RHEL generally doesn't make changes that break old binaries. But any code change [wihch bug fixes are] - can introduce changed behavior with some stuff.. [esp stuff that might use internal - non-api features] RHEL7 glibc had huge number of fixes since 2.17-106.el7_2.8 https://rpmfind.net/linux/RPM/centos/updates/7.3.1611/x86_64/Packages/glibc-2.17-157.el7_3.1.x86_64.html Interestingly the code crashes at dynamic linking time? [ even before main() starts] - perhaps something to do with the way libintlc.so.5 uses memmove? (gdb) where #0 0x00007ffff722865e in ?? () #1 0x00007ffff7de9675 in elf_machine_rela (reloc=0x7ffff592ae38, reloc=0x7ffff592ae38, skip_ifunc=, reloc_addr_arg=0x7ffff5b8e8f0, version=0x0, sym=0x7ffff5925f58, map=0x7ffff7fee570) at ../sysdeps/x86_64/dl-machine.h:288 #2 elf_dynamic_do_Rela (skip_ifunc=, lazy=, nrelative=, relsize=, reladdr=, map=0x7ffff7fee570) at do-rel.h:170 #3 _dl_relocate_object (scope=, reloc_mode=, consider_profiling=, consider_profiling at entry=0) at dl-reloc.c:259 #4 0x00007ffff7de0792 in dl_main (phdr=, phdr at entry=0x400040, phnum=, phnum at entry=9, user_entry=user_entry at entry=0x7fffffffceb8, auxv=) at rtld.c:2192 #5 0x00007ffff7df3e36 in _dl_sysdep_start (start_argptr=start_argptr at entry=0x7fffffffcf70, dl_main=dl_main at entry=0x7ffff7dde820 ) at ../elf/dl-sysdep.c:244 #6 0x00007ffff7de1a31 in _dl_start_final (arg=0x7fffffffcf70) at rtld.c:318 #7 _dl_start (arg=0x7fffffffcf70) at rtld.c:544 #8 0x00007ffff7dde1e8 in _start () from /lib64/ld-linux-x86-64.so.2 #9 0x0000000000000001 in ?? () #10 0x00007fffffffd25c in ?? () #11 0x0000000000000000 in ?? () (gdb) [balay at localhost benchmarks]$ LD_DEBUG=all ./a.out 2468: symbol=__xpg_basename; lookup in file=./a.out [0] 2468: symbol=__xpg_basename; lookup in file=/soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libifcore.so.5 [0] 2468: symbol=__xpg_basename; lookup in file=/lib64/libm.so.6 [0] 2468: symbol=__xpg_basename; lookup in file=/lib64/libgcc_s.so.1 [0] 2468: symbol=__xpg_basename; lookup in file=/lib64/libc.so.6 [0] 2468: binding file /soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libintlc.so.5 [0] to /lib64/libc.so.6 [0]: normal symbol `__xpg_basename' 2468: symbol=memmove; lookup in file=./a.out [0] 2468: symbol=memmove; lookup in file=/soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libifcore.so.5 [0] 2468: symbol=memmove; lookup in file=/lib64/libm.so.6 [0] 2468: symbol=memmove; lookup in file=/lib64/libgcc_s.so.1 [0] 2468: symbol=memmove; lookup in file=/lib64/libc.so.6 [0] 2468: binding file /soft/com/packages/intel/16/u3/compilers_and_libraries_2016.3.210/linux/compiler/lib/intel64/libintlc.so.5 [0] to /lib64/libc.so.6 [0]: normal symbol `memmove' Segmentation fault (core dumped) If intel-16 compilers are critical - one can always downgrade to old glibc - but then would miss out on all the fixes.. yum downgrade glibc* Satish From Patrick.Begou at legi.grenoble-inp.fr Fri Jan 6 02:39:39 2017 From: Patrick.Begou at legi.grenoble-inp.fr (Patrick Begou) Date: Fri, 6 Jan 2017 09:39:39 +0100 Subject: [petsc-users] make test freeze In-Reply-To: References: <586E3CBF.9020605@legi.grenoble-inp.fr> Message-ID: <586F57CB.2030708@legi.grenoble-inp.fr> Hi Matthew, Launching manualy ex19 shows only one process consuming cpu time, after 952mn I've killed the job this morning. [begou at kareline tutorials]$ make ex19 mpicc -o ex19.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/include -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/GCC48/include `pwd`/ex19.c mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -o ex19 ex19.o -L/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/GCC48/lib -lpetsc -llapack -lblas -lX11 -lhwloc -lssl -lcrypto -L/opt/openmpi173-GCC48-node/lib -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 -L/opt/GCC48c/lib -lmpi_usempi -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -L/opt/openmpi173-GCC48-node/lib -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 -L/opt/GCC48c/lib -ldl -lmpi -lgcc_s -lpthread -ldl /bin/rm -f ex19.o [begou at kareline tutorials]$ mpiexec -n 2 ./ex19 -snes_monitor top command shows: PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 32184 begou 20 0 249m 7152 5132 R 99.8 0.0 952:15.97 ex19 32183 begou 20 0 71676 3508 2264 S 0.0 0.0 0:00.04 mpiexec 32185 begou 20 0 185m 7132 5124 S 0.0 0.0 0:00.04 ex19 looks like the first process waiting for something that never occur in MPI communication.... Patrick Matthew Knepley a ?crit : > On Thu, Jan 5, 2017 at 6:31 AM, Patrick Begou > wrote: > > I am unable to run any test on petsc. It looks like if the ex19 run freeze > on the server as it do not use any cpu time and pstree shows > > sshd---bash-+-gedit > `-make---sh-+-gmake---sh---gmake---sh---mpiexec---ex19 > `-tee > I've tested petsc-3.7.5.tar.gz and the latest sources on the Git repository. > > > All make is doing is running ex19, which you can do by hand. What do you get for > > cd $PETSC_DIR > cd src/snes/examples/tutorials > make ex19 > mpiexec -n 2 ./ex19 -snes_monitor > > Thanks, > > Matt > > Setup from the Git repo: > ./configure > --prefix=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries \ > --PETSC_ARCH=GCC48 \ > --PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git \ > --with-shared-libraries=0 \ > --with-fortran-interfaces=1 \ > --with-fortran-kernels=1 \ > --with-cc=mpicc \ > --with-fc=mpif90 \ > --with-cxx=mpicxx > > make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git > PETSC_ARCH=GCC48 all > > make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git > PETSC_ARCH=GCC48 install > > make > PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries > PETSC_ARCH="" test > > > In the log file I've just: > > Running test examples to verify correct installation > Using > PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries > and PETSC_ARCH= > > I'm using: > gcc version 4.8.1 > Open MPI: 1.7.3 (build with gcc 4.8.1) > (This environment is in production for a while for many local software and > works fine) > > Any suggestion is welcome > > Patrick > > -- > =================================================================== > | Equipe M.O.S.T. | | > | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr > | > | LEGI | | > | BP 53 X | Tel 04 76 82 51 35 | > | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | > =================================================================== > > > > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener -- =================================================================== | Equipe M.O.S.T. | | | Patrick BEGOU |mailto:Patrick.Begou at grenoble-inp.fr | | LEGI | | | BP 53 X | Tel 04 76 82 51 35 | | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | =================================================================== -------------- next part -------------- An HTML attachment was scrubbed... URL: From niko.karin at gmail.com Fri Jan 6 05:17:07 2017 From: niko.karin at gmail.com (Karin&NiKo) Date: Fri, 6 Jan 2017 12:17:07 +0100 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: <77441AAA-D70B-449F-A330-B417B6DA64A6@mcs.anl.gov> References: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> <77441AAA-D70B-449F-A330-B417B6DA64A6@mcs.anl.gov> Message-ID: Barry, you are goddamn right - there was something wrong with the numbering. I fixed it and look what I get. The residuals of outer iterations are exactly the same. Thanks again for your insight and perseverance. Nicolas 2017-01-05 20:17 GMT+01:00 Barry Smith : > > This is not good. Something is out of whack. > > First run 1 and 2 processes with -ksp_view_mat binary -ksp_view_rhs > binary in each case this will generate a file called binaryoutput . Send > both files to petsc-maint at mcs.anl.gov I want to confirm that the > matrices are the same in both cases. > > Barry > > > On Jan 5, 2017, at 10:36 AM, Karin&NiKo wrote: > > > > Dave, > > > > Indeed the residual histories differ. Concerning the IS's, I have > checked them on small cases, so that I am quite sure they are OK. > > What could I do with PETSc to evaluate the ill-conditioning of the > system or of the sub-systems? > > > > Thanks again for your help, > > Nicolas > > > > 2017-01-05 15:46 GMT+01:00 Barry Smith : > > > > > On Jan 5, 2017, at 5:58 AM, Dave May wrote: > > > > > > Do you now see identical residual histories for a job using 1 rank and > 4 ranks? > > > > Please send the residual histories with the extra options, I'm > curious too, because a Krylov method should not be needed in the inner > solve, I just asked for it so we can see what the residuals look like. > > > > Barry > > > > > > > > If not, I am inclined to believe that the IS's you are defining for > the splits in the parallel case are incorrect. The operator created to > approximate the Schur complement with selfp should not depend on the > number of ranks. > > > > > > Or possibly your problem is horribly I'll-conditioned. If it is, then > this could result in slightly different residual histories when using > different numbers of ranks - even if the operators are in fact identical > > > > > > > > > Thanks, > > > Dave > > > > > > > > > > > > > > > On Thu, 5 Jan 2017 at 12:14, Karin&NiKo wrote: > > > Dear Barry, dear Dave, > > > > > > THANK YOU! > > > You two pointed out the right problem.By using the options you > provided (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right > -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver > converges in 3 iterations whatever the size of the communicator. > > > All the trick is in the precise resolution of the Schur complement, by > using a Krylov method (and not only preonly) *and* applying the > preconditioner on the right (so evaluating the convergence on the > unpreconditioned residual). > > > > > > @Barry : the difference you see on the nonzero allocations for the > different runs is just an artefact : when using more than one proc, we > slighly over-estimate the number of non-zero terms. If I run the same > problem with the -info option, I get extra information : > > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 110; storage space: 0 > unneeded,5048 used > > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 271; storage space: > 4249 unneeded,26167 used > > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 307; storage space: > 7988 unneeded,31093 used > > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 244; storage space: 0 > unneeded,6194 used > > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 233; storage space: > 823 unneeded,9975 used > > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 197; storage space: > 823 unneeded,8263 used > > > And 5048+26167+31093+6194+9975+8263=86740 which is the number of > exactly estimated nonzero terms for 1 proc. > > > > > > > > > Thank you again! > > > > > > Best regards, > > > Nicolas > > > > > > > > > 2017-01-05 1:36 GMT+01:00 Barry Smith : > > > > > > > > > > > > There is something wrong with your set up. > > > > > > > > > > > > > > > > > > 1 process > > > > > > > > > > > > > > > > > > total: nonzeros=140616, allocated nonzeros=140616 > > > > > > > > > total: nonzeros=68940, allocated nonzeros=68940 > > > > > > > > > total: nonzeros=3584, allocated nonzeros=3584 > > > > > > > > > total: nonzeros=1000, allocated nonzeros=1000 > > > > > > > > > total: nonzeros=8400, allocated nonzeros=8400 > > > > > > > > > > > > > > > > > > 2 processes > > > > > > > > > total: nonzeros=146498, allocated nonzeros=146498 > > > > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > > > > total: nonzeros=3038, allocated nonzeros=3038 > > > > > > > > > total: nonzeros=1110, allocated nonzeros=1110 > > > > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > > > > total: nonzeros=146498, allocated > nonzeros=146498 > > > > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > > > > total: nonzeros=2846, allocated nonzeros=2846 > > > > > > > > > total: nonzeros=86740, allocated nonzeros=94187 > > > > > > > > > > > > > > > > > > It looks like you are setting up the problem differently in parallel > and seq. If it is suppose to be an identical problem then the number > nonzeros should be the same in at least the first two matrices. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > > > > > > > > > > > > > > > > > > > Dear Petsc team, > > > > > > > > > > > > > > > > > > > > I am (still) trying to solve Biot's poroelasticity problem : > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > I am using a mixed P2-P1 finite element discretization. The matrix > of the discretized system in binary format is attached to this email. > > > > > > > > > > > > > > > > > > > > I am using the fieldsplit framework to solve the linear system. > Since I am facing some troubles, I have decided to go back to simple > things. Here are the options I am using : > > > > > > > > > > > > > > > > > > > > -ksp_rtol 1.0e-5 > > > > > > > > > > -ksp_type fgmres > > > > > > > > > > -pc_type fieldsplit > > > > > > > > > > -pc_fieldsplit_schur_factorization_type full > > > > > > > > > > -pc_fieldsplit_type schur > > > > > > > > > > -pc_fieldsplit_schur_precondition selfp > > > > > > > > > > -fieldsplit_0_pc_type lu > > > > > > > > > > -fieldsplit_0_pc_factor_mat_solver_package mumps > > > > > > > > > > -fieldsplit_0_ksp_type preonly > > > > > > > > > > -fieldsplit_0_ksp_converged_reason > > > > > > > > > > -fieldsplit_1_pc_type lu > > > > > > > > > > -fieldsplit_1_pc_factor_mat_solver_package mumps > > > > > > > > > > -fieldsplit_1_ksp_type preonly > > > > > > > > > > -fieldsplit_1_ksp_converged_reason > > > > > > > > > > > > > > > > > > > > On a single proc, everything runs fine : the solver converges in 3 > iterations, according to the theory (see Run-1-proc.txt [contains > -log_view]). > > > > > > > > > > > > > > > > > > > > On 2 procs, the solver converges in 28 iterations (see > Run-2-proc.txt). > > > > > > > > > > > > > > > > > > > > On 3 procs, the solver converges in 91 iterations (see > Run-3-proc.txt). > > > > > > > > > > > > > > > > > > > > I do not understand this behavior : since MUMPS is a parallel direct > solver, shouldn't the solver converge in max 3 iterations whatever the > number of procs? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thanks for your precious help, > > > > > > > > > > Nicolas > > > > > > > > > > > > > > > > > > > > <1_Warning.txt> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 0.000000000000e+00 Linear fieldsplit_1_ solve converged due to CONVERGED_ATOL iterations 0 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 1 KSP Residual norm 4.313410630558e-15 1 KSP Residual norm 4.313410630558e-15 1 KSP unpreconditioned resid norm 6.190344827565e+04 true resid norm 6.190344827565e+04 ||r(i)||/||b|| 2.605810835536e-06 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 1 KSP Residual norm 1.553056052550e-09 1 KSP Residual norm 1.553056052550e-09 1 KSP Residual norm 1.810321861046e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 1 KSP Residual norm 6.852859005090e-10 1 KSP Residual norm 6.852859005090e-10 2 KSP Residual norm 4.110160641015e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 1 KSP Residual norm 3.391519472149e-10 1 KSP Residual norm 3.391519472149e-10 3 KSP Residual norm 9.399363055282e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 1 KSP Residual norm 4.488756555375e-10 1 KSP Residual norm 4.488756555375e-10 4 KSP Residual norm 1.571092856159e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 1 KSP Residual norm 2.684362494425e-10 1 KSP Residual norm 2.684362494425e-10 5 KSP Residual norm 1.963417150656e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546913e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546913e+05 1 KSP Residual norm 1.680082274413e-10 1 KSP Residual norm 1.680082274413e-10 6 KSP Residual norm 2.086077021964e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963975e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963975e+05 1 KSP Residual norm 1.773409123937e-10 1 KSP Residual norm 1.773409123937e-10 7 KSP Residual norm 2.638900162683e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 1 KSP Residual norm 8.396841831477e-10 1 KSP Residual norm 8.396841831477e-10 2 KSP unpreconditioned resid norm 1.633570314420e-01 true resid norm 1.633570534028e-01 ||r(i)||/||b|| 6.876476055467e-12 KSP Object: 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.99982e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 153549. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.99982e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 3 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 3 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 624 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 3 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 3 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 3 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3584, allocated nonzeros=3584 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 123808. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1024. RINFO(3) (local estimated flops for the elimination after factorization): [0] 123808. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 123808. RINFOG(2) (global estimated flops for the assembly after factorization): 1024. RINFOG(3) (global estimated flops for the elimination after factorization): 123808. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3584 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 222 INFOG(5) (estimated maximum front size in the complete tree): 48 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3584 INFOG(10) (total integer space store the matrix factors after factorization): 222 INFOG(11) (order of largest frontal matrix after factorization): 48 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3584 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3584 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 1 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 1 MPI processes type: seqaij rows=64, cols=64 total: nonzeros=1000, allocated nonzeros=1000 total number of mallocs used during MatSetValues calls =0 not using I-node routines A10 Mat Object: 1 MPI processes type: seqaij rows=64, cols=624 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP of A00 KSP Object: (fieldsplit_0_) 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 1 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: nd factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.99982e+07 RINFO(2) (local estimated flops for the assembly after factorization): [0] 153549. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.99982e+07 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 3 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 3 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 624 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 3 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 3 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 3 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 2, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 1 MPI processes type: seqaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=624, cols=64 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 Mat Object: 1 MPI processes type: seqaij rows=64, cols=64 total: nonzeros=2744, allocated nonzeros=2744 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 28 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=86740 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 208 nodes, limit used is 5 -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 0.000000000000e+00 Linear fieldsplit_1_ solve converged due to CONVERGED_ATOL iterations 0 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 1 KSP Residual norm 5.153232313369e-15 1 KSP Residual norm 5.153232313369e-15 1 KSP unpreconditioned resid norm 6.190344827565e+04 true resid norm 6.190344827565e+04 ||r(i)||/||b|| 2.605810835536e-06 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 1 KSP Residual norm 1.656008198025e-09 1 KSP Residual norm 1.656008198025e-09 1 KSP Residual norm 1.810321861046e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 1 KSP Residual norm 8.800214973038e-10 1 KSP Residual norm 8.800214973038e-10 2 KSP Residual norm 4.110160641015e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 1 KSP Residual norm 3.905900670485e-10 1 KSP Residual norm 3.905900670485e-10 3 KSP Residual norm 9.399363055282e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 1 KSP Residual norm 5.032293180728e-10 1 KSP Residual norm 5.032293180728e-10 4 KSP Residual norm 1.571092856159e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 1 KSP Residual norm 3.602474304247e-10 1 KSP Residual norm 3.602474304247e-10 5 KSP Residual norm 1.963417150656e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546913e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546913e+05 1 KSP Residual norm 1.976198090797e-10 1 KSP Residual norm 1.976198090797e-10 6 KSP Residual norm 2.086077021964e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963978e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963978e+05 1 KSP Residual norm 1.812230342781e-10 1 KSP Residual norm 1.812230342781e-10 7 KSP Residual norm 2.638900162683e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 1 KSP Residual norm 8.130910373979e-10 1 KSP Residual norm 8.130910373979e-10 2 KSP unpreconditioned resid norm 1.633570291986e-01 true resid norm 1.633570660459e-01 ||r(i)||/||b|| 6.876476587674e-12 KSP Object: 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 2 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.07031e+07 [1] 9.29512e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 61713. [1] 91836. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.07031e+07 [1] 9.29512e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 7 [1] 7 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 7 [1] 7 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 249 [1] 375 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 7 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 14 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 7 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 14 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 7 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 14 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 2 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 94 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3584, allocated nonzeros=3584 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 0. [1] 123808. RINFO(2) (local estimated flops for the assembly after factorization): [0] 0. [1] 1024. RINFO(3) (local estimated flops for the elimination after factorization): [0] 0. [1] 123808. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 0 [1] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 123808. RINFOG(2) (global estimated flops for the assembly after factorization): 1024. RINFOG(3) (global estimated flops for the elimination after factorization): 123808. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3584 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 222 INFOG(5) (estimated maximum front size in the complete tree): 48 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3584 INFOG(10) (total integer space store the matrix factors after factorization): 222 INFOG(11) (order of largest frontal matrix after factorization): 48 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 2 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 2 INFOG(20) (estimated number of entries in the factors): 3584 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 2 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3584 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 2 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 2 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1000, allocated nonzeros=1000 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines A10 Mat Object: 2 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines KSP of A00 KSP Object: (fieldsplit_0_) 2 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 2 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 2 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 1.07031e+07 [1] 9.29512e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 61713. [1] 91836. RINFO(3) (local estimated flops for the elimination after factorization): [0] 1.07031e+07 [1] 9.29512e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 7 [1] 7 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 7 [1] 7 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 249 [1] 375 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 7 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 14 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 7 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 14 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 7 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 14 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 2 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 94 nodes, limit used is 5 A01 Mat Object: 2 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 94 nodes, limit used is 5 Mat Object: 2 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=2744, allocated nonzeros=2744 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 28 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 2 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=86740 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 94 nodes, limit used is 5 -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 0.000000000000e+00 Linear fieldsplit_1_ solve converged due to CONVERGED_ATOL iterations 0 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 1 KSP Residual norm 4.416192634469e-15 1 KSP Residual norm 4.416192634469e-15 1 KSP unpreconditioned resid norm 6.190344827565e+04 true resid norm 6.190344827565e+04 ||r(i)||/||b|| 2.605810835536e-06 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 1 KSP Residual norm 1.700738199034e-09 1 KSP Residual norm 1.700738199034e-09 1 KSP Residual norm 1.810321861046e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 1 KSP Residual norm 7.962068082549e-10 1 KSP Residual norm 7.962068082549e-10 2 KSP Residual norm 4.110160641015e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 1 KSP Residual norm 3.289249772080e-10 1 KSP Residual norm 3.289249772080e-10 3 KSP Residual norm 9.399363055282e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 1 KSP Residual norm 4.755072958655e-10 1 KSP Residual norm 4.755072958655e-10 4 KSP Residual norm 1.571092856159e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 1 KSP Residual norm 3.150998708365e-10 1 KSP Residual norm 3.150998708365e-10 5 KSP Residual norm 1.963417150656e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546914e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546914e+05 1 KSP Residual norm 1.938942931417e-10 1 KSP Residual norm 1.938942931417e-10 6 KSP Residual norm 2.086077021964e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963998e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963998e+05 1 KSP Residual norm 1.928793262485e-10 1 KSP Residual norm 1.928793262485e-10 7 KSP Residual norm 2.638900162677e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 1 KSP Residual norm 8.110797351987e-10 1 KSP Residual norm 8.110797351987e-10 2 KSP unpreconditioned resid norm 1.633570270762e-01 true resid norm 1.633570670086e-01 ||r(i)||/||b|| 6.876476628201e-12 KSP Object: 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 3 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 7.17743e+06 [1] 9.83155e+06 [2] 2.98924e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 65835. [1] 54225. [2] 33489. RINFO(3) (local estimated flops for the elimination after factorization): [0] 7.17743e+06 [1] 9.83155e+06 [2] 2.98924e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 9 [1] 9 [2] 9 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 9 [1] 9 [2] 9 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 291 [1] 201 [2] 132 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 9 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 27 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 9 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 27 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 9 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 27 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 3 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 58 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3584, allocated nonzeros=3584 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 0. [1] 0. [2] 123808. RINFO(2) (local estimated flops for the assembly after factorization): [0] 0. [1] 0. [2] 1024. RINFO(3) (local estimated flops for the elimination after factorization): [0] 0. [1] 0. [2] 123808. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 [2] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 [2] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 0 [1] 0 [2] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 123808. RINFOG(2) (global estimated flops for the assembly after factorization): 1024. RINFOG(3) (global estimated flops for the elimination after factorization): 123808. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3584 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 222 INFOG(5) (estimated maximum front size in the complete tree): 48 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3584 INFOG(10) (total integer space store the matrix factors after factorization): 222 INFOG(11) (order of largest frontal matrix after factorization): 48 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 3 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 3 INFOG(20) (estimated number of entries in the factors): 3584 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 3 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3584 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 3 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 3 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1000, allocated nonzeros=1000 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines A10 Mat Object: 3 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines KSP of A00 KSP Object: (fieldsplit_0_) 3 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 3 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 3 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 7.17743e+06 [1] 9.83155e+06 [2] 2.98924e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 65835. [1] 54225. [2] 33489. RINFO(3) (local estimated flops for the elimination after factorization): [0] 7.17743e+06 [1] 9.83155e+06 [2] 2.98924e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 9 [1] 9 [2] 9 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 9 [1] 9 [2] 9 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 291 [1] 201 [2] 132 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 9 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 27 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 9 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 27 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 9 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 27 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 3 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 58 nodes, limit used is 5 A01 Mat Object: 3 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 58 nodes, limit used is 5 Mat Object: 3 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=2744, allocated nonzeros=2744 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 27 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 3 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=86740 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 58 nodes, limit used is 5 -------------- next part -------------- 0 KSP unpreconditioned resid norm 2.375592557658e+10 true resid norm 2.375592557658e+10 ||r(i)||/||b|| 1.000000000000e+00 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 0.000000000000e+00 Linear fieldsplit_1_ solve converged due to CONVERGED_ATOL iterations 0 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 1.000000000000e+00 1 KSP Residual norm 4.482205480004e-15 1 KSP Residual norm 4.482205480004e-15 1 KSP unpreconditioned resid norm 6.190344827565e+04 true resid norm 6.190344827565e+04 ||r(i)||/||b|| 2.605810835536e-06 Residual norms for fieldsplit_1_ solve. 0 KSP Residual norm 1.000000000000e+00 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.407021980813e+05 1 KSP Residual norm 1.535392735729e-09 1 KSP Residual norm 1.535392735729e-09 1 KSP Residual norm 1.810321861046e-01 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.443756559170e+05 1 KSP Residual norm 8.820499188125e-10 1 KSP Residual norm 8.820499188125e-10 2 KSP Residual norm 4.110160641015e-02 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.459077029344e+05 1 KSP Residual norm 3.847344827509e-10 1 KSP Residual norm 3.847344827509e-10 3 KSP Residual norm 9.399363055282e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 4.045754196688e+05 1 KSP Residual norm 4.971843515516e-10 1 KSP Residual norm 4.971843515516e-10 4 KSP Residual norm 1.571092856159e-03 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.162336928973e+05 1 KSP Residual norm 3.320224974767e-10 1 KSP Residual norm 3.320224974767e-10 5 KSP Residual norm 1.963417150656e-04 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546914e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 3.145763546914e+05 1 KSP Residual norm 2.070077616994e-10 1 KSP Residual norm 2.070077616994e-10 6 KSP Residual norm 2.086077021964e-05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963991e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.507749963991e+05 1 KSP Residual norm 2.082157055179e-10 1 KSP Residual norm 2.082157055179e-10 7 KSP Residual norm 2.638900162679e-06 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 7 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 Residual norms for fieldsplit_0_ solve. 0 KSP Residual norm 2.087534725346e+05 1 KSP Residual norm 9.092777244907e-10 1 KSP Residual norm 9.092777244907e-10 2 KSP unpreconditioned resid norm 1.633570323877e-01 true resid norm 1.633570547236e-01 ||r(i)||/||b|| 6.876476111066e-12 KSP Object: 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=1000, initial guess is zero tolerances: relative=1e-06, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 4 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization UPPER Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_0_) 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 4 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 5.99914e+06 [1] 2.55926e+06 [2] 2.03824e+06 [3] 9.40158e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 44154. [1] 28881. [2] 30897. [3] 49617. RINFO(3) (local estimated flops for the elimination after factorization): [0] 5.99914e+06 [1] 2.55926e+06 [2] 2.03824e+06 [3] 9.40158e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 11 [1] 11 [2] 11 [3] 12 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 11 [1] 11 [2] 11 [3] 12 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 315 [1] 84 [2] 72 [3] 153 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 12 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 45 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 12 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 45 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 11 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 44 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 4 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 43 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_1_) 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_1_) 4 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: mpiaij rows=64, cols=64 package used to perform factorization: mumps total: nonzeros=3584, allocated nonzeros=3584 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 0. [1] 0. [2] 0. [3] 123808. RINFO(2) (local estimated flops for the assembly after factorization): [0] 0. [1] 0. [2] 0. [3] 1024. RINFO(3) (local estimated flops for the elimination after factorization): [0] 0. [1] 0. [2] 0. [3] 123808. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 [1] 1 [2] 1 [3] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 [1] 1 [2] 1 [3] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 0 [1] 0 [2] 0 [3] 64 RINFOG(1) (global estimated flops for the elimination after analysis): 123808. RINFOG(2) (global estimated flops for the assembly after factorization): 1024. RINFOG(3) (global estimated flops for the elimination after factorization): 123808. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3584 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 222 INFOG(5) (estimated maximum front size in the complete tree): 48 INFOG(6) (number of nodes in the complete tree): 2 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3584 INFOG(10) (total integer space store the matrix factors after factorization): 222 INFOG(11) (order of largest frontal matrix after factorization): 48 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 4 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 4 INFOG(20) (estimated number of entries in the factors): 3584 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 4 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3584 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_1_) 4 MPI processes type: schurcomplement rows=64, cols=64 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_1_) 4 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=1000, allocated nonzeros=1000 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines A10 Mat Object: 4 MPI processes type: mpiaij rows=64, cols=624 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines KSP of A00 KSP Object: (fieldsplit_0_) 4 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (fieldsplit_0_) 4 MPI processes type: lu LU: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 4 MPI processes type: mpiaij rows=624, cols=624 package used to perform factorization: mumps total: nonzeros=140616, allocated nonzeros=140616 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 0 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 1 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 3 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 1 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -32 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 5.99914e+06 [1] 2.55926e+06 [2] 2.03824e+06 [3] 9.40158e+06 RINFO(2) (local estimated flops for the assembly after factorization): [0] 44154. [1] 28881. [2] 30897. [3] 49617. RINFO(3) (local estimated flops for the elimination after factorization): [0] 5.99914e+06 [1] 2.55926e+06 [2] 2.03824e+06 [3] 9.40158e+06 INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 11 [1] 11 [2] 11 [3] 12 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 11 [1] 11 [2] 11 [3] 12 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 315 [1] 84 [2] 72 [3] 153 RINFOG(1) (global estimated flops for the elimination after analysis): 1.99982e+07 RINFOG(2) (global estimated flops for the assembly after factorization): 153549. RINFOG(3) (global estimated flops for the elimination after factorization): 1.99982e+07 (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 140616 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 4995 INFOG(5) (estimated maximum front size in the complete tree): 252 INFOG(6) (number of nodes in the complete tree): 23 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 140616 INFOG(10) (total integer space store the matrix factors after factorization): 4995 INFOG(11) (order of largest frontal matrix after factorization): 252 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 12 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 45 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 12 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 45 INFOG(20) (estimated number of entries in the factors): 140616 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 11 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 44 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 0 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 140616 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 1, 2 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): 7 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_0_) 4 MPI processes type: mpiaij rows=624, cols=624 total: nonzeros=68940, allocated nonzeros=68940 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 43 nodes, limit used is 5 A01 Mat Object: 4 MPI processes type: mpiaij rows=624, cols=64 total: nonzeros=8400, allocated nonzeros=8400 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 43 nodes, limit used is 5 Mat Object: 4 MPI processes type: mpiaij rows=64, cols=64 total: nonzeros=2744, allocated nonzeros=2744 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 23 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 4 MPI processes type: mpiaij rows=688, cols=688 total: nonzeros=86740, allocated nonzeros=86740 total number of mallocs used during MatSetValues calls =0 using I-node (on process 0) routines: found 43 nodes, limit used is 5 From knepley at gmail.com Fri Jan 6 07:36:30 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 6 Jan 2017 07:36:30 -0600 Subject: [petsc-users] make test freeze In-Reply-To: <586F57CB.2030708@legi.grenoble-inp.fr> References: <586E3CBF.9020605@legi.grenoble-inp.fr> <586F57CB.2030708@legi.grenoble-inp.fr> Message-ID: On Fri, Jan 6, 2017 at 2:39 AM, Patrick Begou < Patrick.Begou at legi.grenoble-inp.fr> wrote: > Hi Matthew, > > Launching manualy ex19 shows only one process consuming cpu time, after > 952mn I've killed the job this morning. > > [begou at kareline tutorials]$ make ex19 > mpicc -o ex19.o -c -Wall -Wwrite-strings -Wno-strict-aliasing > -Wno-unknown-pragmas -fvisibility=hidden -g3 > -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/include > -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/GCC48/include > `pwd`/ex19.c > mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fvisibility=hidden -g3 -o ex19 ex19.o -L/kareline/data/begou/YALES2_ > 1.0.0/PREREQUIS/petsc-git/GCC48/lib -lpetsc -llapack -lblas -lX11 > -lhwloc -lssl -lcrypto -L/opt/openmpi173-GCC48-node/lib > -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 > -L/opt/GCC48c/lib -lmpi_usempi -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -L/opt/openmpi173-GCC48-node/lib > -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 > -L/opt/GCC48c/lib -ldl -lmpi -lgcc_s -lpthread -ldl > /bin/rm -f ex19.o > [begou at kareline tutorials]$ mpiexec -n 2 ./ex19 -snes_monitor > > top command shows: > > PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ > COMMAND > 32184 begou 20 0 249m 7152 5132 R 99.8 0.0 952:15.97 > ex19 > 32183 begou 20 0 71676 3508 2264 S 0.0 0.0 0:00.04 > mpiexec > 32185 begou 20 0 185m 7132 5124 S 0.0 0.0 0:00.04 > ex19 > > looks like the first process waiting for something that never occur in MPI > communication.... > 1000s of people run this every day, so I am skeptical of that explanation. However, this could happen if the 'mpiexec' in your path does not match the MPI libraries that PETSc is linked to. Matt > Patrick > > Matthew Knepley a ?crit : > > On Thu, Jan 5, 2017 at 6:31 AM, Patrick Begou < > Patrick.Begou at legi.grenoble-inp.fr> > wrote: > >> I am unable to run any test on petsc. It looks like if the ex19 run >> freeze on the server as it do not use any cpu time and pstree shows >> >> sshd---bash-+-gedit >> `-make---sh-+-gmake---sh---gmake---sh---mpiexec---ex19 >> `-tee >> I've tested petsc-3.7.5.tar.gz and the latest sources on the Git >> repository. >> > > All make is doing is running ex19, which you can do by hand. What do you > get for > > cd $PETSC_DIR > cd src/snes/examples/tutorials > make ex19 > mpiexec -n 2 ./ex19 -snes_monitor > > Thanks, > > Matt > > >> Setup from the Git repo: >> ./configure --prefix=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries >> \ >> --PETSC_ARCH=GCC48 \ >> --PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git \ >> --with-shared-libraries=0 \ >> --with-fortran-interfaces=1 \ >> --with-fortran-kernels=1 \ >> --with-cc=mpicc \ >> --with-fc=mpif90 \ >> --with-cxx=mpicxx >> >> make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git >> PETSC_ARCH=GCC48 all >> >> make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git >> PETSC_ARCH=GCC48 install >> >> make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries >> PETSC_ARCH="" test >> >> >> In the log file I've just: >> >> Running test examples to verify correct installation >> Using PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries >> and PETSC_ARCH= >> >> I'm using: >> gcc version 4.8.1 >> Open MPI: 1.7.3 (build with gcc 4.8.1) >> (This environment is in production for a while for many local software >> and works fine) >> >> Any suggestion is welcome >> >> Patrick >> >> -- >> =================================================================== >> | Equipe M.O.S.T. | | >> | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr | >> | LEGI | | >> | BP 53 X | Tel 04 76 82 51 35 | >> | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | >> =================================================================== >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > -- > =================================================================== > | Equipe M.O.S.T. | | > | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr | > | LEGI | | > | BP 53 X | Tel 04 76 82 51 35 | > | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | > =================================================================== > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Patrick.Begou at legi.grenoble-inp.fr Fri Jan 6 07:52:45 2017 From: Patrick.Begou at legi.grenoble-inp.fr (Patrick Begou) Date: Fri, 6 Jan 2017 14:52:45 +0100 Subject: [petsc-users] make test freeze In-Reply-To: References: <586E3CBF.9020605@legi.grenoble-inp.fr> <586F57CB.2030708@legi.grenoble-inp.fr> Message-ID: <586FA12D.1050506@legi.grenoble-inp.fr> It is not the first time I have this problem and my aim was now to try to solve it instead of ignoring tests. The environment seams coherent (see below). I'll try to run in debug mode to investigate where the code hangs. Patrick [begou at kareline tutorials]$ make ex19 *mpicc* -o ex19.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/include -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/GCC48/include `pwd`/ex19.c mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -o ex19 ex19.o -L/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/GCC48/lib -lpetsc -llapack -lblas -lX11 -lhwloc -lssl -lcrypto -L/opt/openmpi173-GCC48-node/lib -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 -L/opt/GCC48c/lib -lmpi_usempi -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -L/opt/openmpi173-GCC48-node/lib -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 -L/opt/GCC48c/lib -ldl -lmpi -lgcc_s -lpthread -ldl /bin/rm -f ex19.o [begou at kareline tutorials]$ *which mpiexec** **/opt/openmpi173-GCC48-node/bin/mpiexec* [begou at kareline tutorials]$ *mpicc --showme** **gcc -I/opt/openmpi173-GCC48-node/include -pthread -L/opt/openmpi173-GCC48-node/lib -lmpi* [begou at kareline tutorials]$ *ldd ./ex19* linux-vdso.so.1 => (0x00007ffe771ea000) liblapack.so.3 => /usr/lib64/atlas/liblapack.so.3 (0x00007f5ac8596000) libblas.so.3 => /usr/lib64/libblas.so.3 (0x00007f5ac833e000) libX11.so.6 => /usr/lib64/libX11.so.6 (0x0000003c66600000) libhwloc.so.5 => /usr/lib64/libhwloc.so.5 (0x0000003dde600000) libssl.so.10 => /usr/lib64/libssl.so.10 (0x0000003c71600000) libcrypto.so.10 => /usr/lib64/libcrypto.so.10 (0x0000003c69a00000) libmpi_usempi.so.1 => */opt/openmpi173-GCC48-node/lib/libmpi_usempi.so.1* (0x00007f5ac80a4000) libmpi_mpifh.so.2 => */opt/openmpi173-GCC48-node/lib/libmpi_mpifh.so.2* (0x00007f5ac7e5a000) libgfortran.so.3 => /opt/GCC48c/lib64/libgfortran.so.3 (0x00007f5ac7b43000) libm.so.6 => /lib64/libm.so.6 (0x0000003c63200000) libquadmath.so.0 => /opt/GCC48c/lib64/libquadmath.so.0 (0x00007f5ac7907000) libmpi_cxx.so.1 => */opt/openmpi173-GCC48-node/lib/libmpi_cxx.so.1* (0x00007f5ac76ed000) libstdc++.so.6 => /opt/GCC48c/lib64/libstdc++.so.6 (0x00007f5ac73e4000) libdl.so.2 => /lib64/libdl.so.2 (0x0000003c63a00000) libmpi.so.1 => */opt/openmpi173-GCC48-node/lib/libmpi.so.1* (0x00007f5ac7115000) libgcc_s.so.1 => /opt/GCC48c/lib64/libgcc_s.so.1 (0x00007f5ac6eff000) libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003c63600000) libc.so.6 => /lib64/libc.so.6 (0x0000003c62e00000) libf77blas.so.3 => /usr/lib64/atlas/libf77blas.so.3 (0x00007f5ac6cdf000) libcblas.so.3 => /usr/lib64/atlas/libcblas.so.3 (0x00007f5ac6abe000) libxcb.so.1 => /usr/lib64/libxcb.so.1 (0x0000003c66200000) libnuma.so.1 => /usr/lib64/libnuma.so.1 (0x0000003ddee00000) libpci.so.3 => /lib64/libpci.so.3 (0x0000003ddea00000) libxml2.so.2 => /usr/lib64/libxml2.so.2 (0x0000003c6e200000) libgssapi_krb5.so.2 => /lib64/libgssapi_krb5.so.2 (0x0000003c70a00000) libkrb5.so.3 => /lib64/libkrb5.so.3 (0x0000003c70e00000) libcom_err.so.2 => /lib64/libcom_err.so.2 (0x0000003c69e00000) libk5crypto.so.3 => /lib64/libk5crypto.so.3 (0x0000003c6f200000) libz.so.1 => /lib64/libz.so.1 (0x0000003c64200000) libopen-rte.so.6 => */opt/openmpi173-GCC48-node/lib/libopen-rte.so.6* (0x00007f5ac684a000) libopen-pal.so.6 => */opt/openmpi173-GCC48-node/lib/libopen-pal.so.6* (0x00007f5ac6575000) librt.so.1 => /lib64/librt.so.1 (0x0000003c63e00000) libnsl.so.1 => /lib64/libnsl.so.1 (0x0000003c71a00000) libutil.so.1 => /lib64/libutil.so.1 (0x0000003c6ba00000) /lib64/ld-linux-x86-64.so.2 (0x0000003c62a00000) libatlas.so.3 => /usr/lib64/atlas/libatlas.so.3 (0x00007f5ac5f18000) libXau.so.6 => /usr/lib64/libXau.so.6 (0x0000003c66a00000) libresolv.so.2 => /lib64/libresolv.so.2 (0x0000003c64e00000) libkrb5support.so.0 => /lib64/libkrb5support.so.0 (0x0000003c6fe00000) libkeyutils.so.1 => /lib64/libkeyutils.so.1 (0x0000003c6f600000) libselinux.so.1 => /lib64/libselinux.so.1 (0x0000003c64600000) Matthew Knepley a ?crit : > On Fri, Jan 6, 2017 at 2:39 AM, Patrick Begou > > wrote: > > Hi Matthew, > > Launching manualy ex19 shows only one process consuming cpu time, after > 952mn I've killed the job this morning. > > [begou at kareline tutorials]$ make ex19 > mpicc -o ex19.o -c -Wall -Wwrite-strings -Wno-strict-aliasing > -Wno-unknown-pragmas -fvisibility=hidden -g3 > -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/include > -I/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/GCC48/include > `pwd`/ex19.c > mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fvisibility=hidden -g3 -o ex19 ex19.o > -L/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git/GCC48/lib -lpetsc > -llapack -lblas -lX11 -lhwloc -lssl -lcrypto > -L/opt/openmpi173-GCC48-node/lib > -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 > -L/opt/GCC48c/lib -lmpi_usempi -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -L/opt/openmpi173-GCC48-node/lib > -L/opt/GCC48c/lib/gcc/x86_64-unknown-linux-gnu/4.8.1 -L/opt/GCC48c/lib64 > -L/opt/GCC48c/lib -ldl -lmpi -lgcc_s -lpthread -ldl > /bin/rm -f ex19.o > [begou at kareline tutorials]$ mpiexec -n 2 ./ex19 -snes_monitor > > top command shows: > > PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND > 32184 begou 20 0 249m 7152 5132 R 99.8 0.0 952:15.97 ex19 > 32183 begou 20 0 71676 3508 2264 S 0.0 0.0 0:00.04 mpiexec > 32185 begou 20 0 185m 7132 5124 S 0.0 0.0 0:00.04 ex19 > > looks like the first process waiting for something that never occur in MPI > communication.... > > > 1000s of people run this every day, so I am skeptical of that explanation. > > However, this could happen if the 'mpiexec' in your path does not match the > MPI libraries that PETSc is linked to. > > Matt > > Patrick > > Matthew Knepley a ?crit : >> On Thu, Jan 5, 2017 at 6:31 AM, Patrick Begou >> > > wrote: >> >> I am unable to run any test on petsc. It looks like if the ex19 run >> freeze on the server as it do not use any cpu time and pstree shows >> >> sshd---bash-+-gedit >> `-make---sh-+-gmake---sh---gmake---sh---mpiexec---ex19 >> `-tee >> I've tested petsc-3.7.5.tar.gz and the latest sources on the Git >> repository. >> >> >> All make is doing is running ex19, which you can do by hand. What do you >> get for >> >> cd $PETSC_DIR >> cd src/snes/examples/tutorials >> make ex19 >> mpiexec -n 2 ./ex19 -snes_monitor >> >> Thanks, >> >> Matt >> >> Setup from the Git repo: >> ./configure >> --prefix=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries \ >> --PETSC_ARCH=GCC48 \ >> --PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git \ >> --with-shared-libraries=0 \ >> --with-fortran-interfaces=1 \ >> --with-fortran-kernels=1 \ >> --with-cc=mpicc \ >> --with-fc=mpif90 \ >> --with-cxx=mpicxx >> >> make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git >> PETSC_ARCH=GCC48 all >> >> make PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git >> PETSC_ARCH=GCC48 install >> >> make >> PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries >> PETSC_ARCH="" test >> >> >> In the log file I've just: >> >> Running test examples to verify correct installation >> Using >> PETSC_DIR=/kareline/data/begou/YALES2_1.0.0/PREREQUIS/petsc-git-binaries >> and PETSC_ARCH= >> >> I'm using: >> gcc version 4.8.1 >> Open MPI: 1.7.3 (build with gcc 4.8.1) >> (This environment is in production for a while for many local >> software and works fine) >> >> Any suggestion is welcome >> >> Patrick >> >> -- >> =================================================================== >> | Equipe M.O.S.T. | | >> | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr >> | >> | LEGI | | >> | BP 53 X | Tel 04 76 82 51 35 | >> | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | >> =================================================================== >> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which >> their experiments lead. >> -- Norbert Wiener > > > -- > =================================================================== > | Equipe M.O.S.T. | | > | Patrick BEGOU |mailto:Patrick.Begou at grenoble-inp.fr | > | LEGI | | > | BP 53 X | Tel 04 76 82 51 35 | > | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | > =================================================================== > > -- > What most experimenters take for granted before they begin their experiments > is infinitely more interesting than any results to which their experiments > lead. -- Norbert Wiener -- =================================================================== | Equipe M.O.S.T. | | | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr | | LEGI | | | BP 53 X | Tel 04 76 82 51 35 | | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | =================================================================== -------------- next part -------------- An HTML attachment was scrubbed... URL: From u.rochan at gmail.com Fri Jan 6 08:52:05 2017 From: u.rochan at gmail.com (Rochan Upadhyay) Date: Fri, 6 Jan 2017 08:52:05 -0600 Subject: [petsc-users] a question on DMPlexSetAnchors In-Reply-To: References: Message-ID: Constraints come from so-called cohomology conditions. In practical applications, they arise when you couple field models (e.g. Maxwell's equations) with lumped models (e.g. circuit equations). They are described in this paper : http://gmsh.info/doc/preprints/gmsh_homology_preprint.pdf In their matrix in page 12 all rows and columns involving the terms , , <*,E1> and, <*,E2> are non-local. That is because the "cohomology" basis functions E1 and E2, are sums of basis functions defined on all points contained in a group of cells. I guess this structure will kill the performance of most existing preconditioners but I would like to initially look at smallish problems. On Thu, Jan 5, 2017 at 8:40 PM, Matthew Knepley wrote: > On Thu, Jan 5, 2017 at 6:35 PM, Rochan Upadhyay > wrote: > >> Thanks for prompt reply. I don't need hanging nodes or Dirichlet >> conditions which can >> be easily done by adding constraint DoFs in the Section as you mention. >> My requirement is the following: >> >>> Constraints among Fields: >> >>> I would recommend just putting the constraint in as an equation. In >> your case the effect can >> >>> be non-local, so this seems like the best strategy. >> The constraint dof is described by an equation. In fact I have easily >> set up residuals for the system. My (perceived) difficulties are in the >> Jacobian. My additional >> Dof is a scalar quantity that is not physically tied to any specific >> point but needs to be solved tightly coupled >> to a FEM system. In order to use the global section (default section for >> the FEM system) >> to fill up the Mats and Vecs, I have artificially appended this extra dof >> to a particular point. >> Now in the Jacobian matrix there will be one extra row and column that, >> once filled, should be dense >> (rather block dense) due to the non-local dependence of this extra Dof on >> field values at some other points. >> > > Now, if you want good performance, you have to describe the constraint in > terms of the topology. All our DMs > are setup for local equations. Nonlocal equations are not correctly > preallocated. You can > > a) Just turn off checking for proper preallocation, MatSetOption(A, > MAT_NEW_NONZERO_LOCATION_ERR, PETSC_FALSE) > > b) Do the preallocation yourself > > If instead, the pattern "fits inside" a common pattern described by these > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/ > DMPlexGetAdjacencyUseClosure.html > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/DM/ > DMPlexSetAdjacencyUseCone.html > > you can just use that. > > What creates your constraints? > > Matt > > My question is once the DM has allocated non-zeros for the matrix (based >> on the given section) would it be >> possible to add non-zeros in non-standard locations (namely a few dense >> sub-rows and sub-columns) in a way >> that does not destroy performance. Does using the built in routine >> DMSetDefaultConstraint (or for that >> matter the DMPlexSetAnchors) create another (separate) constraint matrix >> that presumably does an efficient job >> of incorporating these additional non-zeros ? Or does this Constraint >> matrix only come in during the DMLocalToGLobal >> (& vice versa) calls as mentioned in the documentation ? >> I appreciate your reading through my rather verbose mail, especially >> considering the numerous other queries that >> you receive each day. >> Thanks. >> >> On Wed, Jan 4, 2017 at 5:59 PM, Matthew Knepley >> wrote: >> >>> On Tue, Jan 3, 2017 at 4:02 PM, Rochan Upadhyay >>> wrote: >>> >>>> I think I sent my previous question (on Dec 28th) to the wrong place >>>> (petsc-users-request at mcs.anl.gov). >>>> >>> >>> Yes, this is the correct mailing list. >>> >>> >>>> To repeat, >>>> >>>> I am having bit of a difficulty in understanding the introduction of >>>> constraints in DMPlex. From a quick study of the User Manual I gather >>>> that it is easiest done using DMPlexSetAnchors ? The description of >>>> this >>>> routine says that there is an anchorIS that specifies the anchor points >>>> (rows in the >>>> matrix). This is okay and easily understood. >>>> >>> >>> I think this is not the right mechanism for you. >>> >>> Anchors: >>> >>> This is intended for constraints in the discretization, such as hanging >>> nodes, which are >>> purely local, and intended to take place across the entire domain. That >>> determines the >>> interface. >>> >>> Dirichlet Boundary Conditions: >>> >>> For these, I would recommend using the Constraint interface in >>> PetscSection, which >>> eliminates these unknowns from the global system, but includes the >>> values in the local >>> vectors used in assembly. >>> >>> You can also just alter your equations for constrained unknowns. >>> >>> Constraints among Fields: >>> >>> I would recommend just putting the constraint in as an equation. In your >>> case the effect can >>> be non-local, so this seems like the best strategy. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> There is also an anchorSection which is described as a map from >>>> constraint points >>>> (columns ?) to the anchor points listed in the anchorIS. Should this >>>> not be a map between >>>> solution indices (i.e. indices appearing in the vectors and matrices) ? >>>> >>>> For example I am completely unable to set up a simple constraint matrix >>>> for the following (say): >>>> >>>> Point 1, Field A, B >>>> Point 2-10 Field A >>>> At point 1, Field B depends on Field A at points 1-10 >>>> >>>> When I set it up it appears to create a matrix where field A depends on >>>> field A values at points 1-10. >>>> >>>> How does the mapping work in this case ? Will the DMPlexSetAnchors() >>>> routine work >>>> for this simple scenario ? >>>> >>>> If not, is the only recourse to create the constraint matrix oneself >>>> using DMSetDefaultConstraints ? >>>> >>>> Also documentation for DMSetDefaultConstraints is incomplete. >>>> The function accepts three arguments (dm, section and Mat) but >>>> what the section is is not described at all. >>>> >>>> I don't know if my question makes any sense. If it does not then it is >>>> only a reflection of my utter confusion regarding the routine >>>> DMPlexSetAnchors :-( >>>> >>>> Regards, >>>> Rochan >>>> >>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From arne.morten.kvarving at sintef.no Fri Jan 6 09:03:20 2017 From: arne.morten.kvarving at sintef.no (Arne Morten Kvarving) Date: Fri, 6 Jan 2017 16:03:20 +0100 Subject: [petsc-users] malconfigured gamg Message-ID: <735d76e6-3875-05f1-4f2d-1ab0158d2846@sintef.no> hi, first, this was an user error and i totally acknowledge this, but i wonder if this might be an oversight in your error checking: if you configure gamg with ilu/asm smoothing, and are stupid enough to have set the number of smoother cycles to 0, your program churns along and apparently converges just fine (towards garbage, but apparently 'sane' garbage (not 0, not nan, not inf)) once i set sor as smoother, i got the error message 'PETSC ERROR: Relaxation requires global its 0 positive' which pointed me to my stupid. fixing this made both asm and sor work fine. it's all wrapped up in a schur/fieldsplit (it's P2/P1 navier-stokes), constructed by hand due to "surrounding" reasons. but i don't think that's relevant as such. i've used 3.6.4 as the oldest and 3.7.4 as the newest version and behavior was the same. if you want logs et al don't hesitate to ask for them, but i do not think they would add much. cheers arnem From jed at jedbrown.org Wed Jan 4 13:46:11 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 04 Jan 2017 12:46:11 -0700 Subject: [petsc-users] TSPseudo overriding SNES iterations In-Reply-To: References: Message-ID: <87inpu8v7g.fsf@jedbrown.org> "Mark W. Lohry" writes: > I have an unsteady problem I'm trying to solve for steady state. The regular time-accurate stepping works fine (uses around 5 Newton iterations with 100 krylov iterations each per time step) with beuler stepping. > > > But when changing only TSType to pseudo it looks like SNES max iterations is getting set to 1, and each pseduo time step then only does a single Newton step and then throws SNES CONVERGED_ITS 1 despite setting snessettolerances to allow 50 Newton steps. Pseudotransient continuation (as described in the cited papers and implemented in TSPseudo) does only one Newton step. In practice, converging the Newton solver on each pseudo-time step usually costs more. You can change -snes_type (TSPseudo defaults it to KSPONLY). See -ts_view for details. > I'm trying to use all the same configuration here that works for backward Euler, but just continually increase the step size each time step. What am I missing here? > > Thanks, > Mark -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From Patrick.Begou at legi.grenoble-inp.fr Fri Jan 6 10:08:55 2017 From: Patrick.Begou at legi.grenoble-inp.fr (Patrick Begou) Date: Fri, 6 Jan 2017 17:08:55 +0100 Subject: [petsc-users] [SOLVED] make test freeze In-Reply-To: References: <586E3CBF.9020605@legi.grenoble-inp.fr> <586F57CB.2030708@legi.grenoble-inp.fr> Message-ID: <586FC117.1030106@legi.grenoble-inp.fr> Hi Matthew, Using the debuguer I finaly found the problem. It is related to MPI. In src/sys/objects/pinit.c line 779, petsc test the availability of PETSC_HAVE_MPI_INIT_THREAD and this is set to True beccause my OpenMPI version is compiled with --enable-mpi-thread-multiple. However the call to MPI_Init_thread(argc,args,MPI_THREAD_FUNNELED,&provided) hangs and freeze the application. Unsetting PETSC_HAVE_MPI_INIT_THREAD in petsc solves the problem. I remember a HPC seminar on BULLX systems where they give us some informations about known problems whith the MPI_Init_thread call in openMPI. May be I should use a more recent version of OpenMPI. This also explain why I had this problem with previous versions of petsc (same OpenMPI environment). None of our codes mixes OpenMP and MPI.... so I never fall in this situation in production and petsc always behave fine (excepted for make test). Is there a way to turn off PETSC_HAVE_MPI_INIT_THREAD at configure time for Petsc ? I've manualy removed it in the generated petscconf.h file before compiling Petsc in debug mode but I don't think it is the best way to to this..... Thanks for your help in running separatly the test codes. Patrick -- =================================================================== | Equipe M.O.S.T. | | | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr | | LEGI | | | BP 53 X | Tel 04 76 82 51 35 | | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | =================================================================== From balay at mcs.anl.gov Fri Jan 6 10:11:37 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 6 Jan 2017 10:11:37 -0600 Subject: [petsc-users] problems after glibc upgrade to 2.17-157 In-Reply-To: <1483691323190.12133@marin.nl> References: <1483455899357.86673@marin.nl> <1483525971870.90369@marin.nl> <1483537021972.79609@marin.nl> <1483541610596.23482@marin.nl> <1483542319358.30478@marin.nl> <1483543177535.33715@marin.nl> <1483605469330.79182@marin.nl> , <1483691323190.12133@marin.nl> Message-ID: On Fri, 6 Jan 2017, Klaij, Christiaan wrote: > Satish, > > Our sysadmin is not keen on downgrading glibc. sure > I'll stick with "--with-shared-libraries=0" for now thats fine. > and wait for SL7.3 with intel 17. Well they are not related so if you can - you should upgrade to intel-17 [irrespective SL7.2 or 7.3] > Thanks for filing the bugreport at RHEL, very curious to see their response. There is a response with reference to previous report https://bugzilla.redhat.com/show_bug.cgi?id=1377895 [with reference to libintlc.so.5 as the cause - its using some wierd/invalid format] A workarround for existing intel-16 precompiled binaries is mentioned. LD_PRELOAD=/lib64/libc.so.6:/opt/intel/16.0/compiler/lib/intel64/libintlc.so.5 icc-compiled-binary This does not help with PETSc build - which uses non-icc binaries during the build. [they break with this LD_PRELOAD] If intel-16 is really required - I found that replacing libintlc.so.5 in intel-16 with the one from intel-17 lets me use intel-16 compilers [for ex: building PETSc]. I haven't tested this extensively to see if there are any breakages with this approach.. Satish From knepley at gmail.com Fri Jan 6 10:15:47 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 6 Jan 2017 10:15:47 -0600 Subject: [petsc-users] a question on DMPlexSetAnchors In-Reply-To: References: Message-ID: On Fri, Jan 6, 2017 at 8:52 AM, Rochan Upadhyay wrote: > Constraints come from so-called cohomology conditions. In practical > applications, > they arise when you couple field models (e.g. Maxwell's equations) with > lumped > models (e.g. circuit equations). They are described in this paper : > http://gmsh.info/doc/preprints/gmsh_homology_preprint.pdf > This looks interesting, but I wish they were more explicit about what was actually being solved. > In their matrix in page 12 all rows and columns involving the terms > , , > <*,E1> and, <*,E2> are non-local. That is because the "cohomology" basis > functions > E1 and E2, are sums of basis functions defined on all points contained in > a group of cells. > Okay, then to me it looks like you have M + L = / A 0 \ \ 0 I / where M is a sparse, block diagonal matrix (maybe you do not have the I), and L is low-rank. You can certainly lay this out with a Section by having 4 fields. It will almost certainly be that the Jacobian layout is wrong due to the non-locality, but you can turn off checking so that you can insert new nonzeros using MatSetOption(M, MAT_NEW_NONZERO_LOCATION_ERR, PETSC_FALSE). Does this make sense? I guess this structure will kill the performance of most existing > preconditioners but I > would like to initially look at smallish problems. > Yes, it will kill performance unless we treat the matrix as M + L, which you can do using the MatLRC type. However, we can postpone that until you have everything working and want to get bigger. Also, integral constraints can sometimes be handled using fast methods for integral equations. Thanks, Matt > On Thu, Jan 5, 2017 at 8:40 PM, Matthew Knepley wrote: > >> On Thu, Jan 5, 2017 at 6:35 PM, Rochan Upadhyay >> wrote: >> >>> Thanks for prompt reply. I don't need hanging nodes or Dirichlet >>> conditions which can >>> be easily done by adding constraint DoFs in the Section as you mention. >>> My requirement is the following: >>> >>> Constraints among Fields: >>> >>> I would recommend just putting the constraint in as an equation. In >>> your case the effect can >>> >>> be non-local, so this seems like the best strategy. >>> The constraint dof is described by an equation. In fact I have easily >>> set up residuals for the system. My (perceived) difficulties are in the >>> Jacobian. My additional >>> Dof is a scalar quantity that is not physically tied to any specific >>> point but needs to be solved tightly coupled >>> to a FEM system. In order to use the global section (default section for >>> the FEM system) >>> to fill up the Mats and Vecs, I have artificially appended this extra >>> dof to a particular point. >>> Now in the Jacobian matrix there will be one extra row and column that, >>> once filled, should be dense >>> (rather block dense) due to the non-local dependence of this extra Dof >>> on field values at some other points. >>> >> >> Now, if you want good performance, you have to describe the constraint in >> terms of the topology. All our DMs >> are setup for local equations. Nonlocal equations are not correctly >> preallocated. You can >> >> a) Just turn off checking for proper preallocation, MatSetOption(A, >> MAT_NEW_NONZERO_LOCATION_ERR, PETSC_FALSE) >> >> b) Do the preallocation yourself >> >> If instead, the pattern "fits inside" a common pattern described by these >> >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpage >> s/DM/DMPlexGetAdjacencyUseClosure.html >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpage >> s/DM/DMPlexSetAdjacencyUseCone.html >> >> you can just use that. >> >> What creates your constraints? >> >> Matt >> >> My question is once the DM has allocated non-zeros for the matrix (based >>> on the given section) would it be >>> possible to add non-zeros in non-standard locations (namely a few dense >>> sub-rows and sub-columns) in a way >>> that does not destroy performance. Does using the built in routine >>> DMSetDefaultConstraint (or for that >>> matter the DMPlexSetAnchors) create another (separate) constraint matrix >>> that presumably does an efficient job >>> of incorporating these additional non-zeros ? Or does this Constraint >>> matrix only come in during the DMLocalToGLobal >>> (& vice versa) calls as mentioned in the documentation ? >>> I appreciate your reading through my rather verbose mail, especially >>> considering the numerous other queries that >>> you receive each day. >>> Thanks. >>> >>> On Wed, Jan 4, 2017 at 5:59 PM, Matthew Knepley >>> wrote: >>> >>>> On Tue, Jan 3, 2017 at 4:02 PM, Rochan Upadhyay >>>> wrote: >>>> >>>>> I think I sent my previous question (on Dec 28th) to the wrong place >>>>> (petsc-users-request at mcs.anl.gov). >>>>> >>>> >>>> Yes, this is the correct mailing list. >>>> >>>> >>>>> To repeat, >>>>> >>>>> I am having bit of a difficulty in understanding the introduction of >>>>> constraints in DMPlex. From a quick study of the User Manual I gather >>>>> that it is easiest done using DMPlexSetAnchors ? The description of >>>>> this >>>>> routine says that there is an anchorIS that specifies the anchor >>>>> points (rows in the >>>>> matrix). This is okay and easily understood. >>>>> >>>> >>>> I think this is not the right mechanism for you. >>>> >>>> Anchors: >>>> >>>> This is intended for constraints in the discretization, such as hanging >>>> nodes, which are >>>> purely local, and intended to take place across the entire domain. That >>>> determines the >>>> interface. >>>> >>>> Dirichlet Boundary Conditions: >>>> >>>> For these, I would recommend using the Constraint interface in >>>> PetscSection, which >>>> eliminates these unknowns from the global system, but includes the >>>> values in the local >>>> vectors used in assembly. >>>> >>>> You can also just alter your equations for constrained unknowns. >>>> >>>> Constraints among Fields: >>>> >>>> I would recommend just putting the constraint in as an equation. In >>>> your case the effect can >>>> be non-local, so this seems like the best strategy. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> There is also an anchorSection which is described as a map from >>>>> constraint points >>>>> (columns ?) to the anchor points listed in the anchorIS. Should this >>>>> not be a map between >>>>> solution indices (i.e. indices appearing in the vectors and matrices) ? >>>>> >>>>> For example I am completely unable to set up a simple constraint >>>>> matrix for the following (say): >>>>> >>>>> Point 1, Field A, B >>>>> Point 2-10 Field A >>>>> At point 1, Field B depends on Field A at points 1-10 >>>>> >>>>> When I set it up it appears to create a matrix where field A depends >>>>> on field A values at points 1-10. >>>>> >>>>> How does the mapping work in this case ? Will the DMPlexSetAnchors() >>>>> routine work >>>>> for this simple scenario ? >>>>> >>>>> If not, is the only recourse to create the constraint matrix oneself >>>>> using DMSetDefaultConstraints ? >>>>> >>>>> Also documentation for DMSetDefaultConstraints is incomplete. >>>>> The function accepts three arguments (dm, section and Mat) but >>>>> what the section is is not described at all. >>>>> >>>>> I don't know if my question makes any sense. If it does not then it is >>>>> only a reflection of my utter confusion regarding the routine >>>>> DMPlexSetAnchors :-( >>>>> >>>>> Regards, >>>>> Rochan >>>>> >>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Fri Jan 6 10:24:24 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 6 Jan 2017 10:24:24 -0600 Subject: [petsc-users] [SOLVED] make test freeze In-Reply-To: <586FC117.1030106@legi.grenoble-inp.fr> References: <586E3CBF.9020605@legi.grenoble-inp.fr> <586F57CB.2030708@legi.grenoble-inp.fr> <586FC117.1030106@legi.grenoble-inp.fr> Message-ID: On Fri, Jan 6, 2017 at 10:08 AM, Patrick Begou < Patrick.Begou at legi.grenoble-inp.fr> wrote: > Hi Matthew, > > Using the debuguer I finaly found the problem. It is related to MPI. In > src/sys/objects/pinit.c line 779, petsc test the availability of > PETSC_HAVE_MPI_INIT_THREAD and this is set to True beccause my OpenMPI > version is compiled with --enable-mpi-thread-multiple. > However the call to MPI_Init_thread(argc,args,MPI_THREAD_FUNNELED,&provided) > hangs and freeze the application. > > Unsetting PETSC_HAVE_MPI_INIT_THREAD in petsc solves the problem. > > I remember a HPC seminar on BULLX systems where they give us some > informations about known problems whith the MPI_Init_thread call in > openMPI. May be I should use a more recent version of OpenMPI. This also > explain why I had this problem with previous versions of petsc (same > OpenMPI environment). > None of our codes mixes OpenMP and MPI.... so I never fall in this > situation in production and petsc always behave fine (excepted for make > test). > Thanks for finding this. > Is there a way to turn off PETSC_HAVE_MPI_INIT_THREAD at configure time > for Petsc ? I've manualy removed it in the generated petscconf.h file > before compiling Petsc in debug mode but I don't think it is the best way > to to this..... > Right now, no. I will talk to everyone to figure out the best solution and update our release tarball. Thanks, Matt > Thanks for your help in running separatly the test codes. > > Patrick > > -- > =================================================================== > | Equipe M.O.S.T. | | > | Patrick BEGOU | mailto:Patrick.Begou at grenoble-inp.fr | > | LEGI | | > | BP 53 X | Tel 04 76 82 51 35 | > | 38041 GRENOBLE CEDEX | Fax 04 76 82 52 71 | > =================================================================== > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Jan 6 10:52:11 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 6 Jan 2017 10:52:11 -0600 Subject: [petsc-users] Fieldsplit with sub pc MUMPS in parallel In-Reply-To: References: <38F933FD-80B5-401A-8DD3-7C14E0B301F7@mcs.anl.gov> <77441AAA-D70B-449F-A330-B417B6DA64A6@mcs.anl.gov> Message-ID: <16EDEDE2-AD95-4A86-8391-0CAF798474C4@mcs.anl.gov> Great, you should be now about to remove the extra options I had you add. > -fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right) > On Jan 6, 2017, at 5:17 AM, Karin&NiKo wrote: > > Barry, > > you are goddamn right - there was something wrong with the numbering. I fixed it and look what I get. The residuals of outer iterations are exactly the same. > > Thanks again for your insight and perseverance. > > Nicolas > > 2017-01-05 20:17 GMT+01:00 Barry Smith : > > This is not good. Something is out of whack. > > First run 1 and 2 processes with -ksp_view_mat binary -ksp_view_rhs binary in each case this will generate a file called binaryoutput . Send both files to petsc-maint at mcs.anl.gov I want to confirm that the matrices are the same in both cases. > > Barry > > > On Jan 5, 2017, at 10:36 AM, Karin&NiKo wrote: > > > > Dave, > > > > Indeed the residual histories differ. Concerning the IS's, I have checked them on small cases, so that I am quite sure they are OK. > > What could I do with PETSc to evaluate the ill-conditioning of the system or of the sub-systems? > > > > Thanks again for your help, > > Nicolas > > > > 2017-01-05 15:46 GMT+01:00 Barry Smith : > > > > > On Jan 5, 2017, at 5:58 AM, Dave May wrote: > > > > > > Do you now see identical residual histories for a job using 1 rank and 4 ranks? > > > > Please send the residual histories with the extra options, I'm curious too, because a Krylov method should not be needed in the inner solve, I just asked for it so we can see what the residuals look like. > > > > Barry > > > > > > > > If not, I am inclined to believe that the IS's you are defining for the splits in the parallel case are incorrect. The operator created to approximate the Schur complement with selfp should not depend on the number of ranks. > > > > > > Or possibly your problem is horribly I'll-conditioned. If it is, then this could result in slightly different residual histories when using different numbers of ranks - even if the operators are in fact identical > > > > > > > > > Thanks, > > > Dave > > > > > > > > > > > > > > > On Thu, 5 Jan 2017 at 12:14, Karin&NiKo wrote: > > > Dear Barry, dear Dave, > > > > > > THANK YOU! > > > You two pointed out the right problem.By using the options you provided (-fieldsplit_0_ksp_type gmres -fieldsplit_0_ksp_pc_side right -fieldsplit_1_ksp_type gmres -fieldsplit_1_ksp_pc_side right), the solver converges in 3 iterations whatever the size of the communicator. > > > All the trick is in the precise resolution of the Schur complement, by using a Krylov method (and not only preonly) *and* applying the preconditioner on the right (so evaluating the convergence on the unpreconditioned residual). > > > > > > @Barry : the difference you see on the nonzero allocations for the different runs is just an artefact : when using more than one proc, we slighly over-estimate the number of non-zero terms. If I run the same problem with the -info option, I get extra information : > > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 110; storage space: 0 unneeded,5048 used > > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 271; storage space: 4249 unneeded,26167 used > > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 307; storage space: 7988 unneeded,31093 used > > > [2] MatAssemblyEnd_SeqAIJ(): Matrix size: 110 X 244; storage space: 0 unneeded,6194 used > > > [1] MatAssemblyEnd_SeqAIJ(): Matrix size: 271 X 233; storage space: 823 unneeded,9975 used > > > [0] MatAssemblyEnd_SeqAIJ(): Matrix size: 307 X 197; storage space: 823 unneeded,8263 used > > > And 5048+26167+31093+6194+9975+8263=86740 which is the number of exactly estimated nonzero terms for 1 proc. > > > > > > > > > Thank you again! > > > > > > Best regards, > > > Nicolas > > > > > > > > > 2017-01-05 1:36 GMT+01:00 Barry Smith : > > > > > > > > > > > > There is something wrong with your set up. > > > > > > > > > > > > > > > > > > 1 process > > > > > > > > > > > > > > > > > > total: nonzeros=140616, allocated nonzeros=140616 > > > > > > > > > total: nonzeros=68940, allocated nonzeros=68940 > > > > > > > > > total: nonzeros=3584, allocated nonzeros=3584 > > > > > > > > > total: nonzeros=1000, allocated nonzeros=1000 > > > > > > > > > total: nonzeros=8400, allocated nonzeros=8400 > > > > > > > > > > > > > > > > > > 2 processes > > > > > > > > > total: nonzeros=146498, allocated nonzeros=146498 > > > > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > > > > total: nonzeros=3038, allocated nonzeros=3038 > > > > > > > > > total: nonzeros=1110, allocated nonzeros=1110 > > > > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > > > > total: nonzeros=146498, allocated nonzeros=146498 > > > > > > > > > total: nonzeros=73470, allocated nonzeros=73470 > > > > > > > > > total: nonzeros=6080, allocated nonzeros=6080 > > > > > > > > > total: nonzeros=2846, allocated nonzeros=2846 > > > > > > > > > total: nonzeros=86740, allocated nonzeros=94187 > > > > > > > > > > > > > > > > > > It looks like you are setting up the problem differently in parallel and seq. If it is suppose to be an identical problem then the number nonzeros should be the same in at least the first two matrices. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Jan 4, 2017, at 3:39 PM, Karin&NiKo wrote: > > > > > > > > > > > > > > > > > > > > Dear Petsc team, > > > > > > > > > > > > > > > > > > > > I am (still) trying to solve Biot's poroelasticity problem : > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > I am using a mixed P2-P1 finite element discretization. The matrix of the discretized system in binary format is attached to this email. > > > > > > > > > > > > > > > > > > > > I am using the fieldsplit framework to solve the linear system. Since I am facing some troubles, I have decided to go back to simple things. Here are the options I am using : > > > > > > > > > > > > > > > > > > > > -ksp_rtol 1.0e-5 > > > > > > > > > > -ksp_type fgmres > > > > > > > > > > -pc_type fieldsplit > > > > > > > > > > -pc_fieldsplit_schur_factorization_type full > > > > > > > > > > -pc_fieldsplit_type schur > > > > > > > > > > -pc_fieldsplit_schur_precondition selfp > > > > > > > > > > -fieldsplit_0_pc_type lu > > > > > > > > > > -fieldsplit_0_pc_factor_mat_solver_package mumps > > > > > > > > > > -fieldsplit_0_ksp_type preonly > > > > > > > > > > -fieldsplit_0_ksp_converged_reason > > > > > > > > > > -fieldsplit_1_pc_type lu > > > > > > > > > > -fieldsplit_1_pc_factor_mat_solver_package mumps > > > > > > > > > > -fieldsplit_1_ksp_type preonly > > > > > > > > > > -fieldsplit_1_ksp_converged_reason > > > > > > > > > > > > > > > > > > > > On a single proc, everything runs fine : the solver converges in 3 iterations, according to the theory (see Run-1-proc.txt [contains -log_view]). > > > > > > > > > > > > > > > > > > > > On 2 procs, the solver converges in 28 iterations (see Run-2-proc.txt). > > > > > > > > > > > > > > > > > > > > On 3 procs, the solver converges in 91 iterations (see Run-3-proc.txt). > > > > > > > > > > > > > > > > > > > > I do not understand this behavior : since MUMPS is a parallel direct solver, shouldn't the solver converge in max 3 iterations whatever the number of procs? > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > Thanks for your precious help, > > > > > > > > > > Nicolas > > > > > > > > > > > > > > > > > > > > <1_Warning.txt> > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From u.rochan at gmail.com Fri Jan 6 12:22:29 2017 From: u.rochan at gmail.com (Rochan Upadhyay) Date: Fri, 6 Jan 2017 12:22:29 -0600 Subject: [petsc-users] a question on DMPlexSetAnchors In-Reply-To: References: Message-ID: Yes, the option MatSetOption(M, MAT_NEW_NONZERO_LOCATION_ERR, PETSC_FALSE) seems to be the path of least resistance. Especially as it is something I am doing out of my own curiosity and not part of anything larger. I might have to bug you again very soon on how to optimize or move forward based on how things turn out after a first run. Thanks. Regards, Rochan On Fri, Jan 6, 2017 at 10:15 AM, Matthew Knepley wrote: > On Fri, Jan 6, 2017 at 8:52 AM, Rochan Upadhyay > wrote: > >> Constraints come from so-called cohomology conditions. In practical >> applications, >> they arise when you couple field models (e.g. Maxwell's equations) with >> lumped >> models (e.g. circuit equations). They are described in this paper : >> http://gmsh.info/doc/preprints/gmsh_homology_preprint.pdf >> > > This looks interesting, but I wish they were more explicit about what was > actually being solved. > > >> In their matrix in page 12 all rows and columns involving the terms >> , , >> <*,E1> and, <*,E2> are non-local. That is because the "cohomology" basis >> functions >> E1 and E2, are sums of basis functions defined on all points contained in >> a group of cells. >> > > Okay, then to me it looks like you have > > M + L = / A 0 \ > \ 0 I / > > where M is a sparse, block diagonal matrix (maybe you do not have the I), > and L is low-rank. You > can certainly lay this out with a Section by having 4 fields. It will > almost certainly be that the > Jacobian layout is wrong due to the non-locality, but you can turn off > checking so that you can insert > new nonzeros using MatSetOption(M, MAT_NEW_NONZERO_LOCATION_ERR, > PETSC_FALSE). > Does this make sense? > > I guess this structure will kill the performance of most existing >> preconditioners but I >> would like to initially look at smallish problems. >> > > Yes, it will kill performance unless we treat the matrix as M + L, which > you can do using the MatLRC > type. However, we can postpone that until you have everything working and > want to get bigger. Also, > integral constraints can sometimes be handled using fast methods for > integral equations. > > Thanks, > > Matt > > >> On Thu, Jan 5, 2017 at 8:40 PM, Matthew Knepley >> wrote: >> >>> On Thu, Jan 5, 2017 at 6:35 PM, Rochan Upadhyay >>> wrote: >>> >>>> Thanks for prompt reply. I don't need hanging nodes or Dirichlet >>>> conditions which can >>>> be easily done by adding constraint DoFs in the Section as you mention. >>>> My requirement is the following: >>>> >>> Constraints among Fields: >>>> >>> I would recommend just putting the constraint in as an equation. In >>>> your case the effect can >>>> >>> be non-local, so this seems like the best strategy. >>>> The constraint dof is described by an equation. In fact I have easily >>>> set up residuals for the system. My (perceived) difficulties are in the >>>> Jacobian. My additional >>>> Dof is a scalar quantity that is not physically tied to any specific >>>> point but needs to be solved tightly coupled >>>> to a FEM system. In order to use the global section (default section >>>> for the FEM system) >>>> to fill up the Mats and Vecs, I have artificially appended this extra >>>> dof to a particular point. >>>> Now in the Jacobian matrix there will be one extra row and column that, >>>> once filled, should be dense >>>> (rather block dense) due to the non-local dependence of this extra Dof >>>> on field values at some other points. >>>> >>> >>> Now, if you want good performance, you have to describe the constraint >>> in terms of the topology. All our DMs >>> are setup for local equations. Nonlocal equations are not correctly >>> preallocated. You can >>> >>> a) Just turn off checking for proper preallocation, MatSetOption(A, >>> MAT_NEW_NONZERO_LOCATION_ERR, PETSC_FALSE) >>> >>> b) Do the preallocation yourself >>> >>> If instead, the pattern "fits inside" a common pattern described by these >>> >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpage >>> s/DM/DMPlexGetAdjacencyUseClosure.html >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpage >>> s/DM/DMPlexSetAdjacencyUseCone.html >>> >>> you can just use that. >>> >>> What creates your constraints? >>> >>> Matt >>> >>> My question is once the DM has allocated non-zeros for the matrix (based >>>> on the given section) would it be >>>> possible to add non-zeros in non-standard locations (namely a few dense >>>> sub-rows and sub-columns) in a way >>>> that does not destroy performance. Does using the built in routine >>>> DMSetDefaultConstraint (or for that >>>> matter the DMPlexSetAnchors) create another (separate) constraint >>>> matrix that presumably does an efficient job >>>> of incorporating these additional non-zeros ? Or does this Constraint >>>> matrix only come in during the DMLocalToGLobal >>>> (& vice versa) calls as mentioned in the documentation ? >>>> I appreciate your reading through my rather verbose mail, especially >>>> considering the numerous other queries that >>>> you receive each day. >>>> Thanks. >>>> >>>> On Wed, Jan 4, 2017 at 5:59 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Tue, Jan 3, 2017 at 4:02 PM, Rochan Upadhyay >>>>> wrote: >>>>> >>>>>> I think I sent my previous question (on Dec 28th) to the wrong place >>>>>> (petsc-users-request at mcs.anl.gov). >>>>>> >>>>> >>>>> Yes, this is the correct mailing list. >>>>> >>>>> >>>>>> To repeat, >>>>>> >>>>>> I am having bit of a difficulty in understanding the introduction of >>>>>> constraints in DMPlex. From a quick study of the User Manual I gather >>>>>> that it is easiest done using DMPlexSetAnchors ? The description of >>>>>> this >>>>>> routine says that there is an anchorIS that specifies the anchor >>>>>> points (rows in the >>>>>> matrix). This is okay and easily understood. >>>>>> >>>>> >>>>> I think this is not the right mechanism for you. >>>>> >>>>> Anchors: >>>>> >>>>> This is intended for constraints in the discretization, such as >>>>> hanging nodes, which are >>>>> purely local, and intended to take place across the entire domain. >>>>> That determines the >>>>> interface. >>>>> >>>>> Dirichlet Boundary Conditions: >>>>> >>>>> For these, I would recommend using the Constraint interface in >>>>> PetscSection, which >>>>> eliminates these unknowns from the global system, but includes the >>>>> values in the local >>>>> vectors used in assembly. >>>>> >>>>> You can also just alter your equations for constrained unknowns. >>>>> >>>>> Constraints among Fields: >>>>> >>>>> I would recommend just putting the constraint in as an equation. In >>>>> your case the effect can >>>>> be non-local, so this seems like the best strategy. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> There is also an anchorSection which is described as a map from >>>>>> constraint points >>>>>> (columns ?) to the anchor points listed in the anchorIS. Should this >>>>>> not be a map between >>>>>> solution indices (i.e. indices appearing in the vectors and matrices) >>>>>> ? >>>>>> >>>>>> For example I am completely unable to set up a simple constraint >>>>>> matrix for the following (say): >>>>>> >>>>>> Point 1, Field A, B >>>>>> Point 2-10 Field A >>>>>> At point 1, Field B depends on Field A at points 1-10 >>>>>> >>>>>> When I set it up it appears to create a matrix where field A depends >>>>>> on field A values at points 1-10. >>>>>> >>>>>> How does the mapping work in this case ? Will the DMPlexSetAnchors() >>>>>> routine work >>>>>> for this simple scenario ? >>>>>> >>>>>> If not, is the only recourse to create the constraint matrix oneself >>>>>> using DMSetDefaultConstraints ? >>>>>> >>>>>> Also documentation for DMSetDefaultConstraints is incomplete. >>>>>> The function accepts three arguments (dm, section and Mat) but >>>>>> what the section is is not described at all. >>>>>> >>>>>> I don't know if my question makes any sense. If it does not then it is >>>>>> only a reflection of my utter confusion regarding the routine >>>>>> DMPlexSetAnchors :-( >>>>>> >>>>>> Regards, >>>>>> Rochan >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Fri Jan 6 14:24:00 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Fri, 6 Jan 2017 12:24:00 -0800 Subject: [petsc-users] Best way to scatter a Seq vector ? In-Reply-To: <7C6E6D1D-AA28-4889-A647-40AB8FA4ED3C@mcs.anl.gov> References: <7C6E6D1D-AA28-4889-A647-40AB8FA4ED3C@mcs.anl.gov> Message-ID: Great help Barry, i totally had overlooked that option (it is explicit in the vecscatterbegin call help page but not in vecscattercreatetozero, as i read later) So i used that and it works partially, it scatters te values assigned in root but not the rest, if i call vecscatterbegin from outside root it hangs, the code currently look as this: call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) call PetscObjectSetName(bp0, 'bp0:',ierr) if(rankl==0)then call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) print*,"done! " CHKERRQ(ierr) endif ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) ! call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) call PetscBarrier(PETSC_NULL_OBJECT,ierr) call exit() And the output is: (with bp the right answer) Vec Object:bp: 2 MPI processes type: mpi Process [0] 1. 2. Process [1] 4. 3. Vec Object:bp2: 2 MPI processes *(before scatter)* type: mpi Process [0] 0. 0. Process [1] 0. 0. Vec Object:bp0: 1 MPI processes type: seq 1. 2. 4. 3. done! Vec Object:bp2: 2 MPI processes *(after scatter)* type: mpi Process [0] 1. 2. *Process [1]* *0.* *0.* Thanks inmensely for your help, Manuel On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith wrote: > > > On Jan 5, 2017, at 6:21 PM, Manuel Valera wrote: > > > > Hello Devs is me again, > > > > I'm trying to distribute a vector to all called processes, the vector > would be originally in root as a sequential vector and i would like to > scatter it, what would the best call to do this ? > > > > I already know how to gather a distributed vector to root with > VecScatterCreateToZero, this would be the inverse operation, > > Use the same VecScatter object but with SCATTER_REVERSE, not you need > to reverse the two vector arguments as well. > > > > i'm currently trying with VecScatterCreate() and as of now im doing the > following: > > > > > > if(rank==0)then > > > > > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i use > WORLD > > !freezes in > SetSizes > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > call VecSetType(bp0,VECSEQ,ierr) > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) > > > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) > > > > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) !rhs > > > > do i=0,nbdp-1,1 > > ind(i+1) = i > > enddo > > > > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_ > VALUES,locis,ierr) > > > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if i > use SELF > > > !freezes here. > > > > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) > > > > endif > > > > bp2 being the receptor MPI vector to scatter to > > > > But it freezes in VecScatterCreate when trying to use more than one > processor, what would be a better approach ? > > > > > > Thanks once again, > > > > Manuel > > > > > > > > > > > > > > > > > > > > > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera > wrote: > > Thanks i had no idea how to debug and read those logs, that solved this > issue at least (i was sending a message from root to everyone else, but > trying to catch from everyone else including root) > > > > Until next time, many thanks, > > > > Manuel > > > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley > wrote: > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera > wrote: > > I did a PetscBarrier just before calling the vicariate routine and im > pretty sure im calling it from every processor, code looks like this: > > > > From the gdb trace. > > > > Proc 0: Is in some MPI routine you call yourself, line 113 > > > > Proc 1: Is in VecCreate(), line 130 > > > > You need to fix your communication code. > > > > Matt > > > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > > print*,'entering POInit from',rank > > !call exit() > > > > call PetscObjsInit() > > > > > > And output gives: > > > > entering POInit from 0 > > entering POInit from 1 > > entering POInit from 2 > > entering POInit from 3 > > > > > > Still hangs in the same way, > > > > Thanks, > > > > Manuel > > > > > > > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera > wrote: > > Thanks for the answers ! > > > > heres the screenshot of what i got from bt in gdb (great hint in how to > debug in petsc, didn't know that) > > > > I don't really know what to look at here, > > > > Thanks, > > > > Manuel > > > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May > wrote: > > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). > These functions cannot be inside if statements like > > if (rank == 0){ > > VecCreateMPI(...) > > } > > > > > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera > wrote: > > Thanks Dave for the quick answer, appreciate it, > > > > I just tried that and it didn't make a difference, any other suggestions > ? > > > > Thanks, > > Manuel > > > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May > wrote: > > You need to swap the order of your function calls. > > Call VecSetSizes() before VecSetType() > > > > Thanks, > > Dave > > > > > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera > wrote: > > Hello all, happy new year, > > > > I'm working on parallelizing my code, it worked and provided some > results when i just called more than one processor, but created artifacts > because i didn't need one image of the whole program in each processor, > conflicting with each other. > > > > Since the pressure solver is the main part i need in parallel im chosing > mpi to run everything in root processor until its time to solve for > pressure, at this point im trying to create a distributed vector using > either > > > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > > or > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > > call VecSetType(xp,VECMPI,ierr) > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > > > > In both cases program hangs at this point, something it never happened > on the naive way i described before. I've made sure the global size, nbdp, > is the same in every processor. What can be wrong? > > > > Thanks for your kind help, > > > > Manuel. > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > -- Norbert Wiener > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Fri Jan 6 14:44:54 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 6 Jan 2017 20:44:54 +0000 Subject: [petsc-users] Best way to scatter a Seq vector ? In-Reply-To: References: <7C6E6D1D-AA28-4889-A647-40AB8FA4ED3C@mcs.anl.gov> Message-ID: On 6 January 2017 at 20:24, Manuel Valera wrote: > Great help Barry, i totally had overlooked that option (it is explicit in > the vecscatterbegin call help page but not in vecscattercreatetozero, as i > read later) > > So i used that and it works partially, it scatters te values assigned in > root but not the rest, if i call vecscatterbegin from outside root it > hangs, the code currently look as this: > > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) > > > call PetscObjectSetName(bp0, 'bp0:',ierr) > > > if(rankl==0)then > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > You need to call VecAssemblyBegin(bp0); VecAssemblyEnd(bp0); after your last call to VecSetValues() before you can do any operations with bp0. With your current code, the call to VecView should produce an error if you used the error checking macro CHKERRQ(ierr) (as should VecScatter{Begin,End} Thanks, Dave > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > print*,"done! " > > CHKERRQ(ierr) > > > endif > > > ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > ! call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > > call exit() > > > > > And the output is: (with bp the right answer) > > > Vec Object:bp: 2 MPI processes > > type: mpi > > Process [0] > > 1. > > 2. > > Process [1] > > 4. > > 3. > > Vec Object:bp2: 2 MPI processes *(before scatter)* > > type: mpi > > Process [0] > > 0. > > 0. > > Process [1] > > 0. > > 0. > > Vec Object:bp0: 1 MPI processes > > type: seq > > 1. > > 2. > > 4. > > 3. > > done! > > Vec Object:bp2: 2 MPI processes *(after scatter)* > > type: mpi > > Process [0] > > 1. > > 2. > > *Process [1]* > > *0.* > > *0.* > > > > > > Thanks inmensely for your help, > > > Manuel > > > > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith wrote: > >> >> > On Jan 5, 2017, at 6:21 PM, Manuel Valera >> wrote: >> > >> > Hello Devs is me again, >> > >> > I'm trying to distribute a vector to all called processes, the vector >> would be originally in root as a sequential vector and i would like to >> scatter it, what would the best call to do this ? >> > >> > I already know how to gather a distributed vector to root with >> VecScatterCreateToZero, this would be the inverse operation, >> >> Use the same VecScatter object but with SCATTER_REVERSE, not you need >> to reverse the two vector arguments as well. >> >> >> > i'm currently trying with VecScatterCreate() and as of now im doing the >> following: >> > >> > >> > if(rank==0)then >> > >> > >> > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i use >> WORLD >> > !freezes >> in SetSizes >> > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >> > call VecSetType(bp0,VECSEQ,ierr) >> > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >> > >> > >> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >> > >> > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >> > >> > >> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >> > >> > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) >> !rhs >> > >> > do i=0,nbdp-1,1 >> > ind(i+1) = i >> > enddo >> > >> > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_VALUES,l >> ocis,ierr) >> > >> > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if >> i use SELF >> > >> !freezes here. >> > >> > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) >> > >> > endif >> > >> > bp2 being the receptor MPI vector to scatter to >> > >> > But it freezes in VecScatterCreate when trying to use more than one >> processor, what would be a better approach ? >> > >> > >> > Thanks once again, >> > >> > Manuel >> > >> > >> > >> > >> > >> > >> > >> > >> > >> > >> > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera >> wrote: >> > Thanks i had no idea how to debug and read those logs, that solved this >> issue at least (i was sending a message from root to everyone else, but >> trying to catch from everyone else including root) >> > >> > Until next time, many thanks, >> > >> > Manuel >> > >> > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley >> wrote: >> > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera >> wrote: >> > I did a PetscBarrier just before calling the vicariate routine and im >> pretty sure im calling it from every processor, code looks like this: >> > >> > From the gdb trace. >> > >> > Proc 0: Is in some MPI routine you call yourself, line 113 >> > >> > Proc 1: Is in VecCreate(), line 130 >> > >> > You need to fix your communication code. >> > >> > Matt >> > >> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >> > >> > print*,'entering POInit from',rank >> > !call exit() >> > >> > call PetscObjsInit() >> > >> > >> > And output gives: >> > >> > entering POInit from 0 >> > entering POInit from 1 >> > entering POInit from 2 >> > entering POInit from 3 >> > >> > >> > Still hangs in the same way, >> > >> > Thanks, >> > >> > Manuel >> > >> > >> > >> > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera >> wrote: >> > Thanks for the answers ! >> > >> > heres the screenshot of what i got from bt in gdb (great hint in how to >> debug in petsc, didn't know that) >> > >> > I don't really know what to look at here, >> > >> > Thanks, >> > >> > Manuel >> > >> > On Wed, Jan 4, 2017 at 2:39 PM, Dave May >> wrote: >> > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). >> These functions cannot be inside if statements like >> > if (rank == 0){ >> > VecCreateMPI(...) >> > } >> > >> > >> > On Wed, 4 Jan 2017 at 23:34, Manuel Valera >> wrote: >> > Thanks Dave for the quick answer, appreciate it, >> > >> > I just tried that and it didn't make a difference, any other >> suggestions ? >> > >> > Thanks, >> > Manuel >> > >> > On Wed, Jan 4, 2017 at 2:29 PM, Dave May >> wrote: >> > You need to swap the order of your function calls. >> > Call VecSetSizes() before VecSetType() >> > >> > Thanks, >> > Dave >> > >> > >> > On Wed, 4 Jan 2017 at 23:21, Manuel Valera >> wrote: >> > Hello all, happy new year, >> > >> > I'm working on parallelizing my code, it worked and provided some >> results when i just called more than one processor, but created artifacts >> because i didn't need one image of the whole program in each processor, >> conflicting with each other. >> > >> > Since the pressure solver is the main part i need in parallel im >> chosing mpi to run everything in root processor until its time to solve for >> pressure, at this point im trying to create a distributed vector using >> either >> > >> > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >> > or >> > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >> > call VecSetType(xp,VECMPI,ierr) >> > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >> > >> > >> > In both cases program hangs at this point, something it never happened >> on the naive way i described before. I've made sure the global size, nbdp, >> is the same in every processor. What can be wrong? >> > >> > Thanks for your kind help, >> > >> > Manuel. >> > >> > >> > >> > >> > >> > >> > >> > >> > >> > >> > >> > >> > -- >> > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > -- Norbert Wiener >> > >> > >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Fri Jan 6 15:29:01 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Fri, 6 Jan 2017 13:29:01 -0800 Subject: [petsc-users] Best way to scatter a Seq vector ? In-Reply-To: References: <7C6E6D1D-AA28-4889-A647-40AB8FA4ED3C@mcs.anl.gov> Message-ID: Thanks Dave, I think is interesting it never gave an error on this, after adding the vecassembly calls it still shows the same behavior, without complaining, i did: if(rankl==0)then call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); CHKERRQ(ierr) call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) CHKERRQ(ierr) call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) print*,"done! " CHKERRQ(ierr) endif CHKERRQ(ierr) Thanks. On Fri, Jan 6, 2017 at 12:44 PM, Dave May wrote: > > > On 6 January 2017 at 20:24, Manuel Valera wrote: > >> Great help Barry, i totally had overlooked that option (it is explicit in >> the vecscatterbegin call help page but not in vecscattercreatetozero, as i >> read later) >> >> So i used that and it works partially, it scatters te values assigned in >> root but not the rest, if i call vecscatterbegin from outside root it >> hangs, the code currently look as this: >> >> call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >> >> >> call PetscObjectSetName(bp0, 'bp0:',ierr) >> >> >> if(rankl==0)then >> >> >> call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >> >> >> call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >> >> >> > You need to call > > VecAssemblyBegin(bp0); > VecAssemblyEnd(bp0); > after your last call to VecSetValues() before you can do any operations > with bp0. > > With your current code, the call to VecView should produce an error if you > used the error checking macro CHKERRQ(ierr) (as should VecScatter{Begin,End} > > Thanks, > Dave > > >> call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) >> >> call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) >> >> print*,"done! " >> >> CHKERRQ(ierr) >> >> >> endif >> >> >> ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) >> >> ! call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) >> >> >> call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >> >> call PetscBarrier(PETSC_NULL_OBJECT,ierr) >> >> >> >> call exit() >> >> >> >> >> And the output is: (with bp the right answer) >> >> >> Vec Object:bp: 2 MPI processes >> >> type: mpi >> >> Process [0] >> >> 1. >> >> 2. >> >> Process [1] >> >> 4. >> >> 3. >> >> Vec Object:bp2: 2 MPI processes *(before scatter)* >> >> type: mpi >> >> Process [0] >> >> 0. >> >> 0. >> >> Process [1] >> >> 0. >> >> 0. >> >> Vec Object:bp0: 1 MPI processes >> >> type: seq >> >> 1. >> >> 2. >> >> 4. >> >> 3. >> >> done! >> >> Vec Object:bp2: 2 MPI processes *(after scatter)* >> >> type: mpi >> >> Process [0] >> >> 1. >> >> 2. >> >> *Process [1]* >> >> *0.* >> >> *0.* >> >> >> >> >> >> Thanks inmensely for your help, >> >> >> Manuel >> >> >> >> On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith wrote: >> >>> >>> > On Jan 5, 2017, at 6:21 PM, Manuel Valera >>> wrote: >>> > >>> > Hello Devs is me again, >>> > >>> > I'm trying to distribute a vector to all called processes, the vector >>> would be originally in root as a sequential vector and i would like to >>> scatter it, what would the best call to do this ? >>> > >>> > I already know how to gather a distributed vector to root with >>> VecScatterCreateToZero, this would be the inverse operation, >>> >>> Use the same VecScatter object but with SCATTER_REVERSE, not you need >>> to reverse the two vector arguments as well. >>> >>> >>> > i'm currently trying with VecScatterCreate() and as of now im doing >>> the following: >>> > >>> > >>> > if(rank==0)then >>> > >>> > >>> > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i >>> use WORLD >>> > !freezes >>> in SetSizes >>> > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>> > call VecSetType(bp0,VECSEQ,ierr) >>> > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>> > >>> > >>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>> > >>> > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>> > >>> > >>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>> > >>> > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) >>> !rhs >>> > >>> > do i=0,nbdp-1,1 >>> > ind(i+1) = i >>> > enddo >>> > >>> > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_VALUES,l >>> ocis,ierr) >>> > >>> > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if >>> i use SELF >>> > >>> !freezes here. >>> > >>> > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) >>> > >>> > endif >>> > >>> > bp2 being the receptor MPI vector to scatter to >>> > >>> > But it freezes in VecScatterCreate when trying to use more than one >>> processor, what would be a better approach ? >>> > >>> > >>> > Thanks once again, >>> > >>> > Manuel >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera >>> wrote: >>> > Thanks i had no idea how to debug and read those logs, that solved >>> this issue at least (i was sending a message from root to everyone else, >>> but trying to catch from everyone else including root) >>> > >>> > Until next time, many thanks, >>> > >>> > Manuel >>> > >>> > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley >>> wrote: >>> > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera >>> wrote: >>> > I did a PetscBarrier just before calling the vicariate routine and im >>> pretty sure im calling it from every processor, code looks like this: >>> > >>> > From the gdb trace. >>> > >>> > Proc 0: Is in some MPI routine you call yourself, line 113 >>> > >>> > Proc 1: Is in VecCreate(), line 130 >>> > >>> > You need to fix your communication code. >>> > >>> > Matt >>> > >>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>> > >>> > print*,'entering POInit from',rank >>> > !call exit() >>> > >>> > call PetscObjsInit() >>> > >>> > >>> > And output gives: >>> > >>> > entering POInit from 0 >>> > entering POInit from 1 >>> > entering POInit from 2 >>> > entering POInit from 3 >>> > >>> > >>> > Still hangs in the same way, >>> > >>> > Thanks, >>> > >>> > Manuel >>> > >>> > >>> > >>> > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera >>> wrote: >>> > Thanks for the answers ! >>> > >>> > heres the screenshot of what i got from bt in gdb (great hint in how >>> to debug in petsc, didn't know that) >>> > >>> > I don't really know what to look at here, >>> > >>> > Thanks, >>> > >>> > Manuel >>> > >>> > On Wed, Jan 4, 2017 at 2:39 PM, Dave May >>> wrote: >>> > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). >>> These functions cannot be inside if statements like >>> > if (rank == 0){ >>> > VecCreateMPI(...) >>> > } >>> > >>> > >>> > On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>> wrote: >>> > Thanks Dave for the quick answer, appreciate it, >>> > >>> > I just tried that and it didn't make a difference, any other >>> suggestions ? >>> > >>> > Thanks, >>> > Manuel >>> > >>> > On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>> wrote: >>> > You need to swap the order of your function calls. >>> > Call VecSetSizes() before VecSetType() >>> > >>> > Thanks, >>> > Dave >>> > >>> > >>> > On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>> wrote: >>> > Hello all, happy new year, >>> > >>> > I'm working on parallelizing my code, it worked and provided some >>> results when i just called more than one processor, but created artifacts >>> because i didn't need one image of the whole program in each processor, >>> conflicting with each other. >>> > >>> > Since the pressure solver is the main part i need in parallel im >>> chosing mpi to run everything in root processor until its time to solve for >>> pressure, at this point im trying to create a distributed vector using >>> either >>> > >>> > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>> > or >>> > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>> > call VecSetType(xp,VECMPI,ierr) >>> > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>> > >>> > >>> > In both cases program hangs at this point, something it never happened >>> on the naive way i described before. I've made sure the global size, nbdp, >>> is the same in every processor. What can be wrong? >>> > >>> > Thanks for your kind help, >>> > >>> > Manuel. >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > >>> > -- >>> > What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> > -- Norbert Wiener >>> > >>> > >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Jan 6 15:53:36 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 6 Jan 2017 15:53:36 -0600 Subject: [petsc-users] Best way to scatter a Seq vector ? In-Reply-To: References: <7C6E6D1D-AA28-4889-A647-40AB8FA4ED3C@mcs.anl.gov> Message-ID: <6947F807-9843-4656-8898-039CFBC21646@mcs.anl.gov> Take the scatter out of the if () since everyone does it and get rid of the VecView(). Does this work? If not where is it hanging? > On Jan 6, 2017, at 3:29 PM, Manuel Valera wrote: > > Thanks Dave, > > I think is interesting it never gave an error on this, after adding the vecassembly calls it still shows the same behavior, without complaining, i did: > > if(rankl==0)then > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); > CHKERRQ(ierr) > endif > > > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > print*,"done! " > CHKERRQ(ierr) > > > CHKERRQ(ierr) > > > Thanks. > > On Fri, Jan 6, 2017 at 12:44 PM, Dave May wrote: > > > On 6 January 2017 at 20:24, Manuel Valera wrote: > Great help Barry, i totally had overlooked that option (it is explicit in the vecscatterbegin call help page but not in vecscattercreatetozero, as i read later) > > So i used that and it works partially, it scatters te values assigned in root but not the rest, if i call vecscatterbegin from outside root it hangs, the code currently look as this: > > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) > > call PetscObjectSetName(bp0, 'bp0:',ierr) > > if(rankl==0)then > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > You need to call > > VecAssemblyBegin(bp0); > VecAssemblyEnd(bp0); > after your last call to VecSetValues() before you can do any operations with bp0. > > With your current code, the call to VecView should produce an error if you used the error checking macro CHKERRQ(ierr) (as should VecScatter{Begin,End} > > Thanks, > Dave > > > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > print*,"done! " > CHKERRQ(ierr) > > endif > > ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > ! call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > call exit() > > > > And the output is: (with bp the right answer) > > Vec Object:bp: 2 MPI processes > type: mpi > Process [0] > 1. > 2. > Process [1] > 4. > 3. > Vec Object:bp2: 2 MPI processes (before scatter) > type: mpi > Process [0] > 0. > 0. > Process [1] > 0. > 0. > Vec Object:bp0: 1 MPI processes > type: seq > 1. > 2. > 4. > 3. > done! > Vec Object:bp2: 2 MPI processes (after scatter) > type: mpi > Process [0] > 1. > 2. > Process [1] > 0. > 0. > > > > > Thanks inmensely for your help, > > Manuel > > > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith wrote: > > > On Jan 5, 2017, at 6:21 PM, Manuel Valera wrote: > > > > Hello Devs is me again, > > > > I'm trying to distribute a vector to all called processes, the vector would be originally in root as a sequential vector and i would like to scatter it, what would the best call to do this ? > > > > I already know how to gather a distributed vector to root with VecScatterCreateToZero, this would be the inverse operation, > > Use the same VecScatter object but with SCATTER_REVERSE, not you need to reverse the two vector arguments as well. > > > > i'm currently trying with VecScatterCreate() and as of now im doing the following: > > > > > > if(rank==0)then > > > > > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i use WORLD > > !freezes in SetSizes > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > call VecSetType(bp0,VECSEQ,ierr) > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) > > > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) > > > > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) !rhs > > > > do i=0,nbdp-1,1 > > ind(i+1) = i > > enddo > > > > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) > > > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if i use SELF > > !freezes here. > > > > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) > > > > endif > > > > bp2 being the receptor MPI vector to scatter to > > > > But it freezes in VecScatterCreate when trying to use more than one processor, what would be a better approach ? > > > > > > Thanks once again, > > > > Manuel > > > > > > > > > > > > > > > > > > > > > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera wrote: > > Thanks i had no idea how to debug and read those logs, that solved this issue at least (i was sending a message from root to everyone else, but trying to catch from everyone else including root) > > > > Until next time, many thanks, > > > > Manuel > > > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley wrote: > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera wrote: > > I did a PetscBarrier just before calling the vicariate routine and im pretty sure im calling it from every processor, code looks like this: > > > > From the gdb trace. > > > > Proc 0: Is in some MPI routine you call yourself, line 113 > > > > Proc 1: Is in VecCreate(), line 130 > > > > You need to fix your communication code. > > > > Matt > > > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > > print*,'entering POInit from',rank > > !call exit() > > > > call PetscObjsInit() > > > > > > And output gives: > > > > entering POInit from 0 > > entering POInit from 1 > > entering POInit from 2 > > entering POInit from 3 > > > > > > Still hangs in the same way, > > > > Thanks, > > > > Manuel > > > > > > > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera wrote: > > Thanks for the answers ! > > > > heres the screenshot of what i got from bt in gdb (great hint in how to debug in petsc, didn't know that) > > > > I don't really know what to look at here, > > > > Thanks, > > > > Manuel > > > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May wrote: > > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). These functions cannot be inside if statements like > > if (rank == 0){ > > VecCreateMPI(...) > > } > > > > > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera wrote: > > Thanks Dave for the quick answer, appreciate it, > > > > I just tried that and it didn't make a difference, any other suggestions ? > > > > Thanks, > > Manuel > > > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May wrote: > > You need to swap the order of your function calls. > > Call VecSetSizes() before VecSetType() > > > > Thanks, > > Dave > > > > > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: > > Hello all, happy new year, > > > > I'm working on parallelizing my code, it worked and provided some results when i just called more than one processor, but created artifacts because i didn't need one image of the whole program in each processor, conflicting with each other. > > > > Since the pressure solver is the main part i need in parallel im chosing mpi to run everything in root processor until its time to solve for pressure, at this point im trying to create a distributed vector using either > > > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > > or > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > > call VecSetType(xp,VECMPI,ierr) > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > > > > In both cases program hangs at this point, something it never happened on the naive way i described before. I've made sure the global size, nbdp, is the same in every processor. What can be wrong? > > > > Thanks for your kind help, > > > > Manuel. > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > -- Norbert Wiener > > > > > > > > From mvalera at mail.sdsu.edu Fri Jan 6 15:58:36 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Fri, 6 Jan 2017 13:58:36 -0800 Subject: [petsc-users] Best way to scatter a Seq vector ? In-Reply-To: <6947F807-9843-4656-8898-039CFBC21646@mcs.anl.gov> References: <7C6E6D1D-AA28-4889-A647-40AB8FA4ED3C@mcs.anl.gov> <6947F807-9843-4656-8898-039CFBC21646@mcs.anl.gov> Message-ID: Awesome, that did it, thanks once again. On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith wrote: > > Take the scatter out of the if () since everyone does it and get rid of > the VecView(). > > Does this work? If not where is it hanging? > > > > On Jan 6, 2017, at 3:29 PM, Manuel Valera wrote: > > > > Thanks Dave, > > > > I think is interesting it never gave an error on this, after adding the > vecassembly calls it still shows the same behavior, without complaining, i > did: > > > > if(rankl==0)then > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); > > CHKERRQ(ierr) > > > endif > > > > > > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE, > ierr) > > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > print*,"done! " > > CHKERRQ(ierr) > > > > > > CHKERRQ(ierr) > > > > > > Thanks. > > > > On Fri, Jan 6, 2017 at 12:44 PM, Dave May > wrote: > > > > > > On 6 January 2017 at 20:24, Manuel Valera wrote: > > Great help Barry, i totally had overlooked that option (it is explicit > in the vecscatterbegin call help page but not in vecscattercreatetozero, as > i read later) > > > > So i used that and it works partially, it scatters te values assigned in > root but not the rest, if i call vecscatterbegin from outside root it > hangs, the code currently look as this: > > > > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) > > > > call PetscObjectSetName(bp0, 'bp0:',ierr) > > > > if(rankl==0)then > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > You need to call > > > > VecAssemblyBegin(bp0); > > VecAssemblyEnd(bp0); > > after your last call to VecSetValues() before you can do any operations > with bp0. > > > > With your current code, the call to VecView should produce an error if > you used the error checking macro CHKERRQ(ierr) (as should > VecScatter{Begin,End} > > > > Thanks, > > Dave > > > > > > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE, > ierr) > > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > print*,"done! " > > CHKERRQ(ierr) > > > > endif > > > > ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE, > ierr) > > ! call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > > > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > > call exit() > > > > > > > > And the output is: (with bp the right answer) > > > > Vec Object:bp: 2 MPI processes > > type: mpi > > Process [0] > > 1. > > 2. > > Process [1] > > 4. > > 3. > > Vec Object:bp2: 2 MPI processes (before scatter) > > type: mpi > > Process [0] > > 0. > > 0. > > Process [1] > > 0. > > 0. > > Vec Object:bp0: 1 MPI processes > > type: seq > > 1. > > 2. > > 4. > > 3. > > done! > > Vec Object:bp2: 2 MPI processes (after scatter) > > type: mpi > > Process [0] > > 1. > > 2. > > Process [1] > > 0. > > 0. > > > > > > > > > > Thanks inmensely for your help, > > > > Manuel > > > > > > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith wrote: > > > > > On Jan 5, 2017, at 6:21 PM, Manuel Valera > wrote: > > > > > > Hello Devs is me again, > > > > > > I'm trying to distribute a vector to all called processes, the vector > would be originally in root as a sequential vector and i would like to > scatter it, what would the best call to do this ? > > > > > > I already know how to gather a distributed vector to root with > VecScatterCreateToZero, this would be the inverse operation, > > > > Use the same VecScatter object but with SCATTER_REVERSE, not you need > to reverse the two vector arguments as well. > > > > > > > i'm currently trying with VecScatterCreate() and as of now im doing > the following: > > > > > > > > > if(rank==0)then > > > > > > > > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i > use WORLD > > > !freezes > in SetSizes > > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > call VecSetType(bp0,VECSEQ,ierr) > > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) > > > > > > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > > > > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) > > > > > > > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) > !rhs > > > > > > do i=0,nbdp-1,1 > > > ind(i+1) = i > > > enddo > > > > > > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_ > VALUES,locis,ierr) > > > > > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if > i use SELF > > > > !freezes here. > > > > > > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) > > > > > > endif > > > > > > bp2 being the receptor MPI vector to scatter to > > > > > > But it freezes in VecScatterCreate when trying to use more than one > processor, what would be a better approach ? > > > > > > > > > Thanks once again, > > > > > > Manuel > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera > wrote: > > > Thanks i had no idea how to debug and read those logs, that solved > this issue at least (i was sending a message from root to everyone else, > but trying to catch from everyone else including root) > > > > > > Until next time, many thanks, > > > > > > Manuel > > > > > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley > wrote: > > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera > wrote: > > > I did a PetscBarrier just before calling the vicariate routine and im > pretty sure im calling it from every processor, code looks like this: > > > > > > From the gdb trace. > > > > > > Proc 0: Is in some MPI routine you call yourself, line 113 > > > > > > Proc 1: Is in VecCreate(), line 130 > > > > > > You need to fix your communication code. > > > > > > Matt > > > > > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > > > > print*,'entering POInit from',rank > > > !call exit() > > > > > > call PetscObjsInit() > > > > > > > > > And output gives: > > > > > > entering POInit from 0 > > > entering POInit from 1 > > > entering POInit from 2 > > > entering POInit from 3 > > > > > > > > > Still hangs in the same way, > > > > > > Thanks, > > > > > > Manuel > > > > > > > > > > > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera > wrote: > > > Thanks for the answers ! > > > > > > heres the screenshot of what i got from bt in gdb (great hint in how > to debug in petsc, didn't know that) > > > > > > I don't really know what to look at here, > > > > > > Thanks, > > > > > > Manuel > > > > > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May > wrote: > > > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). > These functions cannot be inside if statements like > > > if (rank == 0){ > > > VecCreateMPI(...) > > > } > > > > > > > > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera > wrote: > > > Thanks Dave for the quick answer, appreciate it, > > > > > > I just tried that and it didn't make a difference, any other > suggestions ? > > > > > > Thanks, > > > Manuel > > > > > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May > wrote: > > > You need to swap the order of your function calls. > > > Call VecSetSizes() before VecSetType() > > > > > > Thanks, > > > Dave > > > > > > > > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera > wrote: > > > Hello all, happy new year, > > > > > > I'm working on parallelizing my code, it worked and provided some > results when i just called more than one processor, but created artifacts > because i didn't need one image of the whole program in each processor, > conflicting with each other. > > > > > > Since the pressure solver is the main part i need in parallel im > chosing mpi to run everything in root processor until its time to solve for > pressure, at this point im trying to create a distributed vector using > either > > > > > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > > > or > > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > > > call VecSetType(xp,VECMPI,ierr) > > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > > > > > > > In both cases program hangs at this point, something it never happened > on the naive way i described before. I've made sure the global size, nbdp, > is the same in every processor. What can be wrong? > > > > > > Thanks for your kind help, > > > > > > Manuel. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > > > -- Norbert Wiener > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From rpgwars at wp.pl Fri Jan 6 16:31:49 2017 From: rpgwars at wp.pl (=?ISO-8859-2?Q?=A3ukasz_Kasza?=) Date: Fri, 06 Jan 2017 23:31:49 +0100 Subject: [petsc-users] Suspicious long call to VecAXPY Message-ID: <58701ad557d9a2.65409440@wp.pl> Dear PETSc Users, Please consider the following 2 snippets which do exactly the same (calculate a sum of two vectors): 1. VecAXPY(amg_level_x[level],1.0,amg_level_residuals[level]); 2. VecGetArray(amg_level_residuals[level], &values); VecSetValues(amg_level_x[level],size,indices,values,ADD_VALUES); VecRestoreArray(amg_level_residuals[level], &values); VecAssemblyBegin(amg_level_x[level]); VecAssemblyEnd(amg_level_x[level]); In my program I have both of the snippets executed in a loop. The problem with the first one is that the longer the program goes the longer it takes to execute it. At the same time the execution time of the second snippet is more or less constant. I don't know why but after a few hundreds of iterations VecAXPY takes more than MatMult on the matrix and vector of the same size and after that it still grows! Always returning a correct value though. I am using 4.5.3 version, the vectors are sequential. VecAXPY in such case is just a wrapper for blas, do you have any idea why the execution time of this function constantly grows? Best regards. From knepley at gmail.com Fri Jan 6 16:37:19 2017 From: knepley at gmail.com (Matthew Knepley) Date: Fri, 6 Jan 2017 16:37:19 -0600 Subject: [petsc-users] Suspicious long call to VecAXPY In-Reply-To: <58701ad557d9a2.65409440@wp.pl> References: <58701ad557d9a2.65409440@wp.pl> Message-ID: On Fri, Jan 6, 2017 at 4:31 PM, ?ukasz Kasza wrote: > > > Dear PETSc Users, > > Please consider the following 2 snippets which do exactly the same > (calculate a sum of two vectors): > 1. > VecAXPY(amg_level_x[level],1.0,amg_level_residuals[level]); > > 2. > VecGetArray(amg_level_residuals[level], &values); > VecSetValues(amg_level_x[level],size,indices,values,ADD_VALUES); > VecRestoreArray(amg_level_residuals[level], &values); > VecAssemblyBegin(amg_level_x[level]); > VecAssemblyEnd(amg_level_x[level]); > > In my program I have both of the snippets executed in a loop. The problem > with the first one is that the longer the program goes the longer it takes > to execute it. At the same time the execution time of the second snippet is > more or less constant. I don't know why but after a few hundreds of > iterations VecAXPY takes more than MatMult on the matrix and vector of the > same size and after that it still grows! Always returning a correct value > though. I am using 4.5.3 version, the vectors are > sequential. VecAXPY in such case is just a wrapper for blas, do you have > any idea why the execution time of this function constantly grows? > 2 should be MUCH slower than 1. Version 4.5.3 of what? I cannot understand what would make this happen. Can you send the output of -log_view for two different run lengths? Matt > Best regards. > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Fri Jan 6 16:38:37 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Fri, 6 Jan 2017 22:38:37 +0000 Subject: [petsc-users] Suspicious long call to VecAXPY In-Reply-To: <58701ad557d9a2.65409440@wp.pl> References: <58701ad557d9a2.65409440@wp.pl> Message-ID: On 6 January 2017 at 22:31, ?ukasz Kasza wrote: > > > Dear PETSc Users, > > Please consider the following 2 snippets which do exactly the same > (calculate a sum of two vectors): > 1. > VecAXPY(amg_level_x[level],1.0,amg_level_residuals[level]); > > 2. > VecGetArray(amg_level_residuals[level], &values); > VecSetValues(amg_level_x[level],size,indices,values,ADD_VALUES); > VecRestoreArray(amg_level_residuals[level], &values); > VecAssemblyBegin(amg_level_x[level]); > VecAssemblyEnd(amg_level_x[level]); > > In my program I have both of the snippets executed in a loop. The problem > with the first one is that the longer the program goes the longer it takes > to execute it. At the same time the execution time of the second snippet is > more or less constant. I don't know why but after a few hundreds of > iterations VecAXPY takes more than MatMult on the matrix and vector of the > same size and after that it still grows! How did you profile this? > Always returning a correct value though. I am using 4.5.3 version, Which version of PETSc are you using?? Current release is 3.7.5 > the vectors are > sequential. VecAXPY in such case is just a wrapper for blas, do you have > any idea why the execution time of this function constantly grows? > Maybe your code is leaking memory and ultimately your OS starts swapping? Please send the code. Thanks, Dave > > Best regards. > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Fri Jan 6 16:40:22 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 6 Jan 2017 16:40:22 -0600 Subject: [petsc-users] Suspicious long call to VecAXPY In-Reply-To: <58701ad557d9a2.65409440@wp.pl> References: <58701ad557d9a2.65409440@wp.pl> Message-ID: The second one should absolutely be slower than the first (because it actually iterations through the indices you pass in with an indirection) and the first should not get slower the more you run it. Depending on your environment I recommend you using a profiling tool on the code and look at where it is spending its time within VecAXPY. The basic Linux/Unix profiling tool is gprof, but you can use Instruments on macOS (part of Xcode) or Intel's vtune if you have that. You can also try a different BLAS to see if that matters. For example --download-fblaslapack or don't use MKL if you are using it. Barry > On Jan 6, 2017, at 4:31 PM, ?ukasz Kasza wrote: > > > > Dear PETSc Users, > > Please consider the following 2 snippets which do exactly the same (calculate a sum of two vectors): > 1. > VecAXPY(amg_level_x[level],1.0,amg_level_residuals[level]); > > 2. > VecGetArray(amg_level_residuals[level], &values); > VecSetValues(amg_level_x[level],size,indices,values,ADD_VALUES); > VecRestoreArray(amg_level_residuals[level], &values); > VecAssemblyBegin(amg_level_x[level]); > VecAssemblyEnd(amg_level_x[level]); > > In my program I have both of the snippets executed in a loop. The problem with the first one is that the longer the program goes the longer it takes to execute it. At the same time the execution time of the second snippet is more or less constant. I don't know why but after a few hundreds of iterations VecAXPY takes more than MatMult on the matrix and vector of the same size and after that it still grows! Always returning a correct value though. I am using 4.5.3 version, the vectors are > sequential. VecAXPY in such case is just a wrapper for blas, do you have any idea why the execution time of this function constantly grows? > > Best regards. > > From bsmith at mcs.anl.gov Sat Jan 7 14:50:19 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 7 Jan 2017 14:50:19 -0600 Subject: [petsc-users] Suspicious long call to VecAXPY In-Reply-To: <5871514a42b654.34314177@wp.pl> References: <5871514a42b654.34314177@wp.pl> Message-ID: <2226B8CF-58F5-48B9-8D03-21594AC04DD4@mcs.anl.gov> > On Jan 7, 2017, at 2:36 PM, ?ukasz Kasza wrote: > > I am unable to locate the source of the issue. It is the same for problem for the newest petsc version also. Gprof does not profile shared libraries (petsc) Thats odd. ./configure with --with-shared-libraries=0 to get non-shared library version of PETSc. > and there is nothing suspicious in the profile of my code. Sprof does not work due to known issue. When I run my code in callgrind this issue does not occur i.e. VexAXPY takes approximately the same time on every call! Nothing meaningful in the petsc log also. I will have to find a workaround or try another blas as you mentioned. > > Dnia Pi?tek, 6 Stycznia 2017 23:40 Barry Smith napisa?(a) >> >> The second one should absolutely be slower than the first (because it actually iterations through the indices you pass in with an indirection) and the first should not get slower the more you run it. >> >> Depending on your environment I recommend you using a profiling tool on the code and look at where it is spending its time within VecAXPY. The basic Linux/Unix profiling tool is gprof, but you can use Instruments on macOS (part of Xcode) or Intel's vtune if you have that. >> >> >> You can also try a different BLAS to see if that matters. For example --download-fblaslapack or don't use MKL if you are using it. >> >> Barry >> >>> On Jan 6, 2017, at 4:31 PM, ?ukasz Kasza wrote: >>> >>> >>> >>> Dear PETSc Users, >>> >>> Please consider the following 2 snippets which do exactly the same (calculate a sum of two vectors): >>> 1. >>> VecAXPY(amg_level_x[level],1.0,amg_level_residuals[level]); >>> >>> 2. >>> VecGetArray(amg_level_residuals[level], &values); >>> VecSetValues(amg_level_x[level],size,indices,values,ADD_VALUES); >>> VecRestoreArray(amg_level_residuals[level], &values); >>> VecAssemblyBegin(amg_level_x[level]); >>> VecAssemblyEnd(amg_level_x[level]); >>> >>> In my program I have both of the snippets executed in a loop. The problem with the first one is that the longer the program goes the longer it takes to execute it. At the same time the execution time of the second snippet is more or less constant. I don't know why but after a few hundreds of iterations VecAXPY takes more than MatMult on the matrix and vector of the same size and after that it still grows! Always returning a correct value though. I am using 4.5.3 version, the vectors are >>> sequential. VecAXPY in such case is just a wrapper for blas, do you have any idea why the execution time of this function constantly grows? >>> >>> Best regards. >>> >>> > > > From mvalera at mail.sdsu.edu Sat Jan 7 15:32:57 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sat, 7 Jan 2017 13:32:57 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve Message-ID: Hi Devs, hope you are having a great weekend, I could finally parallelize my linear solver and implement it into the rest of the code in a way that only the linear system is solved in parallel, great news for my team, but there is a catch and is that i don't see any speedup in the linear system, i don't know if its the MPI in the cluster we are using, but im not sure on how to debug it, On the other hand and because of this issue i was trying to do -log_summary or -log_view and i noticed the program in this context hangs when is time of producing the log, if i debug this for 2 cores, process 0 exits normally but process 1 hangs in the vectorscatterbegin() with scatter_reverse way back in the code, and even after destroying all associated objects and calling petscfinalize(), so im really clueless on why is this, as it only happens for -log_* or -ksp_view options. my -ksp_view shows this: KSP Object: 2 MPI processes type: gcr GCR: restart = 30 GCR: restarts performed = 20 maximum iterations=10000, initial guess is zero tolerances: relative=1e-14, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 2 MPI processes type: bjacobi block Jacobi: number of blocks = 2 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (sub_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=100000, cols=100000 package used to perform factorization: petsc total: nonzeros=1675180, allocated nonzeros=1675180 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=100000, cols=100000 total: nonzeros=1675180, allocated nonzeros=1675180 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: 2 MPI processes type: mpiaij rows=200000, cols=200000 total: nonzeros=3373340, allocated nonzeros=3373340 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines And i configured my PC object as: call PCSetType(mg,PCHYPRE,ierr) call PCHYPRESetType(mg,'boomeramg',ierr) call PetscOptionsSetValue(PETSC_NULL_OBJECT, 'pc_hypre_boomeramg_nodal_coarsen','1',ierr) call PetscOptionsSetValue(PETSC_NULL_OBJECT, 'pc_hypre_boomeramg_vec_interp_variant','1',ierr) What are your thoughts ? Thanks, Manuel On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera wrote: > Awesome, that did it, thanks once again. > > > On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith wrote: > >> >> Take the scatter out of the if () since everyone does it and get rid >> of the VecView(). >> >> Does this work? If not where is it hanging? >> >> >> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >> wrote: >> > >> > Thanks Dave, >> > >> > I think is interesting it never gave an error on this, after adding the >> vecassembly calls it still shows the same behavior, without complaining, i >> did: >> > >> > if(rankl==0)then >> > >> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >> > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); >> > CHKERRQ(ierr) >> > >> endif >> > >> > >> > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ie >> rr) >> > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) >> > print*,"done! " >> > CHKERRQ(ierr) >> > >> > >> > CHKERRQ(ierr) >> > >> > >> > Thanks. >> > >> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May >> wrote: >> > >> > >> > On 6 January 2017 at 20:24, Manuel Valera >> wrote: >> > Great help Barry, i totally had overlooked that option (it is explicit >> in the vecscatterbegin call help page but not in vecscattercreatetozero, as >> i read later) >> > >> > So i used that and it works partially, it scatters te values assigned >> in root but not the rest, if i call vecscatterbegin from outside root it >> hangs, the code currently look as this: >> > >> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >> > >> > call PetscObjectSetName(bp0, 'bp0:',ierr) >> > >> > if(rankl==0)then >> > >> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >> > >> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >> > >> > >> > You need to call >> > >> > VecAssemblyBegin(bp0); >> > VecAssemblyEnd(bp0); >> > after your last call to VecSetValues() before you can do any operations >> with bp0. >> > >> > With your current code, the call to VecView should produce an error if >> you used the error checking macro CHKERRQ(ierr) (as should >> VecScatter{Begin,End} >> > >> > Thanks, >> > Dave >> > >> > >> > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ie >> rr) >> > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) >> > print*,"done! " >> > CHKERRQ(ierr) >> > >> > endif >> > >> > ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ie >> rr) >> > ! call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr >> ) >> > >> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >> > >> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >> > >> > call exit() >> > >> > >> > >> > And the output is: (with bp the right answer) >> > >> > Vec Object:bp: 2 MPI processes >> > type: mpi >> > Process [0] >> > 1. >> > 2. >> > Process [1] >> > 4. >> > 3. >> > Vec Object:bp2: 2 MPI processes (before scatter) >> > type: mpi >> > Process [0] >> > 0. >> > 0. >> > Process [1] >> > 0. >> > 0. >> > Vec Object:bp0: 1 MPI processes >> > type: seq >> > 1. >> > 2. >> > 4. >> > 3. >> > done! >> > Vec Object:bp2: 2 MPI processes (after scatter) >> > type: mpi >> > Process [0] >> > 1. >> > 2. >> > Process [1] >> > 0. >> > 0. >> > >> > >> > >> > >> > Thanks inmensely for your help, >> > >> > Manuel >> > >> > >> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith wrote: >> > >> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera >> wrote: >> > > >> > > Hello Devs is me again, >> > > >> > > I'm trying to distribute a vector to all called processes, the vector >> would be originally in root as a sequential vector and i would like to >> scatter it, what would the best call to do this ? >> > > >> > > I already know how to gather a distributed vector to root with >> VecScatterCreateToZero, this would be the inverse operation, >> > >> > Use the same VecScatter object but with SCATTER_REVERSE, not you >> need to reverse the two vector arguments as well. >> > >> > >> > > i'm currently trying with VecScatterCreate() and as of now im doing >> the following: >> > > >> > > >> > > if(rank==0)then >> > > >> > > >> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i >> use WORLD >> > > !freezes >> in SetSizes >> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >> > > call VecSetType(bp0,VECSEQ,ierr) >> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >> > > >> > > >> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >> > > >> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >> > > >> > > >> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >> > > >> > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) >> !rhs >> > > >> > > do i=0,nbdp-1,1 >> > > ind(i+1) = i >> > > enddo >> > > >> > > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_VALUES, >> locis,ierr) >> > > >> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >> !if i use SELF >> > > >> !freezes here. >> > > >> > > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) >> > > >> > > endif >> > > >> > > bp2 being the receptor MPI vector to scatter to >> > > >> > > But it freezes in VecScatterCreate when trying to use more than one >> processor, what would be a better approach ? >> > > >> > > >> > > Thanks once again, >> > > >> > > Manuel >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera >> wrote: >> > > Thanks i had no idea how to debug and read those logs, that solved >> this issue at least (i was sending a message from root to everyone else, >> but trying to catch from everyone else including root) >> > > >> > > Until next time, many thanks, >> > > >> > > Manuel >> > > >> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley >> wrote: >> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera >> wrote: >> > > I did a PetscBarrier just before calling the vicariate routine and im >> pretty sure im calling it from every processor, code looks like this: >> > > >> > > From the gdb trace. >> > > >> > > Proc 0: Is in some MPI routine you call yourself, line 113 >> > > >> > > Proc 1: Is in VecCreate(), line 130 >> > > >> > > You need to fix your communication code. >> > > >> > > Matt >> > > >> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >> > > >> > > print*,'entering POInit from',rank >> > > !call exit() >> > > >> > > call PetscObjsInit() >> > > >> > > >> > > And output gives: >> > > >> > > entering POInit from 0 >> > > entering POInit from 1 >> > > entering POInit from 2 >> > > entering POInit from 3 >> > > >> > > >> > > Still hangs in the same way, >> > > >> > > Thanks, >> > > >> > > Manuel >> > > >> > > >> > > >> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera >> wrote: >> > > Thanks for the answers ! >> > > >> > > heres the screenshot of what i got from bt in gdb (great hint in how >> to debug in petsc, didn't know that) >> > > >> > > I don't really know what to look at here, >> > > >> > > Thanks, >> > > >> > > Manuel >> > > >> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May >> wrote: >> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). >> These functions cannot be inside if statements like >> > > if (rank == 0){ >> > > VecCreateMPI(...) >> > > } >> > > >> > > >> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera >> wrote: >> > > Thanks Dave for the quick answer, appreciate it, >> > > >> > > I just tried that and it didn't make a difference, any other >> suggestions ? >> > > >> > > Thanks, >> > > Manuel >> > > >> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May >> wrote: >> > > You need to swap the order of your function calls. >> > > Call VecSetSizes() before VecSetType() >> > > >> > > Thanks, >> > > Dave >> > > >> > > >> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera >> wrote: >> > > Hello all, happy new year, >> > > >> > > I'm working on parallelizing my code, it worked and provided some >> results when i just called more than one processor, but created artifacts >> because i didn't need one image of the whole program in each processor, >> conflicting with each other. >> > > >> > > Since the pressure solver is the main part i need in parallel im >> chosing mpi to run everything in root processor until its time to solve for >> pressure, at this point im trying to create a distributed vector using >> either >> > > >> > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >> > > or >> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >> > > call VecSetType(xp,VECMPI,ierr) >> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >> > > >> > > >> > > In both cases program hangs at this point, something it never >> happened on the naive way i described before. I've made sure the global >> size, nbdp, is the same in every processor. What can be wrong? >> > > >> > > Thanks for your kind help, >> > > >> > > Manuel. >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > >> > > -- >> > > What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> > > -- Norbert Wiener >> > > >> > > >> > >> > >> > >> > >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Jan 7 15:49:24 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 7 Jan 2017 15:49:24 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera wrote: > Hi Devs, hope you are having a great weekend, > > I could finally parallelize my linear solver and implement it into the > rest of the code in a way that only the linear system is solved in > parallel, great news for my team, but there is a catch and is that i don't > see any speedup in the linear system, i don't know if its the MPI in the > cluster we are using, but im not sure on how to debug it, > We need to see -log_view output for any performance question. > On the other hand and because of this issue i was trying to do > -log_summary or -log_view and i noticed the program in this context hangs > when is time of producing the log, if i debug this for 2 cores, process 0 > exits normally but process 1 hangs in the vectorscatterbegin() with > scatter_reverse way back in the code, > You are calling a collective routine from only 1 process. Matt > and even after destroying all associated objects and calling > petscfinalize(), so im really clueless on why is this, as it only happens > for -log_* or -ksp_view options. > > my -ksp_view shows this: > > KSP Object: 2 MPI processes > > type: gcr > > GCR: restart = 30 > > GCR: restarts performed = 20 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > > right preconditioning > > using UNPRECONDITIONED norm type for convergence test > > PC Object: 2 MPI processes > > type: bjacobi > > block Jacobi: number of blocks = 2 > > Local solve is same for all blocks, in the following KSP and PC > objects: > > KSP Object: (sub_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (sub_) 1 MPI processes > > type: ilu > > ILU: out-of-place factorization > > 0 levels of fill > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 1., needed 1. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=100000, cols=100000 > > package used to perform factorization: petsc > > total: nonzeros=1675180, allocated nonzeros=1675180 > > total number of mallocs used during MatSetValues calls =0 > > not using I-node routines > > linear system matrix = precond matrix: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=100000, cols=100000 > > total: nonzeros=1675180, allocated nonzeros=1675180 > > total number of mallocs used during MatSetValues calls =0 > > not using I-node routines > > linear system matrix = precond matrix: > > Mat Object: 2 MPI processes > > type: mpiaij > > rows=200000, cols=200000 > > total: nonzeros=3373340, allocated nonzeros=3373340 > > total number of mallocs used during MatSetValues calls =0 > > not using I-node (on process 0) routines > > > > And i configured my PC object as: > > > call PCSetType(mg,PCHYPRE,ierr) > > call PCHYPRESetType(mg,'boomeramg',ierr) > > > call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_ > boomeramg_nodal_coarsen','1',ierr) > > call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_ > boomeramg_vec_interp_variant','1',ierr) > > > > What are your thoughts ? > > Thanks, > > Manuel > > > > On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera > wrote: > >> Awesome, that did it, thanks once again. >> >> >> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith wrote: >> >>> >>> Take the scatter out of the if () since everyone does it and get rid >>> of the VecView(). >>> >>> Does this work? If not where is it hanging? >>> >>> >>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >>> wrote: >>> > >>> > Thanks Dave, >>> > >>> > I think is interesting it never gave an error on this, after adding >>> the vecassembly calls it still shows the same behavior, without >>> complaining, i did: >>> > >>> > if(rankl==0)then >>> > >>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>> > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); >>> > CHKERRQ(ierr) >>> > >>> endif >>> > >>> > >>> > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ie >>> rr) >>> > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr >>> ) >>> > print*,"done! " >>> > CHKERRQ(ierr) >>> > >>> > >>> > CHKERRQ(ierr) >>> > >>> > >>> > Thanks. >>> > >>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May >>> wrote: >>> > >>> > >>> > On 6 January 2017 at 20:24, Manuel Valera >>> wrote: >>> > Great help Barry, i totally had overlooked that option (it is explicit >>> in the vecscatterbegin call help page but not in vecscattercreatetozero, as >>> i read later) >>> > >>> > So i used that and it works partially, it scatters te values assigned >>> in root but not the rest, if i call vecscatterbegin from outside root it >>> hangs, the code currently look as this: >>> > >>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>> > >>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>> > >>> > if(rankl==0)then >>> > >>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>> > >>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>> > >>> > >>> > You need to call >>> > >>> > VecAssemblyBegin(bp0); >>> > VecAssemblyEnd(bp0); >>> > after your last call to VecSetValues() before you can do any >>> operations with bp0. >>> > >>> > With your current code, the call to VecView should produce an error if >>> you used the error checking macro CHKERRQ(ierr) (as should >>> VecScatter{Begin,End} >>> > >>> > Thanks, >>> > Dave >>> > >>> > >>> > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ie >>> rr) >>> > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr >>> ) >>> > print*,"done! " >>> > CHKERRQ(ierr) >>> > >>> > endif >>> > >>> > ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ie >>> rr) >>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>> RT_VALUES,SCATTER_REVERSE,ierr) >>> > >>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>> > >>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>> > >>> > call exit() >>> > >>> > >>> > >>> > And the output is: (with bp the right answer) >>> > >>> > Vec Object:bp: 2 MPI processes >>> > type: mpi >>> > Process [0] >>> > 1. >>> > 2. >>> > Process [1] >>> > 4. >>> > 3. >>> > Vec Object:bp2: 2 MPI processes (before scatter) >>> > type: mpi >>> > Process [0] >>> > 0. >>> > 0. >>> > Process [1] >>> > 0. >>> > 0. >>> > Vec Object:bp0: 1 MPI processes >>> > type: seq >>> > 1. >>> > 2. >>> > 4. >>> > 3. >>> > done! >>> > Vec Object:bp2: 2 MPI processes (after scatter) >>> > type: mpi >>> > Process [0] >>> > 1. >>> > 2. >>> > Process [1] >>> > 0. >>> > 0. >>> > >>> > >>> > >>> > >>> > Thanks inmensely for your help, >>> > >>> > Manuel >>> > >>> > >>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>> wrote: >>> > >>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera >>> wrote: >>> > > >>> > > Hello Devs is me again, >>> > > >>> > > I'm trying to distribute a vector to all called processes, the >>> vector would be originally in root as a sequential vector and i would like >>> to scatter it, what would the best call to do this ? >>> > > >>> > > I already know how to gather a distributed vector to root with >>> VecScatterCreateToZero, this would be the inverse operation, >>> > >>> > Use the same VecScatter object but with SCATTER_REVERSE, not you >>> need to reverse the two vector arguments as well. >>> > >>> > >>> > > i'm currently trying with VecScatterCreate() and as of now im doing >>> the following: >>> > > >>> > > >>> > > if(rank==0)then >>> > > >>> > > >>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i >>> use WORLD >>> > > >>> !freezes in SetSizes >>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>> > > call VecSetType(bp0,VECSEQ,ierr) >>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>> > > >>> > > >>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>> > > >>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>> > > >>> > > >>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>> > > >>> > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) >>> !rhs >>> > > >>> > > do i=0,nbdp-1,1 >>> > > ind(i+1) = i >>> > > enddo >>> > > >>> > > call ISCreateGeneral(PETSC_COMM_SEL >>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>> > > >>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>> !if i use SELF >>> > > >>> !freezes here. >>> > > >>> > > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) >>> > > >>> > > endif >>> > > >>> > > bp2 being the receptor MPI vector to scatter to >>> > > >>> > > But it freezes in VecScatterCreate when trying to use more than one >>> processor, what would be a better approach ? >>> > > >>> > > >>> > > Thanks once again, >>> > > >>> > > Manuel >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera >>> wrote: >>> > > Thanks i had no idea how to debug and read those logs, that solved >>> this issue at least (i was sending a message from root to everyone else, >>> but trying to catch from everyone else including root) >>> > > >>> > > Until next time, many thanks, >>> > > >>> > > Manuel >>> > > >>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley >>> wrote: >>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera >>> wrote: >>> > > I did a PetscBarrier just before calling the vicariate routine and >>> im pretty sure im calling it from every processor, code looks like this: >>> > > >>> > > From the gdb trace. >>> > > >>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>> > > >>> > > Proc 1: Is in VecCreate(), line 130 >>> > > >>> > > You need to fix your communication code. >>> > > >>> > > Matt >>> > > >>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>> > > >>> > > print*,'entering POInit from',rank >>> > > !call exit() >>> > > >>> > > call PetscObjsInit() >>> > > >>> > > >>> > > And output gives: >>> > > >>> > > entering POInit from 0 >>> > > entering POInit from 1 >>> > > entering POInit from 2 >>> > > entering POInit from 3 >>> > > >>> > > >>> > > Still hangs in the same way, >>> > > >>> > > Thanks, >>> > > >>> > > Manuel >>> > > >>> > > >>> > > >>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera >>> wrote: >>> > > Thanks for the answers ! >>> > > >>> > > heres the screenshot of what i got from bt in gdb (great hint in how >>> to debug in petsc, didn't know that) >>> > > >>> > > I don't really know what to look at here, >>> > > >>> > > Thanks, >>> > > >>> > > Manuel >>> > > >>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May >>> wrote: >>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>> function(s). These functions cannot be inside if statements like >>> > > if (rank == 0){ >>> > > VecCreateMPI(...) >>> > > } >>> > > >>> > > >>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>> wrote: >>> > > Thanks Dave for the quick answer, appreciate it, >>> > > >>> > > I just tried that and it didn't make a difference, any other >>> suggestions ? >>> > > >>> > > Thanks, >>> > > Manuel >>> > > >>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>> wrote: >>> > > You need to swap the order of your function calls. >>> > > Call VecSetSizes() before VecSetType() >>> > > >>> > > Thanks, >>> > > Dave >>> > > >>> > > >>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>> wrote: >>> > > Hello all, happy new year, >>> > > >>> > > I'm working on parallelizing my code, it worked and provided some >>> results when i just called more than one processor, but created artifacts >>> because i didn't need one image of the whole program in each processor, >>> conflicting with each other. >>> > > >>> > > Since the pressure solver is the main part i need in parallel im >>> chosing mpi to run everything in root processor until its time to solve for >>> pressure, at this point im trying to create a distributed vector using >>> either >>> > > >>> > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>> > > or >>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>> > > call VecSetType(xp,VECMPI,ierr) >>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>> > > >>> > > >>> > > In both cases program hangs at this point, something it never >>> happened on the naive way i described before. I've made sure the global >>> size, nbdp, is the same in every processor. What can be wrong? >>> > > >>> > > Thanks for your kind help, >>> > > >>> > > Manuel. >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > >>> > > -- >>> > > What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> > > -- Norbert Wiener >>> > > >>> > > >>> > >>> > >>> > >>> > >>> >>> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Sat Jan 7 16:20:53 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sat, 7 Jan 2017 14:20:53 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: Thank you Matthew, On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley wrote: > On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera > wrote: > >> Hi Devs, hope you are having a great weekend, >> >> I could finally parallelize my linear solver and implement it into the >> rest of the code in a way that only the linear system is solved in >> parallel, great news for my team, but there is a catch and is that i don't >> see any speedup in the linear system, i don't know if its the MPI in the >> cluster we are using, but im not sure on how to debug it, >> > > We need to see -log_view output for any performance question. > > >> On the other hand and because of this issue i was trying to do >> -log_summary or -log_view and i noticed the program in this context hangs >> when is time of producing the log, if i debug this for 2 cores, process 0 >> exits normally but process 1 hangs in the vectorscatterbegin() with >> scatter_reverse way back in the code, >> > > You are calling a collective routine from only 1 process. > > Matt > I am pretty confident this is not the case, the callings to vecscattercreatetozero and vecscatterbegin are made in all processes, the program goes thru all of the iterations on the linear solver, writes output correctly and even closes all the petsc objects without complaining, the freeze occurs at the very end when the log is to be produced. Thanks, Manuel > > >> and even after destroying all associated objects and calling >> petscfinalize(), so im really clueless on why is this, as it only happens >> for -log_* or -ksp_view options. >> >> my -ksp_view shows this: >> >> KSP Object: 2 MPI processes >> >> type: gcr >> >> GCR: restart = 30 >> >> GCR: restarts performed = 20 >> >> maximum iterations=10000, initial guess is zero >> >> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >> >> right preconditioning >> >> using UNPRECONDITIONED norm type for convergence test >> >> PC Object: 2 MPI processes >> >> type: bjacobi >> >> block Jacobi: number of blocks = 2 >> >> Local solve is same for all blocks, in the following KSP and PC >> objects: >> >> KSP Object: (sub_) 1 MPI processes >> >> type: preonly >> >> maximum iterations=10000, initial guess is zero >> >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> >> left preconditioning >> >> using NONE norm type for convergence test >> >> PC Object: (sub_) 1 MPI processes >> >> type: ilu >> >> ILU: out-of-place factorization >> >> 0 levels of fill >> >> tolerance for zero pivot 2.22045e-14 >> >> matrix ordering: natural >> >> factor fill ratio given 1., needed 1. >> >> Factored matrix follows: >> >> Mat Object: 1 MPI processes >> >> type: seqaij >> >> rows=100000, cols=100000 >> >> package used to perform factorization: petsc >> >> total: nonzeros=1675180, allocated nonzeros=1675180 >> >> total number of mallocs used during MatSetValues calls =0 >> >> not using I-node routines >> >> linear system matrix = precond matrix: >> >> Mat Object: 1 MPI processes >> >> type: seqaij >> >> rows=100000, cols=100000 >> >> total: nonzeros=1675180, allocated nonzeros=1675180 >> >> total number of mallocs used during MatSetValues calls =0 >> >> not using I-node routines >> >> linear system matrix = precond matrix: >> >> Mat Object: 2 MPI processes >> >> type: mpiaij >> >> rows=200000, cols=200000 >> >> total: nonzeros=3373340, allocated nonzeros=3373340 >> >> total number of mallocs used during MatSetValues calls =0 >> >> not using I-node (on process 0) routines >> >> >> >> And i configured my PC object as: >> >> >> call PCSetType(mg,PCHYPRE,ierr) >> >> call PCHYPRESetType(mg,'boomeramg',ierr) >> >> >> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_ >> nodal_coarsen','1',ierr) >> >> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_ >> vec_interp_variant','1',ierr) >> >> >> >> What are your thoughts ? >> >> Thanks, >> >> Manuel >> >> >> >> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera >> wrote: >> >>> Awesome, that did it, thanks once again. >>> >>> >>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith wrote: >>> >>>> >>>> Take the scatter out of the if () since everyone does it and get rid >>>> of the VecView(). >>>> >>>> Does this work? If not where is it hanging? >>>> >>>> >>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >>>> wrote: >>>> > >>>> > Thanks Dave, >>>> > >>>> > I think is interesting it never gave an error on this, after adding >>>> the vecassembly calls it still shows the same behavior, without >>>> complaining, i did: >>>> > >>>> > if(rankl==0)then >>>> > >>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>> > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); >>>> > CHKERRQ(ierr) >>>> > >>>> endif >>>> > >>>> > >>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>> > print*,"done! " >>>> > CHKERRQ(ierr) >>>> > >>>> > >>>> > CHKERRQ(ierr) >>>> > >>>> > >>>> > Thanks. >>>> > >>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May >>>> wrote: >>>> > >>>> > >>>> > On 6 January 2017 at 20:24, Manuel Valera >>>> wrote: >>>> > Great help Barry, i totally had overlooked that option (it is >>>> explicit in the vecscatterbegin call help page but not in >>>> vecscattercreatetozero, as i read later) >>>> > >>>> > So i used that and it works partially, it scatters te values assigned >>>> in root but not the rest, if i call vecscatterbegin from outside root it >>>> hangs, the code currently look as this: >>>> > >>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>> > >>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>> > >>>> > if(rankl==0)then >>>> > >>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>> > >>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>> > >>>> > >>>> > You need to call >>>> > >>>> > VecAssemblyBegin(bp0); >>>> > VecAssemblyEnd(bp0); >>>> > after your last call to VecSetValues() before you can do any >>>> operations with bp0. >>>> > >>>> > With your current code, the call to VecView should produce an error >>>> if you used the error checking macro CHKERRQ(ierr) (as should >>>> VecScatter{Begin,End} >>>> > >>>> > Thanks, >>>> > Dave >>>> > >>>> > >>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>> > print*,"done! " >>>> > CHKERRQ(ierr) >>>> > >>>> > endif >>>> > >>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>> > >>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>> > >>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>> > >>>> > call exit() >>>> > >>>> > >>>> > >>>> > And the output is: (with bp the right answer) >>>> > >>>> > Vec Object:bp: 2 MPI processes >>>> > type: mpi >>>> > Process [0] >>>> > 1. >>>> > 2. >>>> > Process [1] >>>> > 4. >>>> > 3. >>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>> > type: mpi >>>> > Process [0] >>>> > 0. >>>> > 0. >>>> > Process [1] >>>> > 0. >>>> > 0. >>>> > Vec Object:bp0: 1 MPI processes >>>> > type: seq >>>> > 1. >>>> > 2. >>>> > 4. >>>> > 3. >>>> > done! >>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>> > type: mpi >>>> > Process [0] >>>> > 1. >>>> > 2. >>>> > Process [1] >>>> > 0. >>>> > 0. >>>> > >>>> > >>>> > >>>> > >>>> > Thanks inmensely for your help, >>>> > >>>> > Manuel >>>> > >>>> > >>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>>> wrote: >>>> > >>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera >>>> wrote: >>>> > > >>>> > > Hello Devs is me again, >>>> > > >>>> > > I'm trying to distribute a vector to all called processes, the >>>> vector would be originally in root as a sequential vector and i would like >>>> to scatter it, what would the best call to do this ? >>>> > > >>>> > > I already know how to gather a distributed vector to root with >>>> VecScatterCreateToZero, this would be the inverse operation, >>>> > >>>> > Use the same VecScatter object but with SCATTER_REVERSE, not you >>>> need to reverse the two vector arguments as well. >>>> > >>>> > >>>> > > i'm currently trying with VecScatterCreate() and as of now im doing >>>> the following: >>>> > > >>>> > > >>>> > > if(rank==0)then >>>> > > >>>> > > >>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i >>>> use WORLD >>>> > > >>>> !freezes in SetSizes >>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>> > > >>>> > > >>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>> > > >>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>> > > >>>> > > >>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>> > > >>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>> VecAssemblyEnd(bp0,ierr) !rhs >>>> > > >>>> > > do i=0,nbdp-1,1 >>>> > > ind(i+1) = i >>>> > > enddo >>>> > > >>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>> > > >>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>> !if i use SELF >>>> > > >>>> !freezes here. >>>> > > >>>> > > call VecScatterCreate(bp0,locis,bp2 >>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>> > > >>>> > > endif >>>> > > >>>> > > bp2 being the receptor MPI vector to scatter to >>>> > > >>>> > > But it freezes in VecScatterCreate when trying to use more than one >>>> processor, what would be a better approach ? >>>> > > >>>> > > >>>> > > Thanks once again, >>>> > > >>>> > > Manuel >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>> mvalera at mail.sdsu.edu> wrote: >>>> > > Thanks i had no idea how to debug and read those logs, that solved >>>> this issue at least (i was sending a message from root to everyone else, >>>> but trying to catch from everyone else including root) >>>> > > >>>> > > Until next time, many thanks, >>>> > > >>>> > > Manuel >>>> > > >>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley >>>> wrote: >>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>> mvalera at mail.sdsu.edu> wrote: >>>> > > I did a PetscBarrier just before calling the vicariate routine and >>>> im pretty sure im calling it from every processor, code looks like this: >>>> > > >>>> > > From the gdb trace. >>>> > > >>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>> > > >>>> > > Proc 1: Is in VecCreate(), line 130 >>>> > > >>>> > > You need to fix your communication code. >>>> > > >>>> > > Matt >>>> > > >>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>> > > >>>> > > print*,'entering POInit from',rank >>>> > > !call exit() >>>> > > >>>> > > call PetscObjsInit() >>>> > > >>>> > > >>>> > > And output gives: >>>> > > >>>> > > entering POInit from 0 >>>> > > entering POInit from 1 >>>> > > entering POInit from 2 >>>> > > entering POInit from 3 >>>> > > >>>> > > >>>> > > Still hangs in the same way, >>>> > > >>>> > > Thanks, >>>> > > >>>> > > Manuel >>>> > > >>>> > > >>>> > > >>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>> mvalera at mail.sdsu.edu> wrote: >>>> > > Thanks for the answers ! >>>> > > >>>> > > heres the screenshot of what i got from bt in gdb (great hint in >>>> how to debug in petsc, didn't know that) >>>> > > >>>> > > I don't really know what to look at here, >>>> > > >>>> > > Thanks, >>>> > > >>>> > > Manuel >>>> > > >>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May >>>> wrote: >>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>> function(s). These functions cannot be inside if statements like >>>> > > if (rank == 0){ >>>> > > VecCreateMPI(...) >>>> > > } >>>> > > >>>> > > >>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>>> wrote: >>>> > > Thanks Dave for the quick answer, appreciate it, >>>> > > >>>> > > I just tried that and it didn't make a difference, any other >>>> suggestions ? >>>> > > >>>> > > Thanks, >>>> > > Manuel >>>> > > >>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>>> wrote: >>>> > > You need to swap the order of your function calls. >>>> > > Call VecSetSizes() before VecSetType() >>>> > > >>>> > > Thanks, >>>> > > Dave >>>> > > >>>> > > >>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>>> wrote: >>>> > > Hello all, happy new year, >>>> > > >>>> > > I'm working on parallelizing my code, it worked and provided some >>>> results when i just called more than one processor, but created artifacts >>>> because i didn't need one image of the whole program in each processor, >>>> conflicting with each other. >>>> > > >>>> > > Since the pressure solver is the main part i need in parallel im >>>> chosing mpi to run everything in root processor until its time to solve for >>>> pressure, at this point im trying to create a distributed vector using >>>> either >>>> > > >>>> > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>>> > > or >>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>> > > call VecSetType(xp,VECMPI,ierr) >>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>> > > >>>> > > >>>> > > In both cases program hangs at this point, something it never >>>> happened on the naive way i described before. I've made sure the global >>>> size, nbdp, is the same in every processor. What can be wrong? >>>> > > >>>> > > Thanks for your kind help, >>>> > > >>>> > > Manuel. >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > >>>> > > -- >>>> > > What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> > > -- Norbert Wiener >>>> > > >>>> > > >>>> > >>>> > >>>> > >>>> > >>>> >>>> >>> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Jan 7 16:24:47 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 7 Jan 2017 16:24:47 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera wrote: > Thank you Matthew, > > On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley wrote: > >> On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera >> wrote: >> >>> Hi Devs, hope you are having a great weekend, >>> >>> I could finally parallelize my linear solver and implement it into the >>> rest of the code in a way that only the linear system is solved in >>> parallel, great news for my team, but there is a catch and is that i don't >>> see any speedup in the linear system, i don't know if its the MPI in the >>> cluster we are using, but im not sure on how to debug it, >>> >> >> We need to see -log_view output for any performance question. >> >> >>> On the other hand and because of this issue i was trying to do >>> -log_summary or -log_view and i noticed the program in this context hangs >>> when is time of producing the log, if i debug this for 2 cores, process 0 >>> exits normally but process 1 hangs in the vectorscatterbegin() with >>> scatter_reverse way back in the code, >>> >> >> You are calling a collective routine from only 1 process. >> >> > Matt >> > > I am pretty confident this is not the case, > This is still the simplest explanation. Can you send the stack trace for the 2 process run? > the callings to vecscattercreatetozero and vecscatterbegin are made in all > processes, the program goes thru all of the iterations on the linear > solver, writes output correctly and even closes all the petsc objects > without complaining, the freeze occurs at the very end when the log is to > be produced. > If you can send us a code to run, we can likely find the error. Thanks, Matt > Thanks, > > Manuel > > > >> >> >>> and even after destroying all associated objects and calling >>> petscfinalize(), so im really clueless on why is this, as it only happens >>> for -log_* or -ksp_view options. >>> >>> my -ksp_view shows this: >>> >>> KSP Object: 2 MPI processes >>> >>> type: gcr >>> >>> GCR: restart = 30 >>> >>> GCR: restarts performed = 20 >>> >>> maximum iterations=10000, initial guess is zero >>> >>> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >>> >>> right preconditioning >>> >>> using UNPRECONDITIONED norm type for convergence test >>> >>> PC Object: 2 MPI processes >>> >>> type: bjacobi >>> >>> block Jacobi: number of blocks = 2 >>> >>> Local solve is same for all blocks, in the following KSP and PC >>> objects: >>> >>> KSP Object: (sub_) 1 MPI processes >>> >>> type: preonly >>> >>> maximum iterations=10000, initial guess is zero >>> >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> >>> left preconditioning >>> >>> using NONE norm type for convergence test >>> >>> PC Object: (sub_) 1 MPI processes >>> >>> type: ilu >>> >>> ILU: out-of-place factorization >>> >>> 0 levels of fill >>> >>> tolerance for zero pivot 2.22045e-14 >>> >>> matrix ordering: natural >>> >>> factor fill ratio given 1., needed 1. >>> >>> Factored matrix follows: >>> >>> Mat Object: 1 MPI processes >>> >>> type: seqaij >>> >>> rows=100000, cols=100000 >>> >>> package used to perform factorization: petsc >>> >>> total: nonzeros=1675180, allocated nonzeros=1675180 >>> >>> total number of mallocs used during MatSetValues calls =0 >>> >>> not using I-node routines >>> >>> linear system matrix = precond matrix: >>> >>> Mat Object: 1 MPI processes >>> >>> type: seqaij >>> >>> rows=100000, cols=100000 >>> >>> total: nonzeros=1675180, allocated nonzeros=1675180 >>> >>> total number of mallocs used during MatSetValues calls =0 >>> >>> not using I-node routines >>> >>> linear system matrix = precond matrix: >>> >>> Mat Object: 2 MPI processes >>> >>> type: mpiaij >>> >>> rows=200000, cols=200000 >>> >>> total: nonzeros=3373340, allocated nonzeros=3373340 >>> >>> total number of mallocs used during MatSetValues calls =0 >>> >>> not using I-node (on process 0) routines >>> >>> >>> >>> And i configured my PC object as: >>> >>> >>> call PCSetType(mg,PCHYPRE,ierr) >>> >>> call PCHYPRESetType(mg,'boomeramg',ierr) >>> >>> >>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_n >>> odal_coarsen','1',ierr) >>> >>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_v >>> ec_interp_variant','1',ierr) >>> >>> >>> >>> What are your thoughts ? >>> >>> Thanks, >>> >>> Manuel >>> >>> >>> >>> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera >>> wrote: >>> >>>> Awesome, that did it, thanks once again. >>>> >>>> >>>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith wrote: >>>> >>>>> >>>>> Take the scatter out of the if () since everyone does it and get >>>>> rid of the VecView(). >>>>> >>>>> Does this work? If not where is it hanging? >>>>> >>>>> >>>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >>>>> wrote: >>>>> > >>>>> > Thanks Dave, >>>>> > >>>>> > I think is interesting it never gave an error on this, after adding >>>>> the vecassembly calls it still shows the same behavior, without >>>>> complaining, i did: >>>>> > >>>>> > if(rankl==0)then >>>>> > >>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>> > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); >>>>> > CHKERRQ(ierr) >>>>> > >>>>> endif >>>>> > >>>>> > >>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>> > print*,"done! " >>>>> > CHKERRQ(ierr) >>>>> > >>>>> > >>>>> > CHKERRQ(ierr) >>>>> > >>>>> > >>>>> > Thanks. >>>>> > >>>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May >>>>> wrote: >>>>> > >>>>> > >>>>> > On 6 January 2017 at 20:24, Manuel Valera >>>>> wrote: >>>>> > Great help Barry, i totally had overlooked that option (it is >>>>> explicit in the vecscatterbegin call help page but not in >>>>> vecscattercreatetozero, as i read later) >>>>> > >>>>> > So i used that and it works partially, it scatters te values >>>>> assigned in root but not the rest, if i call vecscatterbegin from outside >>>>> root it hangs, the code currently look as this: >>>>> > >>>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>>> > >>>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>>> > >>>>> > if(rankl==0)then >>>>> > >>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>> > >>>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>> > >>>>> > >>>>> > You need to call >>>>> > >>>>> > VecAssemblyBegin(bp0); >>>>> > VecAssemblyEnd(bp0); >>>>> > after your last call to VecSetValues() before you can do any >>>>> operations with bp0. >>>>> > >>>>> > With your current code, the call to VecView should produce an error >>>>> if you used the error checking macro CHKERRQ(ierr) (as should >>>>> VecScatter{Begin,End} >>>>> > >>>>> > Thanks, >>>>> > Dave >>>>> > >>>>> > >>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>> > print*,"done! " >>>>> > CHKERRQ(ierr) >>>>> > >>>>> > endif >>>>> > >>>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>> > >>>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>> > >>>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>> > >>>>> > call exit() >>>>> > >>>>> > >>>>> > >>>>> > And the output is: (with bp the right answer) >>>>> > >>>>> > Vec Object:bp: 2 MPI processes >>>>> > type: mpi >>>>> > Process [0] >>>>> > 1. >>>>> > 2. >>>>> > Process [1] >>>>> > 4. >>>>> > 3. >>>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>>> > type: mpi >>>>> > Process [0] >>>>> > 0. >>>>> > 0. >>>>> > Process [1] >>>>> > 0. >>>>> > 0. >>>>> > Vec Object:bp0: 1 MPI processes >>>>> > type: seq >>>>> > 1. >>>>> > 2. >>>>> > 4. >>>>> > 3. >>>>> > done! >>>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>>> > type: mpi >>>>> > Process [0] >>>>> > 1. >>>>> > 2. >>>>> > Process [1] >>>>> > 0. >>>>> > 0. >>>>> > >>>>> > >>>>> > >>>>> > >>>>> > Thanks inmensely for your help, >>>>> > >>>>> > Manuel >>>>> > >>>>> > >>>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>>>> wrote: >>>>> > >>>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera >>>>> wrote: >>>>> > > >>>>> > > Hello Devs is me again, >>>>> > > >>>>> > > I'm trying to distribute a vector to all called processes, the >>>>> vector would be originally in root as a sequential vector and i would like >>>>> to scatter it, what would the best call to do this ? >>>>> > > >>>>> > > I already know how to gather a distributed vector to root with >>>>> VecScatterCreateToZero, this would be the inverse operation, >>>>> > >>>>> > Use the same VecScatter object but with SCATTER_REVERSE, not you >>>>> need to reverse the two vector arguments as well. >>>>> > >>>>> > >>>>> > > i'm currently trying with VecScatterCreate() and as of now im >>>>> doing the following: >>>>> > > >>>>> > > >>>>> > > if(rank==0)then >>>>> > > >>>>> > > >>>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if >>>>> i use WORLD >>>>> > > >>>>> !freezes in SetSizes >>>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>>> > > >>>>> > > >>>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>> > > >>>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>>> > > >>>>> > > >>>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>> > > >>>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>>> VecAssemblyEnd(bp0,ierr) !rhs >>>>> > > >>>>> > > do i=0,nbdp-1,1 >>>>> > > ind(i+1) = i >>>>> > > enddo >>>>> > > >>>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>>> > > >>>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>>> !if i use SELF >>>>> > > >>>>> !freezes here. >>>>> > > >>>>> > > call VecScatterCreate(bp0,locis,bp2 >>>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>>> > > >>>>> > > endif >>>>> > > >>>>> > > bp2 being the receptor MPI vector to scatter to >>>>> > > >>>>> > > But it freezes in VecScatterCreate when trying to use more than >>>>> one processor, what would be a better approach ? >>>>> > > >>>>> > > >>>>> > > Thanks once again, >>>>> > > >>>>> > > Manuel >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>>> mvalera at mail.sdsu.edu> wrote: >>>>> > > Thanks i had no idea how to debug and read those logs, that solved >>>>> this issue at least (i was sending a message from root to everyone else, >>>>> but trying to catch from everyone else including root) >>>>> > > >>>>> > > Until next time, many thanks, >>>>> > > >>>>> > > Manuel >>>>> > > >>>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley >>>>> wrote: >>>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>>> mvalera at mail.sdsu.edu> wrote: >>>>> > > I did a PetscBarrier just before calling the vicariate routine and >>>>> im pretty sure im calling it from every processor, code looks like this: >>>>> > > >>>>> > > From the gdb trace. >>>>> > > >>>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>>> > > >>>>> > > Proc 1: Is in VecCreate(), line 130 >>>>> > > >>>>> > > You need to fix your communication code. >>>>> > > >>>>> > > Matt >>>>> > > >>>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>> > > >>>>> > > print*,'entering POInit from',rank >>>>> > > !call exit() >>>>> > > >>>>> > > call PetscObjsInit() >>>>> > > >>>>> > > >>>>> > > And output gives: >>>>> > > >>>>> > > entering POInit from 0 >>>>> > > entering POInit from 1 >>>>> > > entering POInit from 2 >>>>> > > entering POInit from 3 >>>>> > > >>>>> > > >>>>> > > Still hangs in the same way, >>>>> > > >>>>> > > Thanks, >>>>> > > >>>>> > > Manuel >>>>> > > >>>>> > > >>>>> > > >>>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>>> mvalera at mail.sdsu.edu> wrote: >>>>> > > Thanks for the answers ! >>>>> > > >>>>> > > heres the screenshot of what i got from bt in gdb (great hint in >>>>> how to debug in petsc, didn't know that) >>>>> > > >>>>> > > I don't really know what to look at here, >>>>> > > >>>>> > > Thanks, >>>>> > > >>>>> > > Manuel >>>>> > > >>>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May >>>>> wrote: >>>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>>> function(s). These functions cannot be inside if statements like >>>>> > > if (rank == 0){ >>>>> > > VecCreateMPI(...) >>>>> > > } >>>>> > > >>>>> > > >>>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>>>> wrote: >>>>> > > Thanks Dave for the quick answer, appreciate it, >>>>> > > >>>>> > > I just tried that and it didn't make a difference, any other >>>>> suggestions ? >>>>> > > >>>>> > > Thanks, >>>>> > > Manuel >>>>> > > >>>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>>>> wrote: >>>>> > > You need to swap the order of your function calls. >>>>> > > Call VecSetSizes() before VecSetType() >>>>> > > >>>>> > > Thanks, >>>>> > > Dave >>>>> > > >>>>> > > >>>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>>>> wrote: >>>>> > > Hello all, happy new year, >>>>> > > >>>>> > > I'm working on parallelizing my code, it worked and provided some >>>>> results when i just called more than one processor, but created artifacts >>>>> because i didn't need one image of the whole program in each processor, >>>>> conflicting with each other. >>>>> > > >>>>> > > Since the pressure solver is the main part i need in parallel im >>>>> chosing mpi to run everything in root processor until its time to solve for >>>>> pressure, at this point im trying to create a distributed vector using >>>>> either >>>>> > > >>>>> > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) >>>>> > > or >>>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>> > > call VecSetType(xp,VECMPI,ierr) >>>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>> > > >>>>> > > >>>>> > > In both cases program hangs at this point, something it never >>>>> happened on the naive way i described before. I've made sure the global >>>>> size, nbdp, is the same in every processor. What can be wrong? >>>>> > > >>>>> > > Thanks for your kind help, >>>>> > > >>>>> > > Manuel. >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > >>>>> > > -- >>>>> > > What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> > > -- Norbert Wiener >>>>> > > >>>>> > > >>>>> > >>>>> > >>>>> > >>>>> > >>>>> >>>>> >>>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Sat Jan 7 16:59:22 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sat, 7 Jan 2017 14:59:22 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: I would have to think and code a MWE for this problem before sending it since the model is much bigger than the petsc solver. Attached here is a screenshot of the debugger as barry taught me, is that the stack trace you need ? the ucmsMain.f90:522 that shows is the call (from all processes) to the routine that updates the rhs vector (from root) and scatters it (from all processes). This routine is itself inside a double loop that occurs in all processes but the only call from all processes to the solver is this one, the rest of the loop which involves correcting for velocities, pressure and temperature, all happens in root node. Sorry for the convoluted program design, this is the first beta version of the model working on parallel and was the best i could come with, i suppose it makes more sense in serial, Thanks On Sat, Jan 7, 2017 at 2:24 PM, Matthew Knepley wrote: > On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera > wrote: > >> Thank you Matthew, >> >> On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley >> wrote: >> >>> On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera >>> wrote: >>> >>>> Hi Devs, hope you are having a great weekend, >>>> >>>> I could finally parallelize my linear solver and implement it into the >>>> rest of the code in a way that only the linear system is solved in >>>> parallel, great news for my team, but there is a catch and is that i don't >>>> see any speedup in the linear system, i don't know if its the MPI in the >>>> cluster we are using, but im not sure on how to debug it, >>>> >>> >>> We need to see -log_view output for any performance question. >>> >>> >>>> On the other hand and because of this issue i was trying to do >>>> -log_summary or -log_view and i noticed the program in this context hangs >>>> when is time of producing the log, if i debug this for 2 cores, process 0 >>>> exits normally but process 1 hangs in the vectorscatterbegin() with >>>> scatter_reverse way back in the code, >>>> >>> >>> You are calling a collective routine from only 1 process. >>> >>> >> Matt >>> >> >> I am pretty confident this is not the case, >> > > This is still the simplest explanation. Can you send the stack trace for > the 2 process run? > > >> the callings to vecscattercreatetozero and vecscatterbegin are made in >> all processes, the program goes thru all of the iterations on the linear >> solver, writes output correctly and even closes all the petsc objects >> without complaining, the freeze occurs at the very end when the log is to >> be produced. >> > > If you can send us a code to run, we can likely find the error. > > Thanks, > > Matt > > >> Thanks, >> >> Manuel >> >> >> >>> >>> >>>> and even after destroying all associated objects and calling >>>> petscfinalize(), so im really clueless on why is this, as it only happens >>>> for -log_* or -ksp_view options. >>>> >>>> my -ksp_view shows this: >>>> >>>> KSP Object: 2 MPI processes >>>> >>>> type: gcr >>>> >>>> GCR: restart = 30 >>>> >>>> GCR: restarts performed = 20 >>>> >>>> maximum iterations=10000, initial guess is zero >>>> >>>> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >>>> >>>> right preconditioning >>>> >>>> using UNPRECONDITIONED norm type for convergence test >>>> >>>> PC Object: 2 MPI processes >>>> >>>> type: bjacobi >>>> >>>> block Jacobi: number of blocks = 2 >>>> >>>> Local solve is same for all blocks, in the following KSP and PC >>>> objects: >>>> >>>> KSP Object: (sub_) 1 MPI processes >>>> >>>> type: preonly >>>> >>>> maximum iterations=10000, initial guess is zero >>>> >>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>>> >>>> left preconditioning >>>> >>>> using NONE norm type for convergence test >>>> >>>> PC Object: (sub_) 1 MPI processes >>>> >>>> type: ilu >>>> >>>> ILU: out-of-place factorization >>>> >>>> 0 levels of fill >>>> >>>> tolerance for zero pivot 2.22045e-14 >>>> >>>> matrix ordering: natural >>>> >>>> factor fill ratio given 1., needed 1. >>>> >>>> Factored matrix follows: >>>> >>>> Mat Object: 1 MPI processes >>>> >>>> type: seqaij >>>> >>>> rows=100000, cols=100000 >>>> >>>> package used to perform factorization: petsc >>>> >>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>> >>>> total number of mallocs used during MatSetValues calls =0 >>>> >>>> not using I-node routines >>>> >>>> linear system matrix = precond matrix: >>>> >>>> Mat Object: 1 MPI processes >>>> >>>> type: seqaij >>>> >>>> rows=100000, cols=100000 >>>> >>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>> >>>> total number of mallocs used during MatSetValues calls =0 >>>> >>>> not using I-node routines >>>> >>>> linear system matrix = precond matrix: >>>> >>>> Mat Object: 2 MPI processes >>>> >>>> type: mpiaij >>>> >>>> rows=200000, cols=200000 >>>> >>>> total: nonzeros=3373340, allocated nonzeros=3373340 >>>> >>>> total number of mallocs used during MatSetValues calls =0 >>>> >>>> not using I-node (on process 0) routines >>>> >>>> >>>> >>>> And i configured my PC object as: >>>> >>>> >>>> call PCSetType(mg,PCHYPRE,ierr) >>>> >>>> call PCHYPRESetType(mg,'boomeramg',ierr) >>>> >>>> >>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_n >>>> odal_coarsen','1',ierr) >>>> >>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_v >>>> ec_interp_variant','1',ierr) >>>> >>>> >>>> >>>> What are your thoughts ? >>>> >>>> Thanks, >>>> >>>> Manuel >>>> >>>> >>>> >>>> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera >>>> wrote: >>>> >>>>> Awesome, that did it, thanks once again. >>>>> >>>>> >>>>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith >>>>> wrote: >>>>> >>>>>> >>>>>> Take the scatter out of the if () since everyone does it and get >>>>>> rid of the VecView(). >>>>>> >>>>>> Does this work? If not where is it hanging? >>>>>> >>>>>> >>>>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >>>>>> wrote: >>>>>> > >>>>>> > Thanks Dave, >>>>>> > >>>>>> > I think is interesting it never gave an error on this, after adding >>>>>> the vecassembly calls it still shows the same behavior, without >>>>>> complaining, i did: >>>>>> > >>>>>> > if(rankl==0)then >>>>>> > >>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>> > call VecAssemblyBegin(bp0,ierr) ; call >>>>>> VecAssemblyEnd(bp0,ierr); >>>>>> > CHKERRQ(ierr) >>>>>> > >>>>>> endif >>>>>> > >>>>>> > >>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>> > print*,"done! " >>>>>> > CHKERRQ(ierr) >>>>>> > >>>>>> > >>>>>> > CHKERRQ(ierr) >>>>>> > >>>>>> > >>>>>> > Thanks. >>>>>> > >>>>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May >>>>>> wrote: >>>>>> > >>>>>> > >>>>>> > On 6 January 2017 at 20:24, Manuel Valera >>>>>> wrote: >>>>>> > Great help Barry, i totally had overlooked that option (it is >>>>>> explicit in the vecscatterbegin call help page but not in >>>>>> vecscattercreatetozero, as i read later) >>>>>> > >>>>>> > So i used that and it works partially, it scatters te values >>>>>> assigned in root but not the rest, if i call vecscatterbegin from outside >>>>>> root it hangs, the code currently look as this: >>>>>> > >>>>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>>>> > >>>>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>>>> > >>>>>> > if(rankl==0)then >>>>>> > >>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>> > >>>>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>> > >>>>>> > >>>>>> > You need to call >>>>>> > >>>>>> > VecAssemblyBegin(bp0); >>>>>> > VecAssemblyEnd(bp0); >>>>>> > after your last call to VecSetValues() before you can do any >>>>>> operations with bp0. >>>>>> > >>>>>> > With your current code, the call to VecView should produce an error >>>>>> if you used the error checking macro CHKERRQ(ierr) (as should >>>>>> VecScatter{Begin,End} >>>>>> > >>>>>> > Thanks, >>>>>> > Dave >>>>>> > >>>>>> > >>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>> > print*,"done! " >>>>>> > CHKERRQ(ierr) >>>>>> > >>>>>> > endif >>>>>> > >>>>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>> > >>>>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>> > >>>>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>> > >>>>>> > call exit() >>>>>> > >>>>>> > >>>>>> > >>>>>> > And the output is: (with bp the right answer) >>>>>> > >>>>>> > Vec Object:bp: 2 MPI processes >>>>>> > type: mpi >>>>>> > Process [0] >>>>>> > 1. >>>>>> > 2. >>>>>> > Process [1] >>>>>> > 4. >>>>>> > 3. >>>>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>>>> > type: mpi >>>>>> > Process [0] >>>>>> > 0. >>>>>> > 0. >>>>>> > Process [1] >>>>>> > 0. >>>>>> > 0. >>>>>> > Vec Object:bp0: 1 MPI processes >>>>>> > type: seq >>>>>> > 1. >>>>>> > 2. >>>>>> > 4. >>>>>> > 3. >>>>>> > done! >>>>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>>>> > type: mpi >>>>>> > Process [0] >>>>>> > 1. >>>>>> > 2. >>>>>> > Process [1] >>>>>> > 0. >>>>>> > 0. >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> > Thanks inmensely for your help, >>>>>> > >>>>>> > Manuel >>>>>> > >>>>>> > >>>>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>>>>> wrote: >>>>>> > >>>>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera >>>>>> wrote: >>>>>> > > >>>>>> > > Hello Devs is me again, >>>>>> > > >>>>>> > > I'm trying to distribute a vector to all called processes, the >>>>>> vector would be originally in root as a sequential vector and i would like >>>>>> to scatter it, what would the best call to do this ? >>>>>> > > >>>>>> > > I already know how to gather a distributed vector to root with >>>>>> VecScatterCreateToZero, this would be the inverse operation, >>>>>> > >>>>>> > Use the same VecScatter object but with SCATTER_REVERSE, not you >>>>>> need to reverse the two vector arguments as well. >>>>>> > >>>>>> > >>>>>> > > i'm currently trying with VecScatterCreate() and as of now im >>>>>> doing the following: >>>>>> > > >>>>>> > > >>>>>> > > if(rank==0)then >>>>>> > > >>>>>> > > >>>>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if >>>>>> i use WORLD >>>>>> > > >>>>>> !freezes in SetSizes >>>>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>>>> > > >>>>>> > > >>>>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>> > > >>>>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>>>> > > >>>>>> > > >>>>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>> > > >>>>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>>>> VecAssemblyEnd(bp0,ierr) !rhs >>>>>> > > >>>>>> > > do i=0,nbdp-1,1 >>>>>> > > ind(i+1) = i >>>>>> > > enddo >>>>>> > > >>>>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>>>> > > >>>>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>>>> !if i use SELF >>>>>> > > >>>>>> !freezes here. >>>>>> > > >>>>>> > > call VecScatterCreate(bp0,locis,bp2 >>>>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>>>> > > >>>>>> > > endif >>>>>> > > >>>>>> > > bp2 being the receptor MPI vector to scatter to >>>>>> > > >>>>>> > > But it freezes in VecScatterCreate when trying to use more than >>>>>> one processor, what would be a better approach ? >>>>>> > > >>>>>> > > >>>>>> > > Thanks once again, >>>>>> > > >>>>>> > > Manuel >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>> > > Thanks i had no idea how to debug and read those logs, that >>>>>> solved this issue at least (i was sending a message from root to everyone >>>>>> else, but trying to catch from everyone else including root) >>>>>> > > >>>>>> > > Until next time, many thanks, >>>>>> > > >>>>>> > > Manuel >>>>>> > > >>>>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley < >>>>>> knepley at gmail.com> wrote: >>>>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>> > > I did a PetscBarrier just before calling the vicariate routine >>>>>> and im pretty sure im calling it from every processor, code looks like this: >>>>>> > > >>>>>> > > From the gdb trace. >>>>>> > > >>>>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>>>> > > >>>>>> > > Proc 1: Is in VecCreate(), line 130 >>>>>> > > >>>>>> > > You need to fix your communication code. >>>>>> > > >>>>>> > > Matt >>>>>> > > >>>>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>> > > >>>>>> > > print*,'entering POInit from',rank >>>>>> > > !call exit() >>>>>> > > >>>>>> > > call PetscObjsInit() >>>>>> > > >>>>>> > > >>>>>> > > And output gives: >>>>>> > > >>>>>> > > entering POInit from 0 >>>>>> > > entering POInit from 1 >>>>>> > > entering POInit from 2 >>>>>> > > entering POInit from 3 >>>>>> > > >>>>>> > > >>>>>> > > Still hangs in the same way, >>>>>> > > >>>>>> > > Thanks, >>>>>> > > >>>>>> > > Manuel >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>> > > Thanks for the answers ! >>>>>> > > >>>>>> > > heres the screenshot of what i got from bt in gdb (great hint in >>>>>> how to debug in petsc, didn't know that) >>>>>> > > >>>>>> > > I don't really know what to look at here, >>>>>> > > >>>>>> > > Thanks, >>>>>> > > >>>>>> > > Manuel >>>>>> > > >>>>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May >>>>>> wrote: >>>>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>>>> function(s). These functions cannot be inside if statements like >>>>>> > > if (rank == 0){ >>>>>> > > VecCreateMPI(...) >>>>>> > > } >>>>>> > > >>>>>> > > >>>>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera >>>>>> wrote: >>>>>> > > Thanks Dave for the quick answer, appreciate it, >>>>>> > > >>>>>> > > I just tried that and it didn't make a difference, any other >>>>>> suggestions ? >>>>>> > > >>>>>> > > Thanks, >>>>>> > > Manuel >>>>>> > > >>>>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May >>>>>> wrote: >>>>>> > > You need to swap the order of your function calls. >>>>>> > > Call VecSetSizes() before VecSetType() >>>>>> > > >>>>>> > > Thanks, >>>>>> > > Dave >>>>>> > > >>>>>> > > >>>>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera >>>>>> wrote: >>>>>> > > Hello all, happy new year, >>>>>> > > >>>>>> > > I'm working on parallelizing my code, it worked and provided some >>>>>> results when i just called more than one processor, but created artifacts >>>>>> because i didn't need one image of the whole program in each processor, >>>>>> conflicting with each other. >>>>>> > > >>>>>> > > Since the pressure solver is the main part i need in parallel im >>>>>> chosing mpi to run everything in root processor until its time to solve for >>>>>> pressure, at this point im trying to create a distributed vector using >>>>>> either >>>>>> > > >>>>>> > > call VecCreateMPI(PETSC_COMM_WORLD, >>>>>> PETSC_DECIDE,nbdp,xp,ierr) >>>>>> > > or >>>>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>>> > > call VecSetType(xp,VECMPI,ierr) >>>>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>>> > > >>>>>> > > >>>>>> > > In both cases program hangs at this point, something it never >>>>>> happened on the naive way i described before. I've made sure the global >>>>>> size, nbdp, is the same in every processor. What can be wrong? >>>>>> > > >>>>>> > > Thanks for your kind help, >>>>>> > > >>>>>> > > Manuel. >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > >>>>>> > > -- >>>>>> > > What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> > > -- Norbert Wiener >>>>>> > > >>>>>> > > >>>>>> > >>>>>> > >>>>>> > >>>>>> > >>>>>> >>>>>> >>>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screen Shot 2017-01-07 at 2.47.26 PM.png Type: image/png Size: 393404 bytes Desc: not available URL: From bsmith at mcs.anl.gov Sat Jan 7 17:11:21 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 7 Jan 2017 17:11:21 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: Put a break point in PetscFinalize() Do both processes get to it? Or does one get there while the other is "somewhere in the solver". My guess is that your program on process 0 has decided to end solving while the other processes think there are more solves to do hence they are waiting for the next solve to start up while process 0 has gotten to PetscFinalize(). > On Jan 7, 2017, at 4:59 PM, Manuel Valera wrote: > > I would have to think and code a MWE for this problem before sending it since the model is much bigger than the petsc solver. Attached here is a screenshot of the debugger as barry taught me, is that the stack trace you need ? > > the ucmsMain.f90:522 that shows is the call (from all processes) to the routine that updates the rhs vector (from root) and scatters it (from all processes). > > This routine is itself inside a double loop that occurs in all processes but the only call from all processes to the solver is this one, the rest of the loop which involves correcting for velocities, pressure and temperature, all happens in root node. > > Sorry for the convoluted program design, this is the first beta version of the model working on parallel and was the best i could come with, i suppose it makes more sense in serial, > > Thanks > > On Sat, Jan 7, 2017 at 2:24 PM, Matthew Knepley wrote: > On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera wrote: > Thank you Matthew, > > On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley wrote: > On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera wrote: > Hi Devs, hope you are having a great weekend, > > I could finally parallelize my linear solver and implement it into the rest of the code in a way that only the linear system is solved in parallel, great news for my team, but there is a catch and is that i don't see any speedup in the linear system, i don't know if its the MPI in the cluster we are using, but im not sure on how to debug it, > > We need to see -log_view output for any performance question. > > On the other hand and because of this issue i was trying to do -log_summary or -log_view and i noticed the program in this context hangs when is time of producing the log, if i debug this for 2 cores, process 0 exits normally but process 1 hangs in the vectorscatterbegin() with scatter_reverse way back in the code, > > You are calling a collective routine from only 1 process. > > Matt > > I am pretty confident this is not the case, > > This is still the simplest explanation. Can you send the stack trace for the 2 process run? > > the callings to vecscattercreatetozero and vecscatterbegin are made in all processes, the program goes thru all of the iterations on the linear solver, writes output correctly and even closes all the petsc objects without complaining, the freeze occurs at the very end when the log is to be produced. > > If you can send us a code to run, we can likely find the error. > > Thanks, > > Matt > > Thanks, > > Manuel > > > > and even after destroying all associated objects and calling petscfinalize(), so im really clueless on why is this, as it only happens for -log_* or -ksp_view options. > > my -ksp_view shows this: > > KSP Object: 2 MPI processes > type: gcr > GCR: restart = 30 > GCR: restarts performed = 20 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: 2 MPI processes > type: bjacobi > block Jacobi: number of blocks = 2 > Local solve is same for all blocks, in the following KSP and PC objects: > KSP Object: (sub_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sub_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=100000, cols=100000 > package used to perform factorization: petsc > total: nonzeros=1675180, allocated nonzeros=1675180 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: 1 MPI processes > type: seqaij > rows=100000, cols=100000 > total: nonzeros=1675180, allocated nonzeros=1675180 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: 2 MPI processes > type: mpiaij > rows=200000, cols=200000 > total: nonzeros=3373340, allocated nonzeros=3373340 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > > > And i configured my PC object as: > > call PCSetType(mg,PCHYPRE,ierr) > call PCHYPRESetType(mg,'boomeramg',ierr) > > call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_nodal_coarsen','1',ierr) > call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_vec_interp_variant','1',ierr) > > > What are your thoughts ? > > Thanks, > > Manuel > > > > On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera wrote: > Awesome, that did it, thanks once again. > > > On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith wrote: > > Take the scatter out of the if () since everyone does it and get rid of the VecView(). > > Does this work? If not where is it hanging? > > > > On Jan 6, 2017, at 3:29 PM, Manuel Valera wrote: > > > > Thanks Dave, > > > > I think is interesting it never gave an error on this, after adding the vecassembly calls it still shows the same behavior, without complaining, i did: > > > > if(rankl==0)then > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr); > > CHKERRQ(ierr) > > > endif > > > > > > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > print*,"done! " > > CHKERRQ(ierr) > > > > > > CHKERRQ(ierr) > > > > > > Thanks. > > > > On Fri, Jan 6, 2017 at 12:44 PM, Dave May wrote: > > > > > > On 6 January 2017 at 20:24, Manuel Valera wrote: > > Great help Barry, i totally had overlooked that option (it is explicit in the vecscatterbegin call help page but not in vecscattercreatetozero, as i read later) > > > > So i used that and it works partially, it scatters te values assigned in root but not the rest, if i call vecscatterbegin from outside root it hangs, the code currently look as this: > > > > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) > > > > call PetscObjectSetName(bp0, 'bp0:',ierr) > > > > if(rankl==0)then > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > You need to call > > > > VecAssemblyBegin(bp0); > > VecAssemblyEnd(bp0); > > after your last call to VecSetValues() before you can do any operations with bp0. > > > > With your current code, the call to VecView should produce an error if you used the error checking macro CHKERRQ(ierr) (as should VecScatter{Begin,End} > > > > Thanks, > > Dave > > > > > > call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > print*,"done! " > > CHKERRQ(ierr) > > > > endif > > > > ! call VecScatterBegin(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > ! call VecScatterEnd(ctr,bp0,bp2,INSERT_VALUES,SCATTER_REVERSE,ierr) > > > > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > > call exit() > > > > > > > > And the output is: (with bp the right answer) > > > > Vec Object:bp: 2 MPI processes > > type: mpi > > Process [0] > > 1. > > 2. > > Process [1] > > 4. > > 3. > > Vec Object:bp2: 2 MPI processes (before scatter) > > type: mpi > > Process [0] > > 0. > > 0. > > Process [1] > > 0. > > 0. > > Vec Object:bp0: 1 MPI processes > > type: seq > > 1. > > 2. > > 4. > > 3. > > done! > > Vec Object:bp2: 2 MPI processes (after scatter) > > type: mpi > > Process [0] > > 1. > > 2. > > Process [1] > > 0. > > 0. > > > > > > > > > > Thanks inmensely for your help, > > > > Manuel > > > > > > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith wrote: > > > > > On Jan 5, 2017, at 6:21 PM, Manuel Valera wrote: > > > > > > Hello Devs is me again, > > > > > > I'm trying to distribute a vector to all called processes, the vector would be originally in root as a sequential vector and i would like to scatter it, what would the best call to do this ? > > > > > > I already know how to gather a distributed vector to root with VecScatterCreateToZero, this would be the inverse operation, > > > > Use the same VecScatter object but with SCATTER_REVERSE, not you need to reverse the two vector arguments as well. > > > > > > > i'm currently trying with VecScatterCreate() and as of now im doing the following: > > > > > > > > > if(rank==0)then > > > > > > > > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) !if i use WORLD > > > !freezes in SetSizes > > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > call VecSetType(bp0,VECSEQ,ierr) > > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) > > > > > > > > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) > > > > > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) > > > > > > > > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) > > > > > > call VecAssemblyBegin(bp0,ierr) ; call VecAssemblyEnd(bp0,ierr) !rhs > > > > > > do i=0,nbdp-1,1 > > > ind(i+1) = i > > > enddo > > > > > > call ISCreateGeneral(PETSC_COMM_SELF,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) > > > > > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) !if i use SELF > > > !freezes here. > > > > > > call VecScatterCreate(bp0,locis,bp2,PETSC_NULL_OBJECT,ctr,ierr) > > > > > > endif > > > > > > bp2 being the receptor MPI vector to scatter to > > > > > > But it freezes in VecScatterCreate when trying to use more than one processor, what would be a better approach ? > > > > > > > > > Thanks once again, > > > > > > Manuel > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera wrote: > > > Thanks i had no idea how to debug and read those logs, that solved this issue at least (i was sending a message from root to everyone else, but trying to catch from everyone else including root) > > > > > > Until next time, many thanks, > > > > > > Manuel > > > > > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley wrote: > > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera wrote: > > > I did a PetscBarrier just before calling the vicariate routine and im pretty sure im calling it from every processor, code looks like this: > > > > > > From the gdb trace. > > > > > > Proc 0: Is in some MPI routine you call yourself, line 113 > > > > > > Proc 1: Is in VecCreate(), line 130 > > > > > > You need to fix your communication code. > > > > > > Matt > > > > > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) > > > > > > print*,'entering POInit from',rank > > > !call exit() > > > > > > call PetscObjsInit() > > > > > > > > > And output gives: > > > > > > entering POInit from 0 > > > entering POInit from 1 > > > entering POInit from 2 > > > entering POInit from 3 > > > > > > > > > Still hangs in the same way, > > > > > > Thanks, > > > > > > Manuel > > > > > > > > > > > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera wrote: > > > Thanks for the answers ! > > > > > > heres the screenshot of what i got from bt in gdb (great hint in how to debug in petsc, didn't know that) > > > > > > I don't really know what to look at here, > > > > > > Thanks, > > > > > > Manuel > > > > > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May wrote: > > > Are you certain ALL ranks in PETSC_COMM_WORLD call these function(s). These functions cannot be inside if statements like > > > if (rank == 0){ > > > VecCreateMPI(...) > > > } > > > > > > > > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera wrote: > > > Thanks Dave for the quick answer, appreciate it, > > > > > > I just tried that and it didn't make a difference, any other suggestions ? > > > > > > Thanks, > > > Manuel > > > > > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May wrote: > > > You need to swap the order of your function calls. > > > Call VecSetSizes() before VecSetType() > > > > > > Thanks, > > > Dave > > > > > > > > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera wrote: > > > Hello all, happy new year, > > > > > > I'm working on parallelizing my code, it worked and provided some results when i just called more than one processor, but created artifacts because i didn't need one image of the whole program in each processor, conflicting with each other. > > > > > > Since the pressure solver is the main part i need in parallel im chosing mpi to run everything in root processor until its time to solve for pressure, at this point im trying to create a distributed vector using either > > > > > > call VecCreateMPI(PETSC_COMM_WORLD,PETSC_DECIDE,nbdp,xp,ierr) > > > or > > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) > > > call VecSetType(xp,VECMPI,ierr) > > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) > > > > > > > > > In both cases program hangs at this point, something it never happened on the naive way i described before. I've made sure the global size, nbdp, is the same in every processor. What can be wrong? > > > > > > Thanks for your kind help, > > > > > > Manuel. > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > > > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > > > -- Norbert Wiener > > > > > > > > > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > From knepley at gmail.com Sat Jan 7 17:21:33 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 7 Jan 2017 17:21:33 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: On Sat, Jan 7, 2017 at 4:59 PM, Manuel Valera wrote: > I would have to think and code a MWE for this problem before sending it > since the model is much bigger than the petsc solver. Attached here is a > screenshot of the debugger as barry taught me, is that the stack trace you > need ? > > the ucmsMain.f90:522 that shows is the call (from all processes) to the > routine that updates the rhs vector (from root) and scatters it (from all > processes). > Yes, so one process is here and the other has moved on, so there is a mismatch in calls. You could do what Barry suggests, but I think it would be better to just step through your main routine once (its slow going), and see where the divergence happens. Matt > This routine is itself inside a double loop that occurs in all processes > but the only call from all processes to the solver is this one, the rest of > the loop which involves correcting for velocities, pressure and > temperature, all happens in root node. > > Sorry for the convoluted program design, this is the first beta version of > the model working on parallel and was the best i could come with, i suppose > it makes more sense in serial, > > Thanks > > On Sat, Jan 7, 2017 at 2:24 PM, Matthew Knepley wrote: > >> On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera >> wrote: >> >>> Thank you Matthew, >>> >>> On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley >>> wrote: >>> >>>> On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera >>>> wrote: >>>> >>>>> Hi Devs, hope you are having a great weekend, >>>>> >>>>> I could finally parallelize my linear solver and implement it into the >>>>> rest of the code in a way that only the linear system is solved in >>>>> parallel, great news for my team, but there is a catch and is that i don't >>>>> see any speedup in the linear system, i don't know if its the MPI in the >>>>> cluster we are using, but im not sure on how to debug it, >>>>> >>>> >>>> We need to see -log_view output for any performance question. >>>> >>>> >>>>> On the other hand and because of this issue i was trying to do >>>>> -log_summary or -log_view and i noticed the program in this context hangs >>>>> when is time of producing the log, if i debug this for 2 cores, process 0 >>>>> exits normally but process 1 hangs in the vectorscatterbegin() with >>>>> scatter_reverse way back in the code, >>>>> >>>> >>>> You are calling a collective routine from only 1 process. >>>> >>>> >>> Matt >>>> >>> >>> I am pretty confident this is not the case, >>> >> >> This is still the simplest explanation. Can you send the stack trace for >> the 2 process run? >> >> >>> the callings to vecscattercreatetozero and vecscatterbegin are made in >>> all processes, the program goes thru all of the iterations on the linear >>> solver, writes output correctly and even closes all the petsc objects >>> without complaining, the freeze occurs at the very end when the log is to >>> be produced. >>> >> >> If you can send us a code to run, we can likely find the error. >> >> Thanks, >> >> Matt >> >> >>> Thanks, >>> >>> Manuel >>> >>> >>> >>>> >>>> >>>>> and even after destroying all associated objects and calling >>>>> petscfinalize(), so im really clueless on why is this, as it only happens >>>>> for -log_* or -ksp_view options. >>>>> >>>>> my -ksp_view shows this: >>>>> >>>>> KSP Object: 2 MPI processes >>>>> >>>>> type: gcr >>>>> >>>>> GCR: restart = 30 >>>>> >>>>> GCR: restarts performed = 20 >>>>> >>>>> maximum iterations=10000, initial guess is zero >>>>> >>>>> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >>>>> >>>>> right preconditioning >>>>> >>>>> using UNPRECONDITIONED norm type for convergence test >>>>> >>>>> PC Object: 2 MPI processes >>>>> >>>>> type: bjacobi >>>>> >>>>> block Jacobi: number of blocks = 2 >>>>> >>>>> Local solve is same for all blocks, in the following KSP and PC >>>>> objects: >>>>> >>>>> KSP Object: (sub_) 1 MPI processes >>>>> >>>>> type: preonly >>>>> >>>>> maximum iterations=10000, initial guess is zero >>>>> >>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>>>> >>>>> left preconditioning >>>>> >>>>> using NONE norm type for convergence test >>>>> >>>>> PC Object: (sub_) 1 MPI processes >>>>> >>>>> type: ilu >>>>> >>>>> ILU: out-of-place factorization >>>>> >>>>> 0 levels of fill >>>>> >>>>> tolerance for zero pivot 2.22045e-14 >>>>> >>>>> matrix ordering: natural >>>>> >>>>> factor fill ratio given 1., needed 1. >>>>> >>>>> Factored matrix follows: >>>>> >>>>> Mat Object: 1 MPI processes >>>>> >>>>> type: seqaij >>>>> >>>>> rows=100000, cols=100000 >>>>> >>>>> package used to perform factorization: petsc >>>>> >>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>> >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> >>>>> not using I-node routines >>>>> >>>>> linear system matrix = precond matrix: >>>>> >>>>> Mat Object: 1 MPI processes >>>>> >>>>> type: seqaij >>>>> >>>>> rows=100000, cols=100000 >>>>> >>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>> >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> >>>>> not using I-node routines >>>>> >>>>> linear system matrix = precond matrix: >>>>> >>>>> Mat Object: 2 MPI processes >>>>> >>>>> type: mpiaij >>>>> >>>>> rows=200000, cols=200000 >>>>> >>>>> total: nonzeros=3373340, allocated nonzeros=3373340 >>>>> >>>>> total number of mallocs used during MatSetValues calls =0 >>>>> >>>>> not using I-node (on process 0) routines >>>>> >>>>> >>>>> >>>>> And i configured my PC object as: >>>>> >>>>> >>>>> call PCSetType(mg,PCHYPRE,ierr) >>>>> >>>>> call PCHYPRESetType(mg,'boomeramg',ierr) >>>>> >>>>> >>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_n >>>>> odal_coarsen','1',ierr) >>>>> >>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_v >>>>> ec_interp_variant','1',ierr) >>>>> >>>>> >>>>> >>>>> What are your thoughts ? >>>>> >>>>> Thanks, >>>>> >>>>> Manuel >>>>> >>>>> >>>>> >>>>> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera >>>>> wrote: >>>>> >>>>>> Awesome, that did it, thanks once again. >>>>>> >>>>>> >>>>>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith >>>>>> wrote: >>>>>> >>>>>>> >>>>>>> Take the scatter out of the if () since everyone does it and get >>>>>>> rid of the VecView(). >>>>>>> >>>>>>> Does this work? If not where is it hanging? >>>>>>> >>>>>>> >>>>>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >>>>>>> wrote: >>>>>>> > >>>>>>> > Thanks Dave, >>>>>>> > >>>>>>> > I think is interesting it never gave an error on this, after >>>>>>> adding the vecassembly calls it still shows the same behavior, without >>>>>>> complaining, i did: >>>>>>> > >>>>>>> > if(rankl==0)then >>>>>>> > >>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>> > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>> VecAssemblyEnd(bp0,ierr); >>>>>>> > CHKERRQ(ierr) >>>>>>> > >>>>>>> endif >>>>>>> > >>>>>>> > >>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>> > print*,"done! " >>>>>>> > CHKERRQ(ierr) >>>>>>> > >>>>>>> > >>>>>>> > CHKERRQ(ierr) >>>>>>> > >>>>>>> > >>>>>>> > Thanks. >>>>>>> > >>>>>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May >>>>>>> wrote: >>>>>>> > >>>>>>> > >>>>>>> > On 6 January 2017 at 20:24, Manuel Valera >>>>>>> wrote: >>>>>>> > Great help Barry, i totally had overlooked that option (it is >>>>>>> explicit in the vecscatterbegin call help page but not in >>>>>>> vecscattercreatetozero, as i read later) >>>>>>> > >>>>>>> > So i used that and it works partially, it scatters te values >>>>>>> assigned in root but not the rest, if i call vecscatterbegin from outside >>>>>>> root it hangs, the code currently look as this: >>>>>>> > >>>>>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>>>>> > >>>>>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>>>>> > >>>>>>> > if(rankl==0)then >>>>>>> > >>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>> > >>>>>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>> > >>>>>>> > >>>>>>> > You need to call >>>>>>> > >>>>>>> > VecAssemblyBegin(bp0); >>>>>>> > VecAssemblyEnd(bp0); >>>>>>> > after your last call to VecSetValues() before you can do any >>>>>>> operations with bp0. >>>>>>> > >>>>>>> > With your current code, the call to VecView should produce an >>>>>>> error if you used the error checking macro CHKERRQ(ierr) (as should >>>>>>> VecScatter{Begin,End} >>>>>>> > >>>>>>> > Thanks, >>>>>>> > Dave >>>>>>> > >>>>>>> > >>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>> > print*,"done! " >>>>>>> > CHKERRQ(ierr) >>>>>>> > >>>>>>> > endif >>>>>>> > >>>>>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>> > >>>>>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>> > >>>>>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>> > >>>>>>> > call exit() >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > And the output is: (with bp the right answer) >>>>>>> > >>>>>>> > Vec Object:bp: 2 MPI processes >>>>>>> > type: mpi >>>>>>> > Process [0] >>>>>>> > 1. >>>>>>> > 2. >>>>>>> > Process [1] >>>>>>> > 4. >>>>>>> > 3. >>>>>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>>>>> > type: mpi >>>>>>> > Process [0] >>>>>>> > 0. >>>>>>> > 0. >>>>>>> > Process [1] >>>>>>> > 0. >>>>>>> > 0. >>>>>>> > Vec Object:bp0: 1 MPI processes >>>>>>> > type: seq >>>>>>> > 1. >>>>>>> > 2. >>>>>>> > 4. >>>>>>> > 3. >>>>>>> > done! >>>>>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>>>>> > type: mpi >>>>>>> > Process [0] >>>>>>> > 1. >>>>>>> > 2. >>>>>>> > Process [1] >>>>>>> > 0. >>>>>>> > 0. >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > Thanks inmensely for your help, >>>>>>> > >>>>>>> > Manuel >>>>>>> > >>>>>>> > >>>>>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>>>>>> wrote: >>>>>>> > >>>>>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera >>>>>>> wrote: >>>>>>> > > >>>>>>> > > Hello Devs is me again, >>>>>>> > > >>>>>>> > > I'm trying to distribute a vector to all called processes, the >>>>>>> vector would be originally in root as a sequential vector and i would like >>>>>>> to scatter it, what would the best call to do this ? >>>>>>> > > >>>>>>> > > I already know how to gather a distributed vector to root with >>>>>>> VecScatterCreateToZero, this would be the inverse operation, >>>>>>> > >>>>>>> > Use the same VecScatter object but with SCATTER_REVERSE, not >>>>>>> you need to reverse the two vector arguments as well. >>>>>>> > >>>>>>> > >>>>>>> > > i'm currently trying with VecScatterCreate() and as of now im >>>>>>> doing the following: >>>>>>> > > >>>>>>> > > >>>>>>> > > if(rank==0)then >>>>>>> > > >>>>>>> > > >>>>>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) >>>>>>> !if i use WORLD >>>>>>> > > >>>>>>> !freezes in SetSizes >>>>>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>>>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>>>>> > > >>>>>>> > > >>>>>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>> > > >>>>>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>>>>> > > >>>>>>> > > >>>>>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>> > > >>>>>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>> VecAssemblyEnd(bp0,ierr) !rhs >>>>>>> > > >>>>>>> > > do i=0,nbdp-1,1 >>>>>>> > > ind(i+1) = i >>>>>>> > > enddo >>>>>>> > > >>>>>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>>>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>>>>> > > >>>>>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>>>>> !if i use SELF >>>>>>> > > >>>>>>> !freezes here. >>>>>>> > > >>>>>>> > > call VecScatterCreate(bp0,locis,bp2 >>>>>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>>>>> > > >>>>>>> > > endif >>>>>>> > > >>>>>>> > > bp2 being the receptor MPI vector to scatter to >>>>>>> > > >>>>>>> > > But it freezes in VecScatterCreate when trying to use more than >>>>>>> one processor, what would be a better approach ? >>>>>>> > > >>>>>>> > > >>>>>>> > > Thanks once again, >>>>>>> > > >>>>>>> > > Manuel >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>> > > Thanks i had no idea how to debug and read those logs, that >>>>>>> solved this issue at least (i was sending a message from root to everyone >>>>>>> else, but trying to catch from everyone else including root) >>>>>>> > > >>>>>>> > > Until next time, many thanks, >>>>>>> > > >>>>>>> > > Manuel >>>>>>> > > >>>>>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley < >>>>>>> knepley at gmail.com> wrote: >>>>>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>> > > I did a PetscBarrier just before calling the vicariate routine >>>>>>> and im pretty sure im calling it from every processor, code looks like this: >>>>>>> > > >>>>>>> > > From the gdb trace. >>>>>>> > > >>>>>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>>>>> > > >>>>>>> > > Proc 1: Is in VecCreate(), line 130 >>>>>>> > > >>>>>>> > > You need to fix your communication code. >>>>>>> > > >>>>>>> > > Matt >>>>>>> > > >>>>>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>> > > >>>>>>> > > print*,'entering POInit from',rank >>>>>>> > > !call exit() >>>>>>> > > >>>>>>> > > call PetscObjsInit() >>>>>>> > > >>>>>>> > > >>>>>>> > > And output gives: >>>>>>> > > >>>>>>> > > entering POInit from 0 >>>>>>> > > entering POInit from 1 >>>>>>> > > entering POInit from 2 >>>>>>> > > entering POInit from 3 >>>>>>> > > >>>>>>> > > >>>>>>> > > Still hangs in the same way, >>>>>>> > > >>>>>>> > > Thanks, >>>>>>> > > >>>>>>> > > Manuel >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>> > > Thanks for the answers ! >>>>>>> > > >>>>>>> > > heres the screenshot of what i got from bt in gdb (great hint in >>>>>>> how to debug in petsc, didn't know that) >>>>>>> > > >>>>>>> > > I don't really know what to look at here, >>>>>>> > > >>>>>>> > > Thanks, >>>>>>> > > >>>>>>> > > Manuel >>>>>>> > > >>>>>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May < >>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>>>>> function(s). These functions cannot be inside if statements like >>>>>>> > > if (rank == 0){ >>>>>>> > > VecCreateMPI(...) >>>>>>> > > } >>>>>>> > > >>>>>>> > > >>>>>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera < >>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>> > > Thanks Dave for the quick answer, appreciate it, >>>>>>> > > >>>>>>> > > I just tried that and it didn't make a difference, any other >>>>>>> suggestions ? >>>>>>> > > >>>>>>> > > Thanks, >>>>>>> > > Manuel >>>>>>> > > >>>>>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May < >>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>> > > You need to swap the order of your function calls. >>>>>>> > > Call VecSetSizes() before VecSetType() >>>>>>> > > >>>>>>> > > Thanks, >>>>>>> > > Dave >>>>>>> > > >>>>>>> > > >>>>>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera < >>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>> > > Hello all, happy new year, >>>>>>> > > >>>>>>> > > I'm working on parallelizing my code, it worked and provided >>>>>>> some results when i just called more than one processor, but created >>>>>>> artifacts because i didn't need one image of the whole program in each >>>>>>> processor, conflicting with each other. >>>>>>> > > >>>>>>> > > Since the pressure solver is the main part i need in parallel im >>>>>>> chosing mpi to run everything in root processor until its time to solve for >>>>>>> pressure, at this point im trying to create a distributed vector using >>>>>>> either >>>>>>> > > >>>>>>> > > call VecCreateMPI(PETSC_COMM_WORLD, >>>>>>> PETSC_DECIDE,nbdp,xp,ierr) >>>>>>> > > or >>>>>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>>>> > > call VecSetType(xp,VECMPI,ierr) >>>>>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>>>> > > >>>>>>> > > >>>>>>> > > In both cases program hangs at this point, something it never >>>>>>> happened on the naive way i described before. I've made sure the global >>>>>>> size, nbdp, is the same in every processor. What can be wrong? >>>>>>> > > >>>>>>> > > Thanks for your kind help, >>>>>>> > > >>>>>>> > > Manuel. >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > >>>>>>> > > -- >>>>>>> > > What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> > > -- Norbert Wiener >>>>>>> > > >>>>>>> > > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> > >>>>>>> >>>>>>> >>>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Sat Jan 7 17:33:48 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sat, 7 Jan 2017 15:33:48 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: Thanks Barry and Matt, I was able to detect a bug that i just solved, as suggested the loop parameters weren't updated as it should, now it does and the program still freezes but now in the beginning of the loop... ? Im attaching screen so you have an idea. Im thinking about it... Thanks On Sat, Jan 7, 2017 at 3:21 PM, Matthew Knepley wrote: > On Sat, Jan 7, 2017 at 4:59 PM, Manuel Valera > wrote: > >> I would have to think and code a MWE for this problem before sending it >> since the model is much bigger than the petsc solver. Attached here is a >> screenshot of the debugger as barry taught me, is that the stack trace you >> need ? >> >> the ucmsMain.f90:522 that shows is the call (from all processes) to the >> routine that updates the rhs vector (from root) and scatters it (from all >> processes). >> > > Yes, so one process is here and the other has moved on, so there is a > mismatch in calls. > > You could do what Barry suggests, but I think it would be better to just > step through your main routine once (its slow going), and > see where the divergence happens. > > Matt > > >> This routine is itself inside a double loop that occurs in all processes >> but the only call from all processes to the solver is this one, the rest of >> the loop which involves correcting for velocities, pressure and >> temperature, all happens in root node. >> >> Sorry for the convoluted program design, this is the first beta version >> of the model working on parallel and was the best i could come with, i >> suppose it makes more sense in serial, >> >> Thanks >> >> On Sat, Jan 7, 2017 at 2:24 PM, Matthew Knepley >> wrote: >> >>> On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera >>> wrote: >>> >>>> Thank you Matthew, >>>> >>>> On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera >>>>> wrote: >>>>> >>>>>> Hi Devs, hope you are having a great weekend, >>>>>> >>>>>> I could finally parallelize my linear solver and implement it into >>>>>> the rest of the code in a way that only the linear system is solved in >>>>>> parallel, great news for my team, but there is a catch and is that i don't >>>>>> see any speedup in the linear system, i don't know if its the MPI in the >>>>>> cluster we are using, but im not sure on how to debug it, >>>>>> >>>>> >>>>> We need to see -log_view output for any performance question. >>>>> >>>>> >>>>>> On the other hand and because of this issue i was trying to do >>>>>> -log_summary or -log_view and i noticed the program in this context hangs >>>>>> when is time of producing the log, if i debug this for 2 cores, process 0 >>>>>> exits normally but process 1 hangs in the vectorscatterbegin() with >>>>>> scatter_reverse way back in the code, >>>>>> >>>>> >>>>> You are calling a collective routine from only 1 process. >>>>> >>>>> >>>> Matt >>>>> >>>> >>>> I am pretty confident this is not the case, >>>> >>> >>> This is still the simplest explanation. Can you send the stack trace for >>> the 2 process run? >>> >>> >>>> the callings to vecscattercreatetozero and vecscatterbegin are made in >>>> all processes, the program goes thru all of the iterations on the linear >>>> solver, writes output correctly and even closes all the petsc objects >>>> without complaining, the freeze occurs at the very end when the log is to >>>> be produced. >>>> >>> >>> If you can send us a code to run, we can likely find the error. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> Thanks, >>>> >>>> Manuel >>>> >>>> >>>> >>>>> >>>>> >>>>>> and even after destroying all associated objects and calling >>>>>> petscfinalize(), so im really clueless on why is this, as it only happens >>>>>> for -log_* or -ksp_view options. >>>>>> >>>>>> my -ksp_view shows this: >>>>>> >>>>>> KSP Object: 2 MPI processes >>>>>> >>>>>> type: gcr >>>>>> >>>>>> GCR: restart = 30 >>>>>> >>>>>> GCR: restarts performed = 20 >>>>>> >>>>>> maximum iterations=10000, initial guess is zero >>>>>> >>>>>> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >>>>>> >>>>>> right preconditioning >>>>>> >>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>> >>>>>> PC Object: 2 MPI processes >>>>>> >>>>>> type: bjacobi >>>>>> >>>>>> block Jacobi: number of blocks = 2 >>>>>> >>>>>> Local solve is same for all blocks, in the following KSP and PC >>>>>> objects: >>>>>> >>>>>> KSP Object: (sub_) 1 MPI processes >>>>>> >>>>>> type: preonly >>>>>> >>>>>> maximum iterations=10000, initial guess is zero >>>>>> >>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>>>>> >>>>>> left preconditioning >>>>>> >>>>>> using NONE norm type for convergence test >>>>>> >>>>>> PC Object: (sub_) 1 MPI processes >>>>>> >>>>>> type: ilu >>>>>> >>>>>> ILU: out-of-place factorization >>>>>> >>>>>> 0 levels of fill >>>>>> >>>>>> tolerance for zero pivot 2.22045e-14 >>>>>> >>>>>> matrix ordering: natural >>>>>> >>>>>> factor fill ratio given 1., needed 1. >>>>>> >>>>>> Factored matrix follows: >>>>>> >>>>>> Mat Object: 1 MPI processes >>>>>> >>>>>> type: seqaij >>>>>> >>>>>> rows=100000, cols=100000 >>>>>> >>>>>> package used to perform factorization: petsc >>>>>> >>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>> >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> >>>>>> not using I-node routines >>>>>> >>>>>> linear system matrix = precond matrix: >>>>>> >>>>>> Mat Object: 1 MPI processes >>>>>> >>>>>> type: seqaij >>>>>> >>>>>> rows=100000, cols=100000 >>>>>> >>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>> >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> >>>>>> not using I-node routines >>>>>> >>>>>> linear system matrix = precond matrix: >>>>>> >>>>>> Mat Object: 2 MPI processes >>>>>> >>>>>> type: mpiaij >>>>>> >>>>>> rows=200000, cols=200000 >>>>>> >>>>>> total: nonzeros=3373340, allocated nonzeros=3373340 >>>>>> >>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>> >>>>>> not using I-node (on process 0) routines >>>>>> >>>>>> >>>>>> >>>>>> And i configured my PC object as: >>>>>> >>>>>> >>>>>> call PCSetType(mg,PCHYPRE,ierr) >>>>>> >>>>>> call PCHYPRESetType(mg,'boomeramg',ierr) >>>>>> >>>>>> >>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_n >>>>>> odal_coarsen','1',ierr) >>>>>> >>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT,'pc_hypre_boomeramg_v >>>>>> ec_interp_variant','1',ierr) >>>>>> >>>>>> >>>>>> >>>>>> What are your thoughts ? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Manuel >>>>>> >>>>>> >>>>>> >>>>>> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera >>>>>> wrote: >>>>>> >>>>>>> Awesome, that did it, thanks once again. >>>>>>> >>>>>>> >>>>>>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith >>>>>>> wrote: >>>>>>> >>>>>>>> >>>>>>>> Take the scatter out of the if () since everyone does it and get >>>>>>>> rid of the VecView(). >>>>>>>> >>>>>>>> Does this work? If not where is it hanging? >>>>>>>> >>>>>>>> >>>>>>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >>>>>>>> wrote: >>>>>>>> > >>>>>>>> > Thanks Dave, >>>>>>>> > >>>>>>>> > I think is interesting it never gave an error on this, after >>>>>>>> adding the vecassembly calls it still shows the same behavior, without >>>>>>>> complaining, i did: >>>>>>>> > >>>>>>>> > if(rankl==0)then >>>>>>>> > >>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>> > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>> VecAssemblyEnd(bp0,ierr); >>>>>>>> > CHKERRQ(ierr) >>>>>>>> > >>>>>>>> endif >>>>>>>> > >>>>>>>> > >>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>> > print*,"done! " >>>>>>>> > CHKERRQ(ierr) >>>>>>>> > >>>>>>>> > >>>>>>>> > CHKERRQ(ierr) >>>>>>>> > >>>>>>>> > >>>>>>>> > Thanks. >>>>>>>> > >>>>>>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May < >>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>> > >>>>>>>> > >>>>>>>> > On 6 January 2017 at 20:24, Manuel Valera >>>>>>>> wrote: >>>>>>>> > Great help Barry, i totally had overlooked that option (it is >>>>>>>> explicit in the vecscatterbegin call help page but not in >>>>>>>> vecscattercreatetozero, as i read later) >>>>>>>> > >>>>>>>> > So i used that and it works partially, it scatters te values >>>>>>>> assigned in root but not the rest, if i call vecscatterbegin from outside >>>>>>>> root it hangs, the code currently look as this: >>>>>>>> > >>>>>>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>>>>>> > >>>>>>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>>>>>> > >>>>>>>> > if(rankl==0)then >>>>>>>> > >>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>> > >>>>>>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>> > >>>>>>>> > >>>>>>>> > You need to call >>>>>>>> > >>>>>>>> > VecAssemblyBegin(bp0); >>>>>>>> > VecAssemblyEnd(bp0); >>>>>>>> > after your last call to VecSetValues() before you can do any >>>>>>>> operations with bp0. >>>>>>>> > >>>>>>>> > With your current code, the call to VecView should produce an >>>>>>>> error if you used the error checking macro CHKERRQ(ierr) (as should >>>>>>>> VecScatter{Begin,End} >>>>>>>> > >>>>>>>> > Thanks, >>>>>>>> > Dave >>>>>>>> > >>>>>>>> > >>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>> > print*,"done! " >>>>>>>> > CHKERRQ(ierr) >>>>>>>> > >>>>>>>> > endif >>>>>>>> > >>>>>>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>> > >>>>>>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>> > >>>>>>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>> > >>>>>>>> > call exit() >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> > And the output is: (with bp the right answer) >>>>>>>> > >>>>>>>> > Vec Object:bp: 2 MPI processes >>>>>>>> > type: mpi >>>>>>>> > Process [0] >>>>>>>> > 1. >>>>>>>> > 2. >>>>>>>> > Process [1] >>>>>>>> > 4. >>>>>>>> > 3. >>>>>>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>>>>>> > type: mpi >>>>>>>> > Process [0] >>>>>>>> > 0. >>>>>>>> > 0. >>>>>>>> > Process [1] >>>>>>>> > 0. >>>>>>>> > 0. >>>>>>>> > Vec Object:bp0: 1 MPI processes >>>>>>>> > type: seq >>>>>>>> > 1. >>>>>>>> > 2. >>>>>>>> > 4. >>>>>>>> > 3. >>>>>>>> > done! >>>>>>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>>>>>> > type: mpi >>>>>>>> > Process [0] >>>>>>>> > 1. >>>>>>>> > 2. >>>>>>>> > Process [1] >>>>>>>> > 0. >>>>>>>> > 0. >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> > Thanks inmensely for your help, >>>>>>>> > >>>>>>>> > Manuel >>>>>>>> > >>>>>>>> > >>>>>>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>>>>>>> wrote: >>>>>>>> > >>>>>>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> > > >>>>>>>> > > Hello Devs is me again, >>>>>>>> > > >>>>>>>> > > I'm trying to distribute a vector to all called processes, the >>>>>>>> vector would be originally in root as a sequential vector and i would like >>>>>>>> to scatter it, what would the best call to do this ? >>>>>>>> > > >>>>>>>> > > I already know how to gather a distributed vector to root with >>>>>>>> VecScatterCreateToZero, this would be the inverse operation, >>>>>>>> > >>>>>>>> > Use the same VecScatter object but with SCATTER_REVERSE, not >>>>>>>> you need to reverse the two vector arguments as well. >>>>>>>> > >>>>>>>> > >>>>>>>> > > i'm currently trying with VecScatterCreate() and as of now im >>>>>>>> doing the following: >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > if(rank==0)then >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) >>>>>>>> !if i use WORLD >>>>>>>> > > >>>>>>>> !freezes in SetSizes >>>>>>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); >>>>>>>> CHKERRQ(ierr) >>>>>>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>>>>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>> > > >>>>>>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>> > > >>>>>>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>> VecAssemblyEnd(bp0,ierr) !rhs >>>>>>>> > > >>>>>>>> > > do i=0,nbdp-1,1 >>>>>>>> > > ind(i+1) = i >>>>>>>> > > enddo >>>>>>>> > > >>>>>>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>>>>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>>>>>> > > >>>>>>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>>>>>> !if i use SELF >>>>>>>> > > >>>>>>>> !freezes here. >>>>>>>> > > >>>>>>>> > > call VecScatterCreate(bp0,locis,bp2 >>>>>>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>>>>>> > > >>>>>>>> > > endif >>>>>>>> > > >>>>>>>> > > bp2 being the receptor MPI vector to scatter to >>>>>>>> > > >>>>>>>> > > But it freezes in VecScatterCreate when trying to use more than >>>>>>>> one processor, what would be a better approach ? >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > Thanks once again, >>>>>>>> > > >>>>>>>> > > Manuel >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> > > Thanks i had no idea how to debug and read those logs, that >>>>>>>> solved this issue at least (i was sending a message from root to everyone >>>>>>>> else, but trying to catch from everyone else including root) >>>>>>>> > > >>>>>>>> > > Until next time, many thanks, >>>>>>>> > > >>>>>>>> > > Manuel >>>>>>>> > > >>>>>>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley < >>>>>>>> knepley at gmail.com> wrote: >>>>>>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> > > I did a PetscBarrier just before calling the vicariate routine >>>>>>>> and im pretty sure im calling it from every processor, code looks like this: >>>>>>>> > > >>>>>>>> > > From the gdb trace. >>>>>>>> > > >>>>>>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>>>>>> > > >>>>>>>> > > Proc 1: Is in VecCreate(), line 130 >>>>>>>> > > >>>>>>>> > > You need to fix your communication code. >>>>>>>> > > >>>>>>>> > > Matt >>>>>>>> > > >>>>>>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>> > > >>>>>>>> > > print*,'entering POInit from',rank >>>>>>>> > > !call exit() >>>>>>>> > > >>>>>>>> > > call PetscObjsInit() >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > And output gives: >>>>>>>> > > >>>>>>>> > > entering POInit from 0 >>>>>>>> > > entering POInit from 1 >>>>>>>> > > entering POInit from 2 >>>>>>>> > > entering POInit from 3 >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > Still hangs in the same way, >>>>>>>> > > >>>>>>>> > > Thanks, >>>>>>>> > > >>>>>>>> > > Manuel >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> > > Thanks for the answers ! >>>>>>>> > > >>>>>>>> > > heres the screenshot of what i got from bt in gdb (great hint >>>>>>>> in how to debug in petsc, didn't know that) >>>>>>>> > > >>>>>>>> > > I don't really know what to look at here, >>>>>>>> > > >>>>>>>> > > Thanks, >>>>>>>> > > >>>>>>>> > > Manuel >>>>>>>> > > >>>>>>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May < >>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>>>>>> function(s). These functions cannot be inside if statements like >>>>>>>> > > if (rank == 0){ >>>>>>>> > > VecCreateMPI(...) >>>>>>>> > > } >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> > > Thanks Dave for the quick answer, appreciate it, >>>>>>>> > > >>>>>>>> > > I just tried that and it didn't make a difference, any other >>>>>>>> suggestions ? >>>>>>>> > > >>>>>>>> > > Thanks, >>>>>>>> > > Manuel >>>>>>>> > > >>>>>>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May < >>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>> > > You need to swap the order of your function calls. >>>>>>>> > > Call VecSetSizes() before VecSetType() >>>>>>>> > > >>>>>>>> > > Thanks, >>>>>>>> > > Dave >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> > > Hello all, happy new year, >>>>>>>> > > >>>>>>>> > > I'm working on parallelizing my code, it worked and provided >>>>>>>> some results when i just called more than one processor, but created >>>>>>>> artifacts because i didn't need one image of the whole program in each >>>>>>>> processor, conflicting with each other. >>>>>>>> > > >>>>>>>> > > Since the pressure solver is the main part i need in parallel >>>>>>>> im chosing mpi to run everything in root processor until its time to solve >>>>>>>> for pressure, at this point im trying to create a distributed vector using >>>>>>>> either >>>>>>>> > > >>>>>>>> > > call VecCreateMPI(PETSC_COMM_WORLD, >>>>>>>> PETSC_DECIDE,nbdp,xp,ierr) >>>>>>>> > > or >>>>>>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>>>>> > > call VecSetType(xp,VECMPI,ierr) >>>>>>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); CHKERRQ(ierr) >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > In both cases program hangs at this point, something it never >>>>>>>> happened on the naive way i described before. I've made sure the global >>>>>>>> size, nbdp, is the same in every processor. What can be wrong? >>>>>>>> > > >>>>>>>> > > Thanks for your kind help, >>>>>>>> > > >>>>>>>> > > Manuel. >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > >>>>>>>> > > -- >>>>>>>> > > What most experimenters take for granted before they begin >>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>> their experiments lead. >>>>>>>> > > -- Norbert Wiener >>>>>>>> > > >>>>>>>> > > >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> > >>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Screen Shot 2017-01-07 at 3.30.48 PM.png Type: image/png Size: 515889 bytes Desc: not available URL: From knepley at gmail.com Sat Jan 7 17:39:39 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 7 Jan 2017 17:39:39 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: On Sat, Jan 7, 2017 at 5:33 PM, Manuel Valera wrote: > Thanks Barry and Matt, > > I was able to detect a bug that i just solved, as suggested the loop > parameters weren't updated as it should, now it does and the program still > freezes but now in the beginning of the loop... ? > You have called a collective function from only one process. Stepping through on both processes in your run will find this easily. Thanks, Matt > Im attaching screen so you have an idea. Im thinking about it... > > Thanks > > On Sat, Jan 7, 2017 at 3:21 PM, Matthew Knepley wrote: > >> On Sat, Jan 7, 2017 at 4:59 PM, Manuel Valera >> wrote: >> >>> I would have to think and code a MWE for this problem before sending it >>> since the model is much bigger than the petsc solver. Attached here is a >>> screenshot of the debugger as barry taught me, is that the stack trace you >>> need ? >>> >>> the ucmsMain.f90:522 that shows is the call (from all processes) to the >>> routine that updates the rhs vector (from root) and scatters it (from all >>> processes). >>> >> >> Yes, so one process is here and the other has moved on, so there is a >> mismatch in calls. >> >> You could do what Barry suggests, but I think it would be better to just >> step through your main routine once (its slow going), and >> see where the divergence happens. >> >> Matt >> >> >>> This routine is itself inside a double loop that occurs in all processes >>> but the only call from all processes to the solver is this one, the rest of >>> the loop which involves correcting for velocities, pressure and >>> temperature, all happens in root node. >>> >>> Sorry for the convoluted program design, this is the first beta version >>> of the model working on parallel and was the best i could come with, i >>> suppose it makes more sense in serial, >>> >>> Thanks >>> >>> On Sat, Jan 7, 2017 at 2:24 PM, Matthew Knepley >>> wrote: >>> >>>> On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera >>>> wrote: >>>> >>>>> Thank you Matthew, >>>>> >>>>> On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera >>>>>> wrote: >>>>>> >>>>>>> Hi Devs, hope you are having a great weekend, >>>>>>> >>>>>>> I could finally parallelize my linear solver and implement it into >>>>>>> the rest of the code in a way that only the linear system is solved in >>>>>>> parallel, great news for my team, but there is a catch and is that i don't >>>>>>> see any speedup in the linear system, i don't know if its the MPI in the >>>>>>> cluster we are using, but im not sure on how to debug it, >>>>>>> >>>>>> >>>>>> We need to see -log_view output for any performance question. >>>>>> >>>>>> >>>>>>> On the other hand and because of this issue i was trying to do >>>>>>> -log_summary or -log_view and i noticed the program in this context hangs >>>>>>> when is time of producing the log, if i debug this for 2 cores, process 0 >>>>>>> exits normally but process 1 hangs in the vectorscatterbegin() with >>>>>>> scatter_reverse way back in the code, >>>>>>> >>>>>> >>>>>> You are calling a collective routine from only 1 process. >>>>>> >>>>>> >>>>> Matt >>>>>> >>>>> >>>>> I am pretty confident this is not the case, >>>>> >>>> >>>> This is still the simplest explanation. Can you send the stack trace >>>> for the 2 process run? >>>> >>>> >>>>> the callings to vecscattercreatetozero and vecscatterbegin are made in >>>>> all processes, the program goes thru all of the iterations on the linear >>>>> solver, writes output correctly and even closes all the petsc objects >>>>> without complaining, the freeze occurs at the very end when the log is to >>>>> be produced. >>>>> >>>> >>>> If you can send us a code to run, we can likely find the error. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> Thanks, >>>>> >>>>> Manuel >>>>> >>>>> >>>>> >>>>>> >>>>>> >>>>>>> and even after destroying all associated objects and calling >>>>>>> petscfinalize(), so im really clueless on why is this, as it only happens >>>>>>> for -log_* or -ksp_view options. >>>>>>> >>>>>>> my -ksp_view shows this: >>>>>>> >>>>>>> KSP Object: 2 MPI processes >>>>>>> >>>>>>> type: gcr >>>>>>> >>>>>>> GCR: restart = 30 >>>>>>> >>>>>>> GCR: restarts performed = 20 >>>>>>> >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> >>>>>>> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >>>>>>> >>>>>>> right preconditioning >>>>>>> >>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>> >>>>>>> PC Object: 2 MPI processes >>>>>>> >>>>>>> type: bjacobi >>>>>>> >>>>>>> block Jacobi: number of blocks = 2 >>>>>>> >>>>>>> Local solve is same for all blocks, in the following KSP and PC >>>>>>> objects: >>>>>>> >>>>>>> KSP Object: (sub_) 1 MPI processes >>>>>>> >>>>>>> type: preonly >>>>>>> >>>>>>> maximum iterations=10000, initial guess is zero >>>>>>> >>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>>>>>> >>>>>>> left preconditioning >>>>>>> >>>>>>> using NONE norm type for convergence test >>>>>>> >>>>>>> PC Object: (sub_) 1 MPI processes >>>>>>> >>>>>>> type: ilu >>>>>>> >>>>>>> ILU: out-of-place factorization >>>>>>> >>>>>>> 0 levels of fill >>>>>>> >>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>> >>>>>>> matrix ordering: natural >>>>>>> >>>>>>> factor fill ratio given 1., needed 1. >>>>>>> >>>>>>> Factored matrix follows: >>>>>>> >>>>>>> Mat Object: 1 MPI processes >>>>>>> >>>>>>> type: seqaij >>>>>>> >>>>>>> rows=100000, cols=100000 >>>>>>> >>>>>>> package used to perform factorization: petsc >>>>>>> >>>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>>> >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> >>>>>>> not using I-node routines >>>>>>> >>>>>>> linear system matrix = precond matrix: >>>>>>> >>>>>>> Mat Object: 1 MPI processes >>>>>>> >>>>>>> type: seqaij >>>>>>> >>>>>>> rows=100000, cols=100000 >>>>>>> >>>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>>> >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> >>>>>>> not using I-node routines >>>>>>> >>>>>>> linear system matrix = precond matrix: >>>>>>> >>>>>>> Mat Object: 2 MPI processes >>>>>>> >>>>>>> type: mpiaij >>>>>>> >>>>>>> rows=200000, cols=200000 >>>>>>> >>>>>>> total: nonzeros=3373340, allocated nonzeros=3373340 >>>>>>> >>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>> >>>>>>> not using I-node (on process 0) routines >>>>>>> >>>>>>> >>>>>>> >>>>>>> And i configured my PC object as: >>>>>>> >>>>>>> >>>>>>> call PCSetType(mg,PCHYPRE,ierr) >>>>>>> >>>>>>> call PCHYPRESetType(mg,'boomeramg',ierr) >>>>>>> >>>>>>> >>>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT, >>>>>>> 'pc_hypre_boomeramg_nodal_coarsen','1',ierr) >>>>>>> >>>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT, >>>>>>> 'pc_hypre_boomeramg_vec_interp_variant','1',ierr) >>>>>>> >>>>>>> >>>>>>> >>>>>>> What are your thoughts ? >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Manuel >>>>>>> >>>>>>> >>>>>>> >>>>>>> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera >>>>>> > wrote: >>>>>>> >>>>>>>> Awesome, that did it, thanks once again. >>>>>>>> >>>>>>>> >>>>>>>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith >>>>>>>> wrote: >>>>>>>> >>>>>>>>> >>>>>>>>> Take the scatter out of the if () since everyone does it and >>>>>>>>> get rid of the VecView(). >>>>>>>>> >>>>>>>>> Does this work? If not where is it hanging? >>>>>>>>> >>>>>>>>> >>>>>>>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera >>>>>>>>> wrote: >>>>>>>>> > >>>>>>>>> > Thanks Dave, >>>>>>>>> > >>>>>>>>> > I think is interesting it never gave an error on this, after >>>>>>>>> adding the vecassembly calls it still shows the same behavior, without >>>>>>>>> complaining, i did: >>>>>>>>> > >>>>>>>>> > if(rankl==0)then >>>>>>>>> > >>>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>> > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>>> VecAssemblyEnd(bp0,ierr); >>>>>>>>> > CHKERRQ(ierr) >>>>>>>>> > >>>>>>>>> endif >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>> > print*,"done! " >>>>>>>>> > CHKERRQ(ierr) >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > CHKERRQ(ierr) >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > Thanks. >>>>>>>>> > >>>>>>>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May < >>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > On 6 January 2017 at 20:24, Manuel Valera >>>>>>>>> wrote: >>>>>>>>> > Great help Barry, i totally had overlooked that option (it is >>>>>>>>> explicit in the vecscatterbegin call help page but not in >>>>>>>>> vecscattercreatetozero, as i read later) >>>>>>>>> > >>>>>>>>> > So i used that and it works partially, it scatters te values >>>>>>>>> assigned in root but not the rest, if i call vecscatterbegin from outside >>>>>>>>> root it hangs, the code currently look as this: >>>>>>>>> > >>>>>>>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>>>>>>> > >>>>>>>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>>>>>>> > >>>>>>>>> > if(rankl==0)then >>>>>>>>> > >>>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>> > >>>>>>>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > You need to call >>>>>>>>> > >>>>>>>>> > VecAssemblyBegin(bp0); >>>>>>>>> > VecAssemblyEnd(bp0); >>>>>>>>> > after your last call to VecSetValues() before you can do any >>>>>>>>> operations with bp0. >>>>>>>>> > >>>>>>>>> > With your current code, the call to VecView should produce an >>>>>>>>> error if you used the error checking macro CHKERRQ(ierr) (as should >>>>>>>>> VecScatter{Begin,End} >>>>>>>>> > >>>>>>>>> > Thanks, >>>>>>>>> > Dave >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>> > print*,"done! " >>>>>>>>> > CHKERRQ(ierr) >>>>>>>>> > >>>>>>>>> > endif >>>>>>>>> > >>>>>>>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>> > >>>>>>>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>> > >>>>>>>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>>> > >>>>>>>>> > call exit() >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > And the output is: (with bp the right answer) >>>>>>>>> > >>>>>>>>> > Vec Object:bp: 2 MPI processes >>>>>>>>> > type: mpi >>>>>>>>> > Process [0] >>>>>>>>> > 1. >>>>>>>>> > 2. >>>>>>>>> > Process [1] >>>>>>>>> > 4. >>>>>>>>> > 3. >>>>>>>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>>>>>>> > type: mpi >>>>>>>>> > Process [0] >>>>>>>>> > 0. >>>>>>>>> > 0. >>>>>>>>> > Process [1] >>>>>>>>> > 0. >>>>>>>>> > 0. >>>>>>>>> > Vec Object:bp0: 1 MPI processes >>>>>>>>> > type: seq >>>>>>>>> > 1. >>>>>>>>> > 2. >>>>>>>>> > 4. >>>>>>>>> > 3. >>>>>>>>> > done! >>>>>>>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>>>>>>> > type: mpi >>>>>>>>> > Process [0] >>>>>>>>> > 1. >>>>>>>>> > 2. >>>>>>>>> > Process [1] >>>>>>>>> > 0. >>>>>>>>> > 0. >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > Thanks inmensely for your help, >>>>>>>>> > >>>>>>>>> > Manuel >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>>>>>>>> wrote: >>>>>>>>> > >>>>>>>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera < >>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>> > > >>>>>>>>> > > Hello Devs is me again, >>>>>>>>> > > >>>>>>>>> > > I'm trying to distribute a vector to all called processes, the >>>>>>>>> vector would be originally in root as a sequential vector and i would like >>>>>>>>> to scatter it, what would the best call to do this ? >>>>>>>>> > > >>>>>>>>> > > I already know how to gather a distributed vector to root with >>>>>>>>> VecScatterCreateToZero, this would be the inverse operation, >>>>>>>>> > >>>>>>>>> > Use the same VecScatter object but with SCATTER_REVERSE, not >>>>>>>>> you need to reverse the two vector arguments as well. >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > > i'm currently trying with VecScatterCreate() and as of now im >>>>>>>>> doing the following: >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > if(rank==0)then >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) >>>>>>>>> !if i use WORLD >>>>>>>>> > > >>>>>>>>> !freezes in SetSizes >>>>>>>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); >>>>>>>>> CHKERRQ(ierr) >>>>>>>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>>>>>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>> > > >>>>>>>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>> > > >>>>>>>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>>> VecAssemblyEnd(bp0,ierr) !rhs >>>>>>>>> > > >>>>>>>>> > > do i=0,nbdp-1,1 >>>>>>>>> > > ind(i+1) = i >>>>>>>>> > > enddo >>>>>>>>> > > >>>>>>>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>>>>>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>>>>>>> > > >>>>>>>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>>>>>>> !if i use SELF >>>>>>>>> > > >>>>>>>>> !freezes here. >>>>>>>>> > > >>>>>>>>> > > call VecScatterCreate(bp0,locis,bp2 >>>>>>>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>>>>>>> > > >>>>>>>>> > > endif >>>>>>>>> > > >>>>>>>>> > > bp2 being the receptor MPI vector to scatter to >>>>>>>>> > > >>>>>>>>> > > But it freezes in VecScatterCreate when trying to use more >>>>>>>>> than one processor, what would be a better approach ? >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > Thanks once again, >>>>>>>>> > > >>>>>>>>> > > Manuel >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>> > > Thanks i had no idea how to debug and read those logs, that >>>>>>>>> solved this issue at least (i was sending a message from root to everyone >>>>>>>>> else, but trying to catch from everyone else including root) >>>>>>>>> > > >>>>>>>>> > > Until next time, many thanks, >>>>>>>>> > > >>>>>>>>> > > Manuel >>>>>>>>> > > >>>>>>>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>> > > I did a PetscBarrier just before calling the vicariate routine >>>>>>>>> and im pretty sure im calling it from every processor, code looks like this: >>>>>>>>> > > >>>>>>>>> > > From the gdb trace. >>>>>>>>> > > >>>>>>>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>>>>>>> > > >>>>>>>>> > > Proc 1: Is in VecCreate(), line 130 >>>>>>>>> > > >>>>>>>>> > > You need to fix your communication code. >>>>>>>>> > > >>>>>>>>> > > Matt >>>>>>>>> > > >>>>>>>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>>> > > >>>>>>>>> > > print*,'entering POInit from',rank >>>>>>>>> > > !call exit() >>>>>>>>> > > >>>>>>>>> > > call PetscObjsInit() >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > And output gives: >>>>>>>>> > > >>>>>>>>> > > entering POInit from 0 >>>>>>>>> > > entering POInit from 1 >>>>>>>>> > > entering POInit from 2 >>>>>>>>> > > entering POInit from 3 >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > Still hangs in the same way, >>>>>>>>> > > >>>>>>>>> > > Thanks, >>>>>>>>> > > >>>>>>>>> > > Manuel >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>> > > Thanks for the answers ! >>>>>>>>> > > >>>>>>>>> > > heres the screenshot of what i got from bt in gdb (great hint >>>>>>>>> in how to debug in petsc, didn't know that) >>>>>>>>> > > >>>>>>>>> > > I don't really know what to look at here, >>>>>>>>> > > >>>>>>>>> > > Thanks, >>>>>>>>> > > >>>>>>>>> > > Manuel >>>>>>>>> > > >>>>>>>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May < >>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>>>>>>> function(s). These functions cannot be inside if statements like >>>>>>>>> > > if (rank == 0){ >>>>>>>>> > > VecCreateMPI(...) >>>>>>>>> > > } >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera < >>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>> > > Thanks Dave for the quick answer, appreciate it, >>>>>>>>> > > >>>>>>>>> > > I just tried that and it didn't make a difference, any other >>>>>>>>> suggestions ? >>>>>>>>> > > >>>>>>>>> > > Thanks, >>>>>>>>> > > Manuel >>>>>>>>> > > >>>>>>>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May < >>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>> > > You need to swap the order of your function calls. >>>>>>>>> > > Call VecSetSizes() before VecSetType() >>>>>>>>> > > >>>>>>>>> > > Thanks, >>>>>>>>> > > Dave >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera < >>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>> > > Hello all, happy new year, >>>>>>>>> > > >>>>>>>>> > > I'm working on parallelizing my code, it worked and provided >>>>>>>>> some results when i just called more than one processor, but created >>>>>>>>> artifacts because i didn't need one image of the whole program in each >>>>>>>>> processor, conflicting with each other. >>>>>>>>> > > >>>>>>>>> > > Since the pressure solver is the main part i need in parallel >>>>>>>>> im chosing mpi to run everything in root processor until its time to solve >>>>>>>>> for pressure, at this point im trying to create a distributed vector using >>>>>>>>> either >>>>>>>>> > > >>>>>>>>> > > call VecCreateMPI(PETSC_COMM_WORLD, >>>>>>>>> PETSC_DECIDE,nbdp,xp,ierr) >>>>>>>>> > > or >>>>>>>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>>>>>> > > call VecSetType(xp,VECMPI,ierr) >>>>>>>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); >>>>>>>>> CHKERRQ(ierr) >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > In both cases program hangs at this point, something it never >>>>>>>>> happened on the naive way i described before. I've made sure the global >>>>>>>>> size, nbdp, is the same in every processor. What can be wrong? >>>>>>>>> > > >>>>>>>>> > > Thanks for your kind help, >>>>>>>>> > > >>>>>>>>> > > Manuel. >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > > -- >>>>>>>>> > > What most experimenters take for granted before they begin >>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>> their experiments lead. >>>>>>>>> > > -- Norbert Wiener >>>>>>>>> > > >>>>>>>>> > > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> > >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Sat Jan 7 18:17:45 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sat, 7 Jan 2017 16:17:45 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: I was able to find the bug, it was the outer loop bound in the same fashion than before, my -log_view is this : ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ucmsMR on a arch-linux2-c-debug named ocean with 2 processors, by valera Sat Jan 7 16:11:51 2017 Using Petsc Release Version 3.7.4, unknown Max Max/Min Avg Total Time (sec): 2.074e+01 1.00000 2.074e+01 Objects: 9.300e+01 1.00000 9.300e+01 Flops: 8.662e+09 1.00000 8.662e+09 1.732e+10 Flops/sec: 4.177e+08 1.00000 4.177e+08 8.354e+08 Memory: 1.027e+08 1.03217 2.021e+08 MPI Messages: 5.535e+02 1.00000 5.535e+02 1.107e+03 MPI Message Lengths: 2.533e+07 1.00000 4.576e+04 5.066e+07 MPI Reductions: 1.903e+04 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 2.0739e+01 100.0% 1.7325e+10 100.0% 1.107e+03 100.0% 4.576e+04 100.0% 1.903e+04 100.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ ########################################################## # # # WARNING!!! # # # # This code was compiled with a debugging option, # # To get timing results run ./configure # # using --with-debugging=no, the performance will # # be generally two or three times faster. # # # ########################################################## Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDotNorm2 545 1.0 4.4925e-01 1.6 2.18e+08 1.0 0.0e+00 0.0e+00 1.1e+03 2 3 0 0 6 2 3 0 0 6 971 VecMDot 525 1.0 1.7089e+00 1.7 1.48e+09 1.0 0.0e+00 0.0e+00 1.0e+03 7 17 0 0 6 7 17 0 0 6 1735 VecNorm 420 1.0 7.6857e-02 1.0 8.40e+07 1.0 0.0e+00 0.0e+00 8.4e+02 0 1 0 0 4 0 1 0 0 4 2186 VecScale 1090 1.0 2.5113e-01 1.1 1.09e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 868 VecSet 555 1.0 7.3570e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 1090 1.0 2.7621e-01 1.1 2.18e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 1579 VecAYPX 5 1.0 3.6647e-03 2.1 5.00e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 273 VecMAXPY 1050 1.0 2.3646e+00 1.1 2.97e+09 1.0 0.0e+00 0.0e+00 0.0e+00 11 34 0 0 0 11 34 0 0 0 2508 VecAssemblyBegin 12 1.7 2.4388e-03 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.8e+01 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 12 1.7 1.0085e-04 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 560 1.0 2.3770e+0071.3 0.00e+00 0.0 1.1e+03 2.7e+04 1.0e+01 6 0 99 59 0 6 0 99 59 0 0 VecScatterEnd 550 1.0 3.7769e-02 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 550 1.0 3.7412e+00 1.1 1.80e+09 1.0 1.1e+03 2.0e+04 0.0e+00 17 21 99 43 0 17 21 99 43 0 962 MatSolve 545 1.0 3.6138e+00 1.1 1.77e+09 1.0 0.0e+00 0.0e+00 0.0e+00 17 20 0 0 0 17 20 0 0 0 980 MatLUFactorNum 1 1.0 1.2530e-01 1.5 1.27e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 203 MatILUFactorSym 1 1.0 2.0162e-02 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 1 1.0 3.3683e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 1 1.0 9.5172e-02359.3 0.00e+00 0.0 0.0e+00 0.0e+00 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 1 1.0 2.6907e-02 1.0 0.00e+00 0.0 4.0e+00 5.0e+03 2.3e+01 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 3 1.0 1.2398e-05 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 4.4249e-02 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 3.8892e-01 1.0 0.00e+00 0.0 7.0e+00 3.0e+06 3.8e+01 2 0 1 41 0 2 0 1 41 0 0 KSPSetUp 2 1.0 2.2634e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 5 1.0 1.2218e+01 1.0 8.66e+09 1.0 1.1e+03 2.0e+04 1.9e+04 59100 99 43 99 59100 99 43 99 1418 PCSetUp 3 1.0 1.7993e+00 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 1.0e+01 8 0 0 0 0 8 0 0 0 0 14 PCSetUpOnBlocks 5 1.0 1.9013e-01 1.7 1.27e+07 1.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 134 PCApply 546 1.0 3.8320e+00 1.1 1.77e+09 1.0 0.0e+00 0.0e+00 1.0e+00 18 20 0 0 0 18 20 0 0 0 925 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 72 6 5609648 0. Vector Scatter 3 2 1312 0. Matrix 4 0 0 0. Viewer 2 0 0 0. Index Set 7 4 13104 0. Krylov Solver 2 0 0 0. Preconditioner 3 1 1384 0. ======================================================================================================================== Average time to get PetscTime(): 7.15256e-08 Average time for MPI_Barrier(): 1.82629e-05 Average time for zero size MPI_Send(): 9.89437e-06 #PETSc Option Table entries: -log_view -matload_block_size 1 -pc_hypre_boomeramg_nodal_coarsen 1 -pc_hypre_boomeramg_vec_interp_variant 1 #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos ----------------------------------------- Libraries compiled on Fri Dec 9 12:45:19 2016 on ocean Machine characteristics: Linux-3.10.0-327.13.1.el7.x86_64-x86_64-with-centos-7.2.1511-Core Using PETSc directory: /home/valera/petsc Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/valera/petsc/arch-linux2-c-debug/include -I/home/valera/petsc/include -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include ----------------------------------------- Using C linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc Using Fortran linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 Using libraries: -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lparmetis -lmetis -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -lmpicxx -lstdc++ -lflapack -lfblas -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lpthread -lm -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpicxx -lstdc++ -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -lmpi -lgcc_s -ldl ----------------------------------------- WARNING! There are options you set that were not used! WARNING! could be spelling mistake, etc! Option left: name:-pc_hypre_boomeramg_nodal_coarsen value: 1 Option left: name:-pc_hypre_boomeramg_vec_interp_variant value: 1 [valera at ocean serGCCOM]$ Any suggestions are very much appreciated, Thanks On Sat, Jan 7, 2017 at 3:39 PM, Matthew Knepley wrote: > On Sat, Jan 7, 2017 at 5:33 PM, Manuel Valera > wrote: > >> Thanks Barry and Matt, >> >> I was able to detect a bug that i just solved, as suggested the loop >> parameters weren't updated as it should, now it does and the program still >> freezes but now in the beginning of the loop... ? >> > > You have called a collective function from only one process. Stepping > through on both processes in your run will find this easily. > > Thanks, > > Matt > > >> Im attaching screen so you have an idea. Im thinking about it... >> >> Thanks >> >> On Sat, Jan 7, 2017 at 3:21 PM, Matthew Knepley >> wrote: >> >>> On Sat, Jan 7, 2017 at 4:59 PM, Manuel Valera >>> wrote: >>> >>>> I would have to think and code a MWE for this problem before sending it >>>> since the model is much bigger than the petsc solver. Attached here is a >>>> screenshot of the debugger as barry taught me, is that the stack trace you >>>> need ? >>>> >>>> the ucmsMain.f90:522 that shows is the call (from all processes) to the >>>> routine that updates the rhs vector (from root) and scatters it (from all >>>> processes). >>>> >>> >>> Yes, so one process is here and the other has moved on, so there is a >>> mismatch in calls. >>> >>> You could do what Barry suggests, but I think it would be better to just >>> step through your main routine once (its slow going), and >>> see where the divergence happens. >>> >>> Matt >>> >>> >>>> This routine is itself inside a double loop that occurs in all >>>> processes but the only call from all processes to the solver is this one, >>>> the rest of the loop which involves correcting for velocities, pressure and >>>> temperature, all happens in root node. >>>> >>>> Sorry for the convoluted program design, this is the first beta version >>>> of the model working on parallel and was the best i could come with, i >>>> suppose it makes more sense in serial, >>>> >>>> Thanks >>>> >>>> On Sat, Jan 7, 2017 at 2:24 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera >>>>> wrote: >>>>> >>>>>> Thank you Matthew, >>>>>> >>>>>> On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera >>>>>> > wrote: >>>>>>> >>>>>>>> Hi Devs, hope you are having a great weekend, >>>>>>>> >>>>>>>> I could finally parallelize my linear solver and implement it into >>>>>>>> the rest of the code in a way that only the linear system is solved in >>>>>>>> parallel, great news for my team, but there is a catch and is that i don't >>>>>>>> see any speedup in the linear system, i don't know if its the MPI in the >>>>>>>> cluster we are using, but im not sure on how to debug it, >>>>>>>> >>>>>>> >>>>>>> We need to see -log_view output for any performance question. >>>>>>> >>>>>>> >>>>>>>> On the other hand and because of this issue i was trying to do >>>>>>>> -log_summary or -log_view and i noticed the program in this context hangs >>>>>>>> when is time of producing the log, if i debug this for 2 cores, process 0 >>>>>>>> exits normally but process 1 hangs in the vectorscatterbegin() with >>>>>>>> scatter_reverse way back in the code, >>>>>>>> >>>>>>> >>>>>>> You are calling a collective routine from only 1 process. >>>>>>> >>>>>>> >>>>>> Matt >>>>>>> >>>>>> >>>>>> I am pretty confident this is not the case, >>>>>> >>>>> >>>>> This is still the simplest explanation. Can you send the stack trace >>>>> for the 2 process run? >>>>> >>>>> >>>>>> the callings to vecscattercreatetozero and vecscatterbegin are made >>>>>> in all processes, the program goes thru all of the iterations on the linear >>>>>> solver, writes output correctly and even closes all the petsc objects >>>>>> without complaining, the freeze occurs at the very end when the log is to >>>>>> be produced. >>>>>> >>>>> >>>>> If you can send us a code to run, we can likely find the error. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Thanks, >>>>>> >>>>>> Manuel >>>>>> >>>>>> >>>>>> >>>>>>> >>>>>>> >>>>>>>> and even after destroying all associated objects and calling >>>>>>>> petscfinalize(), so im really clueless on why is this, as it only happens >>>>>>>> for -log_* or -ksp_view options. >>>>>>>> >>>>>>>> my -ksp_view shows this: >>>>>>>> >>>>>>>> KSP Object: 2 MPI processes >>>>>>>> >>>>>>>> type: gcr >>>>>>>> >>>>>>>> GCR: restart = 30 >>>>>>>> >>>>>>>> GCR: restarts performed = 20 >>>>>>>> >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> >>>>>>>> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >>>>>>>> >>>>>>>> right preconditioning >>>>>>>> >>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>> >>>>>>>> PC Object: 2 MPI processes >>>>>>>> >>>>>>>> type: bjacobi >>>>>>>> >>>>>>>> block Jacobi: number of blocks = 2 >>>>>>>> >>>>>>>> Local solve is same for all blocks, in the following KSP and PC >>>>>>>> objects: >>>>>>>> >>>>>>>> KSP Object: (sub_) 1 MPI processes >>>>>>>> >>>>>>>> type: preonly >>>>>>>> >>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>> >>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>>>>>>> >>>>>>>> left preconditioning >>>>>>>> >>>>>>>> using NONE norm type for convergence test >>>>>>>> >>>>>>>> PC Object: (sub_) 1 MPI processes >>>>>>>> >>>>>>>> type: ilu >>>>>>>> >>>>>>>> ILU: out-of-place factorization >>>>>>>> >>>>>>>> 0 levels of fill >>>>>>>> >>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>> >>>>>>>> matrix ordering: natural >>>>>>>> >>>>>>>> factor fill ratio given 1., needed 1. >>>>>>>> >>>>>>>> Factored matrix follows: >>>>>>>> >>>>>>>> Mat Object: 1 MPI processes >>>>>>>> >>>>>>>> type: seqaij >>>>>>>> >>>>>>>> rows=100000, cols=100000 >>>>>>>> >>>>>>>> package used to perform factorization: petsc >>>>>>>> >>>>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>>>> >>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>> =0 >>>>>>>> >>>>>>>> not using I-node routines >>>>>>>> >>>>>>>> linear system matrix = precond matrix: >>>>>>>> >>>>>>>> Mat Object: 1 MPI processes >>>>>>>> >>>>>>>> type: seqaij >>>>>>>> >>>>>>>> rows=100000, cols=100000 >>>>>>>> >>>>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>>>> >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> >>>>>>>> not using I-node routines >>>>>>>> >>>>>>>> linear system matrix = precond matrix: >>>>>>>> >>>>>>>> Mat Object: 2 MPI processes >>>>>>>> >>>>>>>> type: mpiaij >>>>>>>> >>>>>>>> rows=200000, cols=200000 >>>>>>>> >>>>>>>> total: nonzeros=3373340, allocated nonzeros=3373340 >>>>>>>> >>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>> >>>>>>>> not using I-node (on process 0) routines >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> And i configured my PC object as: >>>>>>>> >>>>>>>> >>>>>>>> call PCSetType(mg,PCHYPRE,ierr) >>>>>>>> >>>>>>>> call PCHYPRESetType(mg,'boomeramg',ierr) >>>>>>>> >>>>>>>> >>>>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT, >>>>>>>> 'pc_hypre_boomeramg_nodal_coarsen','1',ierr) >>>>>>>> >>>>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT, >>>>>>>> 'pc_hypre_boomeramg_vec_interp_variant','1',ierr) >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> What are your thoughts ? >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Manuel >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> >>>>>>>>> Awesome, that did it, thanks once again. >>>>>>>>> >>>>>>>>> >>>>>>>>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>>> >>>>>>>>>> Take the scatter out of the if () since everyone does it and >>>>>>>>>> get rid of the VecView(). >>>>>>>>>> >>>>>>>>>> Does this work? If not where is it hanging? >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > >>>>>>>>>> > Thanks Dave, >>>>>>>>>> > >>>>>>>>>> > I think is interesting it never gave an error on this, after >>>>>>>>>> adding the vecassembly calls it still shows the same behavior, without >>>>>>>>>> complaining, i did: >>>>>>>>>> > >>>>>>>>>> > if(rankl==0)then >>>>>>>>>> > >>>>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>>> > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>>>> VecAssemblyEnd(bp0,ierr); >>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>> > >>>>>>>>>> endif >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>> > print*,"done! " >>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > Thanks. >>>>>>>>>> > >>>>>>>>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May < >>>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > On 6 January 2017 at 20:24, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > Great help Barry, i totally had overlooked that option (it is >>>>>>>>>> explicit in the vecscatterbegin call help page but not in >>>>>>>>>> vecscattercreatetozero, as i read later) >>>>>>>>>> > >>>>>>>>>> > So i used that and it works partially, it scatters te values >>>>>>>>>> assigned in root but not the rest, if i call vecscatterbegin from outside >>>>>>>>>> root it hangs, the code currently look as this: >>>>>>>>>> > >>>>>>>>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>>>>>>>> > >>>>>>>>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>>>>>>>> > >>>>>>>>>> > if(rankl==0)then >>>>>>>>>> > >>>>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>>> > >>>>>>>>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > You need to call >>>>>>>>>> > >>>>>>>>>> > VecAssemblyBegin(bp0); >>>>>>>>>> > VecAssemblyEnd(bp0); >>>>>>>>>> > after your last call to VecSetValues() before you can do any >>>>>>>>>> operations with bp0. >>>>>>>>>> > >>>>>>>>>> > With your current code, the call to VecView should produce an >>>>>>>>>> error if you used the error checking macro CHKERRQ(ierr) (as should >>>>>>>>>> VecScatter{Begin,End} >>>>>>>>>> > >>>>>>>>>> > Thanks, >>>>>>>>>> > Dave >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>> > print*,"done! " >>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>> > >>>>>>>>>> > endif >>>>>>>>>> > >>>>>>>>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>> > >>>>>>>>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>>> > >>>>>>>>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>>>> > >>>>>>>>>> > call exit() >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > And the output is: (with bp the right answer) >>>>>>>>>> > >>>>>>>>>> > Vec Object:bp: 2 MPI processes >>>>>>>>>> > type: mpi >>>>>>>>>> > Process [0] >>>>>>>>>> > 1. >>>>>>>>>> > 2. >>>>>>>>>> > Process [1] >>>>>>>>>> > 4. >>>>>>>>>> > 3. >>>>>>>>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>>>>>>>> > type: mpi >>>>>>>>>> > Process [0] >>>>>>>>>> > 0. >>>>>>>>>> > 0. >>>>>>>>>> > Process [1] >>>>>>>>>> > 0. >>>>>>>>>> > 0. >>>>>>>>>> > Vec Object:bp0: 1 MPI processes >>>>>>>>>> > type: seq >>>>>>>>>> > 1. >>>>>>>>>> > 2. >>>>>>>>>> > 4. >>>>>>>>>> > 3. >>>>>>>>>> > done! >>>>>>>>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>>>>>>>> > type: mpi >>>>>>>>>> > Process [0] >>>>>>>>>> > 1. >>>>>>>>>> > 2. >>>>>>>>>> > Process [1] >>>>>>>>>> > 0. >>>>>>>>>> > 0. >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > Thanks inmensely for your help, >>>>>>>>>> > >>>>>>>>>> > Manuel >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith >>>>>>>>>> wrote: >>>>>>>>>> > >>>>>>>>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > > >>>>>>>>>> > > Hello Devs is me again, >>>>>>>>>> > > >>>>>>>>>> > > I'm trying to distribute a vector to all called processes, >>>>>>>>>> the vector would be originally in root as a sequential vector and i would >>>>>>>>>> like to scatter it, what would the best call to do this ? >>>>>>>>>> > > >>>>>>>>>> > > I already know how to gather a distributed vector to root >>>>>>>>>> with VecScatterCreateToZero, this would be the inverse operation, >>>>>>>>>> > >>>>>>>>>> > Use the same VecScatter object but with SCATTER_REVERSE, not >>>>>>>>>> you need to reverse the two vector arguments as well. >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > > i'm currently trying with VecScatterCreate() and as of now im >>>>>>>>>> doing the following: >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > if(rank==0)then >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); CHKERRQ(ierr) >>>>>>>>>> !if i use WORLD >>>>>>>>>> > > >>>>>>>>>> !freezes in SetSizes >>>>>>>>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); >>>>>>>>>> CHKERRQ(ierr) >>>>>>>>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>>>>>>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>>> > > >>>>>>>>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>>> > > >>>>>>>>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>>>> VecAssemblyEnd(bp0,ierr) !rhs >>>>>>>>>> > > >>>>>>>>>> > > do i=0,nbdp-1,1 >>>>>>>>>> > > ind(i+1) = i >>>>>>>>>> > > enddo >>>>>>>>>> > > >>>>>>>>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>>>>>>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>>>>>>>> > > >>>>>>>>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>>>>>>>> !if i use SELF >>>>>>>>>> > > >>>>>>>>>> !freezes here. >>>>>>>>>> > > >>>>>>>>>> > > call VecScatterCreate(bp0,locis,bp2 >>>>>>>>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>>>>>>>> > > >>>>>>>>>> > > endif >>>>>>>>>> > > >>>>>>>>>> > > bp2 being the receptor MPI vector to scatter to >>>>>>>>>> > > >>>>>>>>>> > > But it freezes in VecScatterCreate when trying to use more >>>>>>>>>> than one processor, what would be a better approach ? >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > Thanks once again, >>>>>>>>>> > > >>>>>>>>>> > > Manuel >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > > Thanks i had no idea how to debug and read those logs, that >>>>>>>>>> solved this issue at least (i was sending a message from root to everyone >>>>>>>>>> else, but trying to catch from everyone else including root) >>>>>>>>>> > > >>>>>>>>>> > > Until next time, many thanks, >>>>>>>>>> > > >>>>>>>>>> > > Manuel >>>>>>>>>> > > >>>>>>>>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > > I did a PetscBarrier just before calling the vicariate >>>>>>>>>> routine and im pretty sure im calling it from every processor, code looks >>>>>>>>>> like this: >>>>>>>>>> > > >>>>>>>>>> > > From the gdb trace. >>>>>>>>>> > > >>>>>>>>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>>>>>>>> > > >>>>>>>>>> > > Proc 1: Is in VecCreate(), line 130 >>>>>>>>>> > > >>>>>>>>>> > > You need to fix your communication code. >>>>>>>>>> > > >>>>>>>>>> > > Matt >>>>>>>>>> > > >>>>>>>>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>>>> > > >>>>>>>>>> > > print*,'entering POInit from',rank >>>>>>>>>> > > !call exit() >>>>>>>>>> > > >>>>>>>>>> > > call PetscObjsInit() >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > And output gives: >>>>>>>>>> > > >>>>>>>>>> > > entering POInit from 0 >>>>>>>>>> > > entering POInit from 1 >>>>>>>>>> > > entering POInit from 2 >>>>>>>>>> > > entering POInit from 3 >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > Still hangs in the same way, >>>>>>>>>> > > >>>>>>>>>> > > Thanks, >>>>>>>>>> > > >>>>>>>>>> > > Manuel >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > > Thanks for the answers ! >>>>>>>>>> > > >>>>>>>>>> > > heres the screenshot of what i got from bt in gdb (great hint >>>>>>>>>> in how to debug in petsc, didn't know that) >>>>>>>>>> > > >>>>>>>>>> > > I don't really know what to look at here, >>>>>>>>>> > > >>>>>>>>>> > > Thanks, >>>>>>>>>> > > >>>>>>>>>> > > Manuel >>>>>>>>>> > > >>>>>>>>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May < >>>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>>>>>>>> function(s). These functions cannot be inside if statements like >>>>>>>>>> > > if (rank == 0){ >>>>>>>>>> > > VecCreateMPI(...) >>>>>>>>>> > > } >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > > Thanks Dave for the quick answer, appreciate it, >>>>>>>>>> > > >>>>>>>>>> > > I just tried that and it didn't make a difference, any other >>>>>>>>>> suggestions ? >>>>>>>>>> > > >>>>>>>>>> > > Thanks, >>>>>>>>>> > > Manuel >>>>>>>>>> > > >>>>>>>>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May < >>>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>>> > > You need to swap the order of your function calls. >>>>>>>>>> > > Call VecSetSizes() before VecSetType() >>>>>>>>>> > > >>>>>>>>>> > > Thanks, >>>>>>>>>> > > Dave >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera < >>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>> > > Hello all, happy new year, >>>>>>>>>> > > >>>>>>>>>> > > I'm working on parallelizing my code, it worked and provided >>>>>>>>>> some results when i just called more than one processor, but created >>>>>>>>>> artifacts because i didn't need one image of the whole program in each >>>>>>>>>> processor, conflicting with each other. >>>>>>>>>> > > >>>>>>>>>> > > Since the pressure solver is the main part i need in parallel >>>>>>>>>> im chosing mpi to run everything in root processor until its time to solve >>>>>>>>>> for pressure, at this point im trying to create a distributed vector using >>>>>>>>>> either >>>>>>>>>> > > >>>>>>>>>> > > call VecCreateMPI(PETSC_COMM_WORLD, >>>>>>>>>> PETSC_DECIDE,nbdp,xp,ierr) >>>>>>>>>> > > or >>>>>>>>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>>>>>>> > > call VecSetType(xp,VECMPI,ierr) >>>>>>>>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); >>>>>>>>>> CHKERRQ(ierr) >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > In both cases program hangs at this point, something it never >>>>>>>>>> happened on the naive way i described before. I've made sure the global >>>>>>>>>> size, nbdp, is the same in every processor. What can be wrong? >>>>>>>>>> > > >>>>>>>>>> > > Thanks for your kind help, >>>>>>>>>> > > >>>>>>>>>> > > Manuel. >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > > -- >>>>>>>>>> > > What most experimenters take for granted before they begin >>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>> their experiments lead. >>>>>>>>>> > > -- Norbert Wiener >>>>>>>>>> > > >>>>>>>>>> > > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> > >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sat Jan 7 18:34:52 2017 From: jed at jedbrown.org (Jed Brown) Date: Sat, 07 Jan 2017 17:34:52 -0700 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: <87lgum4cer.fsf@jedbrown.org> Manuel Valera writes: > I was able to find the bug, it was the outer loop bound in the same fashion > than before, my -log_view is this : [...] > ########################################################## > # # > # WARNING!!! # > # # > # This code was compiled with a debugging option, # > # To get timing results run ./configure # > # using --with-debugging=no, the performance will # > # be generally two or three times faster. # > # # > ########################################################## The above isn't a joke. > VecMDot 525 1.0 1.7089e+00 1.7 1.48e+09 1.0 0.0e+00 0.0e+00 > 1.0e+03 7 17 0 0 6 7 17 0 0 6 1735 > > VecMAXPY 1050 1.0 2.3646e+00 1.1 2.97e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 11 34 0 0 0 11 34 0 0 0 2508 You are spending about a third of the solve time doing vector work. What is your reason for using GCR? > KSPSolve 5 1.0 1.2218e+01 1.0 8.66e+09 1.0 1.1e+03 2.0e+04 > 1.9e+04 59100 99 43 99 59100 99 43 99 1418 > > PCSetUp 3 1.0 1.7993e+00 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 > 1.0e+01 8 0 0 0 0 8 0 0 0 0 14 > > PCSetUpOnBlocks 5 1.0 1.9013e-01 1.7 1.27e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 0 0 0 0 1 0 0 0 0 134 > > PCApply 546 1.0 3.8320e+00 1.1 1.77e+09 1.0 0.0e+00 0.0e+00 > 1.0e+00 18 20 0 0 0 18 20 0 0 0 925 To make a big improvement, you'll need a better preconditioner. What kind of problem is this? -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From knepley at gmail.com Sat Jan 7 18:36:55 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 7 Jan 2017 18:36:55 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: Message-ID: On Sat, Jan 7, 2017 at 6:17 PM, Manuel Valera wrote: > I was able to find the bug, it was the outer loop bound in the same > fashion than before, my -log_view is this : > Good. We also need to see the log from 1 process. I note that you are using GCR/ILU. This solver is different on 1 and 2 processes, so you must check that it is not doing more iterates. Matt > ---------------------------------------------- PETSc Performance Summary: > ---------------------------------------------- > > > ./ucmsMR on a arch-linux2-c-debug named ocean with 2 processors, by valera > Sat Jan 7 16:11:51 2017 > > Using Petsc Release Version 3.7.4, unknown > > > Max Max/Min Avg Total > > Time (sec): 2.074e+01 1.00000 2.074e+01 > > Objects: 9.300e+01 1.00000 9.300e+01 > > Flops: 8.662e+09 1.00000 8.662e+09 1.732e+10 > > Flops/sec: 4.177e+08 1.00000 4.177e+08 8.354e+08 > > Memory: 1.027e+08 1.03217 2.021e+08 > > MPI Messages: 5.535e+02 1.00000 5.535e+02 1.107e+03 > > MPI Message Lengths: 2.533e+07 1.00000 4.576e+04 5.066e+07 > > MPI Reductions: 1.903e+04 1.00000 > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > e.g., VecAXPY() for real vectors of length N > --> 2N flops > > and VecAXPY() for complex vectors of length N > --> 8N flops > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages > --- -- Message Lengths -- -- Reductions -- > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > 0: Main Stage: 2.0739e+01 100.0% 1.7325e+10 100.0% 1.107e+03 > 100.0% 4.576e+04 100.0% 1.903e+04 100.0% > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > Phase summary info: > > Count: number of times phase was executed > > Time and Flops: Max - maximum over all processors > > Ratio - ratio of maximum to minimum over all processors > > Mess: number of messages sent > > Avg. len: average message length (bytes) > > Reduct: number of global reductions > > Global: entire computation > > Stage: stages of a computation. Set stages with PetscLogStagePush() and > PetscLogStagePop(). > > %T - percent time in this phase %F - percent flops in this > phase > > %M - percent messages in this phase %L - percent message lengths > in this phase > > %R - percent reductions in this phase > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > ########################################################## > > # # > > # WARNING!!! # > > # # > > # This code was compiled with a debugging option, # > > # To get timing results run ./configure # > > # using --with-debugging=no, the performance will # > > # be generally two or three times faster. # > > # # > > ########################################################## > > > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > Max Ratio Max Ratio Max Ratio Mess Avg len > Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > --- Event Stage 0: Main Stage > > > VecDotNorm2 545 1.0 4.4925e-01 1.6 2.18e+08 1.0 0.0e+00 0.0e+00 > 1.1e+03 2 3 0 0 6 2 3 0 0 6 971 > > VecMDot 525 1.0 1.7089e+00 1.7 1.48e+09 1.0 0.0e+00 0.0e+00 > 1.0e+03 7 17 0 0 6 7 17 0 0 6 1735 > > VecNorm 420 1.0 7.6857e-02 1.0 8.40e+07 1.0 0.0e+00 0.0e+00 > 8.4e+02 0 1 0 0 4 0 1 0 0 4 2186 > > VecScale 1090 1.0 2.5113e-01 1.1 1.09e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 868 > > VecSet 555 1.0 7.3570e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAXPY 1090 1.0 2.7621e-01 1.1 2.18e+08 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 3 0 0 0 1 3 0 0 0 1579 > > VecAYPX 5 1.0 3.6647e-03 2.1 5.00e+05 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 273 > > VecMAXPY 1050 1.0 2.3646e+00 1.1 2.97e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 11 34 0 0 0 11 34 0 0 0 2508 > > VecAssemblyBegin 12 1.7 2.4388e-03 2.2 0.00e+00 0.0 0.0e+00 0.0e+00 > 2.8e+01 0 0 0 0 0 0 0 0 0 0 0 > > VecAssemblyEnd 12 1.7 1.0085e-04 2.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecScatterBegin 560 1.0 2.3770e+0071.3 0.00e+00 0.0 1.1e+03 2.7e+04 > 1.0e+01 6 0 99 59 0 6 0 99 59 0 0 > > VecScatterEnd 550 1.0 3.7769e-02 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMult 550 1.0 3.7412e+00 1.1 1.80e+09 1.0 1.1e+03 2.0e+04 > 0.0e+00 17 21 99 43 0 17 21 99 43 0 962 > > MatSolve 545 1.0 3.6138e+00 1.1 1.77e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 17 20 0 0 0 17 20 0 0 0 980 > > MatLUFactorNum 1 1.0 1.2530e-01 1.5 1.27e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 0 0 0 0 1 0 0 0 0 203 > > MatILUFactorSym 1 1.0 2.0162e-02 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatConvert 1 1.0 3.3683e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyBegin 1 1.0 9.5172e-02359.3 0.00e+00 0.0 0.0e+00 0.0e+00 > 4.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyEnd 1 1.0 2.6907e-02 1.0 0.00e+00 0.0 4.0e+00 5.0e+03 > 2.3e+01 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRowIJ 3 1.0 1.2398e-05 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetOrdering 1 1.0 4.4249e-02 2.5 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatLoad 1 1.0 3.8892e-01 1.0 0.00e+00 0.0 7.0e+00 3.0e+06 > 3.8e+01 2 0 1 41 0 2 0 1 41 0 0 > > KSPSetUp 2 1.0 2.2634e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 5.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 5 1.0 1.2218e+01 1.0 8.66e+09 1.0 1.1e+03 2.0e+04 > 1.9e+04 59100 99 43 99 59100 99 43 99 1418 > > PCSetUp 3 1.0 1.7993e+00 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 > 1.0e+01 8 0 0 0 0 8 0 0 0 0 14 > > PCSetUpOnBlocks 5 1.0 1.9013e-01 1.7 1.27e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 0 0 0 0 1 0 0 0 0 134 > > PCApply 546 1.0 3.8320e+00 1.1 1.77e+09 1.0 0.0e+00 0.0e+00 > 1.0e+00 18 20 0 0 0 18 20 0 0 0 925 > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > Memory usage is given in bytes: > > > Object Type Creations Destructions Memory Descendants' Mem. > > Reports information only for process 0. > > > --- Event Stage 0: Main Stage > > > Vector 72 6 5609648 0. > > Vector Scatter 3 2 1312 0. > > Matrix 4 0 0 0. > > Viewer 2 0 0 0. > > Index Set 7 4 13104 0. > > Krylov Solver 2 0 0 0. > > Preconditioner 3 1 1384 0. > > ============================================================ > ============================================================ > > Average time to get PetscTime(): 7.15256e-08 > > Average time for MPI_Barrier(): 1.82629e-05 > > Average time for zero size MPI_Send(): 9.89437e-06 > > #PETSc Option Table entries: > > -log_view > > -matload_block_size 1 > > -pc_hypre_boomeramg_nodal_coarsen 1 > > -pc_hypre_boomeramg_vec_interp_variant 1 > > #End of PETSc Option Table entries > > Compiled without FORTRAN kernels > > Compiled with full precision matrices (default) > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran > --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf > --download-hypre --download-metis --download-parmetis --download-trillinos > > ----------------------------------------- > > Libraries compiled on Fri Dec 9 12:45:19 2016 on ocean > > Machine characteristics: Linux-3.10.0-327.13.1.el7.x86_ > 64-x86_64-with-centos-7.2.1511-Core > > Using PETSc directory: /home/valera/petsc > > Using PETSc arch: arch-linux2-c-debug > > ----------------------------------------- > > > Using C compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc > -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas > -fvisibility=hidden -g3 ${COPTFLAGS} ${CFLAGS} > > Using Fortran compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 > -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g ${FOPTFLAGS} > ${FFLAGS} > > ----------------------------------------- > > > Using include paths: -I/home/valera/petsc/arch-linux2-c-debug/include > -I/home/valera/petsc/include -I/home/valera/petsc/include > -I/home/valera/petsc/arch-linux2-c-debug/include > > ----------------------------------------- > > > Using C linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc > > Using Fortran linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 > > Using libraries: -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib > -L/home/valera/petsc/arch-linux2-c-debug/lib -lpetsc > -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib > -L/home/valera/petsc/arch-linux2-c-debug/lib -lparmetis -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 > -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -lmpicxx -lstdc++ -lflapack > -lfblas -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lpthread > -lm -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpicxx > -lstdc++ -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib > -L/home/valera/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 > -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl > -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -lmpi -lgcc_s -ldl > > ----------------------------------------- > > > WARNING! There are options you set that were not used! > > WARNING! could be spelling mistake, etc! > > Option left: name:-pc_hypre_boomeramg_nodal_coarsen value: 1 > > Option left: name:-pc_hypre_boomeramg_vec_interp_variant value: 1 > > [valera at ocean serGCCOM]$ > > > Any suggestions are very much appreciated, > > Thanks > > > > On Sat, Jan 7, 2017 at 3:39 PM, Matthew Knepley wrote: > >> On Sat, Jan 7, 2017 at 5:33 PM, Manuel Valera >> wrote: >> >>> Thanks Barry and Matt, >>> >>> I was able to detect a bug that i just solved, as suggested the loop >>> parameters weren't updated as it should, now it does and the program still >>> freezes but now in the beginning of the loop... ? >>> >> >> You have called a collective function from only one process. Stepping >> through on both processes in your run will find this easily. >> >> Thanks, >> >> Matt >> >> >>> Im attaching screen so you have an idea. Im thinking about it... >>> >>> Thanks >>> >>> On Sat, Jan 7, 2017 at 3:21 PM, Matthew Knepley >>> wrote: >>> >>>> On Sat, Jan 7, 2017 at 4:59 PM, Manuel Valera >>>> wrote: >>>> >>>>> I would have to think and code a MWE for this problem before sending >>>>> it since the model is much bigger than the petsc solver. Attached here is a >>>>> screenshot of the debugger as barry taught me, is that the stack trace you >>>>> need ? >>>>> >>>>> the ucmsMain.f90:522 that shows is the call (from all processes) to >>>>> the routine that updates the rhs vector (from root) and scatters it (from >>>>> all processes). >>>>> >>>> >>>> Yes, so one process is here and the other has moved on, so there is a >>>> mismatch in calls. >>>> >>>> You could do what Barry suggests, but I think it would be better to >>>> just step through your main routine once (its slow going), and >>>> see where the divergence happens. >>>> >>>> Matt >>>> >>>> >>>>> This routine is itself inside a double loop that occurs in all >>>>> processes but the only call from all processes to the solver is this one, >>>>> the rest of the loop which involves correcting for velocities, pressure and >>>>> temperature, all happens in root node. >>>>> >>>>> Sorry for the convoluted program design, this is the first beta >>>>> version of the model working on parallel and was the best i could come >>>>> with, i suppose it makes more sense in serial, >>>>> >>>>> Thanks >>>>> >>>>> On Sat, Jan 7, 2017 at 2:24 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Sat, Jan 7, 2017 at 4:20 PM, Manuel Valera >>>>>> wrote: >>>>>> >>>>>>> Thank you Matthew, >>>>>>> >>>>>>> On Sat, Jan 7, 2017 at 1:49 PM, Matthew Knepley >>>>>>> wrote: >>>>>>> >>>>>>>> On Sat, Jan 7, 2017 at 3:32 PM, Manuel Valera < >>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>> >>>>>>>>> Hi Devs, hope you are having a great weekend, >>>>>>>>> >>>>>>>>> I could finally parallelize my linear solver and implement it into >>>>>>>>> the rest of the code in a way that only the linear system is solved in >>>>>>>>> parallel, great news for my team, but there is a catch and is that i don't >>>>>>>>> see any speedup in the linear system, i don't know if its the MPI in the >>>>>>>>> cluster we are using, but im not sure on how to debug it, >>>>>>>>> >>>>>>>> >>>>>>>> We need to see -log_view output for any performance question. >>>>>>>> >>>>>>>> >>>>>>>>> On the other hand and because of this issue i was trying to do >>>>>>>>> -log_summary or -log_view and i noticed the program in this context hangs >>>>>>>>> when is time of producing the log, if i debug this for 2 cores, process 0 >>>>>>>>> exits normally but process 1 hangs in the vectorscatterbegin() with >>>>>>>>> scatter_reverse way back in the code, >>>>>>>>> >>>>>>>> >>>>>>>> You are calling a collective routine from only 1 process. >>>>>>>> >>>>>>>> >>>>>>> Matt >>>>>>>> >>>>>>> >>>>>>> I am pretty confident this is not the case, >>>>>>> >>>>>> >>>>>> This is still the simplest explanation. Can you send the stack trace >>>>>> for the 2 process run? >>>>>> >>>>>> >>>>>>> the callings to vecscattercreatetozero and vecscatterbegin are made >>>>>>> in all processes, the program goes thru all of the iterations on the linear >>>>>>> solver, writes output correctly and even closes all the petsc objects >>>>>>> without complaining, the freeze occurs at the very end when the log is to >>>>>>> be produced. >>>>>>> >>>>>> >>>>>> If you can send us a code to run, we can likely find the error. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Manuel >>>>>>> >>>>>>> >>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> and even after destroying all associated objects and calling >>>>>>>>> petscfinalize(), so im really clueless on why is this, as it only happens >>>>>>>>> for -log_* or -ksp_view options. >>>>>>>>> >>>>>>>>> my -ksp_view shows this: >>>>>>>>> >>>>>>>>> KSP Object: 2 MPI processes >>>>>>>>> >>>>>>>>> type: gcr >>>>>>>>> >>>>>>>>> GCR: restart = 30 >>>>>>>>> >>>>>>>>> GCR: restarts performed = 20 >>>>>>>>> >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> >>>>>>>>> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >>>>>>>>> >>>>>>>>> right preconditioning >>>>>>>>> >>>>>>>>> using UNPRECONDITIONED norm type for convergence test >>>>>>>>> >>>>>>>>> PC Object: 2 MPI processes >>>>>>>>> >>>>>>>>> type: bjacobi >>>>>>>>> >>>>>>>>> block Jacobi: number of blocks = 2 >>>>>>>>> >>>>>>>>> Local solve is same for all blocks, in the following KSP and >>>>>>>>> PC objects: >>>>>>>>> >>>>>>>>> KSP Object: (sub_) 1 MPI processes >>>>>>>>> >>>>>>>>> type: preonly >>>>>>>>> >>>>>>>>> maximum iterations=10000, initial guess is zero >>>>>>>>> >>>>>>>>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>>>>>>>> >>>>>>>>> left preconditioning >>>>>>>>> >>>>>>>>> using NONE norm type for convergence test >>>>>>>>> >>>>>>>>> PC Object: (sub_) 1 MPI processes >>>>>>>>> >>>>>>>>> type: ilu >>>>>>>>> >>>>>>>>> ILU: out-of-place factorization >>>>>>>>> >>>>>>>>> 0 levels of fill >>>>>>>>> >>>>>>>>> tolerance for zero pivot 2.22045e-14 >>>>>>>>> >>>>>>>>> matrix ordering: natural >>>>>>>>> >>>>>>>>> factor fill ratio given 1., needed 1. >>>>>>>>> >>>>>>>>> Factored matrix follows: >>>>>>>>> >>>>>>>>> Mat Object: 1 MPI processes >>>>>>>>> >>>>>>>>> type: seqaij >>>>>>>>> >>>>>>>>> rows=100000, cols=100000 >>>>>>>>> >>>>>>>>> package used to perform factorization: petsc >>>>>>>>> >>>>>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>>>>> >>>>>>>>> total number of mallocs used during MatSetValues calls >>>>>>>>> =0 >>>>>>>>> >>>>>>>>> not using I-node routines >>>>>>>>> >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> >>>>>>>>> Mat Object: 1 MPI processes >>>>>>>>> >>>>>>>>> type: seqaij >>>>>>>>> >>>>>>>>> rows=100000, cols=100000 >>>>>>>>> >>>>>>>>> total: nonzeros=1675180, allocated nonzeros=1675180 >>>>>>>>> >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> >>>>>>>>> not using I-node routines >>>>>>>>> >>>>>>>>> linear system matrix = precond matrix: >>>>>>>>> >>>>>>>>> Mat Object: 2 MPI processes >>>>>>>>> >>>>>>>>> type: mpiaij >>>>>>>>> >>>>>>>>> rows=200000, cols=200000 >>>>>>>>> >>>>>>>>> total: nonzeros=3373340, allocated nonzeros=3373340 >>>>>>>>> >>>>>>>>> total number of mallocs used during MatSetValues calls =0 >>>>>>>>> >>>>>>>>> not using I-node (on process 0) routines >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> And i configured my PC object as: >>>>>>>>> >>>>>>>>> >>>>>>>>> call PCSetType(mg,PCHYPRE,ierr) >>>>>>>>> >>>>>>>>> call PCHYPRESetType(mg,'boomeramg',ierr) >>>>>>>>> >>>>>>>>> >>>>>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT, >>>>>>>>> 'pc_hypre_boomeramg_nodal_coarsen','1',ierr) >>>>>>>>> >>>>>>>>> call PetscOptionsSetValue(PETSC_NULL_OBJECT, >>>>>>>>> 'pc_hypre_boomeramg_vec_interp_variant','1',ierr) >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> What are your thoughts ? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Manuel >>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> On Fri, Jan 6, 2017 at 1:58 PM, Manuel Valera < >>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>> >>>>>>>>>> Awesome, that did it, thanks once again. >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> On Fri, Jan 6, 2017 at 1:53 PM, Barry Smith >>>>>>>>>> wrote: >>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> Take the scatter out of the if () since everyone does it and >>>>>>>>>>> get rid of the VecView(). >>>>>>>>>>> >>>>>>>>>>> Does this work? If not where is it hanging? >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> > On Jan 6, 2017, at 3:29 PM, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > >>>>>>>>>>> > Thanks Dave, >>>>>>>>>>> > >>>>>>>>>>> > I think is interesting it never gave an error on this, after >>>>>>>>>>> adding the vecassembly calls it still shows the same behavior, without >>>>>>>>>>> complaining, i did: >>>>>>>>>>> > >>>>>>>>>>> > if(rankl==0)then >>>>>>>>>>> > >>>>>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>>>> > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>>>>> VecAssemblyEnd(bp0,ierr); >>>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>>> > >>>>>>>>>>> endif >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>>> > print*,"done! " >>>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > Thanks. >>>>>>>>>>> > >>>>>>>>>>> > On Fri, Jan 6, 2017 at 12:44 PM, Dave May < >>>>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > On 6 January 2017 at 20:24, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > Great help Barry, i totally had overlooked that option (it is >>>>>>>>>>> explicit in the vecscatterbegin call help page but not in >>>>>>>>>>> vecscattercreatetozero, as i read later) >>>>>>>>>>> > >>>>>>>>>>> > So i used that and it works partially, it scatters te values >>>>>>>>>>> assigned in root but not the rest, if i call vecscatterbegin from outside >>>>>>>>>>> root it hangs, the code currently look as this: >>>>>>>>>>> > >>>>>>>>>>> > call VecScatterCreateToZero(bp2,ctr,bp0,ierr); CHKERRQ(ierr) >>>>>>>>>>> > >>>>>>>>>>> > call PetscObjectSetName(bp0, 'bp0:',ierr) >>>>>>>>>>> > >>>>>>>>>>> > if(rankl==0)then >>>>>>>>>>> > >>>>>>>>>>> > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>>>> > >>>>>>>>>>> > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > You need to call >>>>>>>>>>> > >>>>>>>>>>> > VecAssemblyBegin(bp0); >>>>>>>>>>> > VecAssemblyEnd(bp0); >>>>>>>>>>> > after your last call to VecSetValues() before you can do any >>>>>>>>>>> operations with bp0. >>>>>>>>>>> > >>>>>>>>>>> > With your current code, the call to VecView should produce an >>>>>>>>>>> error if you used the error checking macro CHKERRQ(ierr) (as should >>>>>>>>>>> VecScatter{Begin,End} >>>>>>>>>>> > >>>>>>>>>>> > Thanks, >>>>>>>>>>> > Dave >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>>> > call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>>> > print*,"done! " >>>>>>>>>>> > CHKERRQ(ierr) >>>>>>>>>>> > >>>>>>>>>>> > endif >>>>>>>>>>> > >>>>>>>>>>> > ! call VecScatterBegin(ctr,bp0,bp2,IN >>>>>>>>>>> SERT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>>> > ! call VecScatterEnd(ctr,bp0,bp2,INSE >>>>>>>>>>> RT_VALUES,SCATTER_REVERSE,ierr) >>>>>>>>>>> > >>>>>>>>>>> > call VecView(bp2,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>>>> > >>>>>>>>>>> > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>>>>> > >>>>>>>>>>> > call exit() >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > And the output is: (with bp the right answer) >>>>>>>>>>> > >>>>>>>>>>> > Vec Object:bp: 2 MPI processes >>>>>>>>>>> > type: mpi >>>>>>>>>>> > Process [0] >>>>>>>>>>> > 1. >>>>>>>>>>> > 2. >>>>>>>>>>> > Process [1] >>>>>>>>>>> > 4. >>>>>>>>>>> > 3. >>>>>>>>>>> > Vec Object:bp2: 2 MPI processes (before scatter) >>>>>>>>>>> > type: mpi >>>>>>>>>>> > Process [0] >>>>>>>>>>> > 0. >>>>>>>>>>> > 0. >>>>>>>>>>> > Process [1] >>>>>>>>>>> > 0. >>>>>>>>>>> > 0. >>>>>>>>>>> > Vec Object:bp0: 1 MPI processes >>>>>>>>>>> > type: seq >>>>>>>>>>> > 1. >>>>>>>>>>> > 2. >>>>>>>>>>> > 4. >>>>>>>>>>> > 3. >>>>>>>>>>> > done! >>>>>>>>>>> > Vec Object:bp2: 2 MPI processes (after scatter) >>>>>>>>>>> > type: mpi >>>>>>>>>>> > Process [0] >>>>>>>>>>> > 1. >>>>>>>>>>> > 2. >>>>>>>>>>> > Process [1] >>>>>>>>>>> > 0. >>>>>>>>>>> > 0. >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > Thanks inmensely for your help, >>>>>>>>>>> > >>>>>>>>>>> > Manuel >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > On Thu, Jan 5, 2017 at 4:39 PM, Barry Smith < >>>>>>>>>>> bsmith at mcs.anl.gov> wrote: >>>>>>>>>>> > >>>>>>>>>>> > > On Jan 5, 2017, at 6:21 PM, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > > >>>>>>>>>>> > > Hello Devs is me again, >>>>>>>>>>> > > >>>>>>>>>>> > > I'm trying to distribute a vector to all called processes, >>>>>>>>>>> the vector would be originally in root as a sequential vector and i would >>>>>>>>>>> like to scatter it, what would the best call to do this ? >>>>>>>>>>> > > >>>>>>>>>>> > > I already know how to gather a distributed vector to root >>>>>>>>>>> with VecScatterCreateToZero, this would be the inverse operation, >>>>>>>>>>> > >>>>>>>>>>> > Use the same VecScatter object but with SCATTER_REVERSE, >>>>>>>>>>> not you need to reverse the two vector arguments as well. >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > > i'm currently trying with VecScatterCreate() and as of now >>>>>>>>>>> im doing the following: >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > if(rank==0)then >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > call VecCreate(PETSC_COMM_SELF,bp0,ierr); >>>>>>>>>>> CHKERRQ(ierr) !if i use WORLD >>>>>>>>>>> > > >>>>>>>>>>> !freezes in SetSizes >>>>>>>>>>> > > call VecSetSizes(bp0,PETSC_DECIDE,nbdp,ierr); >>>>>>>>>>> CHKERRQ(ierr) >>>>>>>>>>> > > call VecSetType(bp0,VECSEQ,ierr) >>>>>>>>>>> > > call VecSetFromOptions(bp0,ierr); CHKERRQ(ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > call VecSetValues(bp0,nbdp,ind,Rhs,INSERT_VALUES,ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > !call VecSet(bp0,5.0D0,ierr); CHKERRQ(ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > call VecView(bp0,PETSC_VIEWER_STDOUT_WORLD,ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > call VecAssemblyBegin(bp0,ierr) ; call >>>>>>>>>>> VecAssemblyEnd(bp0,ierr) !rhs >>>>>>>>>>> > > >>>>>>>>>>> > > do i=0,nbdp-1,1 >>>>>>>>>>> > > ind(i+1) = i >>>>>>>>>>> > > enddo >>>>>>>>>>> > > >>>>>>>>>>> > > call ISCreateGeneral(PETSC_COMM_SEL >>>>>>>>>>> F,nbdp,ind,PETSC_COPY_VALUES,locis,ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > !call VecScatterCreate(bp0,PETSC_NULL_OBJECT,bp2,is,ctr,ierr) >>>>>>>>>>> !if i use SELF >>>>>>>>>>> > > >>>>>>>>>>> !freezes here. >>>>>>>>>>> > > >>>>>>>>>>> > > call VecScatterCreate(bp0,locis,bp2 >>>>>>>>>>> ,PETSC_NULL_OBJECT,ctr,ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > endif >>>>>>>>>>> > > >>>>>>>>>>> > > bp2 being the receptor MPI vector to scatter to >>>>>>>>>>> > > >>>>>>>>>>> > > But it freezes in VecScatterCreate when trying to use more >>>>>>>>>>> than one processor, what would be a better approach ? >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > Thanks once again, >>>>>>>>>>> > > >>>>>>>>>>> > > Manuel >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > On Wed, Jan 4, 2017 at 3:30 PM, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > > Thanks i had no idea how to debug and read those logs, that >>>>>>>>>>> solved this issue at least (i was sending a message from root to everyone >>>>>>>>>>> else, but trying to catch from everyone else including root) >>>>>>>>>>> > > >>>>>>>>>>> > > Until next time, many thanks, >>>>>>>>>>> > > >>>>>>>>>>> > > Manuel >>>>>>>>>>> > > >>>>>>>>>>> > > On Wed, Jan 4, 2017 at 3:23 PM, Matthew Knepley < >>>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>>> > > On Wed, Jan 4, 2017 at 5:21 PM, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > > I did a PetscBarrier just before calling the vicariate >>>>>>>>>>> routine and im pretty sure im calling it from every processor, code looks >>>>>>>>>>> like this: >>>>>>>>>>> > > >>>>>>>>>>> > > From the gdb trace. >>>>>>>>>>> > > >>>>>>>>>>> > > Proc 0: Is in some MPI routine you call yourself, line 113 >>>>>>>>>>> > > >>>>>>>>>>> > > Proc 1: Is in VecCreate(), line 130 >>>>>>>>>>> > > >>>>>>>>>>> > > You need to fix your communication code. >>>>>>>>>>> > > >>>>>>>>>>> > > Matt >>>>>>>>>>> > > >>>>>>>>>>> > > call PetscBarrier(PETSC_NULL_OBJECT,ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > print*,'entering POInit from',rank >>>>>>>>>>> > > !call exit() >>>>>>>>>>> > > >>>>>>>>>>> > > call PetscObjsInit() >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > And output gives: >>>>>>>>>>> > > >>>>>>>>>>> > > entering POInit from 0 >>>>>>>>>>> > > entering POInit from 1 >>>>>>>>>>> > > entering POInit from 2 >>>>>>>>>>> > > entering POInit from 3 >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > Still hangs in the same way, >>>>>>>>>>> > > >>>>>>>>>>> > > Thanks, >>>>>>>>>>> > > >>>>>>>>>>> > > Manuel >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > On Wed, Jan 4, 2017 at 2:55 PM, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > > Thanks for the answers ! >>>>>>>>>>> > > >>>>>>>>>>> > > heres the screenshot of what i got from bt in gdb (great >>>>>>>>>>> hint in how to debug in petsc, didn't know that) >>>>>>>>>>> > > >>>>>>>>>>> > > I don't really know what to look at here, >>>>>>>>>>> > > >>>>>>>>>>> > > Thanks, >>>>>>>>>>> > > >>>>>>>>>>> > > Manuel >>>>>>>>>>> > > >>>>>>>>>>> > > On Wed, Jan 4, 2017 at 2:39 PM, Dave May < >>>>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>>>> > > Are you certain ALL ranks in PETSC_COMM_WORLD call these >>>>>>>>>>> function(s). These functions cannot be inside if statements like >>>>>>>>>>> > > if (rank == 0){ >>>>>>>>>>> > > VecCreateMPI(...) >>>>>>>>>>> > > } >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > On Wed, 4 Jan 2017 at 23:34, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > > Thanks Dave for the quick answer, appreciate it, >>>>>>>>>>> > > >>>>>>>>>>> > > I just tried that and it didn't make a difference, any other >>>>>>>>>>> suggestions ? >>>>>>>>>>> > > >>>>>>>>>>> > > Thanks, >>>>>>>>>>> > > Manuel >>>>>>>>>>> > > >>>>>>>>>>> > > On Wed, Jan 4, 2017 at 2:29 PM, Dave May < >>>>>>>>>>> dave.mayhem23 at gmail.com> wrote: >>>>>>>>>>> > > You need to swap the order of your function calls. >>>>>>>>>>> > > Call VecSetSizes() before VecSetType() >>>>>>>>>>> > > >>>>>>>>>>> > > Thanks, >>>>>>>>>>> > > Dave >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > On Wed, 4 Jan 2017 at 23:21, Manuel Valera < >>>>>>>>>>> mvalera at mail.sdsu.edu> wrote: >>>>>>>>>>> > > Hello all, happy new year, >>>>>>>>>>> > > >>>>>>>>>>> > > I'm working on parallelizing my code, it worked and provided >>>>>>>>>>> some results when i just called more than one processor, but created >>>>>>>>>>> artifacts because i didn't need one image of the whole program in each >>>>>>>>>>> processor, conflicting with each other. >>>>>>>>>>> > > >>>>>>>>>>> > > Since the pressure solver is the main part i need in >>>>>>>>>>> parallel im chosing mpi to run everything in root processor until its time >>>>>>>>>>> to solve for pressure, at this point im trying to create a distributed >>>>>>>>>>> vector using either >>>>>>>>>>> > > >>>>>>>>>>> > > call VecCreateMPI(PETSC_COMM_WORLD, >>>>>>>>>>> PETSC_DECIDE,nbdp,xp,ierr) >>>>>>>>>>> > > or >>>>>>>>>>> > > call VecCreate(PETSC_COMM_WORLD,xp,ierr); CHKERRQ(ierr) >>>>>>>>>>> > > call VecSetType(xp,VECMPI,ierr) >>>>>>>>>>> > > call VecSetSizes(xp,PETSC_DECIDE,nbdp,ierr); >>>>>>>>>>> CHKERRQ(ierr) >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > In both cases program hangs at this point, something it >>>>>>>>>>> never happened on the naive way i described before. I've made sure the >>>>>>>>>>> global size, nbdp, is the same in every processor. What can be wrong? >>>>>>>>>>> > > >>>>>>>>>>> > > Thanks for your kind help, >>>>>>>>>>> > > >>>>>>>>>>> > > Manuel. >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > > -- >>>>>>>>>>> > > What most experimenters take for granted before they begin >>>>>>>>>>> their experiments is infinitely more interesting than any results to which >>>>>>>>>>> their experiments lead. >>>>>>>>>>> > > -- Norbert Wiener >>>>>>>>>>> > > >>>>>>>>>>> > > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> > >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Sat Jan 7 18:47:48 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sat, 7 Jan 2017 16:47:48 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: <87lgum4cer.fsf@jedbrown.org> References: <87lgum4cer.fsf@jedbrown.org> Message-ID: Awesome Matt and Jed, The GCR is used because the matrix is not invertible and because this was the algorithm that the previous library used, The Preconditioned im aiming to use is multigrid, i thought i configured the hypre-boomerAmg solver for this, but i agree in that it doesn't show in the log anywhere, how can i be sure is being used ? i sent -ksp_view log before in this thread I had a problem with the matrix block sizes so i couldn't make the petsc native multigrid solver to work, This is a nonhidrostatic pressure solver, it is an elliptic problem so multigrid is a must, Regards, Manuel On Sat, Jan 7, 2017 at 4:34 PM, Jed Brown wrote: > Manuel Valera writes: > > > I was able to find the bug, it was the outer loop bound in the same > fashion > > than before, my -log_view is this : > [...] > > ########################################################## > > # # > > # WARNING!!! # > > # # > > # This code was compiled with a debugging option, # > > # To get timing results run ./configure # > > # using --with-debugging=no, the performance will # > > # be generally two or three times faster. # > > # # > > ########################################################## > > The above isn't a joke. > > > VecMDot 525 1.0 1.7089e+00 1.7 1.48e+09 1.0 0.0e+00 0.0e+00 > > 1.0e+03 7 17 0 0 6 7 17 0 0 6 1735 > > > > VecMAXPY 1050 1.0 2.3646e+00 1.1 2.97e+09 1.0 0.0e+00 0.0e+00 > > 0.0e+00 11 34 0 0 0 11 34 0 0 0 2508 > > You are spending about a third of the solve time doing vector work. > What is your reason for using GCR? > > > KSPSolve 5 1.0 1.2218e+01 1.0 8.66e+09 1.0 1.1e+03 2.0e+04 > > 1.9e+04 59100 99 43 99 59100 99 43 99 1418 > > > > PCSetUp 3 1.0 1.7993e+00 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 > > 1.0e+01 8 0 0 0 0 8 0 0 0 0 14 > > > > PCSetUpOnBlocks 5 1.0 1.9013e-01 1.7 1.27e+07 1.0 0.0e+00 0.0e+00 > > 0.0e+00 1 0 0 0 0 1 0 0 0 0 134 > > > > PCApply 546 1.0 3.8320e+00 1.1 1.77e+09 1.0 0.0e+00 0.0e+00 > > 1.0e+00 18 20 0 0 0 18 20 0 0 0 925 > > To make a big improvement, you'll need a better preconditioner. What > kind of problem is this? > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sat Jan 7 19:27:35 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 7 Jan 2017 19:27:35 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> Message-ID: On Sat, Jan 7, 2017 at 6:47 PM, Manuel Valera wrote: > Awesome Matt and Jed, > > The GCR is used because the matrix is not invertible and because this was > the algorithm that the previous library used, > > The Preconditioned im aiming to use is multigrid, i thought i configured > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > before in this thread > Yes, that says you are using Block Jacobi/ILU. Try running with -pc_type gamg which is algebraic multigrid. That will give good iteration counts, but not be as efficient as geometric MG. Once that works, you can switch over to geometric. > I had a problem with the matrix block sizes so i couldn't make the petsc > native multigrid solver to work, > Please send the error. > This is a nonhidrostatic pressure solver, it is an elliptic problem so > multigrid is a must, > Yes, MG is the way to go. Matt > Regards, > > Manuel > > On Sat, Jan 7, 2017 at 4:34 PM, Jed Brown wrote: > >> Manuel Valera writes: >> >> > I was able to find the bug, it was the outer loop bound in the same >> fashion >> > than before, my -log_view is this : >> [...] >> > ########################################################## >> > # # >> > # WARNING!!! # >> > # # >> > # This code was compiled with a debugging option, # >> > # To get timing results run ./configure # >> > # using --with-debugging=no, the performance will # >> > # be generally two or three times faster. # >> > # # >> > ########################################################## >> >> The above isn't a joke. >> >> > VecMDot 525 1.0 1.7089e+00 1.7 1.48e+09 1.0 0.0e+00 0.0e+00 >> > 1.0e+03 7 17 0 0 6 7 17 0 0 6 1735 >> > >> > VecMAXPY 1050 1.0 2.3646e+00 1.1 2.97e+09 1.0 0.0e+00 0.0e+00 >> > 0.0e+00 11 34 0 0 0 11 34 0 0 0 2508 >> >> You are spending about a third of the solve time doing vector work. >> What is your reason for using GCR? >> >> > KSPSolve 5 1.0 1.2218e+01 1.0 8.66e+09 1.0 1.1e+03 2.0e+04 >> > 1.9e+04 59100 99 43 99 59100 99 43 99 1418 >> > >> > PCSetUp 3 1.0 1.7993e+00 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 >> > 1.0e+01 8 0 0 0 0 8 0 0 0 0 14 >> > >> > PCSetUpOnBlocks 5 1.0 1.9013e-01 1.7 1.27e+07 1.0 0.0e+00 0.0e+00 >> > 0.0e+00 1 0 0 0 0 1 0 0 0 0 134 >> > >> > PCApply 546 1.0 3.8320e+00 1.1 1.77e+09 1.0 0.0e+00 0.0e+00 >> > 1.0e+00 18 20 0 0 0 18 20 0 0 0 925 >> >> To make a big improvement, you'll need a better preconditioner. What >> kind of problem is this? >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Sat Jan 7 19:28:27 2017 From: jed at jedbrown.org (Jed Brown) Date: Sat, 07 Jan 2017 18:28:27 -0700 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> Message-ID: <87h95a49xg.fsf@jedbrown.org> Manuel Valera writes: > Awesome Matt and Jed, > > The GCR is used because the matrix is not invertible and because this was > the algorithm that the previous library used, > > The Preconditioned im aiming to use is multigrid, i thought i configured > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > before in this thread Did you run with -pc_type hypre? > I had a problem with the matrix block sizes so i couldn't make the petsc > native multigrid solver to work, What block sizes? If the only variable is pressure, the block size would be 1 (default). > This is a nonhidrostatic pressure solver, it is an elliptic problem so > multigrid is a must, Yes, multigrid should work well. -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From mvalera at mail.sdsu.edu Sat Jan 7 19:38:41 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sat, 7 Jan 2017 17:38:41 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: <87h95a49xg.fsf@jedbrown.org> References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> Message-ID: Ok great, i tried those command line args and this is the result: when i use -pc_type gamg: [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [1]PETSC ERROR: Petsc has generated inconsistent data [1]PETSC ERROR: Have un-symmetric graph (apparently). Use '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold -1.0' if the matrix is structurally symmetric. [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera Sat Jan 7 17:35:05 2017 [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c application called MPI_Abort(comm=0x84000002, 77) - process 1 when i use -pc_type gamg and -pc_gamg_sym_graph true: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [1]PETSC ERROR: ------------------------------------------------------------------------ [1]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [1]PETSC ERROR: INSTEAD the line number of the start of the function [1]PETSC ERROR: is given. [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- when i use -pc_type hypre it actually shows something different on -ksp_view : KSP Object: 2 MPI processes type: gcr GCR: restart = 30 GCR: restarts performed = 37 maximum iterations=10000, initial guess is zero tolerances: relative=1e-14, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: 2 MPI processes type: hypre HYPRE BoomerAMG preconditioning HYPRE BoomerAMG: Cycle type V HYPRE BoomerAMG: Maximum number of levels 25 HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. HYPRE BoomerAMG: Threshold for strong coupling 0.25 HYPRE BoomerAMG: Interpolation truncation factor 0. HYPRE BoomerAMG: Interpolation: max elements per row 0 HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 HYPRE BoomerAMG: Maximum row sums 0.9 HYPRE BoomerAMG: Sweeps down 1 HYPRE BoomerAMG: Sweeps up 1 HYPRE BoomerAMG: Sweeps on coarse 1 HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi HYPRE BoomerAMG: Relax on coarse Gaussian-elimination HYPRE BoomerAMG: Relax weight (all) 1. HYPRE BoomerAMG: Outer relax weight (all) 1. HYPRE BoomerAMG: Using CF-relaxation HYPRE BoomerAMG: Not using more complex smoothers. HYPRE BoomerAMG: Measure type local HYPRE BoomerAMG: Coarsen type Falgout HYPRE BoomerAMG: Interpolation type classical HYPRE BoomerAMG: Using nodal coarsening (with HYPRE_BOOMERAMGSetNodal() 1 HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 linear system matrix = precond matrix: Mat Object: 2 MPI processes type: mpiaij rows=200000, cols=200000 total: nonzeros=3373340, allocated nonzeros=3373340 total number of mallocs used during MatSetValues calls =0 not using I-node (on process 0) routines but still the timing is terrible. On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > Manuel Valera writes: > > > Awesome Matt and Jed, > > > > The GCR is used because the matrix is not invertible and because this was > > the algorithm that the previous library used, > > > > The Preconditioned im aiming to use is multigrid, i thought i configured > > the hypre-boomerAmg solver for this, but i agree in that it doesn't show > in > > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > > before in this thread > > Did you run with -pc_type hypre? > > > I had a problem with the matrix block sizes so i couldn't make the petsc > > native multigrid solver to work, > > What block sizes? If the only variable is pressure, the block size > would be 1 (default). > > > This is a nonhidrostatic pressure solver, it is an elliptic problem so > > multigrid is a must, > > Yes, multigrid should work well. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sat Jan 7 19:52:19 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sat, 7 Jan 2017 19:52:19 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> Message-ID: we need to see the -log_summary with hypre on 1 and 2 processes (with debugging tuned off) also we need to see the output from make streams NPMAX=4 run in the PETSc directory. Also make sure that you have at an absolute minimum at least 10,000 degrees of freedom per process with your code (in this case 20,000). Smaller problems will not scale. Barry Parallel computing is not as simple as it should be. > On Jan 7, 2017, at 7:38 PM, Manuel Valera wrote: > > Ok great, i tried those command line args and this is the result: > > when i use -pc_type gamg: > > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Petsc has generated inconsistent data > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold -1.0' if the matrix is structurally symmetric. > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera Sat Jan 7 17:35:05 2017 > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos > [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [1]PETSC ERROR: INSTEAD the line number of the start of the function > [1]PETSC ERROR: is given. > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > when i use -pc_type hypre it actually shows something different on -ksp_view : > > KSP Object: 2 MPI processes > type: gcr > GCR: restart = 30 > GCR: restarts performed = 37 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: 2 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > HYPRE BoomerAMG: Cycle type V > HYPRE BoomerAMG: Maximum number of levels 25 > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > HYPRE BoomerAMG: Interpolation truncation factor 0. > HYPRE BoomerAMG: Interpolation: max elements per row 0 > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > HYPRE BoomerAMG: Maximum row sums 0.9 > HYPRE BoomerAMG: Sweeps down 1 > HYPRE BoomerAMG: Sweeps up 1 > HYPRE BoomerAMG: Sweeps on coarse 1 > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > HYPRE BoomerAMG: Relax weight (all) 1. > HYPRE BoomerAMG: Outer relax weight (all) 1. > HYPRE BoomerAMG: Using CF-relaxation > HYPRE BoomerAMG: Not using more complex smoothers. > HYPRE BoomerAMG: Measure type local > HYPRE BoomerAMG: Coarsen type Falgout > HYPRE BoomerAMG: Interpolation type classical > HYPRE BoomerAMG: Using nodal coarsening (with HYPRE_BOOMERAMGSetNodal() 1 > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > linear system matrix = precond matrix: > Mat Object: 2 MPI processes > type: mpiaij > rows=200000, cols=200000 > total: nonzeros=3373340, allocated nonzeros=3373340 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > > > but still the timing is terrible. > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > Manuel Valera writes: > > > Awesome Matt and Jed, > > > > The GCR is used because the matrix is not invertible and because this was > > the algorithm that the previous library used, > > > > The Preconditioned im aiming to use is multigrid, i thought i configured > > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in > > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > > before in this thread > > Did you run with -pc_type hypre? > > > I had a problem with the matrix block sizes so i couldn't make the petsc > > native multigrid solver to work, > > What block sizes? If the only variable is pressure, the block size > would be 1 (default). > > > This is a nonhidrostatic pressure solver, it is an elliptic problem so > > multigrid is a must, > > Yes, multigrid should work well. > From knepley at gmail.com Sat Jan 7 21:23:02 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 7 Jan 2017 21:23:02 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> Message-ID: On Sat, Jan 7, 2017 at 7:38 PM, Manuel Valera wrote: > Ok great, i tried those command line args and this is the result: > > when i use -pc_type gamg: > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [1]PETSC ERROR: Petsc has generated inconsistent data > > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use > '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold > -1.0' if the matrix is structurally symmetric. > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera > Sat Jan 7 17:35:05 2017 > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 > --download-netcdf --download-hypre --download-metis --download-parmetis > --download-trillinos > > [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/gamg.c > > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > Do everything Barry said. However, I would like to track down this error. It could be a bug in our code. However, it appears to happen in the call to LAPACK, so it could also be a problem with that library on your machine. Could you run this case in the debugger and give the stack trace? Thanks, Matt > ------------------------------------------------------------------------ > > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point > Exception,probably divide by zero > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/ > documentation/faq.html#valgrind > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS > X to find memory corruption errors > > [1]PETSC ERROR: ------------------------------ > ------------------------------------------ > > [1]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > > [1]PETSC ERROR: INSTEAD the line number of the start of the function > > [1]PETSC ERROR: is given. > > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/impls/gmres/gmreig.c > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 > /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 > /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/gamg.c > > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/agg.c > > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/gamg.c > > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > > when i use -pc_type hypre it actually shows something different on > -ksp_view : > > > KSP Object: 2 MPI processes > > type: gcr > > GCR: restart = 30 > > GCR: restarts performed = 37 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > > right preconditioning > > using UNPRECONDITIONED norm type for convergence test > > PC Object: 2 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > HYPRE BoomerAMG: Cycle type V > > HYPRE BoomerAMG: Maximum number of levels 25 > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > > HYPRE BoomerAMG: Interpolation truncation factor 0. > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > HYPRE BoomerAMG: Maximum row sums 0.9 > > HYPRE BoomerAMG: Sweeps down 1 > > HYPRE BoomerAMG: Sweeps up 1 > > HYPRE BoomerAMG: Sweeps on coarse 1 > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > > HYPRE BoomerAMG: Relax weight (all) 1. > > HYPRE BoomerAMG: Outer relax weight (all) 1. > > HYPRE BoomerAMG: Using CF-relaxation > > HYPRE BoomerAMG: Not using more complex smoothers. > > HYPRE BoomerAMG: Measure type local > > HYPRE BoomerAMG: Coarsen type Falgout > > HYPRE BoomerAMG: Interpolation type classical > > HYPRE BoomerAMG: Using nodal coarsening (with > HYPRE_BOOMERAMGSetNodal() 1 > > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > > linear system matrix = precond matrix: > > Mat Object: 2 MPI processes > > type: mpiaij > > rows=200000, cols=200000 > > total: nonzeros=3373340, allocated nonzeros=3373340 > > total number of mallocs used during MatSetValues calls =0 > > not using I-node (on process 0) routines > > > > but still the timing is terrible. > > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > >> Manuel Valera writes: >> >> > Awesome Matt and Jed, >> > >> > The GCR is used because the matrix is not invertible and because this >> was >> > the algorithm that the previous library used, >> > >> > The Preconditioned im aiming to use is multigrid, i thought i configured >> > the hypre-boomerAmg solver for this, but i agree in that it doesn't >> show in >> > the log anywhere, how can i be sure is being used ? i sent -ksp_view log >> > before in this thread >> >> Did you run with -pc_type hypre? >> >> > I had a problem with the matrix block sizes so i couldn't make the petsc >> > native multigrid solver to work, >> >> What block sizes? If the only variable is pressure, the block size >> would be 1 (default). >> >> > This is a nonhidrostatic pressure solver, it is an elliptic problem so >> > multigrid is a must, >> >> Yes, multigrid should work well. >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sat Jan 7 21:48:14 2017 From: mfadams at lbl.gov (Mark Adams) Date: Sat, 7 Jan 2017 22:48:14 -0500 Subject: [petsc-users] pc_gamg_threshol In-Reply-To: <1483626187.2370.11.camel@seamplex.com> References: <1483564436.1134.3.camel@seamplex.com> <1483564604.1134.4.camel@seamplex.com> <1483626187.2370.11.camel@seamplex.com> Message-ID: Thresholding is a heuristic and a crude algorithm (just chop weak edges with the simplest measure of "weak"). I don't know of any abstract analysis of it. The original smoothed aggregation papers by Vanek, Mandel, Brezina were fairly by fairly mathy folks. Ray Tuminaro and Luke Olsen have done some nice work to develop a more robust coarsening strategies that do some non-trivial analysis of the matrix to try to identify genuinely strong connections. Crude methods are not bad for M matrices but can get fooled by higher order discretizations. On Thu, Jan 5, 2017 at 9:23 AM, Jeremy Theler wrote: > Yes, I read that page and it was that paragraph that made me want to > learn more. > > For example, that pages says: > > ?-pc_gamg_threshold 0.0 is the most robust option (the reason for this > is not obvious) ...? > > > Where can I find more math-based background on this subject? I mean, > some text that describes the methods and not just the implementation as > the source code at gamg/util.c so I can better understand what is going > on. > > > Thanks > > > > -- > Jeremy Theler > www.seamplex.com > > > > On Thu, 2017-01-05 at 09:18 -0500, Mark Adams wrote: > > You want the bottom of page 84 in the manual. > > > > On Wed, Jan 4, 2017 at 4:33 PM, Barry Smith > > wrote: > > > > The manual page gives a high-level description > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCGAMGSetThreshold.html the exact details can be found in the code here > http://www.mcs.anl.gov/petsc/petsc-dev/src/ksp/pc/impls/gamg/util.c.html# > PCGAMGFilterGraph I'm adding a link from the former to the later in the > documentation. > > > > Barry > > > > > > > > > On Jan 4, 2017, at 3:16 PM, Jeremy Theler > > wrote: > > > > > > * Any reference to what pc_gamg_treshold means and/or does? > > > > > > > > > > > > On Wed, 2017-01-04 at 18:13 -0300, Jeremy Theler wrote: > > >> Hi! Any reference to what does -pc_gamg_threshold mean > > and/or? > > >> > > > > > > > > > > > > > -- *The secret to doing good research is always be a little underemployed. You waste years by not being able to waste hours* -- Amos Tversky -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Sat Jan 7 21:57:10 2017 From: mfadams at lbl.gov (Mark Adams) Date: Sat, 7 Jan 2017 22:57:10 -0500 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> Message-ID: This error seems to be coming from the computation of the extreme eigenvalues of the matrix for smoothing in smoothed aggregation. Are you getting good solutions with hypre? This error looks like it might just be the first place where a messed up matrix fails in GAMG. On Sat, Jan 7, 2017 at 10:23 PM, Matthew Knepley wrote: > On Sat, Jan 7, 2017 at 7:38 PM, Manuel Valera > wrote: > >> Ok great, i tried those command line args and this is the result: >> >> when i use -pc_type gamg: >> >> [1]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> >> [1]PETSC ERROR: Petsc has generated inconsistent data >> >> [1]PETSC ERROR: Have un-symmetric graph (apparently). Use >> '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold >> -1.0' if the matrix is structurally symmetric. >> >> [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting. >> >> [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown >> >> [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera >> Sat Jan 7 17:35:05 2017 >> >> [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ >> --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 >> --download-netcdf --download-hypre --download-metis --download-parmetis >> --download-trillinos >> >> [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/s >> rc/ksp/pc/impls/gamg/agg.c >> >> [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in >> /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c >> >> [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in >> /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c >> >> [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/s >> rc/ksp/pc/interface/precon.c >> >> [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/s >> rc/ksp/ksp/interface/itfunc.c >> >> application called MPI_Abort(comm=0x84000002, 77) - process 1 >> >> >> when i use -pc_type gamg and -pc_gamg_sym_graph true: >> > > Do everything Barry said. > > However, I would like to track down this error. It could be a bug in our > code. However, it appears to happen in the call > to LAPACK, so it could also be a problem with that library on your > machine. Could you run this case in the debugger > and give the stack trace? > > Thanks, > > Matt > > >> ------------------------------------------------------------------------ >> >> [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point >> Exception,probably divide by zero >> >> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger >> >> [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/d >> ocumentation/faq.html#valgrind >> >> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS >> X to find memory corruption errors >> >> [1]PETSC ERROR: ------------------------------ >> ------------------------------------------ >> >> [1]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------ >> >> [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available, >> >> [1]PETSC ERROR: INSTEAD the line number of the start of the function >> >> [1]PETSC ERROR: is given. >> >> [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/s >> rc/ksp/ksp/impls/gmres/gmreig.c >> >> [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 >> /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c >> >> [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 >> /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c >> >> [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 >> /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c >> >> [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/s >> rc/ksp/pc/impls/gamg/gamg.c >> >> [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/s >> rc/ksp/pc/interface/precon.c >> >> [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/s >> rc/ksp/ksp/interface/itfunc.c >> >> [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/s >> rc/ksp/pc/impls/gamg/agg.c >> >> [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/s >> rc/ksp/pc/impls/gamg/gamg.c >> >> [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/s >> rc/ksp/pc/interface/precon.c >> >> [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/s >> rc/ksp/ksp/interface/itfunc.c >> >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> >> >> when i use -pc_type hypre it actually shows something different on >> -ksp_view : >> >> >> KSP Object: 2 MPI processes >> >> type: gcr >> >> GCR: restart = 30 >> >> GCR: restarts performed = 37 >> >> maximum iterations=10000, initial guess is zero >> >> tolerances: relative=1e-14, absolute=1e-50, divergence=10000. >> >> right preconditioning >> >> using UNPRECONDITIONED norm type for convergence test >> >> PC Object: 2 MPI processes >> >> type: hypre >> >> HYPRE BoomerAMG preconditioning >> >> HYPRE BoomerAMG: Cycle type V >> >> HYPRE BoomerAMG: Maximum number of levels 25 >> >> HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 >> >> HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. >> >> HYPRE BoomerAMG: Threshold for strong coupling 0.25 >> >> HYPRE BoomerAMG: Interpolation truncation factor 0. >> >> HYPRE BoomerAMG: Interpolation: max elements per row 0 >> >> HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 >> >> HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 >> >> HYPRE BoomerAMG: Maximum row sums 0.9 >> >> HYPRE BoomerAMG: Sweeps down 1 >> >> HYPRE BoomerAMG: Sweeps up 1 >> >> HYPRE BoomerAMG: Sweeps on coarse 1 >> >> HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi >> >> HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi >> >> HYPRE BoomerAMG: Relax on coarse Gaussian-elimination >> >> HYPRE BoomerAMG: Relax weight (all) 1. >> >> HYPRE BoomerAMG: Outer relax weight (all) 1. >> >> HYPRE BoomerAMG: Using CF-relaxation >> >> HYPRE BoomerAMG: Not using more complex smoothers. >> >> HYPRE BoomerAMG: Measure type local >> >> HYPRE BoomerAMG: Coarsen type Falgout >> >> HYPRE BoomerAMG: Interpolation type classical >> >> HYPRE BoomerAMG: Using nodal coarsening (with >> HYPRE_BOOMERAMGSetNodal() 1 >> >> HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 >> >> linear system matrix = precond matrix: >> >> Mat Object: 2 MPI processes >> >> type: mpiaij >> >> rows=200000, cols=200000 >> >> total: nonzeros=3373340, allocated nonzeros=3373340 >> >> total number of mallocs used during MatSetValues calls =0 >> >> not using I-node (on process 0) routines >> >> >> >> but still the timing is terrible. >> >> >> >> >> >> On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: >> >>> Manuel Valera writes: >>> >>> > Awesome Matt and Jed, >>> > >>> > The GCR is used because the matrix is not invertible and because this >>> was >>> > the algorithm that the previous library used, >>> > >>> > The Preconditioned im aiming to use is multigrid, i thought i >>> configured >>> > the hypre-boomerAmg solver for this, but i agree in that it doesn't >>> show in >>> > the log anywhere, how can i be sure is being used ? i sent -ksp_view >>> log >>> > before in this thread >>> >>> Did you run with -pc_type hypre? >>> >>> > I had a problem with the matrix block sizes so i couldn't make the >>> petsc >>> > native multigrid solver to work, >>> >>> What block sizes? If the only variable is pressure, the block size >>> would be 1 (default). >>> >>> > This is a nonhidrostatic pressure solver, it is an elliptic problem so >>> > multigrid is a must, >>> >>> Yes, multigrid should work well. >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -- *The secret to doing good research is always be a little underemployed. You waste years by not being able to waste hours* -- Amos Tversky -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Sun Jan 8 02:20:24 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Sun, 08 Jan 2017 08:20:24 +0000 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> Message-ID: I suggest you check the code is valgrind clean. See the petsc FAQ page for details of how to do this. Thanks, Dave On Sun, 8 Jan 2017 at 04:57, Mark Adams wrote: > This error seems to be coming from the computation of the extreme > eigenvalues of the matrix for smoothing in smoothed aggregation. > > Are you getting good solutions with hypre? This error looks like it might > just be the first place where a messed up matrix fails in GAMG. > > On Sat, Jan 7, 2017 at 10:23 PM, Matthew Knepley > wrote: > > On Sat, Jan 7, 2017 at 7:38 PM, Manuel Valera > wrote: > > Ok great, i tried those command line args and this is the result: > > when i use -pc_type gamg: > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > > [1]PETSC ERROR: Petsc has generated inconsistent data > > > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use > '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold > -1.0' if the matrix is structurally symmetric. > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera > Sat Jan 7 17:35:05 2017 > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 > --download-netcdf --download-hypre --download-metis --download-parmetis > --download-trillinos > > > [1]PETSC ERROR: #1 smoothAggs() line 462 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in > /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in > /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > > > Do everything Barry said. > > However, I would like to track down this error. It could be a bug in our > code. However, it appears to happen in the call > to LAPACK, so it could also be a problem with that library on your > machine. Could you run this case in the debugger > and give the stack trace? > > Thanks, > > Matt > > > ------------------------------------------------------------------------ > > > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point > Exception,probably divide by zero > > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > > > [0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS > X to find memory corruption errors > > > [1]PETSC ERROR: > ------------------------------------------------------------------------ > > [1]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > > > [1]PETSC ERROR: INSTEAD the line number of the start of the function > > > [1]PETSC ERROR: is given. > > > [1]PETSC ERROR: [1] LAPACKgesvd line 42 > /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 > /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 > /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > > [1]PETSC ERROR: [1] PCSetUp line 930 > /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > > [1]PETSC ERROR: [1] KSPSetUp line 305 > /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > [0] PCGAMGOptProlongator_AGG line 1187 > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > > [0]PETSC ERROR: [0] PCSetUp line 930 > /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > > [0]PETSC ERROR: [0] KSPSetUp line 305 > /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > > when i use -pc_type hypre it actually shows something different on > -ksp_view : > > > KSP Object: 2 MPI processes > > type: gcr > > GCR: restart = 30 > > GCR: restarts performed = 37 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > > right preconditioning > > using UNPRECONDITIONED norm type for convergence test > > PC Object: 2 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > HYPRE BoomerAMG: Cycle type V > > HYPRE BoomerAMG: Maximum number of levels 25 > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > > HYPRE BoomerAMG: Interpolation truncation factor 0. > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > HYPRE BoomerAMG: Maximum row sums 0.9 > > HYPRE BoomerAMG: Sweeps down 1 > > HYPRE BoomerAMG: Sweeps up 1 > > HYPRE BoomerAMG: Sweeps on coarse 1 > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > > HYPRE BoomerAMG: Relax weight (all) 1. > > HYPRE BoomerAMG: Outer relax weight (all) 1. > > HYPRE BoomerAMG: Using CF-relaxation > > HYPRE BoomerAMG: Not using more complex smoothers. > > HYPRE BoomerAMG: Measure type local > > HYPRE BoomerAMG: Coarsen type Falgout > > HYPRE BoomerAMG: Interpolation type classical > > HYPRE BoomerAMG: Using nodal coarsening (with > HYPRE_BOOMERAMGSetNodal() 1 > > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > > linear system matrix = precond matrix: > > Mat Object: 2 MPI processes > > type: mpiaij > > rows=200000, cols=200000 > > total: nonzeros=3373340, allocated nonzeros=3373340 > > total number of mallocs used during MatSetValues calls =0 > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > not using I-node (on process 0) routines > > > > but still the timing is terrible. > > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > > Manuel Valera writes: > > > > > > > Awesome Matt and Jed, > > > > > > > > The GCR is used because the matrix is not invertible and because this was > > > > the algorithm that the previous library used, > > > > > > > > The Preconditioned im aiming to use is multigrid, i thought i configured > > > > the hypre-boomerAmg solver for this, but i agree in that it doesn't show > in > > > > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > > > > before in this thread > > > > > > Did you run with -pc_type hypre? > > > > > > > I had a problem with the matrix block sizes so i couldn't make the petsc > > > > native multigrid solver to work, > > > > > > What block sizes? If the only variable is pressure, the block size > > > would be 1 (default). > > > > > > > This is a nonhidrostatic pressure solver, it is an elliptic problem so > > > > multigrid is a must, > > > > > > Yes, multigrid should work well. > > > > > > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > > > > > > > > -- > *The secret to doing good research is always be a little underemployed. > You waste years by not being able to waste hours* -- Amos Tversky > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Jan 8 11:48:38 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 8 Jan 2017 11:48:38 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> Message-ID: <0CF3AB08-71F9-42D0-B2C6-0EDFEE4E9635@mcs.anl.gov> we need to see the -log_summary with hypre on 1 and 2 processes (with debugging tuned off) also we need to see the output from make stream NPMAX=4 run in the PETSc directory. > On Jan 7, 2017, at 7:38 PM, Manuel Valera wrote: > > Ok great, i tried those command line args and this is the result: > > when i use -pc_type gamg: > > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [1]PETSC ERROR: Petsc has generated inconsistent data > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold -1.0' if the matrix is structurally symmetric. > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera Sat Jan 7 17:35:05 2017 > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos > [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > > ------------------------------------------------------------------------ > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > [1]PETSC ERROR: ------------------------------------------------------------------------ > [1]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > [1]PETSC ERROR: INSTEAD the line number of the start of the function > [1]PETSC ERROR: is given. > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > when i use -pc_type hypre it actually shows something different on -ksp_view : > > KSP Object: 2 MPI processes > type: gcr > GCR: restart = 30 > GCR: restarts performed = 37 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: 2 MPI processes > type: hypre > HYPRE BoomerAMG preconditioning > HYPRE BoomerAMG: Cycle type V > HYPRE BoomerAMG: Maximum number of levels 25 > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > HYPRE BoomerAMG: Interpolation truncation factor 0. > HYPRE BoomerAMG: Interpolation: max elements per row 0 > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > HYPRE BoomerAMG: Maximum row sums 0.9 > HYPRE BoomerAMG: Sweeps down 1 > HYPRE BoomerAMG: Sweeps up 1 > HYPRE BoomerAMG: Sweeps on coarse 1 > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > HYPRE BoomerAMG: Relax weight (all) 1. > HYPRE BoomerAMG: Outer relax weight (all) 1. > HYPRE BoomerAMG: Using CF-relaxation > HYPRE BoomerAMG: Not using more complex smoothers. > HYPRE BoomerAMG: Measure type local > HYPRE BoomerAMG: Coarsen type Falgout > HYPRE BoomerAMG: Interpolation type classical > HYPRE BoomerAMG: Using nodal coarsening (with HYPRE_BOOMERAMGSetNodal() 1 > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > linear system matrix = precond matrix: > Mat Object: 2 MPI processes > type: mpiaij > rows=200000, cols=200000 > total: nonzeros=3373340, allocated nonzeros=3373340 > total number of mallocs used during MatSetValues calls =0 > not using I-node (on process 0) routines > > > but still the timing is terrible. > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > Manuel Valera writes: > > > Awesome Matt and Jed, > > > > The GCR is used because the matrix is not invertible and because this was > > the algorithm that the previous library used, > > > > The Preconditioned im aiming to use is multigrid, i thought i configured > > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in > > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > > before in this thread > > Did you run with -pc_type hypre? > > > I had a problem with the matrix block sizes so i couldn't make the petsc > > native multigrid solver to work, > > What block sizes? If the only variable is pressure, the block size > would be 1 (default). > > > This is a nonhidrostatic pressure solver, it is an elliptic problem so > > multigrid is a must, > > Yes, multigrid should work well. > From mvalera at mail.sdsu.edu Sun Jan 8 16:41:37 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sun, 8 Jan 2017 14:41:37 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: <0CF3AB08-71F9-42D0-B2C6-0EDFEE4E9635@mcs.anl.gov> References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> <0CF3AB08-71F9-42D0-B2C6-0EDFEE4E9635@mcs.anl.gov> Message-ID: Ok, i just did the streams and log_summary tests, im attaching the output for each run, with NPMAX=4 and NPMAX=32, also -log_summary runs with -pc_type hypre and without it, with 1 and 2 cores, all of this with debugging turned off. The matrix is 200,000x200,000, full curvilinear 3d meshes, non-hydrostatic pressure solver. Thanks a lot for your insight, Manuel On Sun, Jan 8, 2017 at 9:48 AM, Barry Smith wrote: > > we need to see the -log_summary with hypre on 1 and 2 processes (with > debugging tuned off) also we need to see the output from > > make stream NPMAX=4 > > run in the PETSc directory. > > > > > On Jan 7, 2017, at 7:38 PM, Manuel Valera wrote: > > > > Ok great, i tried those command line args and this is the result: > > > > when i use -pc_type gamg: > > > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [1]PETSC ERROR: Petsc has generated inconsistent data > > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use > '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold > -1.0' if the matrix is structurally symmetric. > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera > Sat Jan 7 17:35:05 2017 > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 > --download-netcdf --download-hypre --download-metis --download-parmetis > --download-trillinos > > [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > > > > ------------------------------------------------------------ > ------------ > > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point > Exception,probably divide by zero > > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/ > documentation/faq.html#valgrind > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac > OS X to find memory corruption errors > > [1]PETSC ERROR: ------------------------------ > ------------------------------------------ > > [1]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > > [1]PETSC ERROR: INSTEAD the line number of the start of the > function > > [1]PETSC ERROR: is given. > > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/impls/gmres/gmreig.c > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 > /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 > /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/gamg.c > > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/agg.c > > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/gamg.c > > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > > > when i use -pc_type hypre it actually shows something different on > -ksp_view : > > > > KSP Object: 2 MPI processes > > type: gcr > > GCR: restart = 30 > > GCR: restarts performed = 37 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > > right preconditioning > > using UNPRECONDITIONED norm type for convergence test > > PC Object: 2 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > HYPRE BoomerAMG: Cycle type V > > HYPRE BoomerAMG: Maximum number of levels 25 > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > > HYPRE BoomerAMG: Interpolation truncation factor 0. > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > HYPRE BoomerAMG: Maximum row sums 0.9 > > HYPRE BoomerAMG: Sweeps down 1 > > HYPRE BoomerAMG: Sweeps up 1 > > HYPRE BoomerAMG: Sweeps on coarse 1 > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > > HYPRE BoomerAMG: Relax weight (all) 1. > > HYPRE BoomerAMG: Outer relax weight (all) 1. > > HYPRE BoomerAMG: Using CF-relaxation > > HYPRE BoomerAMG: Not using more complex smoothers. > > HYPRE BoomerAMG: Measure type local > > HYPRE BoomerAMG: Coarsen type Falgout > > HYPRE BoomerAMG: Interpolation type classical > > HYPRE BoomerAMG: Using nodal coarsening (with > HYPRE_BOOMERAMGSetNodal() 1 > > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > > linear system matrix = precond matrix: > > Mat Object: 2 MPI processes > > type: mpiaij > > rows=200000, cols=200000 > > total: nonzeros=3373340, allocated nonzeros=3373340 > > total number of mallocs used during MatSetValues calls =0 > > not using I-node (on process 0) routines > > > > > > but still the timing is terrible. > > > > > > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > > Manuel Valera writes: > > > > > Awesome Matt and Jed, > > > > > > The GCR is used because the matrix is not invertible and because this > was > > > the algorithm that the previous library used, > > > > > > The Preconditioned im aiming to use is multigrid, i thought i > configured > > > the hypre-boomerAmg solver for this, but i agree in that it doesn't > show in > > > the log anywhere, how can i be sure is being used ? i sent -ksp_view > log > > > before in this thread > > > > Did you run with -pc_type hypre? > > > > > I had a problem with the matrix block sizes so i couldn't make the > petsc > > > native multigrid solver to work, > > > > What block sizes? If the only variable is pressure, the block size > > would be 1 (default). > > > > > This is a nonhidrostatic pressure solver, it is an elliptic problem so > > > multigrid is a must, > > > > Yes, multigrid should work well. > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- WARNING: -log_summary is being deprecated; switch to -log_view ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ucmsMR on a arch-linux2-c-debug named ocean with 1 processor, by valera Sun Jan 8 14:24:49 2017 Using Petsc Release Version 3.7.4, unknown Max Max/Min Avg Total Time (sec): 3.386e+01 1.00000 3.386e+01 Objects: 8.100e+01 1.00000 8.100e+01 Flops: 3.820e+10 1.00000 3.820e+10 3.820e+10 Flops/sec: 1.128e+09 1.00000 1.128e+09 1.128e+09 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 3.3859e+01 100.0% 3.8199e+10 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDotNorm2 1462 1.0 8.0159e-01 1.0 1.17e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 1459 VecMDot 1411 1.0 2.3061e+00 1.0 8.38e+09 1.0 0.0e+00 0.0e+00 0.0e+00 7 22 0 0 0 7 22 0 0 0 3633 VecNorm 1337 1.0 2.3786e-01 1.0 5.35e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2248 VecScale 2924 1.0 3.3265e-01 1.0 5.85e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1758 VecSet 1538 1.0 2.1733e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAXPY 2924 1.0 5.2265e-01 1.0 1.17e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2238 VecAYPX 5 1.0 1.2798e-03 1.0 1.00e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 781 VecMAXPY 2822 1.0 4.9800e+00 1.0 1.68e+10 1.0 0.0e+00 0.0e+00 0.0e+00 15 44 0 0 0 15 44 0 0 0 3365 VecAssemblyBegin 7 1.0 1.9073e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 7 1.0 7.1526e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 5 1.0 1.6944e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 1467 1.0 4.9397e+00 1.0 9.60e+09 1.0 0.0e+00 0.0e+00 0.0e+00 15 25 0 0 0 15 25 0 0 0 1944 MatConvert 2 1.0 6.1032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 3 1.0 1.9073e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 3 1.0 1.5736e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 2.3842e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 6.9497e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 1 1.0 1.2220e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 1 1.0 5.7235e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 5 1.0 2.6259e+01 1.0 3.82e+10 1.0 0.0e+00 0.0e+00 0.0e+00 78100 0 0 0 78100 0 0 0 1455 PCSetUp 2 1.0 3.8668e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 PCApply 1463 1.0 1.2196e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 36 0 0 0 0 36 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 69 6 9609168 0. Vector Scatter 1 1 656 0. Matrix 3 1 803112 0. Matrix Null Space 1 1 592 0. Viewer 3 1 816 0. Krylov Solver 1 0 0 0. Preconditioner 2 1 1384 0. Index Set 1 1 776 0. ======================================================================================================================== Average time to get PetscTime(): 2.38419e-08 #PETSc Option Table entries: -log_summary -matload_block_size 1 -pc_hypre_boomeramg_nodal_coarsen 1 -pc_hypre_boomeramg_vec_interp_variant 1 -pc_type hypre #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos --with-debugging=no ----------------------------------------- Libraries compiled on Sun Jan 8 14:06:45 2017 on ocean Machine characteristics: Linux-3.10.0-327.36.3.el7.x86_64-x86_64-with-centos-7.2.1511-Core Using PETSc directory: /home/valera/petsc Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/valera/petsc/arch-linux2-c-debug/include -I/home/valera/petsc/include -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include ----------------------------------------- Using C linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc Using Fortran linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 Using libraries: -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lparmetis -lmetis -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -lmpicxx -lstdc++ -lflapack -lfblas -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lpthread -lm -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpicxx -lstdc++ -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -lmpi -lgcc_s -ldl ----------------------------------------- -------------- next part -------------- WARNING: -log_summary is being deprecated; switch to -log_view ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ucmsMR on a arch-linux2-c-debug named ocean with 1 processor, by valera Sun Jan 8 14:33:19 2017 Using Petsc Release Version 3.7.4, unknown Max Max/Min Avg Total Time (sec): 9.016e+00 1.00000 9.016e+00 Objects: 8.300e+01 1.00000 8.300e+01 Flops: 5.021e+09 1.00000 5.021e+09 5.021e+09 Flops/sec: 5.569e+08 1.00000 5.569e+08 5.569e+08 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 9.0160e+00 100.0% 5.0209e+09 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDotNorm2 155 1.0 8.4427e-02 1.0 1.24e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 1469 VecMDot 145 1.0 2.3651e-01 1.0 8.70e+08 1.0 0.0e+00 0.0e+00 0.0e+00 3 17 0 0 0 3 17 0 0 0 3679 VecNorm 30 1.0 5.3427e-03 1.0 1.20e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2246 VecScale 310 1.0 3.4916e-02 1.0 6.20e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 1776 VecSet 74 1.0 4.6141e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAXPY 310 1.0 5.4786e-02 1.0 1.24e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 2263 VecAYPX 5 1.0 1.1373e-03 1.0 1.00e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 879 VecMAXPY 290 1.0 5.1323e-01 1.0 1.74e+09 1.0 0.0e+00 0.0e+00 0.0e+00 6 35 0 0 0 6 35 0 0 0 3390 VecAssemblyBegin 7 1.0 2.3842e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 7 1.0 7.1526e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 5 1.0 1.0016e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 160 1.0 5.3934e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 6 21 0 0 0 6 21 0 0 0 1942 MatSolve 155 1.0 6.4364e-01 1.0 1.01e+09 1.0 0.0e+00 0.0e+00 0.0e+00 7 20 0 0 0 7 20 0 0 0 1577 MatLUFactorNum 1 1.0 3.0440e-02 1.0 2.57e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 843 MatILUFactorSym 1 1.0 1.4438e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 1 1.0 1.7564e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 3 1.0 7.1526e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 3 1.0 1.3664e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 1.4305e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 1.2245e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 6.2274e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 MatView 1 1.0 1.1229e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 KSPSetUp 1 1.0 4.2639e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 5 1.0 2.1151e+00 1.0 5.00e+09 1.0 0.0e+00 0.0e+00 0.0e+00 23 99 0 0 0 23 99 0 0 0 2362 PCSetUp 2 1.0 2.8751e-01 1.0 2.57e+07 1.0 0.0e+00 0.0e+00 0.0e+00 3 1 0 0 0 3 1 0 0 0 89 PCApply 156 1.0 6.9180e-01 1.0 1.01e+09 1.0 0.0e+00 0.0e+00 0.0e+00 8 20 0 0 0 8 20 0 0 0 1467 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 67 4 6406112 0. Vector Scatter 1 1 656 0. Matrix 4 1 803112 0. Matrix Null Space 1 1 592 0. Viewer 3 1 816 0. Krylov Solver 1 0 0 0. Preconditioner 2 1 1384 0. Index Set 4 1 776 0. ======================================================================================================================== Average time to get PetscTime(): 2.38419e-08 #PETSc Option Table entries: -log_summary -matload_block_size 1 -pc_hypre_boomeramg_nodal_coarsen 1 -pc_hypre_boomeramg_vec_interp_variant 1 #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos --with-debugging=no ----------------------------------------- Libraries compiled on Sun Jan 8 14:06:45 2017 on ocean Machine characteristics: Linux-3.10.0-327.36.3.el7.x86_64-x86_64-with-centos-7.2.1511-Core Using PETSc directory: /home/valera/petsc Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/valera/petsc/arch-linux2-c-debug/include -I/home/valera/petsc/include -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include ----------------------------------------- Using C linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc Using Fortran linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 Using libraries: -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lparmetis -lmetis -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -lmpicxx -lstdc++ -lflapack -lfblas -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lpthread -lm -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpicxx -lstdc++ -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -lmpi -lgcc_s -ldl ----------------------------------------- -------------- next part -------------- WARNING: -log_summary is being deprecated; switch to -log_view ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ucmsMR on a arch-linux2-c-debug named ocean with 2 processors, by valera Sun Jan 8 14:27:52 2017 Using Petsc Release Version 3.7.4, unknown Max Max/Min Avg Total Time (sec): 2.558e+01 1.01638 2.537e+01 Objects: 8.700e+01 1.00000 8.700e+01 Flops: 2.296e+10 1.00000 2.296e+10 4.592e+10 Flops/sec: 9.123e+08 1.01638 9.050e+08 1.810e+09 MPI Messages: 1.768e+03 1.00000 1.768e+03 3.535e+03 MPI Message Lengths: 4.961e+07 1.00000 2.807e+04 9.922e+07 MPI Reductions: 5.153e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 2.5372e+01 100.0% 4.5918e+10 100.0% 3.535e+03 100.0% 2.807e+04 100.0% 5.152e+03 100.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDotNorm2 1759 1.0 7.5034e-01 1.1 7.04e+08 1.0 0.0e+00 0.0e+00 1.8e+03 3 3 0 0 34 3 3 0 0 34 1875 VecMDot 1698 1.0 2.1292e+00 1.2 5.03e+09 1.0 0.0e+00 0.0e+00 1.7e+03 8 22 0 0 33 8 22 0 0 33 4727 VecNorm 1634 1.0 2.0453e-01 1.1 3.27e+08 1.0 0.0e+00 0.0e+00 1.6e+03 1 1 0 0 32 1 1 0 0 32 3196 VecScale 3518 1.0 2.1820e-01 1.0 3.52e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 2 0 0 0 1 2 0 0 0 3225 VecSet 1769 1.0 1.5442e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 1 0 0 0 0 1 0 0 0 0 0 VecAXPY 3518 1.0 3.4378e-01 1.0 7.04e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 4093 VecAYPX 5 1.0 1.2600e-03 2.1 5.00e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 794 VecMAXPY 3396 1.0 3.4249e+00 1.0 1.01e+10 1.0 0.0e+00 0.0e+00 0.0e+00 13 44 0 0 0 13 44 0 0 0 5878 VecAssemblyBegin 12 1.7 2.9197e-03 4.9 0.00e+00 0.0 0.0e+00 0.0e+00 2.1e+01 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 12 1.7 1.1683e-05 1.6 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 1774 1.0 2.3620e+0037.3 0.00e+00 0.0 3.5e+03 2.2e+04 1.0e+01 5 0100 79 0 5 0100 79 0 0 VecScatterEnd 1764 1.0 8.7893e-02 2.3 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 1764 1.0 3.4805e+00 1.0 5.77e+09 1.0 3.5e+03 2.0e+04 0.0e+00 14 25100 71 0 14 25100 71 0 3318 MatConvert 2 1.0 2.9602e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 1 1.0 6.1677e-02384.4 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 1 1.0 8.9786e-03 1.0 0.00e+00 0.0 4.0e+00 5.0e+03 8.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 4 1.0 2.6226e-06 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 1.7685e-01 1.0 0.00e+00 0.0 7.0e+00 3.0e+06 1.3e+01 1 0 0 21 0 1 0 0 21 0 0 KSPSetUp 1 1.0 1.9739e-02 1.2 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 5 1.0 1.8353e+01 1.0 2.30e+10 1.0 3.5e+03 2.0e+04 5.1e+03 72100100 71 99 72100100 71 99 2502 PCSetUp 2 1.0 6.5126e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 8.0e+00 3 0 0 0 0 3 0 0 0 0 0 PCApply 1760 1.0 8.2935e+00 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 32 0 0 0 0 32 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 72 8 7212944 0. Vector Scatter 3 2 1312 0. Matrix 3 0 0 0. Viewer 2 0 0 0. Index Set 4 4 13104 0. Krylov Solver 1 0 0 0. Preconditioner 2 1 1384 0. ======================================================================================================================== Average time to get PetscTime(): 2.38419e-08 Average time for MPI_Barrier(): 1.99318e-05 Average time for zero size MPI_Send(): 9.41753e-06 #PETSc Option Table entries: -log_summary -matload_block_size 1 -pc_hypre_boomeramg_nodal_coarsen 1 -pc_hypre_boomeramg_vec_interp_variant 1 -pc_type hypre #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos --with-debugging=no ----------------------------------------- Libraries compiled on Sun Jan 8 14:06:45 2017 on ocean Machine characteristics: Linux-3.10.0-327.36.3.el7.x86_64-x86_64-with-centos-7.2.1511-Core Using PETSc directory: /home/valera/petsc Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/valera/petsc/arch-linux2-c-debug/include -I/home/valera/petsc/include -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include ----------------------------------------- Using C linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc Using Fortran linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 Using libraries: -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lparmetis -lmetis -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -lmpicxx -lstdc++ -lflapack -lfblas -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lpthread -lm -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpicxx -lstdc++ -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -lmpi -lgcc_s -ldl -------------- next part -------------- WARNING: -log_summary is being deprecated; switch to -log_view ************************************************************************************************************************ *** WIDEN YOUR WINDOW TO 120 CHARACTERS. Use 'enscript -r -fCourier9' to print this document *** ************************************************************************************************************************ ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- ./ucmsMR on a arch-linux2-c-debug named ocean with 2 processors, by valera Sun Jan 8 14:32:12 2017 Using Petsc Release Version 3.7.4, unknown Max Max/Min Avg Total Time (sec): 1.241e+01 1.03508 1.220e+01 Objects: 9.300e+01 1.00000 9.300e+01 Flops: 8.662e+09 1.00000 8.662e+09 1.732e+10 Flops/sec: 7.222e+08 1.03508 7.100e+08 1.420e+09 MPI Messages: 5.535e+02 1.00000 5.535e+02 1.107e+03 MPI Message Lengths: 2.533e+07 1.00000 4.576e+04 5.066e+07 MPI Reductions: 1.548e+03 1.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.2204e+01 100.0% 1.7325e+10 100.0% 1.107e+03 100.0% 4.576e+04 100.0% 1.547e+03 99.9% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDotNorm2 545 1.0 4.1106e-01 1.4 2.18e+08 1.0 0.0e+00 0.0e+00 5.4e+02 3 3 0 0 35 3 3 0 0 35 1061 VecMDot 525 1.0 9.6894e-01 1.5 1.48e+09 1.0 0.0e+00 0.0e+00 5.2e+02 7 17 0 0 34 7 17 0 0 34 3061 VecNorm 420 1.0 8.5726e-02 1.4 8.40e+07 1.0 0.0e+00 0.0e+00 4.2e+02 1 1 0 0 27 1 1 0 0 27 1960 VecScale 1090 1.0 8.5441e-02 1.0 1.09e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2551 VecSet 555 1.0 5.4937e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 1090 1.0 1.2735e-01 1.1 2.18e+08 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3424 VecAYPX 5 1.0 1.4422e-03 2.4 5.00e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 693 VecMAXPY 1050 1.0 1.1876e+00 1.1 2.97e+09 1.0 0.0e+00 0.0e+00 0.0e+00 9 34 0 0 0 9 34 0 0 0 4994 VecAssemblyBegin 12 1.7 3.0236e-03 4.9 0.00e+00 0.0 0.0e+00 0.0e+00 2.1e+01 0 0 0 0 1 0 0 0 0 1 0 VecAssemblyEnd 12 1.7 1.1206e-05 1.4 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 560 1.0 2.3305e+0099.8 0.00e+00 0.0 1.1e+03 2.7e+04 1.0e+01 10 0 99 59 1 10 0 99 59 1 0 VecScatterEnd 550 1.0 5.8907e-02 5.2 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 550 1.0 1.2700e+00 1.1 1.80e+09 1.0 1.1e+03 2.0e+04 0.0e+00 10 21 99 43 0 10 21 99 43 0 2835 MatSolve 545 1.0 1.4914e+00 1.2 1.77e+09 1.0 0.0e+00 0.0e+00 0.0e+00 11 20 0 0 0 11 20 0 0 0 2375 MatLUFactorNum 1 1.0 3.4873e-02 2.3 1.27e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 728 MatILUFactorSym 1 1.0 1.3488e-02 1.8 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 1 1.0 1.7024e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 1 1.0 6.1171e-02344.9 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 1 1.0 9.1953e-03 1.0 0.00e+00 0.0 4.0e+00 5.0e+03 8.0e+00 0 0 0 0 1 0 0 0 0 1 0 MatGetRowIJ 3 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 1 1.0 1.6165e-03 4.1 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatLoad 1 1.0 1.7585e-01 1.0 0.00e+00 0.0 7.0e+00 3.0e+06 1.3e+01 1 0 1 41 1 1 0 1 41 1 0 KSPSetUp 2 1.0 1.9348e-02 1.1 0.00e+00 0.0 0.0e+00 0.0e+00 2.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 5 1.0 5.2705e+00 1.0 8.66e+09 1.0 1.1e+03 2.0e+04 1.5e+03 43100 99 43 96 43100 99 43 96 3287 PCSetUp 3 1.0 6.7521e-01 1.0 1.27e+07 1.0 0.0e+00 0.0e+00 4.0e+00 5 0 0 0 0 5 0 0 0 0 38 PCSetUpOnBlocks 5 1.0 5.0066e-02 2.2 1.27e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 507 PCApply 546 1.0 1.5895e+00 1.2 1.77e+09 1.0 0.0e+00 0.0e+00 0.0e+00 12 20 0 0 0 12 20 0 0 0 2229 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 72 6 5609648 0. Vector Scatter 3 2 1312 0. Matrix 4 0 0 0. Viewer 2 0 0 0. Index Set 7 4 13104 0. Krylov Solver 2 0 0 0. Preconditioner 3 1 1384 0. ======================================================================================================================== Average time to get PetscTime(): 2.38419e-08 Average time for MPI_Barrier(): 1.95503e-05 Average time for zero size MPI_Send(): 1.03712e-05 #PETSc Option Table entries: -log_summary -matload_block_size 1 -pc_hypre_boomeramg_nodal_coarsen 1 -pc_hypre_boomeramg_vec_interp_variant 1 #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos --with-debugging=no ----------------------------------------- Libraries compiled on Sun Jan 8 14:06:45 2017 on ocean Machine characteristics: Linux-3.10.0-327.36.3.el7.x86_64-x86_64-with-centos-7.2.1511-Core Using PETSc directory: /home/valera/petsc Using PETSc arch: arch-linux2-c-debug ----------------------------------------- Using C compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/valera/petsc/arch-linux2-c-debug/include -I/home/valera/petsc/include -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include ----------------------------------------- Using C linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpicc Using Fortran linker: /home/valera/petsc/arch-linux2-c-debug/bin/mpif90 Using libraries: -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lpetsc -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -lparmetis -lmetis -lHYPRE -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -lmpicxx -lstdc++ -lflapack -lfblas -lnetcdf -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lpthread -lm -lmpifort -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpicxx -lstdc++ -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -L/home/valera/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -L/usr/lib/gcc/x86_64-redhat-linux/4.8.5 -ldl -Wl,-rpath,/home/valera/petsc/arch-linux2-c-debug/lib -lmpi -lgcc_s -ldl ----------------------------------------- -------------- next part -------------- [valera at ocean petsc]$ make stream NPMAX=4 cd src/benchmarks/streams; /usr/bin/gmake --no-print-directory PETSC_DIR=/home/valera/petsc PETSC_ARCH=arch-linux2-c-debug stream /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -o MPIVersion.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include `pwd`/MPIVersion.c Running streams with '/home/valera/petsc/arch-linux2-c-debug/bin/mpiexec ' using 'NPMAX=4' Number of MPI processes 1 Processor names ocean Triad: 5998.4683 Rate (MB/s) Number of MPI processes 2 Processor names ocean ocean Triad: 23010.7259 Rate (MB/s) Number of MPI processes 3 Processor names ocean ocean ocean Triad: 6295.2156 Rate (MB/s) Number of MPI processes 4 Processor names ocean ocean ocean ocean Triad: 7019.8170 Rate (MB/s) ------------------------------------------------ np speedup 1 1.0 2 3.84 3 1.05 4 1.17 Estimation of possible speedup of MPI programs based on Streams benchmark. It appears you have 1 node(s) Unable to open matplotlib to plot speedup -------------- next part -------------- [valera at ocean petsc]$ make PETSC_DIR=/home/valera/petsc PETSC_ARCH=arch-linux2-c-debug streams cd src/benchmarks/streams; /usr/bin/gmake --no-print-directory PETSC_DIR=/home/valera/petsc PETSC_ARCH=arch-linux2-c-debug streams /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -o MPIVersion.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include `pwd`/MPIVersion.c Running streams with '/home/valera/petsc/arch-linux2-c-debug/bin/mpiexec ' using 'NPMAX=32' Number of MPI processes 1 Processor names ocean Triad: 11830.2146 Rate (MB/s) Number of MPI processes 2 Processor names ocean ocean Triad: 23111.7734 Rate (MB/s) Number of MPI processes 3 Processor names ocean ocean ocean Triad: 6692.7679 Rate (MB/s) Number of MPI processes 4 Processor names ocean ocean ocean ocean Triad: 7043.7175 Rate (MB/s) Number of MPI processes 5 Processor names ocean ocean ocean ocean ocean Triad: 33053.3434 Rate (MB/s) Number of MPI processes 6 Processor names ocean ocean ocean ocean ocean ocean Triad: 33129.8788 Rate (MB/s) Number of MPI processes 7 Processor names ocean ocean ocean ocean ocean ocean ocean Triad: 32379.8370 Rate (MB/s) Number of MPI processes 8 Processor names ocean ocean ocean ocean ocean ocean ocean ocean Triad: 31644.3971 Rate (MB/s) Number of MPI processes 9 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 30214.8803 Rate (MB/s) Number of MPI processes 10 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 31700.6859 Rate (MB/s) Number of MPI processes 11 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 32369.1251 Rate (MB/s) Number of MPI processes 12 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 7677.4204 Rate (MB/s) Number of MPI processes 13 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 33298.0308 Rate (MB/s) Number of MPI processes 14 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 33220.6717 Rate (MB/s) Number of MPI processes 15 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 7334.5064 Rate (MB/s) Number of MPI processes 16 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 7463.6337 Rate (MB/s) Number of MPI processes 17 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 14108.2617 Rate (MB/s) Number of MPI processes 18 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 29450.3077 Rate (MB/s) Number of MPI processes 19 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 8997.3655 Rate (MB/s) Number of MPI processes 20 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 9334.5314 Rate (MB/s) Number of MPI processes 21 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 30470.9315 Rate (MB/s) Number of MPI processes 22 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 19822.1616 Rate (MB/s) Number of MPI processes 23 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 32290.0731 Rate (MB/s) Number of MPI processes 24 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 11822.5303 Rate (MB/s) Number of MPI processes 25 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 11310.1869 Rate (MB/s) Number of MPI processes 26 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 33472.8900 Rate (MB/s) Number of MPI processes 27 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 30328.8841 Rate (MB/s) Number of MPI processes 28 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 31779.9057 Rate (MB/s) Number of MPI processes 29 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 28275.6109 Rate (MB/s) Number of MPI processes 30 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 16085.2815 Rate (MB/s) Number of MPI processes 31 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 13092.9246 Rate (MB/s) Number of MPI processes 32 Processor names ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean ocean Triad: 18967.6909 Rate (MB/s) ------------------------------------------------ np speedup 1 1.0 2 1.95 3 0.57 4 0.6 5 2.79 6 2.8 7 2.74 8 2.67 9 2.55 10 2.68 11 2.74 12 0.65 13 2.81 14 2.81 15 0.62 16 0.63 17 1.19 18 2.49 19 0.76 20 0.79 21 2.58 22 1.68 23 2.73 24 1.0 25 0.96 26 2.83 27 2.56 28 2.69 29 2.39 30 1.36 31 1.11 32 1.6 Estimation of possible speedup of MPI programs based on Streams benchmark. It appears you have 1 node(s) Unable to open matplotlib to plot speedup From bsmith at mcs.anl.gov Sun Jan 8 18:05:28 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 8 Jan 2017 18:05:28 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> <0CF3AB08-71F9-42D0-B2C6-0EDFEE4E9635@mcs.anl.gov> Message-ID: <61AE96AD-8578-4197-A00D-E7A1483B5C1F@mcs.anl.gov> Manuel, Ok there are two (actually 3) distinct things you need to deal with to get get any kind of performance out of this machine. 0) When running on the machine you cannot share it with other peoples jobs or you will get timings all over the place so run streams and benchmarks of your code when no one else has jobs running (The Unix top command helps) 1) mpiexec is making bad decisions about process binding (what MPI processes are bound/assigned to what MPI cores). From streams you have np speedup 1 1.0 2 1.95 3 0.57 4 0.6 5 2.79 6 2.8 7 2.74 8 2.67 9 2.55 10 2.68 ..... This is nuts. When going from 2 to 3 processes the performance goes WAY down. If the machine is empty and MPI did a good assignment of processes to cores the speedup should not go down for more cores it should just stagnate. So you need to find out how to do process binding with MPI see http://www.mcs.anl.gov/petsc/documentation/faq.html#computers and the links from there. You can run the streams test with binding by for example make streams NPMAX=4 MPI_BINDING="--binding cpu:sockets". Once you have a good binding for your MPI make sure you always run the mpiexec with that binding when running your code. 2) Both preconditioners you have tried for your problem are terrible. With block Jacobi it went from 156 linear iterations (for 5 linear solves) to 546 iterations. With AMG it went from 1463!! iterations to 1760. These are huge numbers of iterations for algebraic multigrid! For some reason AMG doesn't like your pressure matrix (even though AMG generally loves pressure matrices). What do you have for boundary conditions for your pressure? Please run with -ksp_view_mat binary -ksp_view_rhs binary and then send the resulting file binaryoutput to petsc-maint at mcs.anl.gov and we'll see if we can figure out why AMG doesn't like it. > On Jan 8, 2017, at 4:41 PM, Manuel Valera wrote: > > Ok, i just did the streams and log_summary tests, im attaching the output for each run, with NPMAX=4 and NPMAX=32, also -log_summary runs with -pc_type hypre and without it, with 1 and 2 cores, all of this with debugging turned off. > > The matrix is 200,000x200,000, full curvilinear 3d meshes, non-hydrostatic pressure solver. > > Thanks a lot for your insight, > > Manuel > > On Sun, Jan 8, 2017 at 9:48 AM, Barry Smith wrote: > > we need to see the -log_summary with hypre on 1 and 2 processes (with debugging tuned off) also we need to see the output from > > make stream NPMAX=4 > > run in the PETSc directory. > > > > > On Jan 7, 2017, at 7:38 PM, Manuel Valera wrote: > > > > Ok great, i tried those command line args and this is the result: > > > > when i use -pc_type gamg: > > > > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > [1]PETSC ERROR: Petsc has generated inconsistent data > > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold -1.0' if the matrix is structurally symmetric. > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera Sat Jan 7 17:35:05 2017 > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos > > [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > > > > ------------------------------------------------------------------------ > > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > > [1]PETSC ERROR: ------------------------------------------------------------------------ > > [1]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > > [1]PETSC ERROR: INSTEAD the line number of the start of the function > > [1]PETSC ERROR: is given. > > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > > > when i use -pc_type hypre it actually shows something different on -ksp_view : > > > > KSP Object: 2 MPI processes > > type: gcr > > GCR: restart = 30 > > GCR: restarts performed = 37 > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > > right preconditioning > > using UNPRECONDITIONED norm type for convergence test > > PC Object: 2 MPI processes > > type: hypre > > HYPRE BoomerAMG preconditioning > > HYPRE BoomerAMG: Cycle type V > > HYPRE BoomerAMG: Maximum number of levels 25 > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > > HYPRE BoomerAMG: Interpolation truncation factor 0. > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > HYPRE BoomerAMG: Maximum row sums 0.9 > > HYPRE BoomerAMG: Sweeps down 1 > > HYPRE BoomerAMG: Sweeps up 1 > > HYPRE BoomerAMG: Sweeps on coarse 1 > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > > HYPRE BoomerAMG: Relax weight (all) 1. > > HYPRE BoomerAMG: Outer relax weight (all) 1. > > HYPRE BoomerAMG: Using CF-relaxation > > HYPRE BoomerAMG: Not using more complex smoothers. > > HYPRE BoomerAMG: Measure type local > > HYPRE BoomerAMG: Coarsen type Falgout > > HYPRE BoomerAMG: Interpolation type classical > > HYPRE BoomerAMG: Using nodal coarsening (with HYPRE_BOOMERAMGSetNodal() 1 > > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > > linear system matrix = precond matrix: > > Mat Object: 2 MPI processes > > type: mpiaij > > rows=200000, cols=200000 > > total: nonzeros=3373340, allocated nonzeros=3373340 > > total number of mallocs used during MatSetValues calls =0 > > not using I-node (on process 0) routines > > > > > > but still the timing is terrible. > > > > > > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > > Manuel Valera writes: > > > > > Awesome Matt and Jed, > > > > > > The GCR is used because the matrix is not invertible and because this was > > > the algorithm that the previous library used, > > > > > > The Preconditioned im aiming to use is multigrid, i thought i configured > > > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in > > > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > > > before in this thread > > > > Did you run with -pc_type hypre? > > > > > I had a problem with the matrix block sizes so i couldn't make the petsc > > > native multigrid solver to work, > > > > What block sizes? If the only variable is pressure, the block size > > would be 1 (default). > > > > > This is a nonhidrostatic pressure solver, it is an elliptic problem so > > > multigrid is a must, > > > > Yes, multigrid should work well. > > > > > From mvalera at mail.sdsu.edu Sun Jan 8 18:22:04 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Sun, 8 Jan 2017 16:22:04 -0800 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: <61AE96AD-8578-4197-A00D-E7A1483B5C1F@mcs.anl.gov> References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> <0CF3AB08-71F9-42D0-B2C6-0EDFEE4E9635@mcs.anl.gov> <61AE96AD-8578-4197-A00D-E7A1483B5C1F@mcs.anl.gov> Message-ID: Ok many thanks Barry, For the cpu:sockets binding i get an ugly error: [valera at ocean petsc]$ make streams NPMAX=4 MPI_BINDING="--binding cpu:sockets" cd src/benchmarks/streams; /usr/bin/gmake --no-print-directory PETSC_DIR=/home/valera/petsc PETSC_ARCH=arch-linux2-c-debug streams /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -o MPIVersion.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include `pwd`/MPIVersion.c Running streams with '/home/valera/petsc/arch-linux2-c-debug/bin/mpiexec --binding cpu:sockets' using 'NPMAX=4' [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion ------------------------------------------------ Im sending the binary file for the other list in a separate mail next, Regards, On Sun, Jan 8, 2017 at 4:05 PM, Barry Smith wrote: > > Manuel, > > Ok there are two (actually 3) distinct things you need to deal with > to get get any kind of performance out of this machine. > > 0) When running on the machine you cannot share it with other peoples jobs > or you will get timings all over the place so run streams and benchmarks of > your code when no one else has jobs running (The Unix top command helps) > > 1) mpiexec is making bad decisions about process binding (what MPI > processes are bound/assigned to what MPI cores). > > From streams you have > > np speedup > 1 1.0 > 2 1.95 > 3 0.57 > 4 0.6 > 5 2.79 > 6 2.8 > 7 2.74 > 8 2.67 > 9 2.55 > 10 2.68 > ..... > > This is nuts. When going from 2 to 3 processes the performance goes WAY > down. If the machine is empty and MPI did a good assignment of processes to > cores the speedup should not go down for more cores it should just stagnate. > > So you need to find out how to do process binding with MPI see > http://www.mcs.anl.gov/petsc/documentation/faq.html#computers and the > links from there. You can run the streams test with binding by for example > make streams NPMAX=4 MPI_BINDING="--binding cpu:sockets". > > Once you have a good binding for your MPI make sure you always run the > mpiexec with that binding when running your code. > > 2) Both preconditioners you have tried for your problem are terrible. With > block Jacobi it went from 156 linear iterations (for 5 linear solves) to > 546 iterations. With AMG it went from 1463!! iterations to 1760. These are > huge numbers of iterations for algebraic multigrid! > > For some reason AMG doesn't like your pressure matrix (even though AMG > generally loves pressure matrices). What do you have for boundary > conditions for your pressure? > > Please run with -ksp_view_mat binary -ksp_view_rhs binary and then send > the resulting file binaryoutput to petsc-maint at mcs.anl.gov and we'll see > if we can figure out why AMG doesn't like it. > > > > > > > > > On Jan 8, 2017, at 4:41 PM, Manuel Valera wrote: > > > > Ok, i just did the streams and log_summary tests, im attaching the > output for each run, with NPMAX=4 and NPMAX=32, also -log_summary runs with > -pc_type hypre and without it, with 1 and 2 cores, all of this with > debugging turned off. > > > > The matrix is 200,000x200,000, full curvilinear 3d meshes, > non-hydrostatic pressure solver. > > > > Thanks a lot for your insight, > > > > Manuel > > > > On Sun, Jan 8, 2017 at 9:48 AM, Barry Smith wrote: > > > > we need to see the -log_summary with hypre on 1 and 2 processes (with > debugging tuned off) also we need to see the output from > > > > make stream NPMAX=4 > > > > run in the PETSc directory. > > > > > > > > > On Jan 7, 2017, at 7:38 PM, Manuel Valera > wrote: > > > > > > Ok great, i tried those command line args and this is the result: > > > > > > when i use -pc_type gamg: > > > > > > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > > [1]PETSC ERROR: Petsc has generated inconsistent data > > > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use > '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold > -1.0' if the matrix is structurally symmetric. > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/ > documentation/faq.html for trouble shooting. > > > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by > valera Sat Jan 7 17:35:05 2017 > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ > --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 > --download-netcdf --download-hypre --download-metis --download-parmetis > --download-trillinos > > > [1]PETSC ERROR: #1 smoothAggs() line 462 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > > > > > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > > > > > > ------------------------------------------------------------ > ------------ > > > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point > Exception,probably divide by zero > > > [0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger > > > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/ > documentation/faq.html#valgrind > > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac > OS X to find memory corruption errors > > > [1]PETSC ERROR: ------------------------------ > ------------------------------------------ > > > [1]PETSC ERROR: --------------------- Stack Frames > ------------------------------------ > > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available, > > > [1]PETSC ERROR: INSTEAD the line number of the start of the > function > > > [1]PETSC ERROR: is given. > > > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/impls/gmres/gmreig.c > > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 > /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 > /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 > /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/gamg.c > > > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/agg.c > > > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/ > src/ksp/pc/impls/gamg/gamg.c > > > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/ > src/ksp/pc/interface/precon.c > > > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/ > src/ksp/ksp/interface/itfunc.c > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > > > > > when i use -pc_type hypre it actually shows something different on > -ksp_view : > > > > > > KSP Object: 2 MPI processes > > > type: gcr > > > GCR: restart = 30 > > > GCR: restarts performed = 37 > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > > > right preconditioning > > > using UNPRECONDITIONED norm type for convergence test > > > PC Object: 2 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > HYPRE BoomerAMG: Cycle type V > > > HYPRE BoomerAMG: Maximum number of levels 25 > > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > > > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > > > HYPRE BoomerAMG: Interpolation truncation factor 0. > > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > > HYPRE BoomerAMG: Maximum row sums 0.9 > > > HYPRE BoomerAMG: Sweeps down 1 > > > HYPRE BoomerAMG: Sweeps up 1 > > > HYPRE BoomerAMG: Sweeps on coarse 1 > > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > > > HYPRE BoomerAMG: Relax weight (all) 1. > > > HYPRE BoomerAMG: Outer relax weight (all) 1. > > > HYPRE BoomerAMG: Using CF-relaxation > > > HYPRE BoomerAMG: Not using more complex smoothers. > > > HYPRE BoomerAMG: Measure type local > > > HYPRE BoomerAMG: Coarsen type Falgout > > > HYPRE BoomerAMG: Interpolation type classical > > > HYPRE BoomerAMG: Using nodal coarsening (with > HYPRE_BOOMERAMGSetNodal() 1 > > > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > > > linear system matrix = precond matrix: > > > Mat Object: 2 MPI processes > > > type: mpiaij > > > rows=200000, cols=200000 > > > total: nonzeros=3373340, allocated nonzeros=3373340 > > > total number of mallocs used during MatSetValues calls =0 > > > not using I-node (on process 0) routines > > > > > > > > > but still the timing is terrible. > > > > > > > > > > > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > > > Manuel Valera writes: > > > > > > > Awesome Matt and Jed, > > > > > > > > The GCR is used because the matrix is not invertible and because > this was > > > > the algorithm that the previous library used, > > > > > > > > The Preconditioned im aiming to use is multigrid, i thought i > configured > > > > the hypre-boomerAmg solver for this, but i agree in that it doesn't > show in > > > > the log anywhere, how can i be sure is being used ? i sent -ksp_view > log > > > > before in this thread > > > > > > Did you run with -pc_type hypre? > > > > > > > I had a problem with the matrix block sizes so i couldn't make the > petsc > > > > native multigrid solver to work, > > > > > > What block sizes? If the only variable is pressure, the block size > > > would be 1 (default). > > > > > > > This is a nonhidrostatic pressure solver, it is an elliptic problem > so > > > > multigrid is a must, > > > > > > Yes, multigrid should work well. > > > > > > > > > < > logsumm2jacobi.txt> > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Jan 8 18:24:17 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 8 Jan 2017 18:24:17 -0600 Subject: [petsc-users] -log_view hangs unexpectedly // how to optimize my kspsolve In-Reply-To: References: <87lgum4cer.fsf@jedbrown.org> <87h95a49xg.fsf@jedbrown.org> <0CF3AB08-71F9-42D0-B2C6-0EDFEE4E9635@mcs.anl.gov> <61AE96AD-8578-4197-A00D-E7A1483B5C1F@mcs.anl.gov> Message-ID: <8FDEFD4F-5FC5-4D51-AAA7-A0E71B9646AD@mcs.anl.gov> > On Jan 8, 2017, at 6:22 PM, Manuel Valera wrote: > > Ok many thanks Barry, > > For the cpu:sockets binding i get an ugly error: You need to find out for your MPI what binding option to use. Sadly it is different for different MPI implementations and can change over time. The material in our FAQ page are just examples > > [valera at ocean petsc]$ make streams NPMAX=4 MPI_BINDING="--binding cpu:sockets" > cd src/benchmarks/streams; /usr/bin/gmake --no-print-directory PETSC_DIR=/home/valera/petsc PETSC_ARCH=arch-linux2-c-debug streams > /home/valera/petsc/arch-linux2-c-debug/bin/mpicc -o MPIVersion.o -c -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O -I/home/valera/petsc/include -I/home/valera/petsc/arch-linux2-c-debug/include `pwd`/MPIVersion.c > Running streams with '/home/valera/petsc/arch-linux2-c-debug/bin/mpiexec --binding cpu:sockets' using 'NPMAX=4' > [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" > [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" > [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc > [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology > [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error > [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event > [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed > [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event > [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion > [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" > [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" > [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc > [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology > [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error > [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event > [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed > [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event > [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion > [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" > [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" > [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc > [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology > [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error > [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event > [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed > [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event > [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion > [proxy:0:0 at ocean] handle_bitmap_binding (tools/topo/hwloc/topo_hwloc.c:203): unrecognized binding string "cpu:sockets" > [proxy:0:0 at ocean] HYDT_topo_hwloc_init (tools/topo/hwloc/topo_hwloc.c:415): error binding with bind "cpu:sockets" and map "(null)" > [proxy:0:0 at ocean] HYDT_topo_init (tools/topo/topo.c:62): unable to initialize hwloc > [proxy:0:0 at ocean] launch_procs (pm/pmiserv/pmip_cb.c:515): unable to initialize process topology > [proxy:0:0 at ocean] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:892): launch_procs returned error > [proxy:0:0 at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [proxy:0:0 at ocean] main (pm/pmiserv/pmip.c:206): demux engine error waiting for event > [mpiexec at ocean] control_cb (pm/pmiserv/pmiserv_cb.c:200): assert (!closed) failed > [mpiexec at ocean] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status > [mpiexec at ocean] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:198): error waiting for event > [mpiexec at ocean] main (ui/mpich/mpiexec.c:344): process manager error waiting for completion > ------------------------------------------------ > > > Im sending the binary file for the other list in a separate mail next, > > Regards, > > On Sun, Jan 8, 2017 at 4:05 PM, Barry Smith wrote: > > Manuel, > > Ok there are two (actually 3) distinct things you need to deal with to get get any kind of performance out of this machine. > > 0) When running on the machine you cannot share it with other peoples jobs or you will get timings all over the place so run streams and benchmarks of your code when no one else has jobs running (The Unix top command helps) > > 1) mpiexec is making bad decisions about process binding (what MPI processes are bound/assigned to what MPI cores). > > From streams you have > > np speedup > 1 1.0 > 2 1.95 > 3 0.57 > 4 0.6 > 5 2.79 > 6 2.8 > 7 2.74 > 8 2.67 > 9 2.55 > 10 2.68 > ..... > > This is nuts. When going from 2 to 3 processes the performance goes WAY down. If the machine is empty and MPI did a good assignment of processes to cores the speedup should not go down for more cores it should just stagnate. > > So you need to find out how to do process binding with MPI see http://www.mcs.anl.gov/petsc/documentation/faq.html#computers and the links from there. You can run the streams test with binding by for example make streams NPMAX=4 MPI_BINDING="--binding cpu:sockets". > > Once you have a good binding for your MPI make sure you always run the mpiexec with that binding when running your code. > > 2) Both preconditioners you have tried for your problem are terrible. With block Jacobi it went from 156 linear iterations (for 5 linear solves) to 546 iterations. With AMG it went from 1463!! iterations to 1760. These are huge numbers of iterations for algebraic multigrid! > > For some reason AMG doesn't like your pressure matrix (even though AMG generally loves pressure matrices). What do you have for boundary conditions for your pressure? > > Please run with -ksp_view_mat binary -ksp_view_rhs binary and then send the resulting file binaryoutput to petsc-maint at mcs.anl.gov and we'll see if we can figure out why AMG doesn't like it. > > > > > > > > > On Jan 8, 2017, at 4:41 PM, Manuel Valera wrote: > > > > Ok, i just did the streams and log_summary tests, im attaching the output for each run, with NPMAX=4 and NPMAX=32, also -log_summary runs with -pc_type hypre and without it, with 1 and 2 cores, all of this with debugging turned off. > > > > The matrix is 200,000x200,000, full curvilinear 3d meshes, non-hydrostatic pressure solver. > > > > Thanks a lot for your insight, > > > > Manuel > > > > On Sun, Jan 8, 2017 at 9:48 AM, Barry Smith wrote: > > > > we need to see the -log_summary with hypre on 1 and 2 processes (with debugging tuned off) also we need to see the output from > > > > make stream NPMAX=4 > > > > run in the PETSc directory. > > > > > > > > > On Jan 7, 2017, at 7:38 PM, Manuel Valera wrote: > > > > > > Ok great, i tried those command line args and this is the result: > > > > > > when i use -pc_type gamg: > > > > > > [1]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > > [1]PETSC ERROR: Petsc has generated inconsistent data > > > [1]PETSC ERROR: Have un-symmetric graph (apparently). Use '-pc_gamg_sym_graph true' to symetrize the graph or '-pc_gamg_threshold -1.0' if the matrix is structurally symmetric. > > > [1]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > > > [1]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > > [1]PETSC ERROR: ./ucmsMR on a arch-linux2-c-debug named ocean by valera Sat Jan 7 17:35:05 2017 > > > [1]PETSC ERROR: Configure options --with-cc=gcc --with-cxx=g++ --with-fc=gfortran --download-fblaslapack --download-mpich --download-hdf5 --download-netcdf --download-hypre --download-metis --download-parmetis --download-trillinos > > > [1]PETSC ERROR: #1 smoothAggs() line 462 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: #2 PCGAMGCoarsen_AGG() line 998 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: #3 PCSetUp_GAMG() line 571 in /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > > [1]PETSC ERROR: #4 PCSetUp() line 968 in /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > > [1]PETSC ERROR: #5 KSPSetUp() line 390 in /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > application called MPI_Abort(comm=0x84000002, 77) - process 1 > > > > > > > > > when i use -pc_type gamg and -pc_gamg_sym_graph true: > > > > > > ------------------------------------------------------------------------ > > > [0]PETSC ERROR: Caught signal number 8 FPE: Floating Point Exception,probably divide by zero > > > [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger > > > [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > > > [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors > > > [1]PETSC ERROR: ------------------------------------------------------------------------ > > > [1]PETSC ERROR: --------------------- Stack Frames ------------------------------------ > > > [1]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, > > > [1]PETSC ERROR: INSTEAD the line number of the start of the function > > > [1]PETSC ERROR: is given. > > > [1]PETSC ERROR: [1] LAPACKgesvd line 42 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues_GMRES line 24 /usr/dataC/home/valera/petsc/src/ksp/ksp/impls/gmres/gmreig.c > > > [1]PETSC ERROR: [1] KSPComputeExtremeSingularValues line 51 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > [1]PETSC ERROR: [1] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [1]PETSC ERROR: [1] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > > [1]PETSC ERROR: [1] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > > [1]PETSC ERROR: [1] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > [0] PCGAMGOptProlongator_AGG line 1187 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/agg.c > > > [0]PETSC ERROR: [0] PCSetUp_GAMG line 472 /usr/dataC/home/valera/petsc/src/ksp/pc/impls/gamg/gamg.c > > > [0]PETSC ERROR: [0] PCSetUp line 930 /usr/dataC/home/valera/petsc/src/ksp/pc/interface/precon.c > > > [0]PETSC ERROR: [0] KSPSetUp line 305 /usr/dataC/home/valera/petsc/src/ksp/ksp/interface/itfunc.c > > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > > > > > > when i use -pc_type hypre it actually shows something different on -ksp_view : > > > > > > KSP Object: 2 MPI processes > > > type: gcr > > > GCR: restart = 30 > > > GCR: restarts performed = 37 > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-14, absolute=1e-50, divergence=10000. > > > right preconditioning > > > using UNPRECONDITIONED norm type for convergence test > > > PC Object: 2 MPI processes > > > type: hypre > > > HYPRE BoomerAMG preconditioning > > > HYPRE BoomerAMG: Cycle type V > > > HYPRE BoomerAMG: Maximum number of levels 25 > > > HYPRE BoomerAMG: Maximum number of iterations PER hypre call 1 > > > HYPRE BoomerAMG: Convergence tolerance PER hypre call 0. > > > HYPRE BoomerAMG: Threshold for strong coupling 0.25 > > > HYPRE BoomerAMG: Interpolation truncation factor 0. > > > HYPRE BoomerAMG: Interpolation: max elements per row 0 > > > HYPRE BoomerAMG: Number of levels of aggressive coarsening 0 > > > HYPRE BoomerAMG: Number of paths for aggressive coarsening 1 > > > HYPRE BoomerAMG: Maximum row sums 0.9 > > > HYPRE BoomerAMG: Sweeps down 1 > > > HYPRE BoomerAMG: Sweeps up 1 > > > HYPRE BoomerAMG: Sweeps on coarse 1 > > > HYPRE BoomerAMG: Relax down symmetric-SOR/Jacobi > > > HYPRE BoomerAMG: Relax up symmetric-SOR/Jacobi > > > HYPRE BoomerAMG: Relax on coarse Gaussian-elimination > > > HYPRE BoomerAMG: Relax weight (all) 1. > > > HYPRE BoomerAMG: Outer relax weight (all) 1. > > > HYPRE BoomerAMG: Using CF-relaxation > > > HYPRE BoomerAMG: Not using more complex smoothers. > > > HYPRE BoomerAMG: Measure type local > > > HYPRE BoomerAMG: Coarsen type Falgout > > > HYPRE BoomerAMG: Interpolation type classical > > > HYPRE BoomerAMG: Using nodal coarsening (with HYPRE_BOOMERAMGSetNodal() 1 > > > HYPRE BoomerAMG: HYPRE_BoomerAMGSetInterpVecVariant() 1 > > > linear system matrix = precond matrix: > > > Mat Object: 2 MPI processes > > > type: mpiaij > > > rows=200000, cols=200000 > > > total: nonzeros=3373340, allocated nonzeros=3373340 > > > total number of mallocs used during MatSetValues calls =0 > > > not using I-node (on process 0) routines > > > > > > > > > but still the timing is terrible. > > > > > > > > > > > > > > > On Sat, Jan 7, 2017 at 5:28 PM, Jed Brown wrote: > > > Manuel Valera writes: > > > > > > > Awesome Matt and Jed, > > > > > > > > The GCR is used because the matrix is not invertible and because this was > > > > the algorithm that the previous library used, > > > > > > > > The Preconditioned im aiming to use is multigrid, i thought i configured > > > > the hypre-boomerAmg solver for this, but i agree in that it doesn't show in > > > > the log anywhere, how can i be sure is being used ? i sent -ksp_view log > > > > before in this thread > > > > > > Did you run with -pc_type hypre? > > > > > > > I had a problem with the matrix block sizes so i couldn't make the petsc > > > > native multigrid solver to work, > > > > > > What block sizes? If the only variable is pressure, the block size > > > would be 1 (default). > > > > > > > This is a nonhidrostatic pressure solver, it is an elliptic problem so > > > > multigrid is a must, > > > > > > Yes, multigrid should work well. > > > > > > > > > > > From fande.kong at inl.gov Mon Jan 9 12:26:22 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Mon, 9 Jan 2017 11:26:22 -0700 Subject: [petsc-users] PetscTableCreateHashSize Message-ID: Hi All, Hash size is set manually according to the number of expected keys in the function PetscTableCreateHashSize(). Any reason to restrict the ``n"<3276800? One user here encountered an issue because of this restriction. The messages are as follows: [3]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [3]PETSC ERROR: Argument out of range [3]PETSC ERROR: A really huge hash is being requested.. cannot process: 3497472 [3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [3]PETSC ERROR: Petsc Release Version 3.7.4, unknown [3]PETSC ERROR: /home/schuseba/projects/64_bit_builds/yak/yak-opt on a linux-gnu-c-opt named r3i3n0 by schuseba Fri Jan 6 23:15:37 2017 [3]PETSC ERROR: Configure options --download-hypre=1 --with-ssl=0 --with-debugging=no --with-pic=1 --with-shared-libraries=1 --with-64-bit-indices=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 --download-metis=1 --download-parmetis=1 --download-fblaslapack=1 --download-superlu_dist=1 -CC=mpicc -CXX=mpicxx -FC=mpif90 -F77=mpif77 -F90=mpif90 -CFLAGS="-fPIC -fopenmp" -CXXFLAGS="-fPIC -fopenmp" -FFLAGS="-fPIC -fopenmp" -FCFLAGS="-fPIC -fopenmp" -F90FLAGS="-fPIC -fopenmp" -F77FLAGS="-fPIC -fopenmp" [3]PETSC ERROR: #1 PetscTableCreateHashSize() line 28 in /home/schuseba/projects/64_bit_builds/petsc/src/sys/utils/ctable.c Fande, -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Mon Jan 9 13:36:40 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 9 Jan 2017 13:36:40 -0600 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: Message-ID: We can add more entries to the lookup. The stack below looks incomplete. Which routine is calling PetscTableCreateHashSize() with this big size? Satish ------- $ git diff diff --git a/src/sys/utils/ctable.c b/src/sys/utils/ctable.c index cd64284..761a2c6 100644 --- a/src/sys/utils/ctable.c +++ b/src/sys/utils/ctable.c @@ -25,6 +25,7 @@ static PetscErrorCode PetscTableCreateHashSize(PetscInt sz, PetscInt *hsz) else if (sz < 819200) *hsz = 1193557; else if (sz < 1638400) *hsz = 2297059; else if (sz < 3276800) *hsz = 4902383; + else if (sz < 6553600) *hsz = 9179113; else SETERRQ1(PETSC_COMM_SELF,PETSC_ERR_ARG_OUTOFRANGE,"A really huge hash is being requested.. cannot process: %D",sz); PetscFunctionReturn(0); } On Mon, 9 Jan 2017, Kong, Fande wrote: > Hi All, > > Hash size is set manually according to the number of expected keys in the > function PetscTableCreateHashSize(). Any reason to restrict the > ``n"<3276800? > > One user here encountered an issue because of this restriction. The > messages are as follows: > > [3]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > > [3]PETSC ERROR: Argument out of range > > [3]PETSC ERROR: A really huge hash is being requested.. cannot process: > 3497472 > > [3]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for > trouble shooting. > > [3]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > [3]PETSC ERROR: /home/schuseba/projects/64_bit_builds/yak/yak-opt on a > linux-gnu-c-opt named r3i3n0 by schuseba Fri Jan 6 23:15:37 2017 > > [3]PETSC ERROR: Configure options --download-hypre=1 --with-ssl=0 > --with-debugging=no --with-pic=1 --with-shared-libraries=1 > --with-64-bit-indices=1 --with-cc=mpicc --with-cxx=mpicxx --with-fc=mpif90 > --download-metis=1 --download-parmetis=1 --download-fblaslapack=1 > --download-superlu_dist=1 -CC=mpicc -CXX=mpicxx -FC=mpif90 -F77=mpif77 > -F90=mpif90 -CFLAGS="-fPIC -fopenmp" -CXXFLAGS="-fPIC -fopenmp" > -FFLAGS="-fPIC -fopenmp" -FCFLAGS="-fPIC -fopenmp" -F90FLAGS="-fPIC > -fopenmp" -F77FLAGS="-fPIC -fopenmp" > > [3]PETSC ERROR: #1 PetscTableCreateHashSize() line 28 in > /home/schuseba/projects/64_bit_builds/petsc/src/sys/utils/ctable.c > > > > > > Fande, > From fande.kong at inl.gov Mon Jan 9 13:41:24 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Mon, 9 Jan 2017 12:41:24 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: Message-ID: Thanks, Satish, On Mon, Jan 9, 2017 at 12:36 PM, Satish Balay wrote: > We can add more entries to the lookup. The stack below looks > incomplete. Which routine is calling PetscTableCreateHashSize() with > this big size? > call trace: [4]PETSC ERROR: #3 MatSetUpMultiply_MPIAIJ() line 36 in /home/schuseba/projects/64_bit_builds/petsc/src/mat/impls/aij/mpi/mmaij.c [9]PETSC ERROR: #4 MatAssemblyEnd_MPIAIJ() line 747 in /home/schuseba/projects/64_bit_builds/petsc/src/mat/impls/aij/mpi/mpiaij.c [9]PETSC ERROR: #4 MatAssemblyEnd_MPIAIJ() line 747 in /home/schuseba/projects/64_bit_builds/petsc/src/mat/impls/aij/mpi/mpiaij.c > > Satish > > ------- > $ git diff > diff --git a/src/sys/utils/ctable.c b/src/sys/utils/ctable.c > index cd64284..761a2c6 100644 > --- a/src/sys/utils/ctable.c > +++ b/src/sys/utils/ctable.c > @@ -25,6 +25,7 @@ static PetscErrorCode PetscTableCreateHashSize(PetscInt > sz, PetscInt *hsz) > else if (sz < 819200) *hsz = 1193557; > else if (sz < 1638400) *hsz = 2297059; > else if (sz < 3276800) *hsz = 4902383; > + else if (sz < 6553600) *hsz = 9179113; > else SETERRQ1(PETSC_COMM_SELF,PETSC_ERR_ARG_OUTOFRANGE,"A really huge > hash is being requested.. cannot process: %D",sz); > PetscFunctionReturn(0); > } > > On Mon, 9 Jan 2017, Kong, Fande wrote: > > > Hi All, > > > > Hash size is set manually according to the number of expected keys in the > > function PetscTableCreateHashSize(). Any reason to restrict the > > ``n"<3276800? > > > > One user here encountered an issue because of this restriction. The > > messages are as follows: > > > > [3]PETSC ERROR: --------------------- Error Message > > -------------------------------------------------------------- > > > > [3]PETSC ERROR: Argument out of range > > > > [3]PETSC ERROR: A really huge hash is being requested.. cannot process: > > 3497472 > > > > [3]PETSC ERROR: See https://urldefense.proofpoint. > com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_ > faq.html&d=DQIBAg&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB_ > _aEkJFOKJFd00&r=DUUt3SRGI0_JgtNaS3udV68GRkgV4ts7XKfj2opmiCY&m= > fvlOBYaS6Bzg7U320hXOmDVca3d6OkyJnp56sjG6pG8&s= > Rp5eqZDYZPxEHWb7SoQwATm41rJPVIolrCKuUGdM72U&e= for > > trouble shooting. > > > > [3]PETSC ERROR: Petsc Release Version 3.7.4, unknown > > > > [3]PETSC ERROR: /home/schuseba/projects/64_bit_builds/yak/yak-opt on a > > linux-gnu-c-opt named r3i3n0 by schuseba Fri Jan 6 23:15:37 2017 > > > > [3]PETSC ERROR: Configure options --download-hypre=1 --with-ssl=0 > > --with-debugging=no --with-pic=1 --with-shared-libraries=1 > > --with-64-bit-indices=1 --with-cc=mpicc --with-cxx=mpicxx > --with-fc=mpif90 > > --download-metis=1 --download-parmetis=1 --download-fblaslapack=1 > > --download-superlu_dist=1 -CC=mpicc -CXX=mpicxx -FC=mpif90 -F77=mpif77 > > -F90=mpif90 -CFLAGS="-fPIC -fopenmp" -CXXFLAGS="-fPIC -fopenmp" > > -FFLAGS="-fPIC -fopenmp" -FCFLAGS="-fPIC -fopenmp" -F90FLAGS="-fPIC > > -fopenmp" -F77FLAGS="-fPIC -fopenmp" > > > > [3]PETSC ERROR: #1 PetscTableCreateHashSize() line 28 in > > /home/schuseba/projects/64_bit_builds/petsc/src/sys/utils/ctable.c > > > > > > > > > > > > Fande, > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Jan 9 13:42:49 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 09 Jan 2017 12:42:49 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: Message-ID: <8760lo2f5y.fsf@jedbrown.org> Satish Balay writes: > We can add more entries to the lookup. The stack below looks > incomplete. Which routine is calling PetscTableCreateHashSize() with > this big size? > > Satish > > ------- > $ git diff > diff --git a/src/sys/utils/ctable.c b/src/sys/utils/ctable.c > index cd64284..761a2c6 100644 > --- a/src/sys/utils/ctable.c > +++ b/src/sys/utils/ctable.c > @@ -25,6 +25,7 @@ static PetscErrorCode PetscTableCreateHashSize(PetscInt sz, PetscInt *hsz) > else if (sz < 819200) *hsz = 1193557; > else if (sz < 1638400) *hsz = 2297059; > else if (sz < 3276800) *hsz = 4902383; > + else if (sz < 6553600) *hsz = 9179113; Does anyone else think this is ridiculous? Why not either generate the hash sizes algorithmically or put in enough for MAXINT? -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From balay at mcs.anl.gov Mon Jan 9 13:52:57 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 9 Jan 2017 13:52:57 -0600 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: <8760lo2f5y.fsf@jedbrown.org> References: <8760lo2f5y.fsf@jedbrown.org> Message-ID: On Mon, 9 Jan 2017, Jed Brown wrote: > Satish Balay writes: > > > We can add more entries to the lookup. The stack below looks > > incomplete. Which routine is calling PetscTableCreateHashSize() with > > this big size? > > > > Satish > > > > ------- > > $ git diff > > diff --git a/src/sys/utils/ctable.c b/src/sys/utils/ctable.c > > index cd64284..761a2c6 100644 > > --- a/src/sys/utils/ctable.c > > +++ b/src/sys/utils/ctable.c > > @@ -25,6 +25,7 @@ static PetscErrorCode PetscTableCreateHashSize(PetscInt sz, PetscInt *hsz) > > else if (sz < 819200) *hsz = 1193557; > > else if (sz < 1638400) *hsz = 2297059; > > else if (sz < 3276800) *hsz = 4902383; > > + else if (sz < 6553600) *hsz = 9179113; > > Does anyone else think this is ridiculous? Why not either generate the > hash sizes algorithmically or put in enough for MAXINT? Sure - I'm using a crappy algorithm [look-up table] to get "prime_number_close_to(1.4*sz)" - as I don't know how to generate these numbers automatically. Will add more entries to this lookup table. Satish From jed at jedbrown.org Mon Jan 9 14:36:14 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 09 Jan 2017 13:36:14 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: <8760lo2f5y.fsf@jedbrown.org> Message-ID: <87zij00y4h.fsf@jedbrown.org> Satish Balay writes: > Sure - I'm using a crappy algorithm [look-up table] to get > "prime_number_close_to(1.4*sz)" - as I don't know how to generate > these numbers automatically. FWIW, it only needs to be coprime with PETSC_HASH_FACT. -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From balay at mcs.anl.gov Mon Jan 9 15:14:51 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 9 Jan 2017 15:14:51 -0600 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: <87zij00y4h.fsf@jedbrown.org> References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> Message-ID: On Mon, 9 Jan 2017, Jed Brown wrote: > Satish Balay writes: > > Sure - I'm using a crappy algorithm [look-up table] to get > > "prime_number_close_to(1.4*sz)" - as I don't know how to generate > > these numbers automatically. > > FWIW, it only needs to be coprime with PETSC_HASH_FACT. Not sure I understand - are you saying coprime requirement is easier satisfy than a single prime? I had switched this code to use double-hasing algorithm - and the requirement here is - the table size be a prime number. [so I'm attempting to estimate a prime number suitable for the table size] I pushed the following https://bitbucket.org/petsc/petsc/commits/d742c75fd0d514f7fa1873d5b10984bc3f363031 Satish From fande.kong at inl.gov Mon Jan 9 15:58:08 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Mon, 9 Jan 2017 14:58:08 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> Message-ID: Thanks a lot Satish! Like Jed said, it would be better if we could come up an algorithm for automatically computing a hash size for a given n. Otherwise, we may need to add more entries to the lookup again in the future. Fande, On Mon, Jan 9, 2017 at 2:14 PM, Satish Balay wrote: > On Mon, 9 Jan 2017, Jed Brown wrote: > > > Satish Balay writes: > > > Sure - I'm using a crappy algorithm [look-up table] to get > > > "prime_number_close_to(1.4*sz)" - as I don't know how to generate > > > these numbers automatically. > > > > FWIW, it only needs to be coprime with PETSC_HASH_FACT. > > Not sure I understand - are you saying coprime requirement is easier > satisfy than a single prime? > > I had switched this code to use double-hasing algorithm - and the > requirement here is - the table size be a prime number. [so I'm > attempting to estimate a prime number suitable for the table size] > > I pushed the following > https://urldefense.proofpoint.com/v2/url?u=https-3A__ > bitbucket.org_petsc_petsc_commits_d742c75fd0d514f7fa1873d5b10984 > bc3f363031&d=DQIBAg&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB_ > _aEkJFOKJFd00&r=DUUt3SRGI0_JgtNaS3udV68GRkgV4ts7XKfj2opmi > CY&m=nkPXHuaxZeHPzOteY25j_Dptk5XyWiqwzaJbEwI5uWY&s= > eOjfGCXP3g18VLYhXY5xrlOr7AFn7o3G_YrYVo8Rw8Y&e= > > Satish > -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Jan 9 16:22:57 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 09 Jan 2017 15:22:57 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> Message-ID: <87h95727r2.fsf@jedbrown.org> Satish Balay writes: > On Mon, 9 Jan 2017, Jed Brown wrote: > >> Satish Balay writes: >> > Sure - I'm using a crappy algorithm [look-up table] to get >> > "prime_number_close_to(1.4*sz)" - as I don't know how to generate >> > these numbers automatically. >> >> FWIW, it only needs to be coprime with PETSC_HASH_FACT. > > Not sure I understand - are you saying coprime requirement is easier > satisfy than a single prime? Yeah, just don't be a multiple of PETSC_HASH_FACT, which is itself prime. > I had switched this code to use double-hasing algorithm - and the > requirement here is - the table size be a prime number. [so I'm > attempting to estimate a prime number suitable for the table size] Why is it not sufficient to be coprime? -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From balay at mcs.anl.gov Mon Jan 9 18:20:58 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 9 Jan 2017 18:20:58 -0600 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> Message-ID: I've added up to INT_MAX Let us know if that doesn't work. Satish On Mon, 9 Jan 2017, Kong, Fande wrote: > Thanks a lot Satish! > > Like Jed said, it would be better if we could come up an algorithm for > automatically computing a hash size for a given n. Otherwise, we may need > to add more entries to the lookup again in the future. > > Fande, > > On Mon, Jan 9, 2017 at 2:14 PM, Satish Balay wrote: > > > On Mon, 9 Jan 2017, Jed Brown wrote: > > > > > Satish Balay writes: > > > > Sure - I'm using a crappy algorithm [look-up table] to get > > > > "prime_number_close_to(1.4*sz)" - as I don't know how to generate > > > > these numbers automatically. > > > > > > FWIW, it only needs to be coprime with PETSC_HASH_FACT. > > > > Not sure I understand - are you saying coprime requirement is easier > > satisfy than a single prime? > > > > I had switched this code to use double-hasing algorithm - and the > > requirement here is - the table size be a prime number. [so I'm > > attempting to estimate a prime number suitable for the table size] > > > > I pushed the following > > https://urldefense.proofpoint.com/v2/url?u=https-3A__ > > bitbucket.org_petsc_petsc_commits_d742c75fd0d514f7fa1873d5b10984 > > bc3f363031&d=DQIBAg&c=54IZrppPQZKX9mLzcGdPfFD1hxrcB_ > > _aEkJFOKJFd00&r=DUUt3SRGI0_JgtNaS3udV68GRkgV4ts7XKfj2opmi > > CY&m=nkPXHuaxZeHPzOteY25j_Dptk5XyWiqwzaJbEwI5uWY&s= > > eOjfGCXP3g18VLYhXY5xrlOr7AFn7o3G_YrYVo8Rw8Y&e= > > > > Satish > > > From balay at mcs.anl.gov Mon Jan 9 18:38:04 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 9 Jan 2017 18:38:04 -0600 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: <87h95727r2.fsf@jedbrown.org> References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> <87h95727r2.fsf@jedbrown.org> Message-ID: On Mon, 9 Jan 2017, Jed Brown wrote: > Satish Balay writes: > > > On Mon, 9 Jan 2017, Jed Brown wrote: > > > >> Satish Balay writes: > >> > Sure - I'm using a crappy algorithm [look-up table] to get > >> > "prime_number_close_to(1.4*sz)" - as I don't know how to generate > >> > these numbers automatically. > >> > >> FWIW, it only needs to be coprime with PETSC_HASH_FACT. > > > > Not sure I understand - are you saying coprime requirement is easier > > satisfy than a single prime? > > Yeah, just don't be a multiple of PETSC_HASH_FACT, which is itself > prime. > > > I had switched this code to use double-hasing algorithm - and the > > requirement here is - the table size be a prime number. [so I'm > > attempting to estimate a prime number suitable for the table size] > > Why is it not sufficient to be coprime? Well whatever was implemented previsously with PETSC_HASH_FACT [a prime number] didn't work well. [there were a couple of reports on it]. Checking double hashing [Intro to algorithms, Coremen et all.]: For a hashtable size 'm' - and using hash functions h1(k), h2(k) - it says: h2(k) must be relatively prime to 'm'. [for all possilbe 'k' values? its not clear. Also any constraints on h1(k)?] And it suggested the following as one way to imlement: choose a prime 'm' h1(k) = k mod m h2(k) = 1 + (k mod m') This was simple enough for me - so I updated ctable to use it. I've added entries up to INT_MAX. If its still lacking (I could potentially add more entries - perhaps for 64bitincides? or) - feel free to change the algorithm for arbirary sizes... Satish From jed at jedbrown.org Mon Jan 9 19:24:50 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 09 Jan 2017 18:24:50 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> <87h95727r2.fsf@jedbrown.org> Message-ID: <87a8az1zbx.fsf@jedbrown.org> Satish Balay writes: >> Why is it not sufficient to be coprime? > > Well whatever was implemented previsously with PETSC_HASH_FACT [a > prime number] didn't work well. [there were a couple of reports on it]. That was linear probing, right? > Checking double hashing [Intro to algorithms, Coremen et all.]: > > For a hashtable size 'm' - and using hash functions h1(k), h2(k) - it > says: h2(k) must be relatively prime to 'm'. [for all possilbe 'k' > values? its not clear. Also any constraints on h1(k)?] relatively prime = coprime > And it suggested the following as one way to imlement: > choose a prime 'm' > h1(k) = k mod m > h2(k) = 1 + (k mod m') > > This was simple enough for me - so I updated ctable to use it. I've > added entries up to INT_MAX. Thanks. > If its still lacking (I could potentially add more entries - perhaps > for 64bitincides? or) - feel free to change the algorithm for arbirary > sizes... I would use khash (hash.h) because it is a portable and well-optimized hash implementation. The version in PETSc uses primes and double hashing, though upstream now uses quadratic probing (better cache locality). -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From balay at mcs.anl.gov Mon Jan 9 21:10:33 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 9 Jan 2017 21:10:33 -0600 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: <87a8az1zbx.fsf@jedbrown.org> References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> <87h95727r2.fsf@jedbrown.org> <87a8az1zbx.fsf@jedbrown.org> Message-ID: On Mon, 9 Jan 2017, Jed Brown wrote: > Satish Balay writes: > > >> Why is it not sufficient to be coprime? > > > > Well whatever was implemented previsously with PETSC_HASH_FACT [a > > prime number] didn't work well. [there were a couple of reports on it]. > > That was linear probing, right? yes - but not sure if it implemented a good hash function. [had issues with collisions] > > > Checking double hashing [Intro to algorithms, Coremen et all.]: > > > > For a hashtable size 'm' - and using hash functions h1(k), h2(k) - it > > says: h2(k) must be relatively prime to 'm'. [for all possilbe 'k' > > values? its not clear. Also any constraints on h1(k)?] > > relatively prime = coprime > > > And it suggested the following as one way to imlement: > > choose a prime 'm' > > h1(k) = k mod m > > h2(k) = 1 + (k mod m') Forgot to mention: choose m' = (m-1) or (m-2) > > > > This was simple enough for me - so I updated ctable to use it. I've > > added entries up to INT_MAX. > > Thanks. > > > If its still lacking (I could potentially add more entries - perhaps > > for 64bitincides? or) - feel free to change the algorithm for arbirary > > sizes... > > I would use khash (hash.h) because it is a portable and well-optimized > hash implementation. The version in PETSc uses primes and double > hashing, though upstream now uses quadratic probing (better cache > locality). I tried looking at it - but it was easier for me to fixup current ctable code. Satish From jed at jedbrown.org Tue Jan 10 01:33:54 2017 From: jed at jedbrown.org (Jed Brown) Date: Tue, 10 Jan 2017 00:33:54 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> <87h95727r2.fsf@jedbrown.org> <87a8az1zbx.fsf@jedbrown.org> Message-ID: <877f631i8t.fsf@jedbrown.org> Satish Balay writes: > I tried looking at it - but it was easier for me to fixup current ctable code. I mentioned it more as a long-term thing. I don't think we need two different hashtable implementations in PETSc. I think khash is better and extensible, so we may as well use it in all new code and migrate petsctable to it when convenient. I don't think it's urgent unless someone has a use case that needs it. -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From mailinglists at xgm.de Tue Jan 10 06:42:19 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Tue, 10 Jan 2017 13:42:19 +0100 Subject: [petsc-users] Call multiple error handler Message-ID: Hello, I really enjoy the verbosity (line number) of the default PetscTraceBackErrorHandler. However, I want my application to be aborted with the PetscMPIAbortErrorHandler when an error occures. Can I instruct PETSc to call first one handler, then another one? Thanks, Florian From bsmith at mcs.anl.gov Tue Jan 10 07:20:04 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Tue, 10 Jan 2017 07:20:04 -0600 Subject: [petsc-users] Call multiple error handler In-Reply-To: References: Message-ID: I do not understand what you mean. I hope this is C. By default when an error is detected PetscError using PetscTraceBackErrorHandler returns up the stack printing the function/line number then returning to the next function which prints the function/line number until it gets up to the function main where it calls MPI_Abort. Now if one of the functions in the stack of functions is a user function that DOES NOT check a return code with CHKERRQ(ierr) then bad stuff happens because PetscError will print part of the stack of functions but then the user code (that ignores the error code) continues running generally causing bad and confusing things to happen. This is why we recommend always using CHKERRQ() in user code. From the perspective of PETSc error handling there really isn't any difference between user code and PETSc code. > However, I want my application to be aborted with the PetscMPIAbortErrorHandler when an error occures. Do you mean that in your code when you detect an error you want SETERRQ() to call PetscMPIAbortErrorHandler() but in PETSc code you want it to try to return through the stack printing the PETSc function/line numbers? You can do this by making your own macro mySETERRQ() that is defined to be a call to PetscMPIAbortErrorHandler and using mySETERRQ() to mark errors in your code. Though I do not recommend this, better to use CHKERRQ() in your code and get error tracebacks for PETSc code and your code. Barry > On Jan 10, 2017, at 6:42 AM, Florian Lindner wrote: > > Hello, > > I really enjoy the verbosity (line number) of the default PetscTraceBackErrorHandler. However, I want my application to > be aborted with the PetscMPIAbortErrorHandler when an error occures. > > Can I instruct PETSc to call first one handler, then another one? > > Thanks, > Florian From fande.kong at inl.gov Tue Jan 10 09:24:29 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Tue, 10 Jan 2017 08:24:29 -0700 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: <877f631i8t.fsf@jedbrown.org> References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> <87h95727r2.fsf@jedbrown.org> <87a8az1zbx.fsf@jedbrown.org> <877f631i8t.fsf@jedbrown.org> Message-ID: BTW, one more question: There are some pieces of code in #if defined(PETSC_USE_CTABLE) .... #endif. How to disable ctable? That is, make PETSC_USE_CTABLE false during configuration. Fande, On Tue, Jan 10, 2017 at 12:33 AM, Jed Brown wrote: > Satish Balay writes: > > I tried looking at it - but it was easier for me to fixup current ctable > code. > > I mentioned it more as a long-term thing. I don't think we need two > different hashtable implementations in PETSc. I think khash is better > and extensible, so we may as well use it in all new code and migrate > petsctable to it when convenient. I don't think it's urgent unless > someone has a use case that needs it. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Jan 10 09:28:37 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 10 Jan 2017 09:28:37 -0600 Subject: [petsc-users] PetscTableCreateHashSize In-Reply-To: References: <8760lo2f5y.fsf@jedbrown.org> <87zij00y4h.fsf@jedbrown.org> <87h95727r2.fsf@jedbrown.org> <87a8az1zbx.fsf@jedbrown.org> <877f631i8t.fsf@jedbrown.org> Message-ID: With configure option: --with-ctable= Satish On Tue, 10 Jan 2017, Kong, Fande wrote: > BTW, one more question: > > There are some pieces of code in #if defined(PETSC_USE_CTABLE) .... #endif. > How to disable ctable? That is, make PETSC_USE_CTABLE false during > configuration. > > Fande, > > On Tue, Jan 10, 2017 at 12:33 AM, Jed Brown wrote: > > > Satish Balay writes: > > > I tried looking at it - but it was easier for me to fixup current ctable > > code. > > > > I mentioned it more as a long-term thing. I don't think we need two > > different hashtable implementations in PETSc. I think khash is better > > and extensible, so we may as well use it in all new code and migrate > > petsctable to it when convenient. I don't think it's urgent unless > > someone has a use case that needs it. > > > From mailinglists at xgm.de Wed Jan 11 07:00:20 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Wed, 11 Jan 2017 14:00:20 +0100 Subject: [petsc-users] Call multiple error handler In-Reply-To: References: Message-ID: <0ed16536-6df2-d9d3-d21e-9fc5fc70be47@xgm.de> Hey, Am 10.01.2017 um 14:20 schrieb Barry Smith: > > I do not understand what you mean. I hope this is C. Yes, I'm talking about C(++). I'm using PETSc functions like that: ierr = ISDestroy(&ISlocal); CHKERRV(ierr); Unfortunately that is not alway possible, e.g. in this function: std::pair Vector::ownerRange() { PetscInt range_start, range_end; VecGetOwnershipRange(vector, &range_start, &range_end); return std::make_pair(range_start, range_end); } CHKERRV would returns void, CHKERRQ returns an int (iirc). Neither is possible here. But that is not my main issue here. For non PETSc related functions, I do not use CHKERRV and alike. > > By default when an error is detected PetscError using PetscTraceBackErrorHandler returns up the stack printing the > function/line number then returning to the next function which prints the function/line number until it gets up to > the function main where it calls MPI_Abort. Ok, we use petsc like that: void myFunc() { PetscErrorCode ierr = 0; ierr = MatMultTranspose(_matrixA, in, Au); CHKERRV(ierr); [... no further error checking ...] } with the standard error handler this gives a nice traceback, but the application continues to run. With a PetscPushErrorHandler(&PetscMPIAbortErrorHandler, nullptr); above these lines, that gives no traceback, just [0]PETSC ERROR: MatMult() line 2244 in /home/florian/software/petsc/src/mat/interface/matrix.c Null Object: Parameter # 1 and the application aborts. I want to combine these two traits. Print a nice traceback like the first example, then abort, like the second example. > Now if one of the functions in the stack of functions is a user function that DOES NOT check a return code with > CHKERRQ(ierr) then bad stuff happens because PetscError will print part of the stack of functions but then the user > code (that ignores the error code) continues running generally causing bad and confusing things to happen. This is > why we recommend always using CHKERRQ() in user code. > > From the perspective of PETSc error handling there really isn't any difference between user code and PETSc code. I understand that. But changing our application so that every function returns an error code and is checked using CHKERR* is not an option. The PETSc code is buried deeply in the framework. >> However, I want my application to be aborted with the PetscMPIAbortErrorHandler when an error occures. > > Do you mean that in your code when you detect an error you want SETERRQ() to call PetscMPIAbortErrorHandler() but in > PETSc code you want it to try to return through the stack printing the PETSc function/line numbers? You can do this > by making your own macro mySETERRQ() that is defined to be a call to PetscMPIAbortErrorHandler and using mySETERRQ() > to mark errors in your code. Though I do not recommend this, better to use CHKERRQ() in your code and get error > tracebacks for PETSc code and your code. Ok, I'll keep that option in mind... Best, Florian >> On Jan 10, 2017, at 6:42 AM, Florian Lindner wrote: >> >> Hello, >> >> I really enjoy the verbosity (line number) of the default PetscTraceBackErrorHandler. However, I want my >> application to be aborted with the PetscMPIAbortErrorHandler when an error occures. >> >> Can I instruct PETSc to call first one handler, then another one? >> >> Thanks, Florian > From knepley at gmail.com Wed Jan 11 07:07:31 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 11 Jan 2017 07:07:31 -0600 Subject: [petsc-users] Call multiple error handler In-Reply-To: <0ed16536-6df2-d9d3-d21e-9fc5fc70be47@xgm.de> References: <0ed16536-6df2-d9d3-d21e-9fc5fc70be47@xgm.de> Message-ID: On Wed, Jan 11, 2017 at 7:00 AM, Florian Lindner wrote: > Hey, > > Am 10.01.2017 um 14:20 schrieb Barry Smith: > > > > I do not understand what you mean. I hope this is C. > > Yes, I'm talking about C(++). > > I'm using PETSc functions like that: > > ierr = ISDestroy(&ISlocal); CHKERRV(ierr); > > Unfortunately that is not alway possible, e.g. in this function: > > std::pair Vector::ownerRange() > { > PetscInt range_start, range_end; > VecGetOwnershipRange(vector, &range_start, &range_end); > return std::make_pair(range_start, range_end); > } > > CHKERRV would returns void, CHKERRQ returns an int (iirc). Neither is > possible here. But that is not my main issue here. > > For non PETSc related functions, I do not use CHKERRV and alike. > > > > > By default when an error is detected PetscError using > PetscTraceBackErrorHandler returns up the stack printing the > > function/line number then returning to the next function which prints > the function/line number until it gets up to > > the function main where it calls MPI_Abort. > > Ok, we use petsc like that: > > void myFunc() { > PetscErrorCode ierr = 0; > ierr = MatMultTranspose(_matrixA, in, Au); CHKERRV(ierr); > [... no further error checking ...] > } > > with the standard error handler this gives a nice traceback, but the > application continues to run. > > With a PetscPushErrorHandler(&PetscMPIAbortErrorHandler, nullptr); above > these lines, that gives no traceback, just > > [0]PETSC ERROR: MatMult() line 2244 in /home/florian/software/petsc/ > src/mat/interface/matrix.c > Null Object: Parameter # 1 > > and the application aborts. > > I want to combine these two traits. Print a nice traceback like the first > example, then abort, like the second example. > The problematic part of this strategy is that you want the handler to "know" where it is. Right now, the handler has no idea where the top of the stack is, it just executes some action. In the normal case, it adds a line to the sack and returns. In the latter, it aborts. If you want different behavior, one option is to change the check at the level which you think is the top. So, at the uppermost check of a PETSc function, you use CHKERRA instead. Thanks, Matt > > > Now if one of the functions in the stack of functions is a user function > that DOES NOT check a return code with > > CHKERRQ(ierr) then bad stuff happens because PetscError will print part > of the stack of functions but then the user > > code (that ignores the error code) continues running generally causing > bad and confusing things to happen. This is > > why we recommend always using CHKERRQ() in user code. > > > > From the perspective of PETSc error handling there really isn't any > difference between user code and PETSc code. > > I understand that. But changing our application so that every function > returns an error code and is checked using > CHKERR* is not an option. The PETSc code is buried deeply in the framework. > > > >> However, I want my application to be aborted with the > PetscMPIAbortErrorHandler when an error occures. > > > > Do you mean that in your code when you detect an error you want > SETERRQ() to call PetscMPIAbortErrorHandler() but in > > PETSc code you want it to try to return through the stack printing the > PETSc function/line numbers? You can do this > > by making your own macro mySETERRQ() that is defined to be a call to > PetscMPIAbortErrorHandler and using mySETERRQ() > > to mark errors in your code. Though I do not recommend this, better to > use CHKERRQ() in your code and get error > > tracebacks for PETSc code and your code. > > Ok, I'll keep that option in mind... > > Best, > Florian > > >> On Jan 10, 2017, at 6:42 AM, Florian Lindner > wrote: > >> > >> Hello, > >> > >> I really enjoy the verbosity (line number) of the default > PetscTraceBackErrorHandler. However, I want my > >> application to be aborted with the PetscMPIAbortErrorHandler when an > error occures. > >> > >> Can I instruct PETSc to call first one handler, then another one? > >> > >> Thanks, Florian > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From Sander.Arens at ugent.be Wed Jan 11 07:23:04 2017 From: Sander.Arens at ugent.be (Sander Arens) Date: Wed, 11 Jan 2017 14:23:04 +0100 Subject: [petsc-users] Visualization of uninterpolated DMPlex with hdf5 is broken Message-ID: Visualization of uninterpolated DMPlex with hdf5 currently does not work. I think the culprit is this line. Is this to avoid duplicating output for the topology? If so, I think this is wrong, because the dataset /topology/cells does not have the correct block size and correct values (it outputs point numbers and not vertex numbers) for visualization. Thanks, Sander -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 11 11:17:40 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Jan 2017 11:17:40 -0600 Subject: [petsc-users] Call multiple error handler In-Reply-To: <0ed16536-6df2-d9d3-d21e-9fc5fc70be47@xgm.de> References: <0ed16536-6df2-d9d3-d21e-9fc5fc70be47@xgm.de> Message-ID: <4AE8A346-71D2-448C-8C98-949DED99CB92@mcs.anl.gov> > On Jan 11, 2017, at 7:00 AM, Florian Lindner wrote: > > Hey, > > Am 10.01.2017 um 14:20 schrieb Barry Smith: >> >> I do not understand what you mean. I hope this is C. > > Yes, I'm talking about C(++). > > I'm using PETSc functions like that: > > ierr = ISDestroy(&ISlocal); CHKERRV(ierr); > > Unfortunately that is not alway possible, e.g. in this function: > > std::pair Vector::ownerRange() > { > PetscInt range_start, range_end; > VecGetOwnershipRange(vector, &range_start, &range_end); > return std::make_pair(range_start, range_end); > } > > CHKERRV would returns void, CHKERRQ returns an int (iirc). Neither is possible here. But that is not my main issue here. > > For non PETSc related functions, I do not use CHKERRV and alike. > >> >> By default when an error is detected PetscError using PetscTraceBackErrorHandler returns up the stack printing the >> function/line number then returning to the next function which prints the function/line number until it gets up to >> the function main where it calls MPI_Abort. > > Ok, we use petsc like that: > > void myFunc() { > PetscErrorCode ierr = 0; > ierr = MatMultTranspose(_matrixA, in, Au); CHKERRV(ierr); > [... no further error checking ...] > } > > with the standard error handler this gives a nice traceback, but the application continues to run. > > With a PetscPushErrorHandler(&PetscMPIAbortErrorHandler, nullptr); above these lines, that gives no traceback, just > > [0]PETSC ERROR: MatMult() line 2244 in /home/florian/software/petsc/src/mat/interface/matrix.c > Null Object: Parameter # 1 > > and the application aborts. > > I want to combine these two traits. Print a nice traceback like the first example, then abort, like the second example. CHKERRABORT() > > >> Now if one of the functions in the stack of functions is a user function that DOES NOT check a return code with >> CHKERRQ(ierr) then bad stuff happens because PetscError will print part of the stack of functions but then the user >> code (that ignores the error code) continues running generally causing bad and confusing things to happen. This is >> why we recommend always using CHKERRQ() in user code. >> >> From the perspective of PETSc error handling there really isn't any difference between user code and PETSc code. > > I understand that. But changing our application so that every function returns an error code and is checked using > CHKERR* is not an option. The PETSc code is buried deeply in the framework. > > >>> However, I want my application to be aborted with the PetscMPIAbortErrorHandler when an error occures. >> >> Do you mean that in your code when you detect an error you want SETERRQ() to call PetscMPIAbortErrorHandler() but in >> PETSc code you want it to try to return through the stack printing the PETSc function/line numbers? You can do this >> by making your own macro mySETERRQ() that is defined to be a call to PetscMPIAbortErrorHandler and using mySETERRQ() >> to mark errors in your code. Though I do not recommend this, better to use CHKERRQ() in your code and get error >> tracebacks for PETSc code and your code. > > Ok, I'll keep that option in mind... > > Best, > Florian > >>> On Jan 10, 2017, at 6:42 AM, Florian Lindner wrote: >>> >>> Hello, >>> >>> I really enjoy the verbosity (line number) of the default PetscTraceBackErrorHandler. However, I want my >>> application to be aborted with the PetscMPIAbortErrorHandler when an error occures. >>> >>> Can I instruct PETSc to call first one handler, then another one? >>> >>> Thanks, Florian >> > From david.knezevic at akselos.com Wed Jan 11 15:34:24 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Wed, 11 Jan 2017 16:34:24 -0500 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur Message-ID: I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. So I did the following: - PCFieldSplitSetIS to specify the indices of the two splits - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver and PC types for each (MUMPS for A00, ILU+CG for A11) - I set -pc_fieldsplit_schur_fact_type full Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a test case. It seems to converge well, but I'm concerned about the speed (about 90 seconds, vs. about 1 second if I use a direct solver for the entire system). I just wanted to check if I'm setting this up in a good way? Many thanks, David ----------------------------------------------------------------------------------- 0 KSP Residual norm 5.405774214400e+04 1 KSP Residual norm 1.849649014371e+02 2 KSP Residual norm 7.462775074989e-02 3 KSP Residual norm 2.680497175260e-04 KSP Object: 1 MPI processes type: cg maximum iterations=1000 tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using nonzero initial guess using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_FE_split_) 1 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_FE_split_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=28476, cols=28476 package used to perform factorization: petsc total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: schurcomplement rows=28476, cols=28476 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=28476, cols=324 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 5717 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=324, cols=28476 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 67 nodes, limit used is 5 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: () 1 MPI processes type: seqaij rows=28800, cols=28800 total: nonzeros=1024686, allocated nonzeros=1024794 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9600 nodes, limit used is 5 ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 16:16:47 2017 Using Petsc Release Version 3.7.3, unknown Max Max/Min Avg Total Time (sec): 9.179e+01 1.00000 9.179e+01 Objects: 1.990e+02 1.00000 1.990e+02 Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 91 91 9693912 0. Vector Scatter 24 24 15936 0. Index Set 51 51 537888 0. IS L to G Mapping 3 3 240408 0. Matrix 13 13 64097868 0. Krylov Solver 6 6 7888 0. Preconditioner 6 6 6288 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 0. #PETSc Option Table entries: -ksp_monitor -ksp_view -log_view #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml ----------------------------------------- Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/dknez/software/petsc-src Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl ----------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Wed Jan 11 15:49:59 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Wed, 11 Jan 2017 21:49:59 +0000 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: Message-ID: It looks like the Schur solve is requiring a huge number of iterates to converge (based on the instances of MatMult). This is killing the performance. Are you sure that A11 is a good approximation to S? You might consider trying the selfp option http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre Note that the best approx to S is likely both problem and discretisation dependent so if selfp is also terrible, you might want to consider coding up your own approx to S for your specific system. Thanks, Dave On Wed, 11 Jan 2017 at 22:34, David Knezevic wrote: I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. So I did the following: - PCFieldSplitSetIS to specify the indices of the two splits - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver and PC types for each (MUMPS for A00, ILU+CG for A11) - I set -pc_fieldsplit_schur_fact_type full Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a test case. It seems to converge well, but I'm concerned about the speed (about 90 seconds, vs. about 1 second if I use a direct solver for the entire system). I just wanted to check if I'm setting this up in a good way? Many thanks, David ----------------------------------------------------------------------------------- 0 KSP Residual norm 5.405774214400e+04 1 KSP Residual norm 1.849649014371e+02 2 KSP Residual norm 7.462775074989e-02 3 KSP Residual norm 2.680497175260e-04 KSP Object: 1 MPI processes type: cg maximum iterations=1000 tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using nonzero initial guess using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_FE_split_) 1 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_FE_split_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=28476, cols=28476 package used to perform factorization: petsc total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: schurcomplement rows=28476, cols=28476 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=28476, cols=324 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 5717 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=324, cols=28476 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 67 nodes, limit used is 5 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: () 1 MPI processes type: seqaij rows=28800, cols=28800 total: nonzeros=1024686, allocated nonzeros=1024794 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9600 nodes, limit used is 5 ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 16:16:47 2017 Using Petsc Release Version 3.7.3, unknown Max Max/Min Avg Total Time (sec): 9.179e+01 1.00000 9.179e+01 Objects: 1.990e+02 1.00000 1.990e+02 Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 91 91 9693912 0. Vector Scatter 24 24 15936 0. Index Set 51 51 537888 0. IS L to G Mapping 3 3 240408 0. Matrix 13 13 64097868 0. Krylov Solver 6 6 7888 0. Preconditioner 6 6 6288 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 0. #PETSc Option Table entries: -ksp_monitor -ksp_view -log_view #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml ----------------------------------------- Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/dknez/software/petsc-src Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl ----------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Jan 11 15:51:24 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 11 Jan 2017 15:51:24 -0600 Subject: [petsc-users] malconfigured gamg In-Reply-To: <735d76e6-3875-05f1-4f2d-1ab0158d2846@sintef.no> References: <735d76e6-3875-05f1-4f2d-1ab0158d2846@sintef.no> Message-ID: <87mvexz2n7.fsf@jedbrown.org> Arne Morten Kvarving writes: > hi, > > first, this was an user error and i totally acknowledge this, but i > wonder if this might be an oversight in your error checking: if you > configure gamg with ilu/asm smoothing, and are stupid enough to have set > the number of smoother cycles to 0, your program churns along and > apparently converges just fine (towards garbage, but apparently 'sane' > garbage (not 0, not nan, not inf)) My concern here is that skipping smoothing actually makes sense, e.g., for Kaskade cycles (no pre-smoothing). I would suggest checking the unpreconditioned (or true) residual in order to notice when a singular preconditioner causes stagnation (instead of misdiagnosing it as convergence due to the preconditioned residual dropping). > once i set sor as smoother, i got the error message > > 'PETSC ERROR: Relaxation requires global its 0 positive' which pointed > me to my stupid. > > fixing this made both asm and sor work fine. > > it's all wrapped up in a schur/fieldsplit (it's P2/P1 navier-stokes), > constructed by hand due to "surrounding" reasons. but i don't think > that's relevant as such. i've used 3.6.4 as the oldest and 3.7.4 as the > newest version and behavior was the same. if you want logs et al don't > hesitate to ask for them, but i do not think they would add much. > > cheers > > arnem -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From david.knezevic at akselos.com Wed Jan 11 16:29:02 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Wed, 11 Jan 2017 17:29:02 -0500 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: Message-ID: Thanks very much for the input. I tried with "selfp" and it's about the same (log below), so I gather that I'll have to look into a user-defined approximation to S. Thanks, David ----------------------------------------- 0 KSP Residual norm 5.405528187695e+04 1 KSP Residual norm 2.187814910803e+02 2 KSP Residual norm 1.019051577515e-01 3 KSP Residual norm 4.370464012859e-04 KSP Object: 1 MPI processes type: cg maximum iterations=1000 tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using nonzero initial guess using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_FE_split_) 1 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_FE_split_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=28476, cols=28476 package used to perform factorization: petsc total: nonzeros=1037052, allocated nonzeros=1037052 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9489 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1037052, allocated nonzeros=1037052 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9489 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: schurcomplement rows=28476, cols=28476 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=28476, cols=324 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 5717 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=324, cols=28476 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 67 nodes, limit used is 5 Mat Object: 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1037052, allocated nonzeros=1037052 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9489 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: () 1 MPI processes type: seqaij rows=28800, cols=28800 total: nonzeros=1024686, allocated nonzeros=1024794 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9600 nodes, limit used is 5 ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 17:22:10 2017 Using Petsc Release Version 3.7.3, unknown Max Max/Min Avg Total Time (sec): 9.638e+01 1.00000 9.638e+01 Objects: 2.030e+02 1.00000 2.030e+02 Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 92 92 9698040 0. Vector Scatter 24 24 15936 0. Index Set 51 51 537876 0. IS L to G Mapping 3 3 240408 0. Matrix 16 16 77377776 0. Krylov Solver 6 6 7888 0. Preconditioner 6 6 6288 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 0. #PETSc Option Table entries: -ksp_monitor -ksp_view -log_view #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml ----------------------------------------- Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/dknez/software/petsc-src Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl ----------------------------------------- On Wed, Jan 11, 2017 at 4:49 PM, Dave May wrote: > It looks like the Schur solve is requiring a huge number of iterates to > converge (based on the instances of MatMult). > This is killing the performance. > > Are you sure that A11 is a good approximation to S? You might consider > trying the selfp option > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > Note that the best approx to S is likely both problem and discretisation > dependent so if selfp is also terrible, you might want to consider coding > up your own approx to S for your specific system. > > > Thanks, > Dave > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic > wrote: > > I have a definite block 2x2 system and I figured it'd be good to apply the > PCFIELDSPLIT functionality with Schur complement, as described in Section > 4.5 of the manual. > > The A00 block of my matrix is very small so I figured I'd specify a direct > solver (i.e. MUMPS) for that block. > > So I did the following: > - PCFieldSplitSetIS to specify the indices of the two splits > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver > and PC types for each (MUMPS for A00, ILU+CG for A11) > - I set -pc_fieldsplit_schur_fact_type full > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a > test case. It seems to converge well, but I'm concerned about the speed > (about 90 seconds, vs. about 1 second if I use a direct solver for the > entire system). I just wanted to check if I'm setting this up in a good way? > > Many thanks, > David > > ------------------------------------------------------------ > ----------------------- > > 0 KSP Residual norm 5.405774214400e+04 > 1 KSP Residual norm 1.849649014371e+02 > 2 KSP Residual norm 7.462775074989e-02 > 3 KSP Residual norm 2.680497175260e-04 > KSP Object: 1 MPI processes > type: cg > maximum iterations=1000 > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > left preconditioning > using nonzero initial guess > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from A11 > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): > 0 > ICNTL(13) (efficiency control): > 0 > ICNTL(14) (percentage of estimated workspace > increase): 20 > ICNTL(18) (input mat struct): > 0 > ICNTL(19) (Shur complement info): > 0 > ICNTL(20) (rhs sparse pattern): > 0 > ICNTL(21) (solution struct): > 0 > ICNTL(22) (in-core/out-of-core facility): > 0 > ICNTL(23) (max size of memory can be allocated > locally):0 > ICNTL(24) (detection of null pivot rows): > 0 > ICNTL(25) (computation of a null space basis): > 0 > ICNTL(26) (Schur options for rhs or solution): > 0 > ICNTL(27) (experimental parameter): > -24 > ICNTL(28) (use parallel or sequential ordering): > 1 > ICNTL(29) (parallel ordering): > 0 > ICNTL(30) (user-specified set of entries in inv(A)): > 0 > ICNTL(31) (factors is discarded in the solve phase): > 0 > ICNTL(33) (compute determinant): > 0 > CNTL(1) (relative pivoting threshold): 0.01 > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the elimination > after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the assembly after > factorization): > [0] 1092. > RINFO(3) (local estimated flops for the elimination > after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this processor > after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the elimination > after analysis): 29394. > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > RINFOG(3) (global estimated flops for the elimination > after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors on all > processors after analysis): 3888 > INFOG(4) (estimated integer workspace for factors on > all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the complete > tree): 12 > INFOG(6) (number of nodes in the complete tree): 53 > INFOG(7) (ordering option effectively use after > analysis): 2 > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to store the > matrix factors after factorization): 3888 > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix after > factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after > factorization): 0 > INFOG(14) (number of memory compress after > factorization): 0 > INFOG(15) (number of steps of iterative refinement > after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > INFOG(17) (estimated size of all MUMPS internal data > for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data allocated > during factorization: value on the most memory consuming processor): 1 > INFOG(19) (size of all MUMPS internal data allocated > during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the > factors): 3042 > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > INFOG(28) (after factorization: number of null pivots > encountered): 0 > INFOG(29) (after factorization: effective number of > entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if determinant > is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > type: cg > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_FE_split_) 1 MPI processes > type: bjacobi > block Jacobi: number of blocks = 1 > Local solve is same for all blocks, in the following KSP and PC > objects: > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI > processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_FE_split_sub_) 1 MPI > processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > package used to perform factorization: petsc > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls > =0 > using I-node routines: found 9492 nodes, limit used > is 5 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_FE_split_) 1 > MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > linear system matrix followed by preconditioner matrix: > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: schurcomplement > rows=28476, cols=28476 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=324 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 5717 nodes, limit used is 5 > KSP of A00 > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues > calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): > 0 > ICNTL(13) (efficiency control): > 0 > ICNTL(14) (percentage of estimated workspace > increase): 20 > ICNTL(18) (input mat struct): > 0 > ICNTL(19) (Shur complement info): > 0 > ICNTL(20) (rhs sparse pattern): > 0 > ICNTL(21) (solution struct): > 0 > ICNTL(22) (in-core/out-of-core facility): > 0 > ICNTL(23) (max size of memory can be allocated > locally):0 > ICNTL(24) (detection of null pivot rows): > 0 > ICNTL(25) (computation of a null space basis): > 0 > ICNTL(26) (Schur options for rhs or solution): > 0 > ICNTL(27) (experimental parameter): > -24 > ICNTL(28) (use parallel or sequential > ordering): 1 > ICNTL(29) (parallel ordering): > 0 > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > ICNTL(31) (factors is discarded in the solve > phase): 0 > ICNTL(33) (compute determinant): > 0 > CNTL(1) (relative pivoting threshold): > 0.01 > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): > -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the > elimination after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the > assembly after factorization): > [0] 1092. > RINFO(3) (local estimated flops for the > elimination after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data > used during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this > processor after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors > on all processors after analysis): 3888 > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the > complete tree): 12 > INFOG(6) (number of nodes in the complete > tree): 53 > INFOG(7) (ordering option effectively use > after analysis): 2 > INFOG(8) (structural symmetry in percent of > the permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix > after factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after > factorization): 0 > INFOG(14) (number of memory compress after > factorization): 0 > INFOG(15) (number of steps of iterative > refinement after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the > factors): 3042 > INFOG(21) (size in MB of memory effectively > used during factorization - value on the most memory consuming processor): > 1 > INFOG(22) (size in MB of memory effectively > used during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > INFOG(28) (after factorization: number of null > pivots encountered): 0 > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes > of memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis > done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=28476 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 67 nodes, limit used is 5 > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > linear system matrix = precond matrix: > Mat Object: () 1 MPI processes > type: seqaij > rows=28800, cols=28800 > total: nonzeros=1024686, allocated nonzeros=1024794 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9600 nodes, limit used is 5 > > > ---------------------------------------------- PETSc Performance Summary: > ---------------------------------------------- > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 16:16:47 2017 > Using Petsc Release Version 3.7.3, unknown > > Max Max/Min Avg Total > Time (sec): 9.179e+01 1.00000 9.179e+01 > Objects: 1.990e+02 1.00000 1.990e+02 > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Reductions: 0.000e+00 0.00000 > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > e.g., VecAXPY() for real vectors of length N > --> 2N flops > and VecAXPY() for complex vectors of length N > --> 8N flops > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages > --- -- Message Lengths -- -- Reductions -- > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > ------------------------------------------------------------ > ------------------------------------------------------------ > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > Phase summary info: > Count: number of times phase was executed > Time and Flops: Max - maximum over all processors > Ratio - ratio of maximum to minimum over all processors > Mess: number of messages sent > Avg. len: average message length (bytes) > Reduct: number of global reductions > Global: entire computation > Stage: stages of a computation. Set stages with PetscLogStagePush() and > PetscLogStagePop(). > %T - percent time in this phase %F - percent flops in this > phase > %M - percent messages in this phase %L - percent message lengths > in this phase > %R - percent reductions in this phase > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > ------------------------------------------------------------ > ------------------------------------------------------------ > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > Max Ratio Max Ratio Max Ratio Mess Avg len > Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > ------------------------------------------------------------ > ------------------------------------------------------------ > > --- Event Stage 0: Main Stage > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > ------------------------------------------------------------ > ------------------------------------------------------------ > > Memory usage is given in bytes: > > Object Type Creations Destructions Memory Descendants' Mem. > Reports information only for process 0. > > --- Event Stage 0: Main Stage > > Vector 91 91 9693912 0. > Vector Scatter 24 24 15936 0. > Index Set 51 51 537888 0. > IS L to G Mapping 3 3 240408 0. > Matrix 13 13 64097868 0. > Krylov Solver 6 6 7888 0. > Preconditioner 6 6 6288 0. > Viewer 1 0 0 0. > Distributed Mesh 1 1 4624 0. > Star Forest Bipartite Graph 2 2 1616 0. > Discrete System 1 1 872 0. > ============================================================ > ============================================================ > Average time to get PetscTime(): 0. > #PETSc Option Table entries: > -ksp_monitor > -ksp_view > -log_view > #End of PETSc Option Table entries > Compiled without FORTRAN kernels > Compiled with full precision matrices (default) > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > ----------------------------------------- > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > Using PETSc directory: /home/dknez/software/petsc-src > Using PETSc arch: arch-linux2-c-opt > ----------------------------------------- > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing > -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > ----------------------------------------- > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > ----------------------------------------- > > Using C linker: mpicc > Using Fortran linker: mpif90 > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > ----------------------------------------- > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From dave.mayhem23 at gmail.com Wed Jan 11 16:52:14 2017 From: dave.mayhem23 at gmail.com (Dave May) Date: Wed, 11 Jan 2017 22:52:14 +0000 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: Message-ID: On 11 January 2017 at 22:29, David Knezevic wrote: > Thanks very much for the input. I tried with "selfp" and it's about the > same (log below), > Yeah, looks similar. > so I gather that I'll have to look into a user-defined approximation to S. > Where does the 2x2 block system come from? Maybe someone on the list knows the right approximation to use for S. > > Thanks, > David > > > ----------------------------------------- > > 0 KSP Residual norm 5.405528187695e+04 > 1 KSP Residual norm 2.187814910803e+02 > 2 KSP Residual norm 1.019051577515e-01 > 3 KSP Residual norm 4.370464012859e-04 > KSP Object: 1 MPI processes > type: cg > maximum iterations=1000 > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > left preconditioning > using nonzero initial guess > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from Sp, an assembled > approximation to S, which uses (lumped, if requested) A00's diagonal's > inverse > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): > 0 > ICNTL(13) (efficiency control): > 0 > ICNTL(14) (percentage of estimated workspace > increase): 20 > ICNTL(18) (input mat struct): > 0 > ICNTL(19) (Shur complement info): > 0 > ICNTL(20) (rhs sparse pattern): > 0 > ICNTL(21) (solution struct): > 0 > ICNTL(22) (in-core/out-of-core facility): > 0 > ICNTL(23) (max size of memory can be allocated > locally):0 > ICNTL(24) (detection of null pivot rows): > 0 > ICNTL(25) (computation of a null space basis): > 0 > ICNTL(26) (Schur options for rhs or solution): > 0 > ICNTL(27) (experimental parameter): > -24 > ICNTL(28) (use parallel or sequential ordering): > 1 > ICNTL(29) (parallel ordering): > 0 > ICNTL(30) (user-specified set of entries in inv(A)): > 0 > ICNTL(31) (factors is discarded in the solve phase): > 0 > ICNTL(33) (compute determinant): > 0 > CNTL(1) (relative pivoting threshold): 0.01 > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the elimination > after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the assembly after > factorization): > [0] 1092. > RINFO(3) (local estimated flops for the elimination > after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this processor > after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the elimination > after analysis): 29394. > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > RINFOG(3) (global estimated flops for the elimination > after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors on all > processors after analysis): 3888 > INFOG(4) (estimated integer workspace for factors on > all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the complete > tree): 12 > INFOG(6) (number of nodes in the complete tree): 53 > INFOG(7) (ordering option effectively use after > analysis): 2 > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to store the > matrix factors after factorization): 3888 > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix after > factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after > factorization): 0 > INFOG(14) (number of memory compress after > factorization): 0 > INFOG(15) (number of steps of iterative refinement > after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > INFOG(17) (estimated size of all MUMPS internal data > for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data allocated > during factorization: value on the most memory consuming processor): 1 > INFOG(19) (size of all MUMPS internal data allocated > during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the > factors): 3042 > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > INFOG(28) (after factorization: number of null pivots > encountered): 0 > INFOG(29) (after factorization: effective number of > entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if determinant > is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > type: cg > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_FE_split_) 1 MPI processes > type: bjacobi > block Jacobi: number of blocks = 1 > Local solve is same for all blocks, in the following KSP and PC > objects: > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI > processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_FE_split_sub_) 1 MPI > processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > package used to perform factorization: petsc > total: nonzeros=1037052, allocated nonzeros=1037052 > total number of mallocs used during MatSetValues calls > =0 > using I-node routines: found 9489 nodes, limit used > is 5 > linear system matrix = precond matrix: > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1037052, allocated nonzeros=1037052 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9489 nodes, limit used is 5 > linear system matrix followed by preconditioner matrix: > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: schurcomplement > rows=28476, cols=28476 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=324 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 5717 nodes, limit used is 5 > KSP of A00 > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues > calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): > 0 > ICNTL(13) (efficiency control): > 0 > ICNTL(14) (percentage of estimated workspace > increase): 20 > ICNTL(18) (input mat struct): > 0 > ICNTL(19) (Shur complement info): > 0 > ICNTL(20) (rhs sparse pattern): > 0 > ICNTL(21) (solution struct): > 0 > ICNTL(22) (in-core/out-of-core facility): > 0 > ICNTL(23) (max size of memory can be allocated > locally):0 > ICNTL(24) (detection of null pivot rows): > 0 > ICNTL(25) (computation of a null space basis): > 0 > ICNTL(26) (Schur options for rhs or solution): > 0 > ICNTL(27) (experimental parameter): > -24 > ICNTL(28) (use parallel or sequential > ordering): 1 > ICNTL(29) (parallel ordering): > 0 > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > ICNTL(31) (factors is discarded in the solve > phase): 0 > ICNTL(33) (compute determinant): > 0 > CNTL(1) (relative pivoting threshold): > 0.01 > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): > -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the > elimination after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the > assembly after factorization): > [0] 1092. > RINFO(3) (local estimated flops for the > elimination after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data > used during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this > processor after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors > on all processors after analysis): 3888 > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the > complete tree): 12 > INFOG(6) (number of nodes in the complete > tree): 53 > INFOG(7) (ordering option effectively use > after analysis): 2 > INFOG(8) (structural symmetry in percent of > the permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix > after factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after > factorization): 0 > INFOG(14) (number of memory compress after > factorization): 0 > INFOG(15) (number of steps of iterative > refinement after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the > factors): 3042 > INFOG(21) (size in MB of memory effectively > used during factorization - value on the most memory consuming processor): > 1 > INFOG(22) (size in MB of memory effectively > used during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > INFOG(28) (after factorization: number of null > pivots encountered): 0 > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes > of memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis > done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=28476 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 67 nodes, limit used is 5 > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1037052, allocated nonzeros=1037052 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9489 nodes, limit used is 5 > linear system matrix = precond matrix: > Mat Object: () 1 MPI processes > type: seqaij > rows=28800, cols=28800 > total: nonzeros=1024686, allocated nonzeros=1024794 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9600 nodes, limit used is 5 > > ---------------------------------------------- PETSc Performance Summary: > ---------------------------------------------- > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 17:22:10 2017 > Using Petsc Release Version 3.7.3, unknown > > Max Max/Min Avg Total > Time (sec): 9.638e+01 1.00000 9.638e+01 > Objects: 2.030e+02 1.00000 2.030e+02 > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Reductions: 0.000e+00 0.00000 > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > e.g., VecAXPY() for real vectors of length N > --> 2N flops > and VecAXPY() for complex vectors of length N > --> 8N flops > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages > --- -- Message Lengths -- -- Reductions -- > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > ------------------------------------------------------------ > ------------------------------------------------------------ > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > Phase summary info: > Count: number of times phase was executed > Time and Flops: Max - maximum over all processors > Ratio - ratio of maximum to minimum over all processors > Mess: number of messages sent > Avg. len: average message length (bytes) > Reduct: number of global reductions > Global: entire computation > Stage: stages of a computation. Set stages with PetscLogStagePush() and > PetscLogStagePop(). > %T - percent time in this phase %F - percent flops in this > phase > %M - percent messages in this phase %L - percent message lengths > in this phase > %R - percent reductions in this phase > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > ------------------------------------------------------------ > ------------------------------------------------------------ > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > Max Ratio Max Ratio Max Ratio Mess Avg len > Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > ------------------------------------------------------------ > ------------------------------------------------------------ > > --- Event Stage 0: Main Stage > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > ------------------------------------------------------------ > ------------------------------------------------------------ > > Memory usage is given in bytes: > > Object Type Creations Destructions Memory Descendants' Mem. > Reports information only for process 0. > > --- Event Stage 0: Main Stage > > Vector 92 92 9698040 0. > Vector Scatter 24 24 15936 0. > Index Set 51 51 537876 0. > IS L to G Mapping 3 3 240408 0. > Matrix 16 16 77377776 0. > Krylov Solver 6 6 7888 0. > Preconditioner 6 6 6288 0. > Viewer 1 0 0 0. > Distributed Mesh 1 1 4624 0. > Star Forest Bipartite Graph 2 2 1616 0. > Discrete System 1 1 872 0. > ============================================================ > ============================================================ > Average time to get PetscTime(): 0. > #PETSc Option Table entries: > -ksp_monitor > -ksp_view > -log_view > #End of PETSc Option Table entries > Compiled without FORTRAN kernels > Compiled with full precision matrices (default) > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > ----------------------------------------- > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > Using PETSc directory: /home/dknez/software/petsc-src > Using PETSc arch: arch-linux2-c-opt > ----------------------------------------- > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing > -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > ----------------------------------------- > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > ----------------------------------------- > > Using C linker: mpicc > Using Fortran linker: mpif90 > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > ----------------------------------------- > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May wrote: > >> It looks like the Schur solve is requiring a huge number of iterates to >> converge (based on the instances of MatMult). >> This is killing the performance. >> >> Are you sure that A11 is a good approximation to S? You might consider >> trying the selfp option >> >> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >> PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre >> >> Note that the best approx to S is likely both problem and discretisation >> dependent so if selfp is also terrible, you might want to consider coding >> up your own approx to S for your specific system. >> >> >> Thanks, >> Dave >> >> >> On Wed, 11 Jan 2017 at 22:34, David Knezevic >> wrote: >> >> I have a definite block 2x2 system and I figured it'd be good to apply >> the PCFIELDSPLIT functionality with Schur complement, as described in >> Section 4.5 of the manual. >> >> The A00 block of my matrix is very small so I figured I'd specify a >> direct solver (i.e. MUMPS) for that block. >> >> So I did the following: >> - PCFieldSplitSetIS to specify the indices of the two splits >> - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver >> and PC types for each (MUMPS for A00, ILU+CG for A11) >> - I set -pc_fieldsplit_schur_fact_type full >> >> Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for >> a test case. It seems to converge well, but I'm concerned about the speed >> (about 90 seconds, vs. about 1 second if I use a direct solver for the >> entire system). I just wanted to check if I'm setting this up in a good way? >> >> Many thanks, >> David >> >> ------------------------------------------------------------ >> ----------------------- >> >> 0 KSP Residual norm 5.405774214400e+04 >> 1 KSP Residual norm 1.849649014371e+02 >> 2 KSP Residual norm 7.462775074989e-02 >> 3 KSP Residual norm 2.680497175260e-04 >> KSP Object: 1 MPI processes >> type: cg >> maximum iterations=1000 >> tolerances: relative=1e-06, absolute=1e-50, divergence=10000. >> left preconditioning >> using nonzero initial guess >> using PRECONDITIONED norm type for convergence test >> PC Object: 1 MPI processes >> type: fieldsplit >> FieldSplit with Schur preconditioner, factorization FULL >> Preconditioner for the Schur complement formed from A11 >> Split info: >> Split number 0 Defined by IS >> Split number 1 Defined by IS >> KSP solver for A00 block >> KSP Object: (fieldsplit_RB_split_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (fieldsplit_RB_split_) 1 MPI processes >> type: cholesky >> Cholesky: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 0., needed 0. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> package used to perform factorization: mumps >> total: nonzeros=3042, allocated nonzeros=3042 >> total number of mallocs used during MatSetValues calls =0 >> MUMPS run parameters: >> SYM (matrix type): 2 >> PAR (host participation): 1 >> ICNTL(1) (output for error): 6 >> ICNTL(2) (output of diagnostic msg): 0 >> ICNTL(3) (output for global info): 0 >> ICNTL(4) (level of printing): 0 >> ICNTL(5) (input mat struct): 0 >> ICNTL(6) (matrix prescaling): 7 >> ICNTL(7) (sequentia matrix ordering):7 >> ICNTL(8) (scalling strategy): 77 >> ICNTL(10) (max num of refinements): 0 >> ICNTL(11) (error analysis): 0 >> ICNTL(12) (efficiency control): >> 0 >> ICNTL(13) (efficiency control): >> 0 >> ICNTL(14) (percentage of estimated workspace >> increase): 20 >> ICNTL(18) (input mat struct): >> 0 >> ICNTL(19) (Shur complement info): >> 0 >> ICNTL(20) (rhs sparse pattern): >> 0 >> ICNTL(21) (solution struct): >> 0 >> ICNTL(22) (in-core/out-of-core facility): >> 0 >> ICNTL(23) (max size of memory can be allocated >> locally):0 >> ICNTL(24) (detection of null pivot rows): >> 0 >> ICNTL(25) (computation of a null space basis): >> 0 >> ICNTL(26) (Schur options for rhs or solution): >> 0 >> ICNTL(27) (experimental parameter): >> -24 >> ICNTL(28) (use parallel or sequential ordering): >> 1 >> ICNTL(29) (parallel ordering): >> 0 >> ICNTL(30) (user-specified set of entries in inv(A)): >> 0 >> ICNTL(31) (factors is discarded in the solve phase): >> 0 >> ICNTL(33) (compute determinant): >> 0 >> CNTL(1) (relative pivoting threshold): 0.01 >> CNTL(2) (stopping criterion of refinement): >> 1.49012e-08 >> CNTL(3) (absolute pivoting threshold): 0. >> CNTL(4) (value of static pivoting): -1. >> CNTL(5) (fixation for null pivots): 0. >> RINFO(1) (local estimated flops for the elimination >> after analysis): >> [0] 29394. >> RINFO(2) (local estimated flops for the assembly >> after factorization): >> [0] 1092. >> RINFO(3) (local estimated flops for the elimination >> after factorization): >> [0] 29394. >> INFO(15) (estimated size of (in MB) MUMPS internal >> data for running numerical factorization): >> [0] 1 >> INFO(16) (size of (in MB) MUMPS internal data used >> during numerical factorization): >> [0] 1 >> INFO(23) (num of pivots eliminated on this processor >> after factorization): >> [0] 324 >> RINFOG(1) (global estimated flops for the elimination >> after analysis): 29394. >> RINFOG(2) (global estimated flops for the assembly >> after factorization): 1092. >> RINFOG(3) (global estimated flops for the elimination >> after factorization): 29394. >> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): >> (0.,0.)*(2^0) >> INFOG(3) (estimated real workspace for factors on all >> processors after analysis): 3888 >> INFOG(4) (estimated integer workspace for factors on >> all processors after analysis): 2067 >> INFOG(5) (estimated maximum front size in the >> complete tree): 12 >> INFOG(6) (number of nodes in the complete tree): 53 >> INFOG(7) (ordering option effectively use after >> analysis): 2 >> INFOG(8) (structural symmetry in percent of the >> permuted matrix after analysis): 100 >> INFOG(9) (total real/complex workspace to store the >> matrix factors after factorization): 3888 >> INFOG(10) (total integer space store the matrix >> factors after factorization): 2067 >> INFOG(11) (order of largest frontal matrix after >> factorization): 12 >> INFOG(12) (number of off-diagonal pivots): 0 >> INFOG(13) (number of delayed pivots after >> factorization): 0 >> INFOG(14) (number of memory compress after >> factorization): 0 >> INFOG(15) (number of steps of iterative refinement >> after solution): 0 >> INFOG(16) (estimated size (in MB) of all MUMPS >> internal data for factorization after analysis: value on the most memory >> consuming processor): 1 >> INFOG(17) (estimated size of all MUMPS internal data >> for factorization after analysis: sum over all processors): 1 >> INFOG(18) (size of all MUMPS internal data allocated >> during factorization: value on the most memory consuming processor): 1 >> INFOG(19) (size of all MUMPS internal data allocated >> during factorization: sum over all processors): 1 >> INFOG(20) (estimated number of entries in the >> factors): 3042 >> INFOG(21) (size in MB of memory effectively used >> during factorization - value on the most memory consuming processor): 1 >> INFOG(22) (size in MB of memory effectively used >> during factorization - sum over all processors): 1 >> INFOG(23) (after analysis: value of ICNTL(6) >> effectively used): 5 >> INFOG(24) (after analysis: value of ICNTL(12) >> effectively used): 1 >> INFOG(25) (after factorization: number of pivots >> modified by static pivoting): 0 >> INFOG(28) (after factorization: number of null pivots >> encountered): 0 >> INFOG(29) (after factorization: effective number of >> entries in the factors (sum over all processors)): 3042 >> INFOG(30, 31) (after solution: size in Mbytes of >> memory used during solution phase): 0, 0 >> INFOG(32) (after analysis: type of analysis done): 1 >> INFOG(33) (value used for ICNTL(8)): -2 >> INFOG(34) (exponent of the determinant if determinant >> is requested): 0 >> linear system matrix = precond matrix: >> Mat Object: (fieldsplit_RB_split_) 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> total: nonzeros=5760, allocated nonzeros=5760 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 108 nodes, limit used is 5 >> KSP solver for S = A11 - A10 inv(A00) A01 >> KSP Object: (fieldsplit_FE_split_) 1 MPI processes >> type: cg >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_FE_split_) 1 MPI processes >> type: bjacobi >> block Jacobi: number of blocks = 1 >> Local solve is same for all blocks, in the following KSP and PC >> objects: >> KSP Object: (fieldsplit_FE_split_sub_) 1 MPI >> processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (fieldsplit_FE_split_sub_) 1 MPI >> processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 1., needed 1. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=28476, cols=28476 >> package used to perform factorization: petsc >> total: nonzeros=1017054, allocated nonzeros=1017054 >> total number of mallocs used during MatSetValues >> calls =0 >> using I-node routines: found 9492 nodes, limit used >> is 5 >> linear system matrix = precond matrix: >> Mat Object: (fieldsplit_FE_split_) 1 >> MPI processes >> type: seqaij >> rows=28476, cols=28476 >> total: nonzeros=1017054, allocated nonzeros=1017054 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9492 nodes, limit used is 5 >> linear system matrix followed by preconditioner matrix: >> Mat Object: (fieldsplit_FE_split_) 1 MPI processes >> type: schurcomplement >> rows=28476, cols=28476 >> Schur complement A11 - A10 inv(A00) A01 >> A11 >> Mat Object: (fieldsplit_FE_split_) >> 1 MPI processes >> type: seqaij >> rows=28476, cols=28476 >> total: nonzeros=1017054, allocated nonzeros=1017054 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9492 nodes, limit used is 5 >> A10 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=28476, cols=324 >> total: nonzeros=936, allocated nonzeros=936 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 5717 nodes, limit used is 5 >> KSP of A00 >> KSP Object: (fieldsplit_RB_split_) >> 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (fieldsplit_RB_split_) >> 1 MPI processes >> type: cholesky >> Cholesky: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 0., needed 0. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> package used to perform factorization: mumps >> total: nonzeros=3042, allocated nonzeros=3042 >> total number of mallocs used during MatSetValues >> calls =0 >> MUMPS run parameters: >> SYM (matrix type): 2 >> PAR (host participation): 1 >> ICNTL(1) (output for error): 6 >> ICNTL(2) (output of diagnostic msg): 0 >> ICNTL(3) (output for global info): 0 >> ICNTL(4) (level of printing): 0 >> ICNTL(5) (input mat struct): 0 >> ICNTL(6) (matrix prescaling): 7 >> ICNTL(7) (sequentia matrix ordering):7 >> ICNTL(8) (scalling strategy): 77 >> ICNTL(10) (max num of refinements): 0 >> ICNTL(11) (error analysis): 0 >> ICNTL(12) (efficiency control): >> 0 >> ICNTL(13) (efficiency control): >> 0 >> ICNTL(14) (percentage of estimated workspace >> increase): 20 >> ICNTL(18) (input mat struct): >> 0 >> ICNTL(19) (Shur complement info): >> 0 >> ICNTL(20) (rhs sparse pattern): >> 0 >> ICNTL(21) (solution struct): >> 0 >> ICNTL(22) (in-core/out-of-core facility): >> 0 >> ICNTL(23) (max size of memory can be >> allocated locally):0 >> ICNTL(24) (detection of null pivot rows): >> 0 >> ICNTL(25) (computation of a null space >> basis): 0 >> ICNTL(26) (Schur options for rhs or >> solution): 0 >> ICNTL(27) (experimental parameter): >> -24 >> ICNTL(28) (use parallel or sequential >> ordering): 1 >> ICNTL(29) (parallel ordering): >> 0 >> ICNTL(30) (user-specified set of entries in >> inv(A)): 0 >> ICNTL(31) (factors is discarded in the solve >> phase): 0 >> ICNTL(33) (compute determinant): >> 0 >> CNTL(1) (relative pivoting threshold): >> 0.01 >> CNTL(2) (stopping criterion of refinement): >> 1.49012e-08 >> CNTL(3) (absolute pivoting threshold): >> 0. >> CNTL(4) (value of static pivoting): >> -1. >> CNTL(5) (fixation for null pivots): >> 0. >> RINFO(1) (local estimated flops for the >> elimination after analysis): >> [0] 29394. >> RINFO(2) (local estimated flops for the >> assembly after factorization): >> [0] 1092. >> RINFO(3) (local estimated flops for the >> elimination after factorization): >> [0] 29394. >> INFO(15) (estimated size of (in MB) MUMPS >> internal data for running numerical factorization): >> [0] 1 >> INFO(16) (size of (in MB) MUMPS internal data >> used during numerical factorization): >> [0] 1 >> INFO(23) (num of pivots eliminated on this >> processor after factorization): >> [0] 324 >> RINFOG(1) (global estimated flops for the >> elimination after analysis): 29394. >> RINFOG(2) (global estimated flops for the >> assembly after factorization): 1092. >> RINFOG(3) (global estimated flops for the >> elimination after factorization): 29394. >> (RINFOG(12) RINFOG(13))*2^INFOG(34) >> (determinant): (0.,0.)*(2^0) >> INFOG(3) (estimated real workspace for >> factors on all processors after analysis): 3888 >> INFOG(4) (estimated integer workspace for >> factors on all processors after analysis): 2067 >> INFOG(5) (estimated maximum front size in the >> complete tree): 12 >> INFOG(6) (number of nodes in the complete >> tree): 53 >> INFOG(7) (ordering option effectively use >> after analysis): 2 >> INFOG(8) (structural symmetry in percent of >> the permuted matrix after analysis): 100 >> INFOG(9) (total real/complex workspace to >> store the matrix factors after factorization): 3888 >> INFOG(10) (total integer space store the >> matrix factors after factorization): 2067 >> INFOG(11) (order of largest frontal matrix >> after factorization): 12 >> INFOG(12) (number of off-diagonal pivots): 0 >> INFOG(13) (number of delayed pivots after >> factorization): 0 >> INFOG(14) (number of memory compress after >> factorization): 0 >> INFOG(15) (number of steps of iterative >> refinement after solution): 0 >> INFOG(16) (estimated size (in MB) of all >> MUMPS internal data for factorization after analysis: value on the most >> memory consuming processor): 1 >> INFOG(17) (estimated size of all MUMPS >> internal data for factorization after analysis: sum over all processors): 1 >> INFOG(18) (size of all MUMPS internal data >> allocated during factorization: value on the most memory consuming >> processor): 1 >> INFOG(19) (size of all MUMPS internal data >> allocated during factorization: sum over all processors): 1 >> INFOG(20) (estimated number of entries in the >> factors): 3042 >> INFOG(21) (size in MB of memory effectively >> used during factorization - value on the most memory consuming processor): >> 1 >> INFOG(22) (size in MB of memory effectively >> used during factorization - sum over all processors): 1 >> INFOG(23) (after analysis: value of ICNTL(6) >> effectively used): 5 >> INFOG(24) (after analysis: value of ICNTL(12) >> effectively used): 1 >> INFOG(25) (after factorization: number of >> pivots modified by static pivoting): 0 >> INFOG(28) (after factorization: number of >> null pivots encountered): 0 >> INFOG(29) (after factorization: effective >> number of entries in the factors (sum over all processors)): 3042 >> INFOG(30, 31) (after solution: size in Mbytes >> of memory used during solution phase): 0, 0 >> INFOG(32) (after analysis: type of analysis >> done): 1 >> INFOG(33) (value used for ICNTL(8)): -2 >> INFOG(34) (exponent of the determinant if >> determinant is requested): 0 >> linear system matrix = precond matrix: >> Mat Object: (fieldsplit_RB_split_) >> 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> total: nonzeros=5760, allocated nonzeros=5760 >> total number of mallocs used during MatSetValues calls >> =0 >> using I-node routines: found 108 nodes, limit used is >> 5 >> A01 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=324, cols=28476 >> total: nonzeros=936, allocated nonzeros=936 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 67 nodes, limit used is 5 >> Mat Object: (fieldsplit_FE_split_) 1 MPI processes >> type: seqaij >> rows=28476, cols=28476 >> total: nonzeros=1017054, allocated nonzeros=1017054 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9492 nodes, limit used is 5 >> linear system matrix = precond matrix: >> Mat Object: () 1 MPI processes >> type: seqaij >> rows=28800, cols=28800 >> total: nonzeros=1024686, allocated nonzeros=1024794 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9600 nodes, limit used is 5 >> >> >> ---------------------------------------------- PETSc Performance >> Summary: ---------------------------------------------- >> >> /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a >> arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 >> 16:16:47 2017 >> Using Petsc Release Version 3.7.3, unknown >> >> Max Max/Min Avg Total >> Time (sec): 9.179e+01 1.00000 9.179e+01 >> Objects: 1.990e+02 1.00000 1.990e+02 >> Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 >> Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 >> MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> MPI Reductions: 0.000e+00 0.00000 >> >> Flop counting convention: 1 flop = 1 real number operation of type >> (multiply/divide/add/subtract) >> e.g., VecAXPY() for real vectors of length N >> --> 2N flops >> and VecAXPY() for complex vectors of length N >> --> 8N flops >> >> Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages >> --- -- Message Lengths -- -- Reductions -- >> Avg %Total Avg %Total counts >> %Total Avg %Total counts %Total >> 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 >> 0.0% 0.000e+00 0.0% 0.000e+00 0.0% >> >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> See the 'Profiling' chapter of the users' manual for details on >> interpreting output. >> Phase summary info: >> Count: number of times phase was executed >> Time and Flops: Max - maximum over all processors >> Ratio - ratio of maximum to minimum over all processors >> Mess: number of messages sent >> Avg. len: average message length (bytes) >> Reduct: number of global reductions >> Global: entire computation >> Stage: stages of a computation. Set stages with PetscLogStagePush() >> and PetscLogStagePop(). >> %T - percent time in this phase %F - percent flops in this >> phase >> %M - percent messages in this phase %L - percent message >> lengths in this phase >> %R - percent reductions in this phase >> Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time >> over all processors) >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> Event Count Time (sec) Flops >> --- Global --- --- Stage --- Total >> Max Ratio Max Ratio Max Ratio Mess Avg len >> Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> >> --- Event Stage 0: Main Stage >> >> VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 >> VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 >> VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 >> VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 >> VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 >> VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 >> VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 >> 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 >> MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 >> 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 >> MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 >> 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 >> MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 >> MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 >> 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >> PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 >> PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 >> PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 >> 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >> KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 >> 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >> KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> >> Memory usage is given in bytes: >> >> Object Type Creations Destructions Memory Descendants' >> Mem. >> Reports information only for process 0. >> >> --- Event Stage 0: Main Stage >> >> Vector 91 91 9693912 0. >> Vector Scatter 24 24 15936 0. >> Index Set 51 51 537888 0. >> IS L to G Mapping 3 3 240408 0. >> Matrix 13 13 64097868 0. >> Krylov Solver 6 6 7888 0. >> Preconditioner 6 6 6288 0. >> Viewer 1 0 0 0. >> Distributed Mesh 1 1 4624 0. >> Star Forest Bipartite Graph 2 2 1616 0. >> Discrete System 1 1 872 0. >> ============================================================ >> ============================================================ >> Average time to get PetscTime(): 0. >> #PETSc Option Table entries: >> -ksp_monitor >> -ksp_view >> -log_view >> #End of PETSc Option Table entries >> Compiled without FORTRAN kernels >> Compiled with full precision matrices (default) >> sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 >> sizeof(PetscScalar) 8 sizeof(PetscInt) 4 >> Configure options: --with-shared-libraries=1 --with-debugging=0 >> --download-suitesparse --download-blacs --download-ptscotch=yes >> --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl >> --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps >> --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc >> --download-hypre --download-ml >> ----------------------------------------- >> Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo >> Machine characteristics: Linux-4.4.0-38-generic-x86_64- >> with-Ubuntu-16.04-xenial >> Using PETSc directory: /home/dknez/software/petsc-src >> Using PETSc arch: arch-linux2-c-opt >> ----------------------------------------- >> >> Using C compiler: mpicc -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O >> ${COPTFLAGS} ${CFLAGS} >> Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 >> -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} >> ----------------------------------------- >> >> Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include >> -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/libmesh_install/opt_real/petsc/include >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include >> -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi >> ----------------------------------------- >> >> Using C linker: mpicc >> Using Fortran linker: mpif90 >> Using libraries: -Wl,-rpath,/home/dknez/softwar >> e/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib >> -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib >> -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps >> -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx >> -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod >> -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig >> -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 >> -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 >> -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch >> -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 >> -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm >> -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl >> -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl >> ----------------------------------------- >> >> >> >> >> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From david.knezevic at akselos.com Wed Jan 11 18:32:30 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Wed, 11 Jan 2017 19:32:30 -0500 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: Message-ID: On Wed, Jan 11, 2017 at 5:52 PM, Dave May wrote: > so I gather that I'll have to look into a user-defined approximation to S. >> > > Where does the 2x2 block system come from? > Maybe someone on the list knows the right approximation to use for S. > The model is 3D linear elasticity using a finite element discretization. I applied substructuring to part of the system to "condense" it, and that results in the small A00 block. The A11 block is just standard 3D elasticity; no substructuring was applied there. There are constraints to connect the degrees of freedom on the interface of the substructured and non-substructured regions. If anyone has suggestions for a good way to precondition this type of system, I'd be most appreciative! Thanks, David > ----------------------------------------- >> >> 0 KSP Residual norm 5.405528187695e+04 >> 1 KSP Residual norm 2.187814910803e+02 >> 2 KSP Residual norm 1.019051577515e-01 >> 3 KSP Residual norm 4.370464012859e-04 >> KSP Object: 1 MPI processes >> type: cg >> maximum iterations=1000 >> tolerances: relative=1e-06, absolute=1e-50, divergence=10000. >> left preconditioning >> using nonzero initial guess >> using PRECONDITIONED norm type for convergence test >> PC Object: 1 MPI processes >> type: fieldsplit >> FieldSplit with Schur preconditioner, factorization FULL >> Preconditioner for the Schur complement formed from Sp, an assembled >> approximation to S, which uses (lumped, if requested) A00's diagonal's >> inverse >> Split info: >> Split number 0 Defined by IS >> Split number 1 Defined by IS >> KSP solver for A00 block >> KSP Object: (fieldsplit_RB_split_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (fieldsplit_RB_split_) 1 MPI processes >> type: cholesky >> Cholesky: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 0., needed 0. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> package used to perform factorization: mumps >> total: nonzeros=3042, allocated nonzeros=3042 >> total number of mallocs used during MatSetValues calls =0 >> MUMPS run parameters: >> SYM (matrix type): 2 >> PAR (host participation): 1 >> ICNTL(1) (output for error): 6 >> ICNTL(2) (output of diagnostic msg): 0 >> ICNTL(3) (output for global info): 0 >> ICNTL(4) (level of printing): 0 >> ICNTL(5) (input mat struct): 0 >> ICNTL(6) (matrix prescaling): 7 >> ICNTL(7) (sequentia matrix ordering):7 >> ICNTL(8) (scalling strategy): 77 >> ICNTL(10) (max num of refinements): 0 >> ICNTL(11) (error analysis): 0 >> ICNTL(12) (efficiency control): >> 0 >> ICNTL(13) (efficiency control): >> 0 >> ICNTL(14) (percentage of estimated workspace >> increase): 20 >> ICNTL(18) (input mat struct): >> 0 >> ICNTL(19) (Shur complement info): >> 0 >> ICNTL(20) (rhs sparse pattern): >> 0 >> ICNTL(21) (solution struct): >> 0 >> ICNTL(22) (in-core/out-of-core facility): >> 0 >> ICNTL(23) (max size of memory can be allocated >> locally):0 >> ICNTL(24) (detection of null pivot rows): >> 0 >> ICNTL(25) (computation of a null space basis): >> 0 >> ICNTL(26) (Schur options for rhs or solution): >> 0 >> ICNTL(27) (experimental parameter): >> -24 >> ICNTL(28) (use parallel or sequential ordering): >> 1 >> ICNTL(29) (parallel ordering): >> 0 >> ICNTL(30) (user-specified set of entries in inv(A)): >> 0 >> ICNTL(31) (factors is discarded in the solve phase): >> 0 >> ICNTL(33) (compute determinant): >> 0 >> CNTL(1) (relative pivoting threshold): 0.01 >> CNTL(2) (stopping criterion of refinement): >> 1.49012e-08 >> CNTL(3) (absolute pivoting threshold): 0. >> CNTL(4) (value of static pivoting): -1. >> CNTL(5) (fixation for null pivots): 0. >> RINFO(1) (local estimated flops for the elimination >> after analysis): >> [0] 29394. >> RINFO(2) (local estimated flops for the assembly >> after factorization): >> [0] 1092. >> RINFO(3) (local estimated flops for the elimination >> after factorization): >> [0] 29394. >> INFO(15) (estimated size of (in MB) MUMPS internal >> data for running numerical factorization): >> [0] 1 >> INFO(16) (size of (in MB) MUMPS internal data used >> during numerical factorization): >> [0] 1 >> INFO(23) (num of pivots eliminated on this processor >> after factorization): >> [0] 324 >> RINFOG(1) (global estimated flops for the elimination >> after analysis): 29394. >> RINFOG(2) (global estimated flops for the assembly >> after factorization): 1092. >> RINFOG(3) (global estimated flops for the elimination >> after factorization): 29394. >> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): >> (0.,0.)*(2^0) >> INFOG(3) (estimated real workspace for factors on all >> processors after analysis): 3888 >> INFOG(4) (estimated integer workspace for factors on >> all processors after analysis): 2067 >> INFOG(5) (estimated maximum front size in the >> complete tree): 12 >> INFOG(6) (number of nodes in the complete tree): 53 >> INFOG(7) (ordering option effectively use after >> analysis): 2 >> INFOG(8) (structural symmetry in percent of the >> permuted matrix after analysis): 100 >> INFOG(9) (total real/complex workspace to store the >> matrix factors after factorization): 3888 >> INFOG(10) (total integer space store the matrix >> factors after factorization): 2067 >> INFOG(11) (order of largest frontal matrix after >> factorization): 12 >> INFOG(12) (number of off-diagonal pivots): 0 >> INFOG(13) (number of delayed pivots after >> factorization): 0 >> INFOG(14) (number of memory compress after >> factorization): 0 >> INFOG(15) (number of steps of iterative refinement >> after solution): 0 >> INFOG(16) (estimated size (in MB) of all MUMPS >> internal data for factorization after analysis: value on the most memory >> consuming processor): 1 >> INFOG(17) (estimated size of all MUMPS internal data >> for factorization after analysis: sum over all processors): 1 >> INFOG(18) (size of all MUMPS internal data allocated >> during factorization: value on the most memory consuming processor): 1 >> INFOG(19) (size of all MUMPS internal data allocated >> during factorization: sum over all processors): 1 >> INFOG(20) (estimated number of entries in the >> factors): 3042 >> INFOG(21) (size in MB of memory effectively used >> during factorization - value on the most memory consuming processor): 1 >> INFOG(22) (size in MB of memory effectively used >> during factorization - sum over all processors): 1 >> INFOG(23) (after analysis: value of ICNTL(6) >> effectively used): 5 >> INFOG(24) (after analysis: value of ICNTL(12) >> effectively used): 1 >> INFOG(25) (after factorization: number of pivots >> modified by static pivoting): 0 >> INFOG(28) (after factorization: number of null pivots >> encountered): 0 >> INFOG(29) (after factorization: effective number of >> entries in the factors (sum over all processors)): 3042 >> INFOG(30, 31) (after solution: size in Mbytes of >> memory used during solution phase): 0, 0 >> INFOG(32) (after analysis: type of analysis done): 1 >> INFOG(33) (value used for ICNTL(8)): -2 >> INFOG(34) (exponent of the determinant if determinant >> is requested): 0 >> linear system matrix = precond matrix: >> Mat Object: (fieldsplit_RB_split_) 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> total: nonzeros=5760, allocated nonzeros=5760 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 108 nodes, limit used is 5 >> KSP solver for S = A11 - A10 inv(A00) A01 >> KSP Object: (fieldsplit_FE_split_) 1 MPI processes >> type: cg >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using PRECONDITIONED norm type for convergence test >> PC Object: (fieldsplit_FE_split_) 1 MPI processes >> type: bjacobi >> block Jacobi: number of blocks = 1 >> Local solve is same for all blocks, in the following KSP and PC >> objects: >> KSP Object: (fieldsplit_FE_split_sub_) 1 MPI >> processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (fieldsplit_FE_split_sub_) 1 MPI >> processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 1., needed 1. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=28476, cols=28476 >> package used to perform factorization: petsc >> total: nonzeros=1037052, allocated nonzeros=1037052 >> total number of mallocs used during MatSetValues >> calls =0 >> using I-node routines: found 9489 nodes, limit used >> is 5 >> linear system matrix = precond matrix: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=28476, cols=28476 >> total: nonzeros=1037052, allocated nonzeros=1037052 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9489 nodes, limit used is 5 >> linear system matrix followed by preconditioner matrix: >> Mat Object: (fieldsplit_FE_split_) 1 MPI processes >> type: schurcomplement >> rows=28476, cols=28476 >> Schur complement A11 - A10 inv(A00) A01 >> A11 >> Mat Object: (fieldsplit_FE_split_) >> 1 MPI processes >> type: seqaij >> rows=28476, cols=28476 >> total: nonzeros=1017054, allocated nonzeros=1017054 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9492 nodes, limit used is 5 >> A10 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=28476, cols=324 >> total: nonzeros=936, allocated nonzeros=936 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 5717 nodes, limit used is 5 >> KSP of A00 >> KSP Object: (fieldsplit_RB_split_) >> 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (fieldsplit_RB_split_) >> 1 MPI processes >> type: cholesky >> Cholesky: out-of-place factorization >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 0., needed 0. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> package used to perform factorization: mumps >> total: nonzeros=3042, allocated nonzeros=3042 >> total number of mallocs used during MatSetValues >> calls =0 >> MUMPS run parameters: >> SYM (matrix type): 2 >> PAR (host participation): 1 >> ICNTL(1) (output for error): 6 >> ICNTL(2) (output of diagnostic msg): 0 >> ICNTL(3) (output for global info): 0 >> ICNTL(4) (level of printing): 0 >> ICNTL(5) (input mat struct): 0 >> ICNTL(6) (matrix prescaling): 7 >> ICNTL(7) (sequentia matrix ordering):7 >> ICNTL(8) (scalling strategy): 77 >> ICNTL(10) (max num of refinements): 0 >> ICNTL(11) (error analysis): 0 >> ICNTL(12) (efficiency control): >> 0 >> ICNTL(13) (efficiency control): >> 0 >> ICNTL(14) (percentage of estimated workspace >> increase): 20 >> ICNTL(18) (input mat struct): >> 0 >> ICNTL(19) (Shur complement info): >> 0 >> ICNTL(20) (rhs sparse pattern): >> 0 >> ICNTL(21) (solution struct): >> 0 >> ICNTL(22) (in-core/out-of-core facility): >> 0 >> ICNTL(23) (max size of memory can be >> allocated locally):0 >> ICNTL(24) (detection of null pivot rows): >> 0 >> ICNTL(25) (computation of a null space >> basis): 0 >> ICNTL(26) (Schur options for rhs or >> solution): 0 >> ICNTL(27) (experimental parameter): >> -24 >> ICNTL(28) (use parallel or sequential >> ordering): 1 >> ICNTL(29) (parallel ordering): >> 0 >> ICNTL(30) (user-specified set of entries in >> inv(A)): 0 >> ICNTL(31) (factors is discarded in the solve >> phase): 0 >> ICNTL(33) (compute determinant): >> 0 >> CNTL(1) (relative pivoting threshold): >> 0.01 >> CNTL(2) (stopping criterion of refinement): >> 1.49012e-08 >> CNTL(3) (absolute pivoting threshold): >> 0. >> CNTL(4) (value of static pivoting): >> -1. >> CNTL(5) (fixation for null pivots): >> 0. >> RINFO(1) (local estimated flops for the >> elimination after analysis): >> [0] 29394. >> RINFO(2) (local estimated flops for the >> assembly after factorization): >> [0] 1092. >> RINFO(3) (local estimated flops for the >> elimination after factorization): >> [0] 29394. >> INFO(15) (estimated size of (in MB) MUMPS >> internal data for running numerical factorization): >> [0] 1 >> INFO(16) (size of (in MB) MUMPS internal data >> used during numerical factorization): >> [0] 1 >> INFO(23) (num of pivots eliminated on this >> processor after factorization): >> [0] 324 >> RINFOG(1) (global estimated flops for the >> elimination after analysis): 29394. >> RINFOG(2) (global estimated flops for the >> assembly after factorization): 1092. >> RINFOG(3) (global estimated flops for the >> elimination after factorization): 29394. >> (RINFOG(12) RINFOG(13))*2^INFOG(34) >> (determinant): (0.,0.)*(2^0) >> INFOG(3) (estimated real workspace for >> factors on all processors after analysis): 3888 >> INFOG(4) (estimated integer workspace for >> factors on all processors after analysis): 2067 >> INFOG(5) (estimated maximum front size in the >> complete tree): 12 >> INFOG(6) (number of nodes in the complete >> tree): 53 >> INFOG(7) (ordering option effectively use >> after analysis): 2 >> INFOG(8) (structural symmetry in percent of >> the permuted matrix after analysis): 100 >> INFOG(9) (total real/complex workspace to >> store the matrix factors after factorization): 3888 >> INFOG(10) (total integer space store the >> matrix factors after factorization): 2067 >> INFOG(11) (order of largest frontal matrix >> after factorization): 12 >> INFOG(12) (number of off-diagonal pivots): 0 >> INFOG(13) (number of delayed pivots after >> factorization): 0 >> INFOG(14) (number of memory compress after >> factorization): 0 >> INFOG(15) (number of steps of iterative >> refinement after solution): 0 >> INFOG(16) (estimated size (in MB) of all >> MUMPS internal data for factorization after analysis: value on the most >> memory consuming processor): 1 >> INFOG(17) (estimated size of all MUMPS >> internal data for factorization after analysis: sum over all processors): 1 >> INFOG(18) (size of all MUMPS internal data >> allocated during factorization: value on the most memory consuming >> processor): 1 >> INFOG(19) (size of all MUMPS internal data >> allocated during factorization: sum over all processors): 1 >> INFOG(20) (estimated number of entries in the >> factors): 3042 >> INFOG(21) (size in MB of memory effectively >> used during factorization - value on the most memory consuming processor): >> 1 >> INFOG(22) (size in MB of memory effectively >> used during factorization - sum over all processors): 1 >> INFOG(23) (after analysis: value of ICNTL(6) >> effectively used): 5 >> INFOG(24) (after analysis: value of ICNTL(12) >> effectively used): 1 >> INFOG(25) (after factorization: number of >> pivots modified by static pivoting): 0 >> INFOG(28) (after factorization: number of >> null pivots encountered): 0 >> INFOG(29) (after factorization: effective >> number of entries in the factors (sum over all processors)): 3042 >> INFOG(30, 31) (after solution: size in Mbytes >> of memory used during solution phase): 0, 0 >> INFOG(32) (after analysis: type of analysis >> done): 1 >> INFOG(33) (value used for ICNTL(8)): -2 >> INFOG(34) (exponent of the determinant if >> determinant is requested): 0 >> linear system matrix = precond matrix: >> Mat Object: (fieldsplit_RB_split_) >> 1 MPI processes >> type: seqaij >> rows=324, cols=324 >> total: nonzeros=5760, allocated nonzeros=5760 >> total number of mallocs used during MatSetValues calls >> =0 >> using I-node routines: found 108 nodes, limit used is >> 5 >> A01 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=324, cols=28476 >> total: nonzeros=936, allocated nonzeros=936 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 67 nodes, limit used is 5 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=28476, cols=28476 >> total: nonzeros=1037052, allocated nonzeros=1037052 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9489 nodes, limit used is 5 >> linear system matrix = precond matrix: >> Mat Object: () 1 MPI processes >> type: seqaij >> rows=28800, cols=28800 >> total: nonzeros=1024686, allocated nonzeros=1024794 >> total number of mallocs used during MatSetValues calls =0 >> using I-node routines: found 9600 nodes, limit used is 5 >> >> ---------------------------------------------- PETSc Performance >> Summary: ---------------------------------------------- >> >> /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a >> arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 >> 17:22:10 2017 >> Using Petsc Release Version 3.7.3, unknown >> >> Max Max/Min Avg Total >> Time (sec): 9.638e+01 1.00000 9.638e+01 >> Objects: 2.030e+02 1.00000 2.030e+02 >> Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 >> Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 >> MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> MPI Reductions: 0.000e+00 0.00000 >> >> Flop counting convention: 1 flop = 1 real number operation of type >> (multiply/divide/add/subtract) >> e.g., VecAXPY() for real vectors of length N >> --> 2N flops >> and VecAXPY() for complex vectors of length N >> --> 8N flops >> >> Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages >> --- -- Message Lengths -- -- Reductions -- >> Avg %Total Avg %Total counts >> %Total Avg %Total counts %Total >> 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 >> 0.0% 0.000e+00 0.0% 0.000e+00 0.0% >> >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> See the 'Profiling' chapter of the users' manual for details on >> interpreting output. >> Phase summary info: >> Count: number of times phase was executed >> Time and Flops: Max - maximum over all processors >> Ratio - ratio of maximum to minimum over all processors >> Mess: number of messages sent >> Avg. len: average message length (bytes) >> Reduct: number of global reductions >> Global: entire computation >> Stage: stages of a computation. Set stages with PetscLogStagePush() >> and PetscLogStagePop(). >> %T - percent time in this phase %F - percent flops in this >> phase >> %M - percent messages in this phase %L - percent message >> lengths in this phase >> %R - percent reductions in this phase >> Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time >> over all processors) >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> Event Count Time (sec) Flops >> --- Global --- --- Stage --- Total >> Max Ratio Max Ratio Max Ratio Mess Avg len >> Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> >> --- Event Stage 0: Main Stage >> >> VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 >> VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 >> VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 >> VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 >> VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 >> VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 >> 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 >> VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 >> 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 >> MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 >> 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 >> MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 >> 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 >> MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 >> MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 >> MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 >> MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 >> KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 >> 0.0e+00 97100 0 0 0 97100 0 0 0 1855 >> PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 >> PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 >> PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 >> 0.0e+00 97100 0 0 0 97100 0 0 0 1855 >> KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 >> 0.0e+00 97100 0 0 0 97100 0 0 0 1855 >> KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> ------------------------------------------------------------ >> ------------------------------------------------------------ >> >> Memory usage is given in bytes: >> >> Object Type Creations Destructions Memory Descendants' >> Mem. >> Reports information only for process 0. >> >> --- Event Stage 0: Main Stage >> >> Vector 92 92 9698040 0. >> Vector Scatter 24 24 15936 0. >> Index Set 51 51 537876 0. >> IS L to G Mapping 3 3 240408 0. >> Matrix 16 16 77377776 0. >> Krylov Solver 6 6 7888 0. >> Preconditioner 6 6 6288 0. >> Viewer 1 0 0 0. >> Distributed Mesh 1 1 4624 0. >> Star Forest Bipartite Graph 2 2 1616 0. >> Discrete System 1 1 872 0. >> ============================================================ >> ============================================================ >> Average time to get PetscTime(): 0. >> #PETSc Option Table entries: >> -ksp_monitor >> -ksp_view >> -log_view >> #End of PETSc Option Table entries >> Compiled without FORTRAN kernels >> Compiled with full precision matrices (default) >> sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 >> sizeof(PetscScalar) 8 sizeof(PetscInt) 4 >> Configure options: --with-shared-libraries=1 --with-debugging=0 >> --download-suitesparse --download-blacs --download-ptscotch=yes >> --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl >> --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps >> --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc >> --download-hypre --download-ml >> ----------------------------------------- >> Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo >> Machine characteristics: Linux-4.4.0-38-generic-x86_64- >> with-Ubuntu-16.04-xenial >> Using PETSc directory: /home/dknez/software/petsc-src >> Using PETSc arch: arch-linux2-c-opt >> ----------------------------------------- >> >> Using C compiler: mpicc -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O >> ${COPTFLAGS} ${CFLAGS} >> Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 >> -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} >> ----------------------------------------- >> >> Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include >> -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/libmesh_install/opt_real/petsc/include >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include >> -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi >> ----------------------------------------- >> >> Using C linker: mpicc >> Using Fortran linker: mpif90 >> Using libraries: -Wl,-rpath,/home/dknez/softwar >> e/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib >> -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib >> -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps >> -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx >> -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod >> -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig >> -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 >> -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 >> -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch >> -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 >> -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm >> -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl >> -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl >> ----------------------------------------- >> >> >> >> >> On Wed, Jan 11, 2017 at 4:49 PM, Dave May >> wrote: >> >>> It looks like the Schur solve is requiring a huge number of iterates to >>> converge (based on the instances of MatMult). >>> This is killing the performance. >>> >>> Are you sure that A11 is a good approximation to S? You might consider >>> trying the selfp option >>> >>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >>> PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre >>> >>> Note that the best approx to S is likely both problem and discretisation >>> dependent so if selfp is also terrible, you might want to consider coding >>> up your own approx to S for your specific system. >>> >>> >>> Thanks, >>> Dave >>> >>> >>> On Wed, 11 Jan 2017 at 22:34, David Knezevic >>> wrote: >>> >>> I have a definite block 2x2 system and I figured it'd be good to apply >>> the PCFIELDSPLIT functionality with Schur complement, as described in >>> Section 4.5 of the manual. >>> >>> The A00 block of my matrix is very small so I figured I'd specify a >>> direct solver (i.e. MUMPS) for that block. >>> >>> So I did the following: >>> - PCFieldSplitSetIS to specify the indices of the two splits >>> - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the >>> solver and PC types for each (MUMPS for A00, ILU+CG for A11) >>> - I set -pc_fieldsplit_schur_fact_type full >>> >>> Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for >>> a test case. It seems to converge well, but I'm concerned about the speed >>> (about 90 seconds, vs. about 1 second if I use a direct solver for the >>> entire system). I just wanted to check if I'm setting this up in a good way? >>> >>> Many thanks, >>> David >>> >>> ------------------------------------------------------------ >>> ----------------------- >>> >>> 0 KSP Residual norm 5.405774214400e+04 >>> 1 KSP Residual norm 1.849649014371e+02 >>> 2 KSP Residual norm 7.462775074989e-02 >>> 3 KSP Residual norm 2.680497175260e-04 >>> KSP Object: 1 MPI processes >>> type: cg >>> maximum iterations=1000 >>> tolerances: relative=1e-06, absolute=1e-50, divergence=10000. >>> left preconditioning >>> using nonzero initial guess >>> using PRECONDITIONED norm type for convergence test >>> PC Object: 1 MPI processes >>> type: fieldsplit >>> FieldSplit with Schur preconditioner, factorization FULL >>> Preconditioner for the Schur complement formed from A11 >>> Split info: >>> Split number 0 Defined by IS >>> Split number 1 Defined by IS >>> KSP solver for A00 block >>> KSP Object: (fieldsplit_RB_split_) 1 MPI processes >>> type: preonly >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (fieldsplit_RB_split_) 1 MPI processes >>> type: cholesky >>> Cholesky: out-of-place factorization >>> tolerance for zero pivot 2.22045e-14 >>> matrix ordering: natural >>> factor fill ratio given 0., needed 0. >>> Factored matrix follows: >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=324, cols=324 >>> package used to perform factorization: mumps >>> total: nonzeros=3042, allocated nonzeros=3042 >>> total number of mallocs used during MatSetValues calls =0 >>> MUMPS run parameters: >>> SYM (matrix type): 2 >>> PAR (host participation): 1 >>> ICNTL(1) (output for error): 6 >>> ICNTL(2) (output of diagnostic msg): 0 >>> ICNTL(3) (output for global info): 0 >>> ICNTL(4) (level of printing): 0 >>> ICNTL(5) (input mat struct): 0 >>> ICNTL(6) (matrix prescaling): 7 >>> ICNTL(7) (sequentia matrix ordering):7 >>> ICNTL(8) (scalling strategy): 77 >>> ICNTL(10) (max num of refinements): 0 >>> ICNTL(11) (error analysis): 0 >>> ICNTL(12) (efficiency control): >>> 0 >>> ICNTL(13) (efficiency control): >>> 0 >>> ICNTL(14) (percentage of estimated workspace >>> increase): 20 >>> ICNTL(18) (input mat struct): >>> 0 >>> ICNTL(19) (Shur complement info): >>> 0 >>> ICNTL(20) (rhs sparse pattern): >>> 0 >>> ICNTL(21) (solution struct): >>> 0 >>> ICNTL(22) (in-core/out-of-core facility): >>> 0 >>> ICNTL(23) (max size of memory can be allocated >>> locally):0 >>> ICNTL(24) (detection of null pivot rows): >>> 0 >>> ICNTL(25) (computation of a null space basis): >>> 0 >>> ICNTL(26) (Schur options for rhs or solution): >>> 0 >>> ICNTL(27) (experimental parameter): >>> -24 >>> ICNTL(28) (use parallel or sequential ordering): >>> 1 >>> ICNTL(29) (parallel ordering): >>> 0 >>> ICNTL(30) (user-specified set of entries in inv(A)): >>> 0 >>> ICNTL(31) (factors is discarded in the solve phase): >>> 0 >>> ICNTL(33) (compute determinant): >>> 0 >>> CNTL(1) (relative pivoting threshold): 0.01 >>> CNTL(2) (stopping criterion of refinement): >>> 1.49012e-08 >>> CNTL(3) (absolute pivoting threshold): 0. >>> CNTL(4) (value of static pivoting): -1. >>> CNTL(5) (fixation for null pivots): 0. >>> RINFO(1) (local estimated flops for the elimination >>> after analysis): >>> [0] 29394. >>> RINFO(2) (local estimated flops for the assembly >>> after factorization): >>> [0] 1092. >>> RINFO(3) (local estimated flops for the elimination >>> after factorization): >>> [0] 29394. >>> INFO(15) (estimated size of (in MB) MUMPS internal >>> data for running numerical factorization): >>> [0] 1 >>> INFO(16) (size of (in MB) MUMPS internal data used >>> during numerical factorization): >>> [0] 1 >>> INFO(23) (num of pivots eliminated on this processor >>> after factorization): >>> [0] 324 >>> RINFOG(1) (global estimated flops for the >>> elimination after analysis): 29394. >>> RINFOG(2) (global estimated flops for the assembly >>> after factorization): 1092. >>> RINFOG(3) (global estimated flops for the >>> elimination after factorization): 29394. >>> (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): >>> (0.,0.)*(2^0) >>> INFOG(3) (estimated real workspace for factors on >>> all processors after analysis): 3888 >>> INFOG(4) (estimated integer workspace for factors on >>> all processors after analysis): 2067 >>> INFOG(5) (estimated maximum front size in the >>> complete tree): 12 >>> INFOG(6) (number of nodes in the complete tree): 53 >>> INFOG(7) (ordering option effectively use after >>> analysis): 2 >>> INFOG(8) (structural symmetry in percent of the >>> permuted matrix after analysis): 100 >>> INFOG(9) (total real/complex workspace to store the >>> matrix factors after factorization): 3888 >>> INFOG(10) (total integer space store the matrix >>> factors after factorization): 2067 >>> INFOG(11) (order of largest frontal matrix after >>> factorization): 12 >>> INFOG(12) (number of off-diagonal pivots): 0 >>> INFOG(13) (number of delayed pivots after >>> factorization): 0 >>> INFOG(14) (number of memory compress after >>> factorization): 0 >>> INFOG(15) (number of steps of iterative refinement >>> after solution): 0 >>> INFOG(16) (estimated size (in MB) of all MUMPS >>> internal data for factorization after analysis: value on the most memory >>> consuming processor): 1 >>> INFOG(17) (estimated size of all MUMPS internal data >>> for factorization after analysis: sum over all processors): 1 >>> INFOG(18) (size of all MUMPS internal data allocated >>> during factorization: value on the most memory consuming processor): 1 >>> INFOG(19) (size of all MUMPS internal data allocated >>> during factorization: sum over all processors): 1 >>> INFOG(20) (estimated number of entries in the >>> factors): 3042 >>> INFOG(21) (size in MB of memory effectively used >>> during factorization - value on the most memory consuming processor): 1 >>> INFOG(22) (size in MB of memory effectively used >>> during factorization - sum over all processors): 1 >>> INFOG(23) (after analysis: value of ICNTL(6) >>> effectively used): 5 >>> INFOG(24) (after analysis: value of ICNTL(12) >>> effectively used): 1 >>> INFOG(25) (after factorization: number of pivots >>> modified by static pivoting): 0 >>> INFOG(28) (after factorization: number of null >>> pivots encountered): 0 >>> INFOG(29) (after factorization: effective number of >>> entries in the factors (sum over all processors)): 3042 >>> INFOG(30, 31) (after solution: size in Mbytes of >>> memory used during solution phase): 0, 0 >>> INFOG(32) (after analysis: type of analysis done): 1 >>> INFOG(33) (value used for ICNTL(8)): -2 >>> INFOG(34) (exponent of the determinant if >>> determinant is requested): 0 >>> linear system matrix = precond matrix: >>> Mat Object: (fieldsplit_RB_split_) 1 MPI processes >>> type: seqaij >>> rows=324, cols=324 >>> total: nonzeros=5760, allocated nonzeros=5760 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 108 nodes, limit used is 5 >>> KSP solver for S = A11 - A10 inv(A00) A01 >>> KSP Object: (fieldsplit_FE_split_) 1 MPI processes >>> type: cg >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >>> left preconditioning >>> using PRECONDITIONED norm type for convergence test >>> PC Object: (fieldsplit_FE_split_) 1 MPI processes >>> type: bjacobi >>> block Jacobi: number of blocks = 1 >>> Local solve is same for all blocks, in the following KSP and >>> PC objects: >>> KSP Object: (fieldsplit_FE_split_sub_) 1 >>> MPI processes >>> type: preonly >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, >>> divergence=10000. >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (fieldsplit_FE_split_sub_) 1 MPI >>> processes >>> type: ilu >>> ILU: out-of-place factorization >>> 0 levels of fill >>> tolerance for zero pivot 2.22045e-14 >>> matrix ordering: natural >>> factor fill ratio given 1., needed 1. >>> Factored matrix follows: >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=28476, cols=28476 >>> package used to perform factorization: petsc >>> total: nonzeros=1017054, allocated nonzeros=1017054 >>> total number of mallocs used during MatSetValues >>> calls =0 >>> using I-node routines: found 9492 nodes, limit >>> used is 5 >>> linear system matrix = precond matrix: >>> Mat Object: (fieldsplit_FE_split_) 1 >>> MPI processes >>> type: seqaij >>> rows=28476, cols=28476 >>> total: nonzeros=1017054, allocated nonzeros=1017054 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 9492 nodes, limit used is 5 >>> linear system matrix followed by preconditioner matrix: >>> Mat Object: (fieldsplit_FE_split_) 1 MPI processes >>> type: schurcomplement >>> rows=28476, cols=28476 >>> Schur complement A11 - A10 inv(A00) A01 >>> A11 >>> Mat Object: (fieldsplit_FE_split_) >>> 1 MPI processes >>> type: seqaij >>> rows=28476, cols=28476 >>> total: nonzeros=1017054, allocated nonzeros=1017054 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 9492 nodes, limit used is >>> 5 >>> A10 >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=28476, cols=324 >>> total: nonzeros=936, allocated nonzeros=936 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 5717 nodes, limit used is >>> 5 >>> KSP of A00 >>> KSP Object: (fieldsplit_RB_split_) >>> 1 MPI processes >>> type: preonly >>> maximum iterations=10000, initial guess is zero >>> tolerances: relative=1e-05, absolute=1e-50, >>> divergence=10000. >>> left preconditioning >>> using NONE norm type for convergence test >>> PC Object: (fieldsplit_RB_split_) >>> 1 MPI processes >>> type: cholesky >>> Cholesky: out-of-place factorization >>> tolerance for zero pivot 2.22045e-14 >>> matrix ordering: natural >>> factor fill ratio given 0., needed 0. >>> Factored matrix follows: >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=324, cols=324 >>> package used to perform factorization: mumps >>> total: nonzeros=3042, allocated nonzeros=3042 >>> total number of mallocs used during MatSetValues >>> calls =0 >>> MUMPS run parameters: >>> SYM (matrix type): 2 >>> PAR (host participation): 1 >>> ICNTL(1) (output for error): 6 >>> ICNTL(2) (output of diagnostic msg): 0 >>> ICNTL(3) (output for global info): 0 >>> ICNTL(4) (level of printing): 0 >>> ICNTL(5) (input mat struct): 0 >>> ICNTL(6) (matrix prescaling): 7 >>> ICNTL(7) (sequentia matrix ordering):7 >>> ICNTL(8) (scalling strategy): 77 >>> ICNTL(10) (max num of refinements): 0 >>> ICNTL(11) (error analysis): 0 >>> ICNTL(12) (efficiency control): >>> 0 >>> ICNTL(13) (efficiency control): >>> 0 >>> ICNTL(14) (percentage of estimated workspace >>> increase): 20 >>> ICNTL(18) (input mat struct): >>> 0 >>> ICNTL(19) (Shur complement info): >>> 0 >>> ICNTL(20) (rhs sparse pattern): >>> 0 >>> ICNTL(21) (solution struct): >>> 0 >>> ICNTL(22) (in-core/out-of-core facility): >>> 0 >>> ICNTL(23) (max size of memory can be >>> allocated locally):0 >>> ICNTL(24) (detection of null pivot rows): >>> 0 >>> ICNTL(25) (computation of a null space >>> basis): 0 >>> ICNTL(26) (Schur options for rhs or >>> solution): 0 >>> ICNTL(27) (experimental parameter): >>> -24 >>> ICNTL(28) (use parallel or sequential >>> ordering): 1 >>> ICNTL(29) (parallel ordering): >>> 0 >>> ICNTL(30) (user-specified set of entries in >>> inv(A)): 0 >>> ICNTL(31) (factors is discarded in the solve >>> phase): 0 >>> ICNTL(33) (compute determinant): >>> 0 >>> CNTL(1) (relative pivoting threshold): >>> 0.01 >>> CNTL(2) (stopping criterion of refinement): >>> 1.49012e-08 >>> CNTL(3) (absolute pivoting threshold): >>> 0. >>> CNTL(4) (value of static pivoting): >>> -1. >>> CNTL(5) (fixation for null pivots): >>> 0. >>> RINFO(1) (local estimated flops for the >>> elimination after analysis): >>> [0] 29394. >>> RINFO(2) (local estimated flops for the >>> assembly after factorization): >>> [0] 1092. >>> RINFO(3) (local estimated flops for the >>> elimination after factorization): >>> [0] 29394. >>> INFO(15) (estimated size of (in MB) MUMPS >>> internal data for running numerical factorization): >>> [0] 1 >>> INFO(16) (size of (in MB) MUMPS internal >>> data used during numerical factorization): >>> [0] 1 >>> INFO(23) (num of pivots eliminated on this >>> processor after factorization): >>> [0] 324 >>> RINFOG(1) (global estimated flops for the >>> elimination after analysis): 29394. >>> RINFOG(2) (global estimated flops for the >>> assembly after factorization): 1092. >>> RINFOG(3) (global estimated flops for the >>> elimination after factorization): 29394. >>> (RINFOG(12) RINFOG(13))*2^INFOG(34) >>> (determinant): (0.,0.)*(2^0) >>> INFOG(3) (estimated real workspace for >>> factors on all processors after analysis): 3888 >>> INFOG(4) (estimated integer workspace for >>> factors on all processors after analysis): 2067 >>> INFOG(5) (estimated maximum front size in >>> the complete tree): 12 >>> INFOG(6) (number of nodes in the complete >>> tree): 53 >>> INFOG(7) (ordering option effectively use >>> after analysis): 2 >>> INFOG(8) (structural symmetry in percent of >>> the permuted matrix after analysis): 100 >>> INFOG(9) (total real/complex workspace to >>> store the matrix factors after factorization): 3888 >>> INFOG(10) (total integer space store the >>> matrix factors after factorization): 2067 >>> INFOG(11) (order of largest frontal matrix >>> after factorization): 12 >>> INFOG(12) (number of off-diagonal pivots): 0 >>> INFOG(13) (number of delayed pivots after >>> factorization): 0 >>> INFOG(14) (number of memory compress after >>> factorization): 0 >>> INFOG(15) (number of steps of iterative >>> refinement after solution): 0 >>> INFOG(16) (estimated size (in MB) of all >>> MUMPS internal data for factorization after analysis: value on the most >>> memory consuming processor): 1 >>> INFOG(17) (estimated size of all MUMPS >>> internal data for factorization after analysis: sum over all processors): 1 >>> INFOG(18) (size of all MUMPS internal data >>> allocated during factorization: value on the most memory consuming >>> processor): 1 >>> INFOG(19) (size of all MUMPS internal data >>> allocated during factorization: sum over all processors): 1 >>> INFOG(20) (estimated number of entries in >>> the factors): 3042 >>> INFOG(21) (size in MB of memory effectively >>> used during factorization - value on the most memory consuming processor): >>> 1 >>> INFOG(22) (size in MB of memory effectively >>> used during factorization - sum over all processors): 1 >>> INFOG(23) (after analysis: value of ICNTL(6) >>> effectively used): 5 >>> INFOG(24) (after analysis: value of >>> ICNTL(12) effectively used): 1 >>> INFOG(25) (after factorization: number of >>> pivots modified by static pivoting): 0 >>> INFOG(28) (after factorization: number of >>> null pivots encountered): 0 >>> INFOG(29) (after factorization: effective >>> number of entries in the factors (sum over all processors)): 3042 >>> INFOG(30, 31) (after solution: size in >>> Mbytes of memory used during solution phase): 0, 0 >>> INFOG(32) (after analysis: type of analysis >>> done): 1 >>> INFOG(33) (value used for ICNTL(8)): -2 >>> INFOG(34) (exponent of the determinant if >>> determinant is requested): 0 >>> linear system matrix = precond matrix: >>> Mat Object: (fieldsplit_RB_split_) >>> 1 MPI processes >>> type: seqaij >>> rows=324, cols=324 >>> total: nonzeros=5760, allocated nonzeros=5760 >>> total number of mallocs used during MatSetValues calls >>> =0 >>> using I-node routines: found 108 nodes, limit used >>> is 5 >>> A01 >>> Mat Object: 1 MPI processes >>> type: seqaij >>> rows=324, cols=28476 >>> total: nonzeros=936, allocated nonzeros=936 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 67 nodes, limit used is 5 >>> Mat Object: (fieldsplit_FE_split_) 1 MPI processes >>> type: seqaij >>> rows=28476, cols=28476 >>> total: nonzeros=1017054, allocated nonzeros=1017054 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 9492 nodes, limit used is 5 >>> linear system matrix = precond matrix: >>> Mat Object: () 1 MPI processes >>> type: seqaij >>> rows=28800, cols=28800 >>> total: nonzeros=1024686, allocated nonzeros=1024794 >>> total number of mallocs used during MatSetValues calls =0 >>> using I-node routines: found 9600 nodes, limit used is 5 >>> >>> >>> ---------------------------------------------- PETSc Performance >>> Summary: ---------------------------------------------- >>> >>> /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a >>> arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 >>> 16:16:47 2017 >>> Using Petsc Release Version 3.7.3, unknown >>> >>> Max Max/Min Avg Total >>> Time (sec): 9.179e+01 1.00000 9.179e+01 >>> Objects: 1.990e+02 1.00000 1.990e+02 >>> Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 >>> Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 >>> MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 >>> MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 >>> MPI Reductions: 0.000e+00 0.00000 >>> >>> Flop counting convention: 1 flop = 1 real number operation of type >>> (multiply/divide/add/subtract) >>> e.g., VecAXPY() for real vectors of length N >>> --> 2N flops >>> and VecAXPY() for complex vectors of length >>> N --> 8N flops >>> >>> Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages >>> --- -- Message Lengths -- -- Reductions -- >>> Avg %Total Avg %Total counts >>> %Total Avg %Total counts %Total >>> 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 >>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0% >>> >>> ------------------------------------------------------------ >>> ------------------------------------------------------------ >>> See the 'Profiling' chapter of the users' manual for details on >>> interpreting output. >>> Phase summary info: >>> Count: number of times phase was executed >>> Time and Flops: Max - maximum over all processors >>> Ratio - ratio of maximum to minimum over all >>> processors >>> Mess: number of messages sent >>> Avg. len: average message length (bytes) >>> Reduct: number of global reductions >>> Global: entire computation >>> Stage: stages of a computation. Set stages with PetscLogStagePush() >>> and PetscLogStagePop(). >>> %T - percent time in this phase %F - percent flops in this >>> phase >>> %M - percent messages in this phase %L - percent message >>> lengths in this phase >>> %R - percent reductions in this phase >>> Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time >>> over all processors) >>> ------------------------------------------------------------ >>> ------------------------------------------------------------ >>> Event Count Time (sec) Flops >>> --- Global --- --- Stage --- Total >>> Max Ratio Max Ratio Max Ratio Mess Avg len >>> Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >>> ------------------------------------------------------------ >>> ------------------------------------------------------------ >>> >>> --- Event Stage 0: Main Stage >>> >>> VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 >>> VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 >>> VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 >>> VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 >>> VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 >>> VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 >>> VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 >>> MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 >>> MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 >>> MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 >>> MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >>> PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 >>> PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 >>> PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >>> KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 >>> 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >>> KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 >>> 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >>> ------------------------------------------------------------ >>> ------------------------------------------------------------ >>> >>> Memory usage is given in bytes: >>> >>> Object Type Creations Destructions Memory Descendants' >>> Mem. >>> Reports information only for process 0. >>> >>> --- Event Stage 0: Main Stage >>> >>> Vector 91 91 9693912 0. >>> Vector Scatter 24 24 15936 0. >>> Index Set 51 51 537888 0. >>> IS L to G Mapping 3 3 240408 0. >>> Matrix 13 13 64097868 0. >>> Krylov Solver 6 6 7888 0. >>> Preconditioner 6 6 6288 0. >>> Viewer 1 0 0 0. >>> Distributed Mesh 1 1 4624 0. >>> Star Forest Bipartite Graph 2 2 1616 0. >>> Discrete System 1 1 872 0. >>> ============================================================ >>> ============================================================ >>> Average time to get PetscTime(): 0. >>> #PETSc Option Table entries: >>> -ksp_monitor >>> -ksp_view >>> -log_view >>> #End of PETSc Option Table entries >>> Compiled without FORTRAN kernels >>> Compiled with full precision matrices (default) >>> sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 >>> sizeof(PetscScalar) 8 sizeof(PetscInt) 4 >>> Configure options: --with-shared-libraries=1 --with-debugging=0 >>> --download-suitesparse --download-blacs --download-ptscotch=yes >>> --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl >>> --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps >>> --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc >>> --download-hypre --download-ml >>> ----------------------------------------- >>> Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo >>> Machine characteristics: Linux-4.4.0-38-generic-x86_64- >>> with-Ubuntu-16.04-xenial >>> Using PETSc directory: /home/dknez/software/petsc-src >>> Using PETSc arch: arch-linux2-c-opt >>> ----------------------------------------- >>> >>> Using C compiler: mpicc -fPIC -Wall -Wwrite-strings >>> -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O >>> ${COPTFLAGS} ${CFLAGS} >>> Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 >>> -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} >>> ----------------------------------------- >>> >>> Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >>> -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include >>> -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >>> -I/home/dknez/software/libmesh_install/opt_real/petsc/include >>> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent >>> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include >>> -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi >>> ----------------------------------------- >>> >>> Using C linker: mpicc >>> Using Fortran linker: mpif90 >>> Using libraries: -Wl,-rpath,/home/dknez/softwar >>> e/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib >>> -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib >>> -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps >>> -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE >>> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 >>> -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu >>> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu >>> -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx >>> -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd >>> -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_s >>> tudio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 >>> -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc >>> -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm >>> -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm >>> -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz >>> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >>> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 >>> -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu >>> -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu >>> -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu >>> -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi >>> -lgcc_s -lpthread -ldl >>> ----------------------------------------- >>> >>> >>> >>> >>> >>> >> > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 11 19:32:22 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Jan 2017 19:32:22 -0600 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: Message-ID: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> Can you please run with all the monitoring on? So we can see the convergence of all the inner solvers -fieldsplit_FE_split_ksp_monitor Then run again with -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky and send both sets of results Barry > On Jan 11, 2017, at 6:32 PM, David Knezevic wrote: > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May wrote: > so I gather that I'll have to look into a user-defined approximation to S. > > Where does the 2x2 block system come from? > Maybe someone on the list knows the right approximation to use for S. > > The model is 3D linear elasticity using a finite element discretization. I applied substructuring to part of the system to "condense" it, and that results in the small A00 block. The A11 block is just standard 3D elasticity; no substructuring was applied there. There are constraints to connect the degrees of freedom on the interface of the substructured and non-substructured regions. > > If anyone has suggestions for a good way to precondition this type of system, I'd be most appreciative! > > Thanks, > David > > > > ----------------------------------------- > > 0 KSP Residual norm 5.405528187695e+04 > 1 KSP Residual norm 2.187814910803e+02 > 2 KSP Residual norm 1.019051577515e-01 > 3 KSP Residual norm 4.370464012859e-04 > KSP Object: 1 MPI processes > type: cg > maximum iterations=1000 > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > left preconditioning > using nonzero initial guess > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): 0 > ICNTL(13) (efficiency control): 0 > ICNTL(14) (percentage of estimated workspace increase): 20 > ICNTL(18) (input mat struct): 0 > ICNTL(19) (Shur complement info): 0 > ICNTL(20) (rhs sparse pattern): 0 > ICNTL(21) (solution struct): 0 > ICNTL(22) (in-core/out-of-core facility): 0 > ICNTL(23) (max size of memory can be allocated locally):0 > ICNTL(24) (detection of null pivot rows): 0 > ICNTL(25) (computation of a null space basis): 0 > ICNTL(26) (Schur options for rhs or solution): 0 > ICNTL(27) (experimental parameter): -24 > ICNTL(28) (use parallel or sequential ordering): 1 > ICNTL(29) (parallel ordering): 0 > ICNTL(30) (user-specified set of entries in inv(A)): 0 > ICNTL(31) (factors is discarded in the solve phase): 0 > ICNTL(33) (compute determinant): 0 > CNTL(1) (relative pivoting threshold): 0.01 > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the elimination after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the assembly after factorization): > [0] 1092. > RINFO(3) (local estimated flops for the elimination after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this processor after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the complete tree): 12 > INFOG(6) (number of nodes in the complete tree): 53 > INFOG(7) (ordering option effectively use after analysis): 2 > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix after factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after factorization): 0 > INFOG(14) (number of memory compress after factorization): 0 > INFOG(15) (number of steps of iterative refinement after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the factors): 3042 > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > INFOG(28) (after factorization: number of null pivots encountered): 0 > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if determinant is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > type: cg > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_FE_split_) 1 MPI processes > type: bjacobi > block Jacobi: number of blocks = 1 > Local solve is same for all blocks, in the following KSP and PC objects: > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > package used to perform factorization: petsc > total: nonzeros=1037052, allocated nonzeros=1037052 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9489 nodes, limit used is 5 > linear system matrix = precond matrix: > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1037052, allocated nonzeros=1037052 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9489 nodes, limit used is 5 > linear system matrix followed by preconditioner matrix: > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: schurcomplement > rows=28476, cols=28476 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=324 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 5717 nodes, limit used is 5 > KSP of A00 > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): 0 > ICNTL(13) (efficiency control): 0 > ICNTL(14) (percentage of estimated workspace increase): 20 > ICNTL(18) (input mat struct): 0 > ICNTL(19) (Shur complement info): 0 > ICNTL(20) (rhs sparse pattern): 0 > ICNTL(21) (solution struct): 0 > ICNTL(22) (in-core/out-of-core facility): 0 > ICNTL(23) (max size of memory can be allocated locally):0 > ICNTL(24) (detection of null pivot rows): 0 > ICNTL(25) (computation of a null space basis): 0 > ICNTL(26) (Schur options for rhs or solution): 0 > ICNTL(27) (experimental parameter): -24 > ICNTL(28) (use parallel or sequential ordering): 1 > ICNTL(29) (parallel ordering): 0 > ICNTL(30) (user-specified set of entries in inv(A)): 0 > ICNTL(31) (factors is discarded in the solve phase): 0 > ICNTL(33) (compute determinant): 0 > CNTL(1) (relative pivoting threshold): 0.01 > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the elimination after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the assembly after factorization): > [0] 1092. > RINFO(3) (local estimated flops for the elimination after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this processor after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the complete tree): 12 > INFOG(6) (number of nodes in the complete tree): 53 > INFOG(7) (ordering option effectively use after analysis): 2 > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix after factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after factorization): 0 > INFOG(14) (number of memory compress after factorization): 0 > INFOG(15) (number of steps of iterative refinement after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the factors): 3042 > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > INFOG(28) (after factorization: number of null pivots encountered): 0 > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if determinant is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=28476 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 67 nodes, limit used is 5 > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1037052, allocated nonzeros=1037052 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9489 nodes, limit used is 5 > linear system matrix = precond matrix: > Mat Object: () 1 MPI processes > type: seqaij > rows=28800, cols=28800 > total: nonzeros=1024686, allocated nonzeros=1024794 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9600 nodes, limit used is 5 > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 17:22:10 2017 > Using Petsc Release Version 3.7.3, unknown > > Max Max/Min Avg Total > Time (sec): 9.638e+01 1.00000 9.638e+01 > Objects: 2.030e+02 1.00000 2.030e+02 > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Reductions: 0.000e+00 0.00000 > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > e.g., VecAXPY() for real vectors of length N --> 2N flops > and VecAXPY() for complex vectors of length N --> 8N flops > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > ------------------------------------------------------------------------------------------------------------------------ > See the 'Profiling' chapter of the users' manual for details on interpreting output. > Phase summary info: > Count: number of times phase was executed > Time and Flops: Max - maximum over all processors > Ratio - ratio of maximum to minimum over all processors > Mess: number of messages sent > Avg. len: average message length (bytes) > Reduct: number of global reductions > Global: entire computation > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > %T - percent time in this phase %F - percent flops in this phase > %M - percent messages in this phase %L - percent message lengths in this phase > %R - percent reductions in this phase > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > ------------------------------------------------------------------------------------------------------------------------ > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > ------------------------------------------------------------------------------------------------------------------------ > > --- Event Stage 0: Main Stage > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > ------------------------------------------------------------------------------------------------------------------------ > > Memory usage is given in bytes: > > Object Type Creations Destructions Memory Descendants' Mem. > Reports information only for process 0. > > --- Event Stage 0: Main Stage > > Vector 92 92 9698040 0. > Vector Scatter 24 24 15936 0. > Index Set 51 51 537876 0. > IS L to G Mapping 3 3 240408 0. > Matrix 16 16 77377776 0. > Krylov Solver 6 6 7888 0. > Preconditioner 6 6 6288 0. > Viewer 1 0 0 0. > Distributed Mesh 1 1 4624 0. > Star Forest Bipartite Graph 2 2 1616 0. > Discrete System 1 1 872 0. > ======================================================================================================================== > Average time to get PetscTime(): 0. > #PETSc Option Table entries: > -ksp_monitor > -ksp_view > -log_view > #End of PETSc Option Table entries > Compiled without FORTRAN kernels > Compiled with full precision matrices (default) > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > ----------------------------------------- > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > Using PETSc directory: /home/dknez/software/petsc-src > Using PETSc arch: arch-linux2-c-opt > ----------------------------------------- > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > ----------------------------------------- > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > ----------------------------------------- > > Using C linker: mpicc > Using Fortran linker: mpif90 > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > ----------------------------------------- > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May wrote: > It looks like the Schur solve is requiring a huge number of iterates to converge (based on the instances of MatMult). > This is killing the performance. > > Are you sure that A11 is a good approximation to S? You might consider trying the selfp option > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > Note that the best approx to S is likely both problem and discretisation dependent so if selfp is also terrible, you might want to consider coding up your own approx to S for your specific system. > > > Thanks, > Dave > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic wrote: > I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. > > The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. > > So I did the following: > - PCFieldSplitSetIS to specify the indices of the two splits > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver and PC types for each (MUMPS for A00, ILU+CG for A11) > - I set -pc_fieldsplit_schur_fact_type full > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a test case. It seems to converge well, but I'm concerned about the speed (about 90 seconds, vs. about 1 second if I use a direct solver for the entire system). I just wanted to check if I'm setting this up in a good way? > > Many thanks, > David > > ----------------------------------------------------------------------------------- > > 0 KSP Residual norm 5.405774214400e+04 > 1 KSP Residual norm 1.849649014371e+02 > 2 KSP Residual norm 7.462775074989e-02 > 3 KSP Residual norm 2.680497175260e-04 > KSP Object: 1 MPI processes > type: cg > maximum iterations=1000 > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > left preconditioning > using nonzero initial guess > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from A11 > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): 0 > ICNTL(13) (efficiency control): 0 > ICNTL(14) (percentage of estimated workspace increase): 20 > ICNTL(18) (input mat struct): 0 > ICNTL(19) (Shur complement info): 0 > ICNTL(20) (rhs sparse pattern): 0 > ICNTL(21) (solution struct): 0 > ICNTL(22) (in-core/out-of-core facility): 0 > ICNTL(23) (max size of memory can be allocated locally):0 > ICNTL(24) (detection of null pivot rows): 0 > ICNTL(25) (computation of a null space basis): 0 > ICNTL(26) (Schur options for rhs or solution): 0 > ICNTL(27) (experimental parameter): -24 > ICNTL(28) (use parallel or sequential ordering): 1 > ICNTL(29) (parallel ordering): 0 > ICNTL(30) (user-specified set of entries in inv(A)): 0 > ICNTL(31) (factors is discarded in the solve phase): 0 > ICNTL(33) (compute determinant): 0 > CNTL(1) (relative pivoting threshold): 0.01 > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the elimination after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the assembly after factorization): > [0] 1092. > RINFO(3) (local estimated flops for the elimination after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this processor after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the complete tree): 12 > INFOG(6) (number of nodes in the complete tree): 53 > INFOG(7) (ordering option effectively use after analysis): 2 > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix after factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after factorization): 0 > INFOG(14) (number of memory compress after factorization): 0 > INFOG(15) (number of steps of iterative refinement after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the factors): 3042 > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > INFOG(28) (after factorization: number of null pivots encountered): 0 > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if determinant is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > type: cg > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_FE_split_) 1 MPI processes > type: bjacobi > block Jacobi: number of blocks = 1 > Local solve is same for all blocks, in the following KSP and PC objects: > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=28476 > package used to perform factorization: petsc > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > linear system matrix followed by preconditioner matrix: > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: schurcomplement > rows=28476, cols=28476 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=28476, cols=324 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 5717 nodes, limit used is 5 > KSP of A00 > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_RB_split_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 0., needed 0. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=324 > package used to perform factorization: mumps > total: nonzeros=3042, allocated nonzeros=3042 > total number of mallocs used during MatSetValues calls =0 > MUMPS run parameters: > SYM (matrix type): 2 > PAR (host participation): 1 > ICNTL(1) (output for error): 6 > ICNTL(2) (output of diagnostic msg): 0 > ICNTL(3) (output for global info): 0 > ICNTL(4) (level of printing): 0 > ICNTL(5) (input mat struct): 0 > ICNTL(6) (matrix prescaling): 7 > ICNTL(7) (sequentia matrix ordering):7 > ICNTL(8) (scalling strategy): 77 > ICNTL(10) (max num of refinements): 0 > ICNTL(11) (error analysis): 0 > ICNTL(12) (efficiency control): 0 > ICNTL(13) (efficiency control): 0 > ICNTL(14) (percentage of estimated workspace increase): 20 > ICNTL(18) (input mat struct): 0 > ICNTL(19) (Shur complement info): 0 > ICNTL(20) (rhs sparse pattern): 0 > ICNTL(21) (solution struct): 0 > ICNTL(22) (in-core/out-of-core facility): 0 > ICNTL(23) (max size of memory can be allocated locally):0 > ICNTL(24) (detection of null pivot rows): 0 > ICNTL(25) (computation of a null space basis): 0 > ICNTL(26) (Schur options for rhs or solution): 0 > ICNTL(27) (experimental parameter): -24 > ICNTL(28) (use parallel or sequential ordering): 1 > ICNTL(29) (parallel ordering): 0 > ICNTL(30) (user-specified set of entries in inv(A)): 0 > ICNTL(31) (factors is discarded in the solve phase): 0 > ICNTL(33) (compute determinant): 0 > CNTL(1) (relative pivoting threshold): 0.01 > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > CNTL(3) (absolute pivoting threshold): 0. > CNTL(4) (value of static pivoting): -1. > CNTL(5) (fixation for null pivots): 0. > RINFO(1) (local estimated flops for the elimination after analysis): > [0] 29394. > RINFO(2) (local estimated flops for the assembly after factorization): > [0] 1092. > RINFO(3) (local estimated flops for the elimination after factorization): > [0] 29394. > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > [0] 1 > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > [0] 1 > INFO(23) (num of pivots eliminated on this processor after factorization): > [0] 324 > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > INFOG(5) (estimated maximum front size in the complete tree): 12 > INFOG(6) (number of nodes in the complete tree): 53 > INFOG(7) (ordering option effectively use after analysis): 2 > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > INFOG(11) (order of largest frontal matrix after factorization): 12 > INFOG(12) (number of off-diagonal pivots): 0 > INFOG(13) (number of delayed pivots after factorization): 0 > INFOG(14) (number of memory compress after factorization): 0 > INFOG(15) (number of steps of iterative refinement after solution): 0 > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > INFOG(20) (estimated number of entries in the factors): 3042 > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > INFOG(28) (after factorization: number of null pivots encountered): 0 > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > INFOG(32) (after analysis: type of analysis done): 1 > INFOG(33) (value used for ICNTL(8)): -2 > INFOG(34) (exponent of the determinant if determinant is requested): 0 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > type: seqaij > rows=324, cols=324 > total: nonzeros=5760, allocated nonzeros=5760 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 108 nodes, limit used is 5 > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=324, cols=28476 > total: nonzeros=936, allocated nonzeros=936 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 67 nodes, limit used is 5 > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > type: seqaij > rows=28476, cols=28476 > total: nonzeros=1017054, allocated nonzeros=1017054 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9492 nodes, limit used is 5 > linear system matrix = precond matrix: > Mat Object: () 1 MPI processes > type: seqaij > rows=28800, cols=28800 > total: nonzeros=1024686, allocated nonzeros=1024794 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9600 nodes, limit used is 5 > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 16:16:47 2017 > Using Petsc Release Version 3.7.3, unknown > > Max Max/Min Avg Total > Time (sec): 9.179e+01 1.00000 9.179e+01 > Objects: 1.990e+02 1.00000 1.990e+02 > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > MPI Reductions: 0.000e+00 0.00000 > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > e.g., VecAXPY() for real vectors of length N --> 2N flops > and VecAXPY() for complex vectors of length N --> 8N flops > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > ------------------------------------------------------------------------------------------------------------------------ > See the 'Profiling' chapter of the users' manual for details on interpreting output. > Phase summary info: > Count: number of times phase was executed > Time and Flops: Max - maximum over all processors > Ratio - ratio of maximum to minimum over all processors > Mess: number of messages sent > Avg. len: average message length (bytes) > Reduct: number of global reductions > Global: entire computation > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > %T - percent time in this phase %F - percent flops in this phase > %M - percent messages in this phase %L - percent message lengths in this phase > %R - percent reductions in this phase > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > ------------------------------------------------------------------------------------------------------------------------ > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > ------------------------------------------------------------------------------------------------------------------------ > > --- Event Stage 0: Main Stage > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > ------------------------------------------------------------------------------------------------------------------------ > > Memory usage is given in bytes: > > Object Type Creations Destructions Memory Descendants' Mem. > Reports information only for process 0. > > --- Event Stage 0: Main Stage > > Vector 91 91 9693912 0. > Vector Scatter 24 24 15936 0. > Index Set 51 51 537888 0. > IS L to G Mapping 3 3 240408 0. > Matrix 13 13 64097868 0. > Krylov Solver 6 6 7888 0. > Preconditioner 6 6 6288 0. > Viewer 1 0 0 0. > Distributed Mesh 1 1 4624 0. > Star Forest Bipartite Graph 2 2 1616 0. > Discrete System 1 1 872 0. > ======================================================================================================================== > Average time to get PetscTime(): 0. > #PETSc Option Table entries: > -ksp_monitor > -ksp_view > -log_view > #End of PETSc Option Table entries > Compiled without FORTRAN kernels > Compiled with full precision matrices (default) > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > ----------------------------------------- > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > Using PETSc directory: /home/dknez/software/petsc-src > Using PETSc arch: arch-linux2-c-opt > ----------------------------------------- > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > ----------------------------------------- > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > ----------------------------------------- > > Using C linker: mpicc > Using Fortran linker: mpif90 > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > ----------------------------------------- > > > > > > > From bsmith at mcs.anl.gov Wed Jan 11 19:39:49 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Jan 2017 19:39:49 -0600 Subject: [petsc-users] malconfigured gamg In-Reply-To: <87mvexz2n7.fsf@jedbrown.org> References: <735d76e6-3875-05f1-4f2d-1ab0158d2846@sintef.no> <87mvexz2n7.fsf@jedbrown.org> Message-ID: <61854A5B-AE25-4386-A36C-6DB72D079214@mcs.anl.gov> > On Jan 11, 2017, at 3:51 PM, Jed Brown wrote: > > Arne Morten Kvarving writes: > >> hi, >> >> first, this was an user error and i totally acknowledge this, but i >> wonder if this might be an oversight in your error checking: if you >> configure gamg with ilu/asm smoothing, and are stupid enough to have set >> the number of smoother cycles to 0, your program churns along and >> apparently converges just fine (towards garbage, but apparently 'sane' >> garbage (not 0, not nan, not inf)) > > My concern here is that skipping smoothing actually makes sense, e.g., > for Kaskade cycles (no pre-smoothing). I would suggest checking the > unpreconditioned (or true) residual in order to notice when a singular > preconditioner causes stagnation (instead of misdiagnosing it as > convergence due to the preconditioned residual dropping). Jed, Yeah but what about checking that the sum of the number of pre and post smooths >=1 ? > >> once i set sor as smoother, i got the error message >> >> 'PETSC ERROR: Relaxation requires global its 0 positive' which pointed >> me to my stupid. >> >> fixing this made both asm and sor work fine. >> >> it's all wrapped up in a schur/fieldsplit (it's P2/P1 navier-stokes), >> constructed by hand due to "surrounding" reasons. but i don't think >> that's relevant as such. i've used 3.6.4 as the oldest and 3.7.4 as the >> newest version and behavior was the same. if you want logs et al don't >> hesitate to ask for them, but i do not think they would add much. >> >> cheers >> >> arnem From david.knezevic at akselos.com Wed Jan 11 19:47:05 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Wed, 11 Jan 2017 20:47:05 -0500 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> Message-ID: I've attached the two log files. Using cholesky for "FE_split" seems to have helped a lot! David -- David J. Knezevic | CTO Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 Phone: +1-617-599-4755 [image: akselos.com] [image: @AkselosCAE] [image: linkedin.com/company/akselos] This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith wrote: > > Can you please run with all the monitoring on? So we can see the > convergence of all the inner solvers > -fieldsplit_FE_split_ksp_monitor > > Then run again with > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky > > > and send both sets of results > > Barry > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic > wrote: > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May > wrote: > > so I gather that I'll have to look into a user-defined approximation to > S. > > > > Where does the 2x2 block system come from? > > Maybe someone on the list knows the right approximation to use for S. > > > > The model is 3D linear elasticity using a finite element discretization. > I applied substructuring to part of the system to "condense" it, and that > results in the small A00 block. The A11 block is just standard 3D > elasticity; no substructuring was applied there. There are constraints to > connect the degrees of freedom on the interface of the substructured and > non-substructured regions. > > > > If anyone has suggestions for a good way to precondition this type of > system, I'd be most appreciative! > > > > Thanks, > > David > > > > > > > > ----------------------------------------- > > > > 0 KSP Residual norm 5.405528187695e+04 > > 1 KSP Residual norm 2.187814910803e+02 > > 2 KSP Residual norm 1.019051577515e-01 > > 3 KSP Residual norm 4.370464012859e-04 > > KSP Object: 1 MPI processes > > type: cg > > maximum iterations=1000 > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > left preconditioning > > using nonzero initial guess > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: fieldsplit > > FieldSplit with Schur preconditioner, factorization FULL > > Preconditioner for the Schur complement formed from Sp, an assembled > approximation to S, which uses (lumped, if requested) A00's diagonal's > inverse > > Split info: > > Split number 0 Defined by IS > > Split number 1 Defined by IS > > KSP solver for A00 block > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): > 0 > > ICNTL(13) (efficiency control): > 0 > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > ICNTL(18) (input mat struct): > 0 > > ICNTL(19) (Shur complement info): > 0 > > ICNTL(20) (rhs sparse pattern): > 0 > > ICNTL(21) (solution struct): > 0 > > ICNTL(22) (in-core/out-of-core facility): > 0 > > ICNTL(23) (max size of memory can be allocated > locally):0 > > ICNTL(24) (detection of null pivot rows): > 0 > > ICNTL(25) (computation of a null space basis): > 0 > > ICNTL(26) (Schur options for rhs or solution): > 0 > > ICNTL(27) (experimental parameter): > -24 > > ICNTL(28) (use parallel or sequential ordering): > 1 > > ICNTL(29) (parallel ordering): > 0 > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > ICNTL(33) (compute determinant): > 0 > > CNTL(1) (relative pivoting threshold): 0.01 > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): 0. > > CNTL(4) (value of static pivoting): -1. > > CNTL(5) (fixation for null pivots): 0. > > RINFO(1) (local estimated flops for the elimination > after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the assembly > after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the elimination > after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this processor > after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for factors on > all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for factors on > all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > INFOG(6) (number of nodes in the complete tree): 53 > > INFOG(7) (ordering option effectively use after > analysis): 2 > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to store the > matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after > factorization): 0 > > INFOG(14) (number of memory compress after > factorization): 0 > > INFOG(15) (number of steps of iterative refinement > after solution): 0 > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS internal data > for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data allocated > during factorization: value on the most memory consuming processor): 1 > > INFOG(19) (size of all MUMPS internal data allocated > during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in the > factors): 3042 > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > INFOG(29) (after factorization: effective number of > entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 108 nodes, limit used is 5 > > KSP solver for S = A11 - A10 inv(A00) A01 > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > type: cg > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > type: bjacobi > > block Jacobi: number of blocks = 1 > > Local solve is same for all blocks, in the following KSP and > PC objects: > > KSP Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI > processes > > type: ilu > > ILU: out-of-place factorization > > 0 levels of fill > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 1., needed 1. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > package used to perform factorization: petsc > > total: nonzeros=1037052, allocated nonzeros=1037052 > > total number of mallocs used during MatSetValues > calls =0 > > using I-node routines: found 9489 nodes, limit > used is 5 > > linear system matrix = precond matrix: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1037052, allocated nonzeros=1037052 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9489 nodes, limit used is 5 > > linear system matrix followed by preconditioner matrix: > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: schurcomplement > > rows=28476, cols=28476 > > Schur complement A11 - A10 inv(A00) A01 > > A11 > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is > 5 > > A10 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=324 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 5717 nodes, limit used is > 5 > > KSP of A00 > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues > calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): > 0 > > ICNTL(13) (efficiency control): > 0 > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > ICNTL(18) (input mat struct): > 0 > > ICNTL(19) (Shur complement info): > 0 > > ICNTL(20) (rhs sparse pattern): > 0 > > ICNTL(21) (solution struct): > 0 > > ICNTL(22) (in-core/out-of-core facility): > 0 > > ICNTL(23) (max size of memory can be > allocated locally):0 > > ICNTL(24) (detection of null pivot rows): > 0 > > ICNTL(25) (computation of a null space > basis): 0 > > ICNTL(26) (Schur options for rhs or > solution): 0 > > ICNTL(27) (experimental parameter): > -24 > > ICNTL(28) (use parallel or sequential > ordering): 1 > > ICNTL(29) (parallel ordering): > 0 > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > ICNTL(33) (compute determinant): > 0 > > CNTL(1) (relative pivoting threshold): > 0.01 > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): > 0. > > CNTL(4) (value of static pivoting): > -1. > > CNTL(5) (fixation for null pivots): > 0. > > RINFO(1) (local estimated flops for the > elimination after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the > assembly after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the > elimination after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in > the complete tree): 12 > > INFOG(6) (number of nodes in the complete > tree): 53 > > INFOG(7) (ordering option effectively use > after analysis): 2 > > INFOG(8) (structural symmetry in percent of > the permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix > after factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after > factorization): 0 > > INFOG(14) (number of memory compress after > factorization): 0 > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in > the factors): 3042 > > INFOG(21) (size in MB of memory effectively > used during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively > used during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > > INFOG(28) (after factorization: number of > null pivots encountered): 0 > > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis > done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls > =0 > > using I-node routines: found 108 nodes, limit used > is 5 > > A01 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=28476 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 67 nodes, limit used is 5 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1037052, allocated nonzeros=1037052 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9489 nodes, limit used is 5 > > linear system matrix = precond matrix: > > Mat Object: () 1 MPI processes > > type: seqaij > > rows=28800, cols=28800 > > total: nonzeros=1024686, allocated nonzeros=1024794 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9600 nodes, limit used is 5 > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 17:22:10 2017 > > Using Petsc Release Version 3.7.3, unknown > > > > Max Max/Min Avg Total > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > Objects: 2.030e+02 1.00000 2.030e+02 > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Reductions: 0.000e+00 0.00000 > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > e.g., VecAXPY() for real vectors of length N > --> 2N flops > > and VecAXPY() for complex vectors of length > N --> 8N flops > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages > --- -- Message Lengths -- -- Reductions -- > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > Phase summary info: > > Count: number of times phase was executed > > Time and Flops: Max - maximum over all processors > > Ratio - ratio of maximum to minimum over all > processors > > Mess: number of messages sent > > Avg. len: average message length (bytes) > > Reduct: number of global reductions > > Global: entire computation > > Stage: stages of a computation. Set stages with PetscLogStagePush() > and PetscLogStagePop(). > > %T - percent time in this phase %F - percent flops in this > phase > > %M - percent messages in this phase %L - percent message > lengths in this phase > > %R - percent reductions in this phase > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > > ------------------------------------------------------------ > ------------------------------------------------------------ > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > Max Ratio Max Ratio Max Ratio Mess Avg len > Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > --- Event Stage 0: Main Stage > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > Memory usage is given in bytes: > > > > Object Type Creations Destructions Memory Descendants' > Mem. > > Reports information only for process 0. > > > > --- Event Stage 0: Main Stage > > > > Vector 92 92 9698040 0. > > Vector Scatter 24 24 15936 0. > > Index Set 51 51 537876 0. > > IS L to G Mapping 3 3 240408 0. > > Matrix 16 16 77377776 0. > > Krylov Solver 6 6 7888 0. > > Preconditioner 6 6 6288 0. > > Viewer 1 0 0 0. > > Distributed Mesh 1 1 4624 0. > > Star Forest Bipartite Graph 2 2 1616 0. > > Discrete System 1 1 872 0. > > ============================================================ > ============================================================ > > Average time to get PetscTime(): 0. > > #PETSc Option Table entries: > > -ksp_monitor > > -ksp_view > > -log_view > > #End of PETSc Option Table entries > > Compiled without FORTRAN kernels > > Compiled with full precision matrices (default) > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > ----------------------------------------- > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > Using PETSc directory: /home/dknez/software/petsc-src > > Using PETSc arch: arch-linux2-c-opt > > ----------------------------------------- > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > ----------------------------------------- > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > ----------------------------------------- > > > > Using C linker: mpicc > > Using Fortran linker: mpif90 > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > ----------------------------------------- > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May > wrote: > > It looks like the Schur solve is requiring a huge number of iterates to > converge (based on the instances of MatMult). > > This is killing the performance. > > > > Are you sure that A11 is a good approximation to S? You might consider > trying the selfp option > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > Note that the best approx to S is likely both problem and discretisation > dependent so if selfp is also terrible, you might want to consider coding > up your own approx to S for your specific system. > > > > > > Thanks, > > Dave > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic > wrote: > > I have a definite block 2x2 system and I figured it'd be good to apply > the PCFIELDSPLIT functionality with Schur complement, as described in > Section 4.5 of the manual. > > > > The A00 block of my matrix is very small so I figured I'd specify a > direct solver (i.e. MUMPS) for that block. > > > > So I did the following: > > - PCFieldSplitSetIS to specify the indices of the two splits > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the > solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > - I set -pc_fieldsplit_schur_fact_type full > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for > a test case. It seems to converge well, but I'm concerned about the speed > (about 90 seconds, vs. about 1 second if I use a direct solver for the > entire system). I just wanted to check if I'm setting this up in a good way? > > > > Many thanks, > > David > > > > ------------------------------------------------------------ > ----------------------- > > > > 0 KSP Residual norm 5.405774214400e+04 > > 1 KSP Residual norm 1.849649014371e+02 > > 2 KSP Residual norm 7.462775074989e-02 > > 3 KSP Residual norm 2.680497175260e-04 > > KSP Object: 1 MPI processes > > type: cg > > maximum iterations=1000 > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > left preconditioning > > using nonzero initial guess > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: fieldsplit > > FieldSplit with Schur preconditioner, factorization FULL > > Preconditioner for the Schur complement formed from A11 > > Split info: > > Split number 0 Defined by IS > > Split number 1 Defined by IS > > KSP solver for A00 block > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): > 0 > > ICNTL(13) (efficiency control): > 0 > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > ICNTL(18) (input mat struct): > 0 > > ICNTL(19) (Shur complement info): > 0 > > ICNTL(20) (rhs sparse pattern): > 0 > > ICNTL(21) (solution struct): > 0 > > ICNTL(22) (in-core/out-of-core facility): > 0 > > ICNTL(23) (max size of memory can be allocated > locally):0 > > ICNTL(24) (detection of null pivot rows): > 0 > > ICNTL(25) (computation of a null space basis): > 0 > > ICNTL(26) (Schur options for rhs or solution): > 0 > > ICNTL(27) (experimental parameter): > -24 > > ICNTL(28) (use parallel or sequential ordering): > 1 > > ICNTL(29) (parallel ordering): > 0 > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > ICNTL(33) (compute determinant): > 0 > > CNTL(1) (relative pivoting threshold): 0.01 > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): 0. > > CNTL(4) (value of static pivoting): -1. > > CNTL(5) (fixation for null pivots): 0. > > RINFO(1) (local estimated flops for the elimination > after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the assembly > after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the elimination > after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this processor > after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for factors on > all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for factors on > all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > INFOG(6) (number of nodes in the complete tree): 53 > > INFOG(7) (ordering option effectively use after > analysis): 2 > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to store the > matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after > factorization): 0 > > INFOG(14) (number of memory compress after > factorization): 0 > > INFOG(15) (number of steps of iterative refinement > after solution): 0 > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS internal data > for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data allocated > during factorization: value on the most memory consuming processor): 1 > > INFOG(19) (size of all MUMPS internal data allocated > during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in the > factors): 3042 > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > INFOG(29) (after factorization: effective number of > entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 108 nodes, limit used is 5 > > KSP solver for S = A11 - A10 inv(A00) A01 > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > type: cg > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > type: bjacobi > > block Jacobi: number of blocks = 1 > > Local solve is same for all blocks, in the following KSP and > PC objects: > > KSP Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI > processes > > type: ilu > > ILU: out-of-place factorization > > 0 levels of fill > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 1., needed 1. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > package used to perform factorization: petsc > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues > calls =0 > > using I-node routines: found 9492 nodes, limit > used is 5 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_FE_split_) 1 > MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is 5 > > linear system matrix followed by preconditioner matrix: > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: schurcomplement > > rows=28476, cols=28476 > > Schur complement A11 - A10 inv(A00) A01 > > A11 > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is > 5 > > A10 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=324 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 5717 nodes, limit used is > 5 > > KSP of A00 > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues > calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): > 0 > > ICNTL(13) (efficiency control): > 0 > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > ICNTL(18) (input mat struct): > 0 > > ICNTL(19) (Shur complement info): > 0 > > ICNTL(20) (rhs sparse pattern): > 0 > > ICNTL(21) (solution struct): > 0 > > ICNTL(22) (in-core/out-of-core facility): > 0 > > ICNTL(23) (max size of memory can be > allocated locally):0 > > ICNTL(24) (detection of null pivot rows): > 0 > > ICNTL(25) (computation of a null space > basis): 0 > > ICNTL(26) (Schur options for rhs or > solution): 0 > > ICNTL(27) (experimental parameter): > -24 > > ICNTL(28) (use parallel or sequential > ordering): 1 > > ICNTL(29) (parallel ordering): > 0 > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > ICNTL(33) (compute determinant): > 0 > > CNTL(1) (relative pivoting threshold): > 0.01 > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): > 0. > > CNTL(4) (value of static pivoting): > -1. > > CNTL(5) (fixation for null pivots): > 0. > > RINFO(1) (local estimated flops for the > elimination after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the > assembly after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the > elimination after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in > the complete tree): 12 > > INFOG(6) (number of nodes in the complete > tree): 53 > > INFOG(7) (ordering option effectively use > after analysis): 2 > > INFOG(8) (structural symmetry in percent of > the permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix > after factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after > factorization): 0 > > INFOG(14) (number of memory compress after > factorization): 0 > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in > the factors): 3042 > > INFOG(21) (size in MB of memory effectively > used during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively > used during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > > INFOG(28) (after factorization: number of > null pivots encountered): 0 > > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis > done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls > =0 > > using I-node routines: found 108 nodes, limit used > is 5 > > A01 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=28476 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 67 nodes, limit used is 5 > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is 5 > > linear system matrix = precond matrix: > > Mat Object: () 1 MPI processes > > type: seqaij > > rows=28800, cols=28800 > > total: nonzeros=1024686, allocated nonzeros=1024794 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 16:16:47 2017 > > Using Petsc Release Version 3.7.3, unknown > > > > Max Max/Min Avg Total > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > Objects: 1.990e+02 1.00000 1.990e+02 > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Reductions: 0.000e+00 0.00000 > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > e.g., VecAXPY() for real vectors of length N > --> 2N flops > > and VecAXPY() for complex vectors of length > N --> 8N flops > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages > --- -- Message Lengths -- -- Reductions -- > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > Phase summary info: > > Count: number of times phase was executed > > Time and Flops: Max - maximum over all processors > > Ratio - ratio of maximum to minimum over all > processors > > Mess: number of messages sent > > Avg. len: average message length (bytes) > > Reduct: number of global reductions > > Global: entire computation > > Stage: stages of a computation. Set stages with PetscLogStagePush() > and PetscLogStagePop(). > > %T - percent time in this phase %F - percent flops in this > phase > > %M - percent messages in this phase %L - percent message > lengths in this phase > > %R - percent reductions in this phase > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > > ------------------------------------------------------------ > ------------------------------------------------------------ > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > Max Ratio Max Ratio Max Ratio Mess Avg len > Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > --- Event Stage 0: Main Stage > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 > 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 > 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 > 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 > 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > Memory usage is given in bytes: > > > > Object Type Creations Destructions Memory Descendants' > Mem. > > Reports information only for process 0. > > > > --- Event Stage 0: Main Stage > > > > Vector 91 91 9693912 0. > > Vector Scatter 24 24 15936 0. > > Index Set 51 51 537888 0. > > IS L to G Mapping 3 3 240408 0. > > Matrix 13 13 64097868 0. > > Krylov Solver 6 6 7888 0. > > Preconditioner 6 6 6288 0. > > Viewer 1 0 0 0. > > Distributed Mesh 1 1 4624 0. > > Star Forest Bipartite Graph 2 2 1616 0. > > Discrete System 1 1 872 0. > > ============================================================ > ============================================================ > > Average time to get PetscTime(): 0. > > #PETSc Option Table entries: > > -ksp_monitor > > -ksp_view > > -log_view > > #End of PETSc Option Table entries > > Compiled without FORTRAN kernels > > Compiled with full precision matrices (default) > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > ----------------------------------------- > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > Using PETSc directory: /home/dknez/software/petsc-src > > Using PETSc arch: arch-linux2-c-opt > > ----------------------------------------- > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > ----------------------------------------- > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > ----------------------------------------- > > > > Using C linker: mpicc > > Using Fortran linker: mpif90 > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > ----------------------------------------- > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 7.439873800415e+00 1 KSP Residual norm 2.271092312202e+00 2 KSP Residual norm 1.579282324756e+00 3 KSP Residual norm 1.107108552974e+00 4 KSP Residual norm 8.906995999187e-01 5 KSP Residual norm 7.119535581527e-01 6 KSP Residual norm 6.292448465808e-01 7 KSP Residual norm 5.574190632829e-01 8 KSP Residual norm 4.914453903768e-01 9 KSP Residual norm 4.169590427313e-01 10 KSP Residual norm 3.833759307828e-01 11 KSP Residual norm 3.525084900775e-01 12 KSP Residual norm 3.090712658867e-01 13 KSP Residual norm 3.072058092791e-01 14 KSP Residual norm 2.791154478844e-01 15 KSP Residual norm 2.596487823324e-01 16 KSP Residual norm 2.454855113839e-01 17 KSP Residual norm 2.323657304086e-01 18 KSP Residual norm 2.180987080607e-01 19 KSP Residual norm 2.071787317658e-01 20 KSP Residual norm 1.892488656995e-01 21 KSP Residual norm 1.812493522062e-01 22 KSP Residual norm 1.876453055369e-01 23 KSP Residual norm 1.683189793079e-01 24 KSP Residual norm 1.543469424065e-01 25 KSP Residual norm 1.496069368476e-01 26 KSP Residual norm 1.493645624754e-01 27 KSP Residual norm 1.525727733866e-01 28 KSP Residual norm 1.504760809841e-01 29 KSP Residual norm 1.398787428655e-01 30 KSP Residual norm 1.377742552240e-01 31 KSP Residual norm 1.350150247010e-01 32 KSP Residual norm 1.275144910850e-01 33 KSP Residual norm 1.273327004584e-01 34 KSP Residual norm 1.216415065625e-01 35 KSP Residual norm 1.109570051668e-01 36 KSP Residual norm 1.074358443291e-01 37 KSP Residual norm 1.061971546557e-01 38 KSP Residual norm 1.067809057557e-01 39 KSP Residual norm 1.063479786887e-01 40 KSP Residual norm 1.034518470284e-01 41 KSP Residual norm 9.721337391778e-02 42 KSP Residual norm 9.699203830334e-02 43 KSP Residual norm 9.481065986826e-02 44 KSP Residual norm 9.136671340793e-02 45 KSP Residual norm 8.851274329317e-02 46 KSP Residual norm 8.467762039012e-02 47 KSP Residual norm 8.471801429555e-02 48 KSP Residual norm 8.278633364455e-02 49 KSP Residual norm 7.838491006512e-02 50 KSP Residual norm 7.670240644506e-02 51 KSP Residual norm 8.043224588019e-02 52 KSP Residual norm 8.363745629948e-02 53 KSP Residual norm 7.940546973611e-02 54 KSP Residual norm 7.456475032696e-02 55 KSP Residual norm 7.700637275547e-02 56 KSP Residual norm 7.543373247528e-02 57 KSP Residual norm 7.094501517914e-02 58 KSP Residual norm 7.075418803924e-02 59 KSP Residual norm 6.974463189188e-02 60 KSP Residual norm 6.564358161289e-02 61 KSP Residual norm 6.777118961072e-02 62 KSP Residual norm 6.572645071229e-02 63 KSP Residual norm 6.592491054825e-02 64 KSP Residual norm 6.361704393145e-02 65 KSP Residual norm 6.091631661495e-02 66 KSP Residual norm 6.045947178947e-02 67 KSP Residual norm 5.960456641459e-02 68 KSP Residual norm 5.849383480514e-02 69 KSP Residual norm 5.706157714189e-02 70 KSP Residual norm 5.785957008296e-02 71 KSP Residual norm 5.598254854109e-02 72 KSP Residual norm 5.456921702305e-02 73 KSP Residual norm 5.612718298077e-02 74 KSP Residual norm 5.404331315421e-02 75 KSP Residual norm 5.445184161126e-02 76 KSP Residual norm 5.513790692539e-02 77 KSP Residual norm 5.603152577219e-02 78 KSP Residual norm 5.594578881463e-02 79 KSP Residual norm 5.270506297482e-02 80 KSP Residual norm 4.969434451616e-02 81 KSP Residual norm 5.066917256846e-02 82 KSP Residual norm 5.145815352597e-02 83 KSP Residual norm 4.855115729361e-02 84 KSP Residual norm 4.872335394452e-02 85 KSP Residual norm 4.674867276039e-02 86 KSP Residual norm 4.754130141657e-02 87 KSP Residual norm 4.833727177651e-02 88 KSP Residual norm 4.753752603403e-02 89 KSP Residual norm 4.711922842054e-02 90 KSP Residual norm 4.405539388922e-02 91 KSP Residual norm 4.296516931381e-02 92 KSP Residual norm 4.339330987856e-02 93 KSP Residual norm 4.309048328104e-02 94 KSP Residual norm 4.249286390589e-02 95 KSP Residual norm 4.279072317861e-02 96 KSP Residual norm 4.138481489705e-02 97 KSP Residual norm 4.075783749198e-02 98 KSP Residual norm 4.150832258048e-02 99 KSP Residual norm 4.196726309881e-02 100 KSP Residual norm 4.511508557881e-02 101 KSP Residual norm 4.353430099295e-02 102 KSP Residual norm 4.110659276090e-02 103 KSP Residual norm 4.216362776668e-02 104 KSP Residual norm 4.132142431466e-02 105 KSP Residual norm 3.937463571948e-02 106 KSP Residual norm 3.847820147612e-02 107 KSP Residual norm 3.903461144864e-02 108 KSP Residual norm 3.873827003588e-02 109 KSP Residual norm 3.757181226921e-02 110 KSP Residual norm 3.713370747089e-02 111 KSP Residual norm 3.797595762562e-02 112 KSP Residual norm 3.904892207809e-02 113 KSP Residual norm 3.832213482446e-02 114 KSP Residual norm 3.673918137725e-02 115 KSP Residual norm 3.568169366392e-02 116 KSP Residual norm 3.463483623212e-02 117 KSP Residual norm 3.474865380315e-02 118 KSP Residual norm 3.369100450734e-02 119 KSP Residual norm 3.429462851161e-02 120 KSP Residual norm 3.356928597959e-02 121 KSP Residual norm 3.388831153274e-02 122 KSP Residual norm 3.403530170365e-02 123 KSP Residual norm 3.432423924461e-02 124 KSP Residual norm 3.480510081872e-02 125 KSP Residual norm 3.501270104127e-02 126 KSP Residual norm 3.436014578560e-02 127 KSP Residual norm 3.313137022795e-02 128 KSP Residual norm 3.224202549572e-02 129 KSP Residual norm 3.282496094068e-02 130 KSP Residual norm 3.179581680589e-02 131 KSP Residual norm 3.126804650993e-02 132 KSP Residual norm 3.164061472181e-02 133 KSP Residual norm 3.064596217610e-02 134 KSP Residual norm 3.123484050542e-02 135 KSP Residual norm 3.098163173034e-02 136 KSP Residual norm 3.021371993367e-02 137 KSP Residual norm 2.997438018349e-02 138 KSP Residual norm 2.996861618503e-02 139 KSP Residual norm 2.952174027773e-02 140 KSP Residual norm 2.781275108201e-02 141 KSP Residual norm 2.831949666222e-02 142 KSP Residual norm 2.943315787910e-02 143 KSP Residual norm 2.861351453584e-02 144 KSP Residual norm 2.857243223511e-02 145 KSP Residual norm 2.776445330757e-02 146 KSP Residual norm 2.815377297347e-02 147 KSP Residual norm 2.833952777968e-02 148 KSP Residual norm 2.820315095721e-02 149 KSP Residual norm 2.887229075208e-02 150 KSP Residual norm 2.979067721535e-02 151 KSP Residual norm 2.815700116316e-02 152 KSP Residual norm 2.680023054442e-02 153 KSP Residual norm 2.639722486097e-02 154 KSP Residual norm 2.662100285008e-02 155 KSP Residual norm 2.675384875846e-02 156 KSP Residual norm 2.701241246267e-02 157 KSP Residual norm 2.643754530991e-02 158 KSP Residual norm 2.592855185871e-02 159 KSP Residual norm 2.618977338412e-02 160 KSP Residual norm 2.586809270174e-02 161 KSP Residual norm 2.576998073277e-02 162 KSP Residual norm 2.540380238842e-02 163 KSP Residual norm 2.495114311948e-02 164 KSP Residual norm 2.455716236236e-02 165 KSP Residual norm 2.428454731484e-02 166 KSP Residual norm 2.464342635564e-02 167 KSP Residual norm 2.452483143323e-02 168 KSP Residual norm 2.481544666658e-02 169 KSP Residual norm 2.477918836230e-02 170 KSP Residual norm 2.359566978599e-02 171 KSP Residual norm 2.418262225393e-02 172 KSP Residual norm 2.564699145458e-02 173 KSP Residual norm 2.500981972636e-02 174 KSP Residual norm 2.471425418109e-02 175 KSP Residual norm 2.445163739021e-02 176 KSP Residual norm 2.385452288757e-02 177 KSP Residual norm 2.428112914859e-02 178 KSP Residual norm 2.291705127975e-02 179 KSP Residual norm 2.271629416191e-02 180 KSP Residual norm 2.345648624740e-02 181 KSP Residual norm 2.411307961322e-02 182 KSP Residual norm 2.305778032055e-02 183 KSP Residual norm 2.363029408956e-02 184 KSP Residual norm 2.334430183050e-02 185 KSP Residual norm 2.288119542251e-02 186 KSP Residual norm 2.268194883960e-02 187 KSP Residual norm 2.194114600599e-02 188 KSP Residual norm 2.138125374716e-02 189 KSP Residual norm 2.122308178911e-02 190 KSP Residual norm 2.167011211336e-02 191 KSP Residual norm 2.124093196590e-02 192 KSP Residual norm 2.254672827834e-02 193 KSP Residual norm 2.259893744483e-02 194 KSP Residual norm 2.124131449373e-02 195 KSP Residual norm 2.128570772460e-02 196 KSP Residual norm 2.272443514094e-02 197 KSP Residual norm 2.248357368551e-02 198 KSP Residual norm 2.186082830509e-02 199 KSP Residual norm 2.193211823715e-02 200 KSP Residual norm 2.092614010486e-02 201 KSP Residual norm 2.058552529496e-02 202 KSP Residual norm 2.093249615703e-02 203 KSP Residual norm 2.045325798910e-02 204 KSP Residual norm 2.043511462793e-02 205 KSP Residual norm 2.037030539329e-02 206 KSP Residual norm 2.103460538310e-02 207 KSP Residual norm 2.069703021464e-02 208 KSP Residual norm 1.978197990018e-02 209 KSP Residual norm 1.997727225256e-02 210 KSP Residual norm 2.014133842310e-02 211 KSP Residual norm 1.909869680533e-02 212 KSP Residual norm 1.898652371759e-02 213 KSP Residual norm 1.917618301995e-02 214 KSP Residual norm 1.903312625797e-02 215 KSP Residual norm 1.935123404273e-02 216 KSP Residual norm 1.876736458793e-02 217 KSP Residual norm 1.907165437333e-02 218 KSP Residual norm 1.931891473034e-02 219 KSP Residual norm 1.947700505686e-02 220 KSP Residual norm 1.946867709414e-02 221 KSP Residual norm 1.975097371142e-02 222 KSP Residual norm 1.936728796435e-02 223 KSP Residual norm 1.861169070605e-02 224 KSP Residual norm 1.821669203618e-02 225 KSP Residual norm 1.830030189024e-02 226 KSP Residual norm 1.847574748920e-02 227 KSP Residual norm 1.821934647650e-02 228 KSP Residual norm 1.858550789703e-02 229 KSP Residual norm 1.790620122765e-02 230 KSP Residual norm 1.736046756570e-02 231 KSP Residual norm 1.808056503558e-02 232 KSP Residual norm 1.831571970355e-02 233 KSP Residual norm 1.782550071852e-02 234 KSP Residual norm 1.704250503644e-02 235 KSP Residual norm 1.775557915607e-02 236 KSP Residual norm 1.749168705311e-02 237 KSP Residual norm 1.716635016316e-02 238 KSP Residual norm 1.697804812505e-02 239 KSP Residual norm 1.719949296811e-02 240 KSP Residual norm 1.798197854926e-02 241 KSP Residual norm 1.773491134572e-02 242 KSP Residual norm 1.724256440113e-02 243 KSP Residual norm 1.777189498102e-02 244 KSP Residual norm 1.819945790203e-02 245 KSP Residual norm 1.762431996374e-02 246 KSP Residual norm 1.711668380042e-02 247 KSP Residual norm 1.717847950917e-02 248 KSP Residual norm 1.675314264329e-02 249 KSP Residual norm 1.672527101557e-02 250 KSP Residual norm 1.652103275422e-02 251 KSP Residual norm 1.696676153706e-02 252 KSP Residual norm 1.715357344334e-02 253 KSP Residual norm 1.706981861601e-02 254 KSP Residual norm 1.696030869204e-02 255 KSP Residual norm 1.763423572015e-02 256 KSP Residual norm 1.829751997263e-02 257 KSP Residual norm 1.972223165522e-02 258 KSP Residual norm 2.130118969187e-02 259 KSP Residual norm 2.315722127923e-02 260 KSP Residual norm 2.684768883739e-02 261 KSP Residual norm 3.348163940639e-02 262 KSP Residual norm 4.199127730378e-02 263 KSP Residual norm 5.398653105045e-02 264 KSP Residual norm 6.821839369420e-02 265 KSP Residual norm 8.029032178293e-02 266 KSP Residual norm 8.815811508553e-02 267 KSP Residual norm 8.962694266764e-02 268 KSP Residual norm 8.065993997821e-02 269 KSP Residual norm 6.648938465331e-02 270 KSP Residual norm 5.345676431768e-02 271 KSP Residual norm 3.972291572909e-02 272 KSP Residual norm 2.946517479139e-02 273 KSP Residual norm 2.384222409654e-02 274 KSP Residual norm 2.191852135161e-02 275 KSP Residual norm 2.190584659264e-02 276 KSP Residual norm 2.207283053774e-02 277 KSP Residual norm 2.285429680636e-02 278 KSP Residual norm 2.042181640152e-02 279 KSP Residual norm 1.666191118634e-02 280 KSP Residual norm 1.233344503183e-02 281 KSP Residual norm 9.732943526464e-03 282 KSP Residual norm 8.231148322685e-03 283 KSP Residual norm 7.658532417128e-03 284 KSP Residual norm 7.159090758549e-03 285 KSP Residual norm 6.837740854101e-03 286 KSP Residual norm 6.771864567159e-03 287 KSP Residual norm 6.506633131436e-03 288 KSP Residual norm 6.552854756680e-03 289 KSP Residual norm 6.465354304817e-03 290 KSP Residual norm 6.348201270809e-03 291 KSP Residual norm 6.054566113069e-03 292 KSP Residual norm 5.368641816405e-03 293 KSP Residual norm 4.740328570452e-03 294 KSP Residual norm 4.455220605625e-03 295 KSP Residual norm 4.414378942258e-03 296 KSP Residual norm 4.365393288580e-03 297 KSP Residual norm 4.112409955106e-03 298 KSP Residual norm 3.683251112994e-03 299 KSP Residual norm 3.338614520090e-03 300 KSP Residual norm 3.099327570392e-03 301 KSP Residual norm 3.109917829137e-03 302 KSP Residual norm 3.004996253210e-03 303 KSP Residual norm 2.830520603323e-03 304 KSP Residual norm 2.748235291994e-03 305 KSP Residual norm 2.914293446395e-03 306 KSP Residual norm 2.780149150920e-03 307 KSP Residual norm 2.608542174674e-03 308 KSP Residual norm 2.404201656147e-03 309 KSP Residual norm 2.247889289592e-03 310 KSP Residual norm 2.359008329300e-03 311 KSP Residual norm 2.662652962160e-03 312 KSP Residual norm 2.917896860107e-03 313 KSP Residual norm 2.781060424836e-03 314 KSP Residual norm 2.499901538412e-03 315 KSP Residual norm 2.318409392919e-03 316 KSP Residual norm 2.237948977105e-03 317 KSP Residual norm 2.275842789015e-03 318 KSP Residual norm 2.095361798572e-03 319 KSP Residual norm 1.939363510796e-03 320 KSP Residual norm 1.824756647376e-03 321 KSP Residual norm 1.855785739942e-03 322 KSP Residual norm 1.876266124266e-03 323 KSP Residual norm 1.706083418355e-03 324 KSP Residual norm 1.504223272392e-03 325 KSP Residual norm 1.367003023878e-03 326 KSP Residual norm 1.325025357987e-03 327 KSP Residual norm 1.217685553397e-03 328 KSP Residual norm 1.147595137980e-03 329 KSP Residual norm 1.140757899810e-03 330 KSP Residual norm 1.161337539671e-03 331 KSP Residual norm 1.189634577655e-03 332 KSP Residual norm 1.156853032944e-03 333 KSP Residual norm 1.168543981069e-03 334 KSP Residual norm 1.255758863060e-03 335 KSP Residual norm 1.329900079847e-03 336 KSP Residual norm 1.277788599679e-03 337 KSP Residual norm 1.156283394140e-03 338 KSP Residual norm 1.170787202744e-03 339 KSP Residual norm 1.165756028629e-03 340 KSP Residual norm 1.149594359750e-03 341 KSP Residual norm 1.035743087984e-03 342 KSP Residual norm 9.055748010466e-04 343 KSP Residual norm 8.729284667762e-04 344 KSP Residual norm 9.123661560199e-04 345 KSP Residual norm 9.403865973752e-04 346 KSP Residual norm 9.397482648680e-04 347 KSP Residual norm 9.781230092460e-04 348 KSP Residual norm 1.028764754662e-03 349 KSP Residual norm 1.098227372587e-03 350 KSP Residual norm 1.131938817545e-03 351 KSP Residual norm 1.178183837363e-03 352 KSP Residual norm 1.219263604270e-03 353 KSP Residual norm 1.285550443119e-03 354 KSP Residual norm 1.314390656564e-03 355 KSP Residual norm 1.280155123264e-03 356 KSP Residual norm 1.337317413181e-03 357 KSP Residual norm 1.375269807467e-03 358 KSP Residual norm 1.339949505912e-03 359 KSP Residual norm 1.312858194973e-03 360 KSP Residual norm 1.302230401883e-03 361 KSP Residual norm 1.296953236606e-03 362 KSP Residual norm 1.241958131296e-03 363 KSP Residual norm 1.317399611984e-03 364 KSP Residual norm 1.404700632013e-03 365 KSP Residual norm 1.465377387279e-03 366 KSP Residual norm 1.449387099771e-03 367 KSP Residual norm 1.340755308031e-03 368 KSP Residual norm 1.276373010248e-03 369 KSP Residual norm 1.300492007224e-03 370 KSP Residual norm 1.305680420594e-03 371 KSP Residual norm 1.247563315763e-03 372 KSP Residual norm 1.215067995838e-03 373 KSP Residual norm 1.245842567847e-03 374 KSP Residual norm 1.248065264454e-03 375 KSP Residual norm 1.279964483758e-03 376 KSP Residual norm 1.296017171744e-03 377 KSP Residual norm 1.322475758719e-03 378 KSP Residual norm 1.346553928101e-03 379 KSP Residual norm 1.409217159475e-03 380 KSP Residual norm 1.427387037000e-03 381 KSP Residual norm 1.385982705722e-03 382 KSP Residual norm 1.326140475173e-03 383 KSP Residual norm 1.363313832696e-03 384 KSP Residual norm 1.399869841773e-03 385 KSP Residual norm 1.382389217698e-03 386 KSP Residual norm 1.354178848227e-03 387 KSP Residual norm 1.352742312886e-03 388 KSP Residual norm 1.353096597807e-03 389 KSP Residual norm 1.334145707497e-03 390 KSP Residual norm 1.244976876447e-03 391 KSP Residual norm 1.206076100038e-03 392 KSP Residual norm 1.261782193894e-03 393 KSP Residual norm 1.300016335362e-03 394 KSP Residual norm 1.220254929690e-03 395 KSP Residual norm 1.106128865061e-03 396 KSP Residual norm 1.046642239943e-03 397 KSP Residual norm 1.064825863444e-03 398 KSP Residual norm 1.076071139351e-03 399 KSP Residual norm 1.072418592011e-03 400 KSP Residual norm 1.054699683117e-03 401 KSP Residual norm 1.044433124340e-03 402 KSP Residual norm 1.067396312320e-03 403 KSP Residual norm 1.152053727618e-03 404 KSP Residual norm 1.214731081875e-03 405 KSP Residual norm 1.200559675103e-03 406 KSP Residual norm 1.118454110419e-03 407 KSP Residual norm 1.065613359694e-03 408 KSP Residual norm 1.059780664889e-03 409 KSP Residual norm 1.147890672875e-03 410 KSP Residual norm 1.160152942280e-03 411 KSP Residual norm 1.155457644175e-03 412 KSP Residual norm 1.218328095414e-03 413 KSP Residual norm 1.282417285128e-03 414 KSP Residual norm 1.290107982445e-03 415 KSP Residual norm 1.276638656415e-03 416 KSP Residual norm 1.278858623101e-03 417 KSP Residual norm 1.363772235297e-03 418 KSP Residual norm 1.419371458797e-03 419 KSP Residual norm 1.412832072037e-03 420 KSP Residual norm 1.365748169574e-03 421 KSP Residual norm 1.405720923862e-03 422 KSP Residual norm 1.453177134976e-03 423 KSP Residual norm 1.520454157083e-03 424 KSP Residual norm 1.446299931363e-03 425 KSP Residual norm 1.351877459723e-03 426 KSP Residual norm 1.327808945913e-03 427 KSP Residual norm 1.350351799488e-03 428 KSP Residual norm 1.385479038188e-03 429 KSP Residual norm 1.387796324528e-03 430 KSP Residual norm 1.384647149458e-03 431 KSP Residual norm 1.460355026546e-03 432 KSP Residual norm 1.480065918295e-03 433 KSP Residual norm 1.456790034607e-03 434 KSP Residual norm 1.429909079454e-03 435 KSP Residual norm 1.451310070774e-03 436 KSP Residual norm 1.406727345937e-03 437 KSP Residual norm 1.325991263019e-03 438 KSP Residual norm 1.246160382525e-03 439 KSP Residual norm 1.255009836241e-03 440 KSP Residual norm 1.303332052273e-03 441 KSP Residual norm 1.286598917010e-03 442 KSP Residual norm 1.239425544177e-03 443 KSP Residual norm 1.252515845760e-03 444 KSP Residual norm 1.294422403484e-03 445 KSP Residual norm 1.276765289333e-03 446 KSP Residual norm 1.244824813612e-03 447 KSP Residual norm 1.183245770405e-03 448 KSP Residual norm 1.153565086222e-03 449 KSP Residual norm 1.119489362997e-03 450 KSP Residual norm 1.117623170827e-03 451 KSP Residual norm 1.096029604095e-03 452 KSP Residual norm 1.055193076106e-03 453 KSP Residual norm 1.005504863389e-03 454 KSP Residual norm 9.941050424498e-04 455 KSP Residual norm 9.770049958018e-04 456 KSP Residual norm 9.092852704615e-04 457 KSP Residual norm 8.974990456384e-04 458 KSP Residual norm 9.012999264230e-04 459 KSP Residual norm 9.147555348631e-04 460 KSP Residual norm 9.349542493543e-04 461 KSP Residual norm 9.631165447434e-04 462 KSP Residual norm 9.927856223692e-04 463 KSP Residual norm 9.783830479125e-04 464 KSP Residual norm 9.818168561440e-04 465 KSP Residual norm 9.896615948644e-04 466 KSP Residual norm 1.006369854207e-03 467 KSP Residual norm 9.939846132237e-04 468 KSP Residual norm 9.767849618419e-04 469 KSP Residual norm 9.502666517985e-04 470 KSP Residual norm 9.627158476888e-04 471 KSP Residual norm 9.696220358260e-04 472 KSP Residual norm 9.555925574913e-04 473 KSP Residual norm 9.596753073191e-04 474 KSP Residual norm 9.778312668318e-04 475 KSP Residual norm 9.886046752730e-04 476 KSP Residual norm 9.764004519850e-04 477 KSP Residual norm 9.018124220522e-04 478 KSP Residual norm 8.857320067402e-04 479 KSP Residual norm 8.713510421022e-04 480 KSP Residual norm 8.969565417031e-04 481 KSP Residual norm 9.336233261242e-04 482 KSP Residual norm 9.433172095001e-04 483 KSP Residual norm 9.161931619946e-04 484 KSP Residual norm 9.038656681831e-04 485 KSP Residual norm 9.181623563365e-04 486 KSP Residual norm 9.069719514158e-04 487 KSP Residual norm 9.191716357195e-04 488 KSP Residual norm 9.206538113900e-04 489 KSP Residual norm 9.309524213734e-04 490 KSP Residual norm 9.715978975188e-04 491 KSP Residual norm 1.013562676274e-03 492 KSP Residual norm 1.009919531281e-03 493 KSP Residual norm 1.059499239989e-03 494 KSP Residual norm 1.076813253306e-03 495 KSP Residual norm 1.074668719575e-03 496 KSP Residual norm 1.133223873975e-03 497 KSP Residual norm 1.147944277888e-03 498 KSP Residual norm 1.192672858255e-03 499 KSP Residual norm 1.304195086532e-03 500 KSP Residual norm 1.302173161555e-03 501 KSP Residual norm 1.270177225081e-03 502 KSP Residual norm 1.216566257579e-03 503 KSP Residual norm 1.165110642117e-03 504 KSP Residual norm 1.195550821372e-03 505 KSP Residual norm 1.192631583738e-03 506 KSP Residual norm 1.237066709884e-03 507 KSP Residual norm 1.210893265197e-03 508 KSP Residual norm 1.078432807806e-03 509 KSP Residual norm 1.008083683026e-03 510 KSP Residual norm 1.056488978524e-03 511 KSP Residual norm 1.126142245496e-03 512 KSP Residual norm 1.128376995711e-03 513 KSP Residual norm 1.096525953864e-03 514 KSP Residual norm 1.095326824687e-03 515 KSP Residual norm 1.142282392403e-03 516 KSP Residual norm 1.139660427298e-03 517 KSP Residual norm 1.094055452431e-03 518 KSP Residual norm 1.063409470212e-03 519 KSP Residual norm 1.110273349622e-03 520 KSP Residual norm 1.202600760469e-03 521 KSP Residual norm 1.268411358991e-03 522 KSP Residual norm 1.277862861022e-03 523 KSP Residual norm 1.305649608547e-03 524 KSP Residual norm 1.340498031515e-03 525 KSP Residual norm 1.325384291903e-03 526 KSP Residual norm 1.304614031779e-03 527 KSP Residual norm 1.287973763995e-03 528 KSP Residual norm 1.267718986109e-03 529 KSP Residual norm 1.241471871921e-03 530 KSP Residual norm 1.269218895099e-03 531 KSP Residual norm 1.252827011848e-03 532 KSP Residual norm 1.206084773674e-03 533 KSP Residual norm 1.191593136725e-03 534 KSP Residual norm 1.164464769735e-03 535 KSP Residual norm 1.192334038324e-03 536 KSP Residual norm 1.219492933410e-03 537 KSP Residual norm 1.199743127274e-03 538 KSP Residual norm 1.248335920228e-03 539 KSP Residual norm 1.252673398232e-03 540 KSP Residual norm 1.246751345189e-03 541 KSP Residual norm 1.205530499378e-03 542 KSP Residual norm 1.195070824271e-03 543 KSP Residual norm 1.223501263517e-03 544 KSP Residual norm 1.262089895506e-03 545 KSP Residual norm 1.323061159399e-03 546 KSP Residual norm 1.349425603612e-03 547 KSP Residual norm 1.309128586794e-03 548 KSP Residual norm 1.302075761304e-03 549 KSP Residual norm 1.331125648600e-03 550 KSP Residual norm 1.353544159260e-03 551 KSP Residual norm 1.377107006558e-03 552 KSP Residual norm 1.349377941781e-03 553 KSP Residual norm 1.358605387046e-03 554 KSP Residual norm 1.436355572419e-03 555 KSP Residual norm 1.502607291761e-03 556 KSP Residual norm 1.449345666679e-03 557 KSP Residual norm 1.424783995975e-03 558 KSP Residual norm 1.427960551265e-03 559 KSP Residual norm 1.427466632801e-03 560 KSP Residual norm 1.481062651578e-03 561 KSP Residual norm 1.562158551958e-03 562 KSP Residual norm 1.598554803542e-03 563 KSP Residual norm 1.597557725759e-03 564 KSP Residual norm 1.659863389266e-03 565 KSP Residual norm 1.695117214430e-03 566 KSP Residual norm 1.751026900278e-03 567 KSP Residual norm 1.739353545363e-03 568 KSP Residual norm 1.817037903174e-03 569 KSP Residual norm 1.857665704497e-03 570 KSP Residual norm 1.782656590883e-03 571 KSP Residual norm 1.821065237934e-03 572 KSP Residual norm 1.812838460391e-03 573 KSP Residual norm 1.922829241083e-03 574 KSP Residual norm 1.960159350212e-03 575 KSP Residual norm 2.020917548665e-03 576 KSP Residual norm 2.001600983891e-03 577 KSP Residual norm 1.941586650750e-03 578 KSP Residual norm 2.010907353233e-03 579 KSP Residual norm 2.231140523179e-03 580 KSP Residual norm 2.374708165863e-03 581 KSP Residual norm 2.445724331607e-03 582 KSP Residual norm 2.451548247846e-03 583 KSP Residual norm 2.471622812581e-03 584 KSP Residual norm 2.546071482432e-03 585 KSP Residual norm 2.576192819880e-03 586 KSP Residual norm 2.684760642875e-03 587 KSP Residual norm 2.844454005347e-03 588 KSP Residual norm 2.891295774829e-03 589 KSP Residual norm 2.827164491452e-03 590 KSP Residual norm 2.978210284874e-03 591 KSP Residual norm 3.118874246324e-03 592 KSP Residual norm 3.226927540750e-03 593 KSP Residual norm 3.297702822485e-03 594 KSP Residual norm 3.269635997402e-03 595 KSP Residual norm 3.272395692426e-03 596 KSP Residual norm 3.127220033345e-03 597 KSP Residual norm 3.002089958033e-03 598 KSP Residual norm 2.887952193552e-03 599 KSP Residual norm 2.789545707468e-03 600 KSP Residual norm 2.728822784045e-03 601 KSP Residual norm 2.586721450953e-03 602 KSP Residual norm 2.456861877398e-03 603 KSP Residual norm 2.270195608825e-03 604 KSP Residual norm 2.234015372155e-03 605 KSP Residual norm 2.271006954022e-03 606 KSP Residual norm 2.274709919172e-03 607 KSP Residual norm 2.319140505599e-03 608 KSP Residual norm 2.371304192676e-03 609 KSP Residual norm 2.381183182868e-03 610 KSP Residual norm 2.473342013503e-03 611 KSP Residual norm 2.668702366905e-03 612 KSP Residual norm 2.840263324868e-03 613 KSP Residual norm 2.887342238157e-03 614 KSP Residual norm 2.945717643492e-03 615 KSP Residual norm 2.954708258451e-03 616 KSP Residual norm 2.782576372141e-03 617 KSP Residual norm 2.604890421262e-03 618 KSP Residual norm 2.554034090005e-03 619 KSP Residual norm 2.623418706237e-03 620 KSP Residual norm 2.561484842595e-03 621 KSP Residual norm 2.393195287692e-03 622 KSP Residual norm 2.160609863780e-03 623 KSP Residual norm 1.979602551998e-03 624 KSP Residual norm 1.963931478777e-03 625 KSP Residual norm 1.948555791659e-03 626 KSP Residual norm 2.017840029712e-03 627 KSP Residual norm 2.046199056507e-03 628 KSP Residual norm 1.993108418353e-03 629 KSP Residual norm 1.974961947376e-03 630 KSP Residual norm 2.061584360663e-03 631 KSP Residual norm 2.054957286204e-03 632 KSP Residual norm 1.990859027025e-03 633 KSP Residual norm 2.041368256010e-03 634 KSP Residual norm 2.058655836834e-03 635 KSP Residual norm 2.099752038269e-03 636 KSP Residual norm 2.131260502055e-03 637 KSP Residual norm 2.152145468088e-03 638 KSP Residual norm 2.186736970547e-03 639 KSP Residual norm 2.226068625504e-03 640 KSP Residual norm 2.414508229979e-03 641 KSP Residual norm 2.641630675320e-03 642 KSP Residual norm 2.875761700403e-03 643 KSP Residual norm 3.128751719400e-03 644 KSP Residual norm 3.218863877361e-03 645 KSP Residual norm 3.286440896976e-03 646 KSP Residual norm 3.573443161029e-03 647 KSP Residual norm 3.886465854832e-03 648 KSP Residual norm 3.996486091627e-03 649 KSP Residual norm 3.952223090674e-03 650 KSP Residual norm 4.148696249293e-03 651 KSP Residual norm 4.079176608710e-03 652 KSP Residual norm 3.918952724596e-03 653 KSP Residual norm 3.923035202176e-03 654 KSP Residual norm 3.944710633091e-03 655 KSP Residual norm 3.857676629281e-03 656 KSP Residual norm 3.736280082312e-03 657 KSP Residual norm 3.793661571477e-03 658 KSP Residual norm 3.873896125551e-03 659 KSP Residual norm 3.877810193883e-03 660 KSP Residual norm 4.169433784065e-03 661 KSP Residual norm 4.323131896838e-03 662 KSP Residual norm 4.296835496769e-03 663 KSP Residual norm 4.207135380912e-03 664 KSP Residual norm 4.031582293437e-03 665 KSP Residual norm 4.218523456173e-03 666 KSP Residual norm 4.529317974038e-03 667 KSP Residual norm 4.934835798003e-03 668 KSP Residual norm 5.301936631949e-03 669 KSP Residual norm 5.277983685620e-03 670 KSP Residual norm 5.019195591950e-03 671 KSP Residual norm 4.851201693366e-03 672 KSP Residual norm 4.734902022510e-03 673 KSP Residual norm 4.976852212368e-03 674 KSP Residual norm 5.193092769362e-03 675 KSP Residual norm 5.074930831870e-03 676 KSP Residual norm 5.032785592826e-03 677 KSP Residual norm 5.158280095896e-03 678 KSP Residual norm 5.322468916220e-03 679 KSP Residual norm 5.634887770503e-03 680 KSP Residual norm 5.909765459769e-03 681 KSP Residual norm 6.020179701853e-03 682 KSP Residual norm 5.783216033479e-03 683 KSP Residual norm 5.624748149217e-03 684 KSP Residual norm 5.501514634248e-03 685 KSP Residual norm 5.696721575211e-03 686 KSP Residual norm 5.715606984051e-03 687 KSP Residual norm 5.562257632383e-03 688 KSP Residual norm 5.544174264842e-03 689 KSP Residual norm 5.485930659494e-03 690 KSP Residual norm 4.956546863206e-03 691 KSP Residual norm 4.632123369208e-03 692 KSP Residual norm 4.604542140537e-03 693 KSP Residual norm 4.866250739707e-03 694 KSP Residual norm 4.773117405448e-03 695 KSP Residual norm 4.708889265211e-03 696 KSP Residual norm 4.918986654095e-03 697 KSP Residual norm 5.108781269711e-03 698 KSP Residual norm 4.924217059739e-03 699 KSP Residual norm 4.618712978303e-03 700 KSP Residual norm 4.540370025483e-03 701 KSP Residual norm 4.648699130247e-03 702 KSP Residual norm 4.530699767673e-03 703 KSP Residual norm 4.208925421903e-03 704 KSP Residual norm 4.188225308855e-03 705 KSP Residual norm 4.422248684790e-03 706 KSP Residual norm 4.568392073765e-03 707 KSP Residual norm 4.616808828879e-03 708 KSP Residual norm 4.588543206558e-03 709 KSP Residual norm 4.639488790088e-03 710 KSP Residual norm 4.444272664026e-03 711 KSP Residual norm 4.234628347681e-03 712 KSP Residual norm 4.127758383775e-03 713 KSP Residual norm 4.226146707318e-03 714 KSP Residual norm 4.315745072182e-03 715 KSP Residual norm 4.414551383475e-03 716 KSP Residual norm 4.526361096101e-03 717 KSP Residual norm 4.821404255311e-03 718 KSP Residual norm 5.048407023781e-03 719 KSP Residual norm 4.865432067942e-03 720 KSP Residual norm 4.736673154911e-03 721 KSP Residual norm 4.844045662115e-03 722 KSP Residual norm 5.025216696831e-03 723 KSP Residual norm 5.063807199625e-03 724 KSP Residual norm 5.199316362538e-03 725 KSP Residual norm 5.073180119090e-03 726 KSP Residual norm 4.867937378468e-03 727 KSP Residual norm 4.622519181293e-03 728 KSP Residual norm 4.301126032899e-03 729 KSP Residual norm 4.102680159217e-03 730 KSP Residual norm 4.182928190220e-03 731 KSP Residual norm 4.286367784157e-03 732 KSP Residual norm 4.391122318785e-03 733 KSP Residual norm 4.595538944216e-03 734 KSP Residual norm 4.706566736553e-03 735 KSP Residual norm 4.503865315399e-03 736 KSP Residual norm 4.316431285229e-03 737 KSP Residual norm 4.430407108087e-03 738 KSP Residual norm 4.506848843703e-03 739 KSP Residual norm 4.433850111061e-03 740 KSP Residual norm 4.380771081996e-03 741 KSP Residual norm 4.222631885474e-03 742 KSP Residual norm 4.147771849579e-03 743 KSP Residual norm 4.365639718295e-03 744 KSP Residual norm 4.717559413792e-03 745 KSP Residual norm 4.814102031643e-03 746 KSP Residual norm 4.664000257623e-03 747 KSP Residual norm 4.828979911030e-03 748 KSP Residual norm 4.936597793490e-03 749 KSP Residual norm 4.861591122631e-03 750 KSP Residual norm 4.672726595736e-03 751 KSP Residual norm 4.922548149667e-03 752 KSP Residual norm 5.229361787727e-03 753 KSP Residual norm 5.249145190642e-03 754 KSP Residual norm 4.948681226659e-03 755 KSP Residual norm 4.630066570005e-03 756 KSP Residual norm 4.599738801059e-03 757 KSP Residual norm 4.929083792314e-03 758 KSP Residual norm 5.566633954187e-03 759 KSP Residual norm 5.769661533521e-03 760 KSP Residual norm 5.644415061329e-03 761 KSP Residual norm 5.704979658186e-03 762 KSP Residual norm 5.491941722453e-03 763 KSP Residual norm 5.126667613508e-03 764 KSP Residual norm 4.851766976706e-03 765 KSP Residual norm 4.811919456483e-03 766 KSP Residual norm 4.907107900783e-03 767 KSP Residual norm 5.248085780992e-03 768 KSP Residual norm 5.489611963591e-03 769 KSP Residual norm 5.983835818817e-03 770 KSP Residual norm 6.525916643132e-03 771 KSP Residual norm 6.778487524504e-03 772 KSP Residual norm 6.669372553854e-03 773 KSP Residual norm 6.135866213372e-03 774 KSP Residual norm 5.783129247859e-03 775 KSP Residual norm 5.784201879151e-03 776 KSP Residual norm 5.741559179621e-03 777 KSP Residual norm 5.703354544397e-03 778 KSP Residual norm 5.482250606638e-03 779 KSP Residual norm 5.310252890419e-03 780 KSP Residual norm 5.335711864097e-03 781 KSP Residual norm 5.313727521953e-03 782 KSP Residual norm 5.298550285404e-03 783 KSP Residual norm 5.277860914991e-03 784 KSP Residual norm 5.562270366961e-03 785 KSP Residual norm 5.773676284701e-03 786 KSP Residual norm 5.937828098629e-03 787 KSP Residual norm 5.933205595583e-03 788 KSP Residual norm 5.706620882406e-03 789 KSP Residual norm 5.746541406510e-03 790 KSP Residual norm 5.682632330616e-03 791 KSP Residual norm 5.718373329312e-03 792 KSP Residual norm 5.858127842517e-03 793 KSP Residual norm 6.036029033969e-03 794 KSP Residual norm 5.937362017402e-03 795 KSP Residual norm 5.662821329724e-03 796 KSP Residual norm 5.570868424837e-03 797 KSP Residual norm 5.736255872605e-03 798 KSP Residual norm 5.718267824764e-03 799 KSP Residual norm 5.523708236915e-03 800 KSP Residual norm 5.260143665610e-03 801 KSP Residual norm 5.273276972924e-03 802 KSP Residual norm 5.326177610156e-03 803 KSP Residual norm 5.470489474939e-03 804 KSP Residual norm 5.563660821212e-03 805 KSP Residual norm 5.450467821544e-03 806 KSP Residual norm 5.257754749895e-03 807 KSP Residual norm 5.173273041024e-03 808 KSP Residual norm 5.173399710786e-03 809 KSP Residual norm 5.129745167935e-03 810 KSP Residual norm 4.990486065819e-03 811 KSP Residual norm 4.752904983010e-03 812 KSP Residual norm 4.620248446524e-03 813 KSP Residual norm 4.643793500577e-03 814 KSP Residual norm 4.670321867182e-03 815 KSP Residual norm 4.617087182743e-03 816 KSP Residual norm 4.700188292896e-03 817 KSP Residual norm 4.605612711004e-03 818 KSP Residual norm 4.383306226908e-03 819 KSP Residual norm 4.491049783797e-03 820 KSP Residual norm 4.641695104821e-03 821 KSP Residual norm 4.728778395632e-03 822 KSP Residual norm 4.726685398043e-03 823 KSP Residual norm 4.719323445919e-03 824 KSP Residual norm 4.397459455653e-03 825 KSP Residual norm 4.202202803142e-03 826 KSP Residual norm 4.187108103326e-03 827 KSP Residual norm 4.413636897633e-03 828 KSP Residual norm 4.294243790417e-03 829 KSP Residual norm 4.237713512230e-03 830 KSP Residual norm 4.230817171428e-03 831 KSP Residual norm 4.361317085853e-03 832 KSP Residual norm 4.322420049048e-03 833 KSP Residual norm 4.333158250701e-03 834 KSP Residual norm 4.272257205512e-03 835 KSP Residual norm 4.166208845377e-03 836 KSP Residual norm 4.276427624622e-03 837 KSP Residual norm 4.262789381603e-03 838 KSP Residual norm 4.390849186044e-03 839 KSP Residual norm 4.392670055004e-03 840 KSP Residual norm 4.197153591994e-03 841 KSP Residual norm 3.985740589991e-03 842 KSP Residual norm 3.751916382322e-03 843 KSP Residual norm 3.744476290366e-03 844 KSP Residual norm 3.703285656437e-03 845 KSP Residual norm 3.681674911257e-03 846 KSP Residual norm 3.665749856416e-03 847 KSP Residual norm 3.795682207139e-03 848 KSP Residual norm 3.879211906228e-03 849 KSP Residual norm 3.904866739752e-03 850 KSP Residual norm 4.053463668093e-03 851 KSP Residual norm 4.310206873307e-03 852 KSP Residual norm 4.398468828346e-03 853 KSP Residual norm 4.303875562632e-03 854 KSP Residual norm 4.281086768268e-03 855 KSP Residual norm 4.381377636333e-03 856 KSP Residual norm 4.408508579163e-03 857 KSP Residual norm 4.261455674502e-03 858 KSP Residual norm 4.085046370784e-03 859 KSP Residual norm 3.962246614559e-03 860 KSP Residual norm 4.017324998468e-03 861 KSP Residual norm 4.080511395918e-03 862 KSP Residual norm 4.036490084737e-03 863 KSP Residual norm 3.937069305169e-03 864 KSP Residual norm 3.707986416453e-03 865 KSP Residual norm 3.777951030973e-03 866 KSP Residual norm 3.974784104611e-03 867 KSP Residual norm 4.188430303811e-03 868 KSP Residual norm 4.219085975362e-03 869 KSP Residual norm 4.145977225117e-03 870 KSP Residual norm 4.181287665007e-03 871 KSP Residual norm 4.195306731661e-03 872 KSP Residual norm 4.174729054987e-03 873 KSP Residual norm 4.178829217510e-03 874 KSP Residual norm 4.152426882362e-03 875 KSP Residual norm 4.337268072075e-03 876 KSP Residual norm 4.575096886853e-03 877 KSP Residual norm 4.540538034903e-03 878 KSP Residual norm 4.689722606122e-03 879 KSP Residual norm 5.092002143298e-03 880 KSP Residual norm 5.426101401836e-03 881 KSP Residual norm 5.319771478453e-03 882 KSP Residual norm 5.065400101422e-03 883 KSP Residual norm 4.987627471669e-03 884 KSP Residual norm 5.231010313901e-03 885 KSP Residual norm 5.697968184051e-03 886 KSP Residual norm 5.984576951654e-03 887 KSP Residual norm 5.890014849194e-03 888 KSP Residual norm 5.765833588141e-03 889 KSP Residual norm 5.831964254872e-03 890 KSP Residual norm 5.804212026051e-03 891 KSP Residual norm 5.453299776526e-03 892 KSP Residual norm 5.581785894480e-03 893 KSP Residual norm 5.914960551582e-03 894 KSP Residual norm 6.215274237221e-03 895 KSP Residual norm 6.380147448218e-03 896 KSP Residual norm 6.434006988794e-03 897 KSP Residual norm 6.956152793164e-03 898 KSP Residual norm 7.569943236908e-03 899 KSP Residual norm 7.196213063372e-03 900 KSP Residual norm 6.669265142295e-03 901 KSP Residual norm 6.291602518634e-03 902 KSP Residual norm 6.219791968697e-03 903 KSP Residual norm 6.338503625427e-03 904 KSP Residual norm 6.424207519846e-03 905 KSP Residual norm 6.358795522748e-03 906 KSP Residual norm 6.065467535689e-03 907 KSP Residual norm 5.900664281144e-03 908 KSP Residual norm 5.842698544577e-03 909 KSP Residual norm 5.930779381444e-03 910 KSP Residual norm 5.701726896468e-03 911 KSP Residual norm 5.451726110990e-03 912 KSP Residual norm 5.358715189907e-03 913 KSP Residual norm 5.074947630587e-03 914 KSP Residual norm 4.879003170423e-03 915 KSP Residual norm 4.508233945089e-03 916 KSP Residual norm 4.443694702769e-03 917 KSP Residual norm 4.613572364187e-03 918 KSP Residual norm 4.738259183511e-03 919 KSP Residual norm 4.791155625810e-03 920 KSP Residual norm 4.974137880502e-03 921 KSP Residual norm 5.104963370756e-03 922 KSP Residual norm 5.161325592815e-03 923 KSP Residual norm 5.066610625764e-03 924 KSP Residual norm 4.859808367323e-03 925 KSP Residual norm 4.759496591612e-03 926 KSP Residual norm 4.877206386720e-03 927 KSP Residual norm 5.057897513934e-03 928 KSP Residual norm 5.111313050571e-03 929 KSP Residual norm 5.389458752792e-03 930 KSP Residual norm 5.737138995252e-03 931 KSP Residual norm 5.998194270249e-03 932 KSP Residual norm 6.146434423575e-03 933 KSP Residual norm 6.569709275569e-03 934 KSP Residual norm 6.962879787477e-03 935 KSP Residual norm 7.372751593413e-03 936 KSP Residual norm 7.516081815966e-03 937 KSP Residual norm 7.345681334036e-03 938 KSP Residual norm 7.387023502760e-03 939 KSP Residual norm 7.751916785347e-03 940 KSP Residual norm 8.124853985422e-03 941 KSP Residual norm 8.161143065742e-03 942 KSP Residual norm 8.067366672832e-03 943 KSP Residual norm 8.229867127952e-03 944 KSP Residual norm 8.853793072529e-03 945 KSP Residual norm 9.269567121478e-03 946 KSP Residual norm 8.751506073752e-03 947 KSP Residual norm 7.938945593248e-03 948 KSP Residual norm 7.797694446958e-03 949 KSP Residual norm 7.929228242394e-03 950 KSP Residual norm 7.719452507282e-03 951 KSP Residual norm 6.832883878296e-03 952 KSP Residual norm 6.443643850214e-03 953 KSP Residual norm 6.295045856499e-03 954 KSP Residual norm 6.150883178803e-03 955 KSP Residual norm 6.074214553054e-03 956 KSP Residual norm 6.034663057893e-03 957 KSP Residual norm 6.236959412573e-03 958 KSP Residual norm 6.307983143271e-03 959 KSP Residual norm 6.337338786166e-03 960 KSP Residual norm 6.424917217463e-03 961 KSP Residual norm 6.797528340310e-03 962 KSP Residual norm 6.747058303445e-03 963 KSP Residual norm 6.474594117430e-03 964 KSP Residual norm 5.993114301998e-03 965 KSP Residual norm 5.624010238615e-03 966 KSP Residual norm 5.593720784244e-03 967 KSP Residual norm 5.589581870468e-03 968 KSP Residual norm 5.603005508742e-03 969 KSP Residual norm 5.810391484899e-03 970 KSP Residual norm 6.194263875096e-03 971 KSP Residual norm 6.136652736529e-03 972 KSP Residual norm 6.283690392221e-03 973 KSP Residual norm 6.519842963053e-03 974 KSP Residual norm 6.895802471345e-03 975 KSP Residual norm 6.992540210060e-03 976 KSP Residual norm 6.948142695474e-03 977 KSP Residual norm 6.902910092755e-03 978 KSP Residual norm 6.810624040762e-03 979 KSP Residual norm 6.933177851430e-03 980 KSP Residual norm 7.308598793955e-03 981 KSP Residual norm 7.517714501679e-03 982 KSP Residual norm 7.832519795887e-03 983 KSP Residual norm 7.864611364591e-03 984 KSP Residual norm 7.684466785193e-03 985 KSP Residual norm 7.640868812113e-03 986 KSP Residual norm 7.935507353152e-03 987 KSP Residual norm 8.163967874078e-03 988 KSP Residual norm 8.055085011071e-03 989 KSP Residual norm 7.676106656276e-03 990 KSP Residual norm 7.230936801529e-03 991 KSP Residual norm 6.951908145572e-03 992 KSP Residual norm 6.802094733110e-03 993 KSP Residual norm 6.683949692399e-03 994 KSP Residual norm 6.336700308828e-03 995 KSP Residual norm 6.244222851116e-03 996 KSP Residual norm 6.304898269266e-03 997 KSP Residual norm 6.330769690003e-03 998 KSP Residual norm 6.228526816184e-03 999 KSP Residual norm 6.236334612293e-03 1000 KSP Residual norm 6.082543349162e-03 1001 KSP Residual norm 5.951192334012e-03 1002 KSP Residual norm 5.896320518952e-03 1003 KSP Residual norm 5.893906943159e-03 1004 KSP Residual norm 6.189911727147e-03 1005 KSP Residual norm 6.298401986625e-03 1006 KSP Residual norm 6.394011235232e-03 1007 KSP Residual norm 6.515404444493e-03 1008 KSP Residual norm 6.720948446026e-03 1009 KSP Residual norm 6.589878288593e-03 1010 KSP Residual norm 6.152351062045e-03 1011 KSP Residual norm 5.969266592480e-03 1012 KSP Residual norm 6.170609409981e-03 1013 KSP Residual norm 6.469621506901e-03 1014 KSP Residual norm 6.459914206986e-03 1015 KSP Residual norm 6.593668533056e-03 1016 KSP Residual norm 6.622514784875e-03 1017 KSP Residual norm 6.656735647554e-03 1018 KSP Residual norm 6.751257601391e-03 1019 KSP Residual norm 6.988646562368e-03 1020 KSP Residual norm 6.905687803374e-03 1021 KSP Residual norm 6.655272945212e-03 1022 KSP Residual norm 6.777996924187e-03 1023 KSP Residual norm 6.947630339348e-03 1024 KSP Residual norm 6.950024896731e-03 1025 KSP Residual norm 6.581465729748e-03 1026 KSP Residual norm 6.623424889169e-03 1027 KSP Residual norm 6.544497469118e-03 1028 KSP Residual norm 6.765260623953e-03 1029 KSP Residual norm 6.915449248189e-03 1030 KSP Residual norm 6.372236580133e-03 1031 KSP Residual norm 5.953128974582e-03 1032 KSP Residual norm 6.015530242281e-03 1033 KSP Residual norm 5.943168745891e-03 1034 KSP Residual norm 5.596915041857e-03 1035 KSP Residual norm 5.408257600816e-03 1036 KSP Residual norm 5.462516314601e-03 1037 KSP Residual norm 5.452288052399e-03 1038 KSP Residual norm 5.531964783471e-03 1039 KSP Residual norm 5.396516226761e-03 1040 KSP Residual norm 4.956288530837e-03 1041 KSP Residual norm 4.675443151148e-03 1042 KSP Residual norm 4.634914726898e-03 1043 KSP Residual norm 4.719420543451e-03 1044 KSP Residual norm 4.878425027179e-03 1045 KSP Residual norm 4.992200285686e-03 1046 KSP Residual norm 4.883623561089e-03 1047 KSP Residual norm 4.902805121842e-03 1048 KSP Residual norm 5.038729498491e-03 1049 KSP Residual norm 5.137707181042e-03 1050 KSP Residual norm 4.830803807506e-03 1051 KSP Residual norm 4.320867223425e-03 1052 KSP Residual norm 4.142120709162e-03 1053 KSP Residual norm 4.298950610032e-03 1054 KSP Residual norm 4.467252788161e-03 1055 KSP Residual norm 4.522559064521e-03 1056 KSP Residual norm 4.761306312461e-03 1057 KSP Residual norm 5.150319637346e-03 1058 KSP Residual norm 5.322174446922e-03 1059 KSP Residual norm 5.238802254595e-03 1060 KSP Residual norm 5.061485394297e-03 1061 KSP Residual norm 4.909992936371e-03 1062 KSP Residual norm 4.685635154322e-03 1063 KSP Residual norm 4.549642150475e-03 1064 KSP Residual norm 4.315222756062e-03 1065 KSP Residual norm 4.281712726966e-03 1066 KSP Residual norm 4.628655632869e-03 1067 KSP Residual norm 5.104523083976e-03 1068 KSP Residual norm 5.189322907695e-03 1069 KSP Residual norm 5.221327773111e-03 1070 KSP Residual norm 5.319418446647e-03 1071 KSP Residual norm 5.285463126603e-03 1072 KSP Residual norm 5.180630969047e-03 1073 KSP Residual norm 5.214370896858e-03 1074 KSP Residual norm 5.560363588384e-03 1075 KSP Residual norm 5.820058047521e-03 1076 KSP Residual norm 6.161044287324e-03 1077 KSP Residual norm 6.134595772506e-03 1078 KSP Residual norm 6.162806832956e-03 1079 KSP Residual norm 6.015504687689e-03 1080 KSP Residual norm 5.965365763166e-03 1081 KSP Residual norm 5.925516590688e-03 1082 KSP Residual norm 6.050030721362e-03 1083 KSP Residual norm 6.035467347951e-03 1084 KSP Residual norm 5.840636688796e-03 1085 KSP Residual norm 5.974492387850e-03 1086 KSP Residual norm 6.306378068772e-03 1087 KSP Residual norm 6.223532094991e-03 1088 KSP Residual norm 5.978102326085e-03 1089 KSP Residual norm 6.032699383338e-03 1090 KSP Residual norm 5.963182048539e-03 1091 KSP Residual norm 5.845853825071e-03 1092 KSP Residual norm 5.703606769746e-03 1093 KSP Residual norm 5.589634794580e-03 1094 KSP Residual norm 5.945482515465e-03 1095 KSP Residual norm 6.096712270725e-03 1096 KSP Residual norm 5.727527640483e-03 1097 KSP Residual norm 5.463535057390e-03 1098 KSP Residual norm 5.621914974956e-03 1099 KSP Residual norm 5.892077814275e-03 1100 KSP Residual norm 6.139083451418e-03 1101 KSP Residual norm 5.740656746572e-03 1102 KSP Residual norm 5.328292130255e-03 1103 KSP Residual norm 5.258243410082e-03 1104 KSP Residual norm 5.294375311027e-03 1105 KSP Residual norm 5.457460865136e-03 1106 KSP Residual norm 5.774135867417e-03 1107 KSP Residual norm 6.231248846901e-03 1108 KSP Residual norm 6.634947482888e-03 1109 KSP Residual norm 6.915589060378e-03 1110 KSP Residual norm 6.842662490898e-03 1111 KSP Residual norm 6.653520411095e-03 1112 KSP Residual norm 6.507144896527e-03 1113 KSP Residual norm 6.203930956071e-03 1114 KSP Residual norm 6.190651197615e-03 1115 KSP Residual norm 6.443233750548e-03 1116 KSP Residual norm 6.846407285740e-03 1117 KSP Residual norm 6.813516923484e-03 1118 KSP Residual norm 6.586346121190e-03 1119 KSP Residual norm 6.716516964440e-03 1120 KSP Residual norm 6.750677507897e-03 1121 KSP Residual norm 6.650235124616e-03 1122 KSP Residual norm 6.340073963061e-03 1123 KSP Residual norm 6.442977105372e-03 1124 KSP Residual norm 6.607485326192e-03 1125 KSP Residual norm 6.654648159593e-03 1126 KSP Residual norm 6.390913213498e-03 1127 KSP Residual norm 6.139694331253e-03 1128 KSP Residual norm 5.975839996696e-03 1129 KSP Residual norm 6.181452135028e-03 1130 KSP Residual norm 6.483924116115e-03 1131 KSP Residual norm 6.244388643648e-03 1132 KSP Residual norm 5.726665913960e-03 1133 KSP Residual norm 5.548101276250e-03 1134 KSP Residual norm 5.670483843427e-03 1135 KSP Residual norm 5.815484357897e-03 1136 KSP Residual norm 5.836656679452e-03 1137 KSP Residual norm 6.126786438304e-03 1138 KSP Residual norm 6.131637697385e-03 1139 KSP Residual norm 5.939461551154e-03 1140 KSP Residual norm 5.899863029397e-03 1141 KSP Residual norm 5.879249371999e-03 1142 KSP Residual norm 5.733033436179e-03 1143 KSP Residual norm 5.810084705179e-03 1144 KSP Residual norm 6.192124782418e-03 1145 KSP Residual norm 6.198880886066e-03 1146 KSP Residual norm 6.126373924098e-03 1147 KSP Residual norm 6.202550295473e-03 1148 KSP Residual norm 6.619257825958e-03 1149 KSP Residual norm 6.958254861449e-03 1150 KSP Residual norm 6.903498120753e-03 1151 KSP Residual norm 6.853397498305e-03 1152 KSP Residual norm 6.783288123518e-03 1153 KSP Residual norm 6.684459902594e-03 1154 KSP Residual norm 7.000778316486e-03 1155 KSP Residual norm 6.912436408032e-03 1156 KSP Residual norm 6.819457534321e-03 1157 KSP Residual norm 6.677222508219e-03 1158 KSP Residual norm 6.455952608254e-03 1159 KSP Residual norm 6.346410790821e-03 1160 KSP Residual norm 6.331219961493e-03 1161 KSP Residual norm 6.640611156020e-03 1162 KSP Residual norm 6.790349677284e-03 1163 KSP Residual norm 6.332263850028e-03 1164 KSP Residual norm 5.816647448194e-03 1165 KSP Residual norm 5.697215514246e-03 1166 KSP Residual norm 6.016051161797e-03 1167 KSP Residual norm 6.340316880162e-03 1168 KSP Residual norm 6.101820359558e-03 1169 KSP Residual norm 5.802519263927e-03 1170 KSP Residual norm 5.442038846680e-03 1171 KSP Residual norm 4.842404703879e-03 1172 KSP Residual norm 4.307367171043e-03 1173 KSP Residual norm 4.312710721844e-03 1174 KSP Residual norm 4.615300758975e-03 1175 KSP Residual norm 4.835514304094e-03 1176 KSP Residual norm 4.971895126694e-03 1177 KSP Residual norm 4.922519270854e-03 1178 KSP Residual norm 5.027507820926e-03 1179 KSP Residual norm 5.381926369022e-03 1180 KSP Residual norm 5.924644512727e-03 1181 KSP Residual norm 5.891518828823e-03 1182 KSP Residual norm 5.675322830638e-03 1183 KSP Residual norm 5.488681315201e-03 1184 KSP Residual norm 5.464539823754e-03 1185 KSP Residual norm 5.191120797280e-03 1186 KSP Residual norm 5.036484482861e-03 1187 KSP Residual norm 5.514703417422e-03 1188 KSP Residual norm 6.021271127233e-03 1189 KSP Residual norm 6.102166452512e-03 1190 KSP Residual norm 5.759277572695e-03 1191 KSP Residual norm 5.365159664280e-03 1192 KSP Residual norm 4.831505966819e-03 1193 KSP Residual norm 4.669862169270e-03 1194 KSP Residual norm 4.619647530157e-03 1195 KSP Residual norm 4.692323156185e-03 1196 KSP Residual norm 4.827179343472e-03 1197 KSP Residual norm 4.722460358927e-03 1198 KSP Residual norm 4.457709639624e-03 1199 KSP Residual norm 4.232960701882e-03 1200 KSP Residual norm 3.913306868682e-03 1201 KSP Residual norm 3.938209539604e-03 1202 KSP Residual norm 4.209949450264e-03 1203 KSP Residual norm 4.348061906313e-03 1204 KSP Residual norm 4.309578170217e-03 1205 KSP Residual norm 4.215557309157e-03 1206 KSP Residual norm 4.328624858550e-03 1207 KSP Residual norm 4.679769902461e-03 1208 KSP Residual norm 4.772875673194e-03 1209 KSP Residual norm 4.840061492543e-03 1210 KSP Residual norm 4.894836284673e-03 1211 KSP Residual norm 5.106793066553e-03 1212 KSP Residual norm 5.419370252828e-03 1213 KSP Residual norm 5.602897497417e-03 1214 KSP Residual norm 5.620211820518e-03 1215 KSP Residual norm 5.541580410610e-03 1216 KSP Residual norm 5.377318358986e-03 1217 KSP Residual norm 5.355932724490e-03 1218 KSP Residual norm 5.172556245287e-03 1219 KSP Residual norm 5.195067873454e-03 1220 KSP Residual norm 5.411953098667e-03 1221 KSP Residual norm 5.542498186585e-03 1222 KSP Residual norm 5.699400248498e-03 1223 KSP Residual norm 6.083526194246e-03 1224 KSP Residual norm 6.776632040042e-03 1225 KSP Residual norm 6.591245482881e-03 1226 KSP Residual norm 6.056789615984e-03 1227 KSP Residual norm 5.620002002496e-03 1228 KSP Residual norm 5.511076945362e-03 1229 KSP Residual norm 5.403195902175e-03 1230 KSP Residual norm 4.979935000628e-03 1231 KSP Residual norm 4.573086112890e-03 1232 KSP Residual norm 4.459743381228e-03 1233 KSP Residual norm 4.372845880899e-03 1234 KSP Residual norm 4.302849623364e-03 1235 KSP Residual norm 4.111474668947e-03 1236 KSP Residual norm 3.979685522244e-03 1237 KSP Residual norm 4.065333685994e-03 1238 KSP Residual norm 4.212503344253e-03 1239 KSP Residual norm 4.255606705877e-03 1240 KSP Residual norm 4.410395450204e-03 1241 KSP Residual norm 4.645881211501e-03 1242 KSP Residual norm 4.396305934514e-03 1243 KSP Residual norm 3.795604675090e-03 1244 KSP Residual norm 3.382639010574e-03 1245 KSP Residual norm 3.263719609938e-03 1246 KSP Residual norm 3.292309527915e-03 1247 KSP Residual norm 3.454794521972e-03 1248 KSP Residual norm 3.756456407396e-03 1249 KSP Residual norm 4.129318294708e-03 1250 KSP Residual norm 4.188513748578e-03 1251 KSP Residual norm 4.333470426178e-03 1252 KSP Residual norm 4.534784618524e-03 1253 KSP Residual norm 4.662683091238e-03 1254 KSP Residual norm 4.745039771543e-03 1255 KSP Residual norm 5.036284709742e-03 1256 KSP Residual norm 5.140546926157e-03 1257 KSP Residual norm 5.026826636179e-03 1258 KSP Residual norm 5.010636800537e-03 1259 KSP Residual norm 5.316707900383e-03 1260 KSP Residual norm 5.326368822382e-03 1261 KSP Residual norm 5.266328100617e-03 1262 KSP Residual norm 5.234593905012e-03 1263 KSP Residual norm 5.186181958411e-03 1264 KSP Residual norm 4.808342735265e-03 1265 KSP Residual norm 4.556387656805e-03 1266 KSP Residual norm 4.549331733816e-03 1267 KSP Residual norm 4.669412804075e-03 1268 KSP Residual norm 4.574004829108e-03 1269 KSP Residual norm 4.340970646999e-03 1270 KSP Residual norm 4.319369679024e-03 1271 KSP Residual norm 4.578968060174e-03 1272 KSP Residual norm 4.846032166389e-03 1273 KSP Residual norm 4.746368789336e-03 1274 KSP Residual norm 4.609013741009e-03 1275 KSP Residual norm 4.708151412562e-03 1276 KSP Residual norm 4.995241581273e-03 1277 KSP Residual norm 5.332579238943e-03 1278 KSP Residual norm 5.204965860375e-03 1279 KSP Residual norm 4.945698544804e-03 1280 KSP Residual norm 4.804631013042e-03 1281 KSP Residual norm 4.917148464483e-03 1282 KSP Residual norm 5.240730514973e-03 1283 KSP Residual norm 5.576608925829e-03 1284 KSP Residual norm 5.371381327328e-03 1285 KSP Residual norm 5.011314190904e-03 1286 KSP Residual norm 4.866107074784e-03 1287 KSP Residual norm 4.787906287522e-03 1288 KSP Residual norm 4.864775553534e-03 1289 KSP Residual norm 4.916785832573e-03 1290 KSP Residual norm 4.657463941679e-03 1291 KSP Residual norm 4.530399732902e-03 1292 KSP Residual norm 4.567561150432e-03 1293 KSP Residual norm 4.623376308790e-03 1294 KSP Residual norm 4.386296319403e-03 1295 KSP Residual norm 4.224960644615e-03 1296 KSP Residual norm 4.124504209937e-03 1297 KSP Residual norm 4.113756726773e-03 1298 KSP Residual norm 4.219404272017e-03 1299 KSP Residual norm 4.319074017788e-03 1300 KSP Residual norm 4.338363708766e-03 1301 KSP Residual norm 4.087293049445e-03 1302 KSP Residual norm 3.828187057594e-03 1303 KSP Residual norm 3.615664295881e-03 1304 KSP Residual norm 3.720737447464e-03 1305 KSP Residual norm 3.995361756972e-03 1306 KSP Residual norm 4.202771155210e-03 1307 KSP Residual norm 4.412679881259e-03 1308 KSP Residual norm 4.559842389085e-03 1309 KSP Residual norm 4.771511782515e-03 1310 KSP Residual norm 4.804998789986e-03 1311 KSP Residual norm 5.021417214261e-03 1312 KSP Residual norm 5.325459583935e-03 1313 KSP Residual norm 5.448092377851e-03 1314 KSP Residual norm 5.357339275060e-03 1315 KSP Residual norm 5.172504557519e-03 1316 KSP Residual norm 4.864858663382e-03 1317 KSP Residual norm 4.573250060157e-03 1318 KSP Residual norm 4.497697556282e-03 1319 KSP Residual norm 4.611563966140e-03 1320 KSP Residual norm 4.608284094663e-03 1321 KSP Residual norm 4.378159837225e-03 1322 KSP Residual norm 4.486083715508e-03 1323 KSP Residual norm 5.006618659923e-03 1324 KSP Residual norm 5.400365411200e-03 1325 KSP Residual norm 5.307026679105e-03 1326 KSP Residual norm 5.293514731677e-03 1327 KSP Residual norm 5.306975541628e-03 1328 KSP Residual norm 5.368792860268e-03 1329 KSP Residual norm 4.919812160700e-03 1330 KSP Residual norm 4.471669316051e-03 1331 KSP Residual norm 4.515952541262e-03 1332 KSP Residual norm 4.917163287239e-03 1333 KSP Residual norm 5.245851058222e-03 1334 KSP Residual norm 5.621468067928e-03 1335 KSP Residual norm 5.652217685842e-03 1336 KSP Residual norm 5.708425105195e-03 1337 KSP Residual norm 5.824373614395e-03 1338 KSP Residual norm 6.009562993019e-03 1339 KSP Residual norm 6.144142720600e-03 1340 KSP Residual norm 6.599285586716e-03 1341 KSP Residual norm 7.344808253598e-03 1342 KSP Residual norm 7.688040205459e-03 1343 KSP Residual norm 7.586092654733e-03 1344 KSP Residual norm 7.173564259039e-03 1345 KSP Residual norm 6.598240988005e-03 1346 KSP Residual norm 5.892587099044e-03 1347 KSP Residual norm 5.613682437077e-03 1348 KSP Residual norm 5.460696099903e-03 1349 KSP Residual norm 5.462517746714e-03 1350 KSP Residual norm 5.600953475630e-03 1351 KSP Residual norm 5.387775229912e-03 1352 KSP Residual norm 5.181412421696e-03 1353 KSP Residual norm 4.931952401920e-03 1354 KSP Residual norm 4.570619979257e-03 1355 KSP Residual norm 4.377392593060e-03 1356 KSP Residual norm 4.134943800022e-03 1357 KSP Residual norm 4.050101621145e-03 1358 KSP Residual norm 3.927170173902e-03 1359 KSP Residual norm 4.002863746419e-03 1360 KSP Residual norm 4.133904576647e-03 1361 KSP Residual norm 4.093282465061e-03 1362 KSP Residual norm 3.857297451022e-03 1363 KSP Residual norm 3.620285409349e-03 1364 KSP Residual norm 3.382937730415e-03 1365 KSP Residual norm 3.338970913338e-03 1366 KSP Residual norm 3.621615143604e-03 1367 KSP Residual norm 4.004203600945e-03 1368 KSP Residual norm 4.313641226175e-03 1369 KSP Residual norm 4.381419457289e-03 1370 KSP Residual norm 4.062557649672e-03 1371 KSP Residual norm 3.804951568249e-03 1372 KSP Residual norm 3.824966688333e-03 1373 KSP Residual norm 4.170571410610e-03 1374 KSP Residual norm 4.359063888330e-03 1375 KSP Residual norm 4.369708161276e-03 1376 KSP Residual norm 4.616366604858e-03 1377 KSP Residual norm 4.813132711751e-03 1378 KSP Residual norm 4.841581890431e-03 1379 KSP Residual norm 4.615457216910e-03 1380 KSP Residual norm 4.364865174047e-03 1381 KSP Residual norm 4.402076551154e-03 1382 KSP Residual norm 4.550847289956e-03 1383 KSP Residual norm 4.653118821445e-03 1384 KSP Residual norm 4.848951209101e-03 1385 KSP Residual norm 5.335005979295e-03 1386 KSP Residual norm 5.990831239637e-03 1387 KSP Residual norm 6.390408526740e-03 1388 KSP Residual norm 6.238132730307e-03 1389 KSP Residual norm 5.777213476124e-03 1390 KSP Residual norm 5.745474019357e-03 1391 KSP Residual norm 6.171919713719e-03 1392 KSP Residual norm 6.814628576131e-03 1393 KSP Residual norm 7.092930622436e-03 1394 KSP Residual norm 7.202829647083e-03 1395 KSP Residual norm 7.263068243672e-03 1396 KSP Residual norm 7.991866027692e-03 1397 KSP Residual norm 8.076857344675e-03 1398 KSP Residual norm 7.416722959321e-03 1399 KSP Residual norm 7.256745039815e-03 1400 KSP Residual norm 7.817460645521e-03 1401 KSP Residual norm 7.888386414770e-03 1402 KSP Residual norm 7.778721146485e-03 1403 KSP Residual norm 7.633145130674e-03 1404 KSP Residual norm 7.080884138097e-03 1405 KSP Residual norm 6.392244539346e-03 1406 KSP Residual norm 5.964969378442e-03 1407 KSP Residual norm 5.759052737011e-03 1408 KSP Residual norm 5.509325648131e-03 1409 KSP Residual norm 4.965502665671e-03 1410 KSP Residual norm 4.553387456256e-03 1411 KSP Residual norm 4.350348933840e-03 1412 KSP Residual norm 4.450579817108e-03 1413 KSP Residual norm 4.462209314224e-03 1414 KSP Residual norm 4.759025680116e-03 1415 KSP Residual norm 4.843115994220e-03 1416 KSP Residual norm 4.883781434826e-03 1417 KSP Residual norm 4.855960636670e-03 1418 KSP Residual norm 4.909991832520e-03 1419 KSP Residual norm 5.167335712909e-03 1420 KSP Residual norm 5.062535842783e-03 1421 KSP Residual norm 4.861047026837e-03 1422 KSP Residual norm 4.614075594659e-03 1423 KSP Residual norm 4.448845132952e-03 1424 KSP Residual norm 4.341318580693e-03 1425 KSP Residual norm 4.446510898985e-03 1426 KSP Residual norm 4.663442738450e-03 1427 KSP Residual norm 4.765644861878e-03 1428 KSP Residual norm 4.659291336051e-03 1429 KSP Residual norm 4.359520779209e-03 1430 KSP Residual norm 4.223229345176e-03 1431 KSP Residual norm 4.418285143963e-03 1432 KSP Residual norm 4.619124062440e-03 1433 KSP Residual norm 4.553885494409e-03 1434 KSP Residual norm 4.280900988546e-03 1435 KSP Residual norm 4.085018266935e-03 1436 KSP Residual norm 4.004797334160e-03 1437 KSP Residual norm 4.007557730599e-03 1438 KSP Residual norm 3.818920853936e-03 1439 KSP Residual norm 3.735385033394e-03 1440 KSP Residual norm 3.797665363864e-03 1441 KSP Residual norm 3.647434711828e-03 1442 KSP Residual norm 3.434667605772e-03 1443 KSP Residual norm 3.579061679853e-03 1444 KSP Residual norm 3.929397396732e-03 1445 KSP Residual norm 4.161246364486e-03 1446 KSP Residual norm 4.247942879325e-03 1447 KSP Residual norm 4.214530157465e-03 1448 KSP Residual norm 4.349880370678e-03 1449 KSP Residual norm 4.382618450311e-03 1450 KSP Residual norm 4.317896908899e-03 1451 KSP Residual norm 4.318923942325e-03 1452 KSP Residual norm 4.597371089027e-03 1453 KSP Residual norm 4.734297417514e-03 1454 KSP Residual norm 4.723529661083e-03 1455 KSP Residual norm 4.535019309761e-03 1456 KSP Residual norm 4.557228968325e-03 1457 KSP Residual norm 4.675657336082e-03 1458 KSP Residual norm 4.678101252096e-03 1459 KSP Residual norm 4.440029523124e-03 1460 KSP Residual norm 4.443447061774e-03 1461 KSP Residual norm 4.675489876803e-03 1462 KSP Residual norm 4.697316439515e-03 1463 KSP Residual norm 4.347006366960e-03 1464 KSP Residual norm 4.116358055477e-03 1465 KSP Residual norm 3.909385741593e-03 1466 KSP Residual norm 3.427271155185e-03 1467 KSP Residual norm 3.148050704572e-03 1468 KSP Residual norm 3.146758567557e-03 1469 KSP Residual norm 3.326439000327e-03 1470 KSP Residual norm 3.528227381870e-03 1471 KSP Residual norm 3.734636049772e-03 1472 KSP Residual norm 3.801175141192e-03 1473 KSP Residual norm 3.682733054341e-03 1474 KSP Residual norm 3.869391327748e-03 1475 KSP Residual norm 4.124262041395e-03 1476 KSP Residual norm 4.301928175170e-03 1477 KSP Residual norm 4.360539607347e-03 1478 KSP Residual norm 4.695576840578e-03 1479 KSP Residual norm 5.338862907112e-03 1480 KSP Residual norm 5.857255626224e-03 1481 KSP Residual norm 5.758243446024e-03 1482 KSP Residual norm 5.429528389948e-03 1483 KSP Residual norm 5.332596773347e-03 1484 KSP Residual norm 5.317884347570e-03 1485 KSP Residual norm 5.251252274662e-03 1486 KSP Residual norm 4.970344100120e-03 1487 KSP Residual norm 4.679405455370e-03 1488 KSP Residual norm 4.257698337737e-03 1489 KSP Residual norm 3.922890835350e-03 1490 KSP Residual norm 3.740125726498e-03 1491 KSP Residual norm 3.739714064191e-03 1492 KSP Residual norm 3.966049899764e-03 1493 KSP Residual norm 4.027427417459e-03 1494 KSP Residual norm 3.958089315333e-03 1495 KSP Residual norm 3.911692723710e-03 1496 KSP Residual norm 4.016878615819e-03 1497 KSP Residual norm 4.250091344628e-03 1498 KSP Residual norm 4.518796865137e-03 1499 KSP Residual norm 5.010301973299e-03 1500 KSP Residual norm 5.578390370620e-03 1501 KSP Residual norm 5.614031960271e-03 1502 KSP Residual norm 5.487587594523e-03 1503 KSP Residual norm 5.490917249746e-03 1504 KSP Residual norm 5.410518625817e-03 1505 KSP Residual norm 4.795851091615e-03 1506 KSP Residual norm 3.971918432854e-03 1507 KSP Residual norm 3.703618718688e-03 1508 KSP Residual norm 3.914709198679e-03 1509 KSP Residual norm 4.020164070164e-03 1510 KSP Residual norm 4.140657598389e-03 1511 KSP Residual norm 4.045678848300e-03 1512 KSP Residual norm 3.877274234083e-03 1513 KSP Residual norm 3.773905053915e-03 1514 KSP Residual norm 3.777227522832e-03 1515 KSP Residual norm 3.615497735700e-03 1516 KSP Residual norm 3.329500584338e-03 1517 KSP Residual norm 3.211498061362e-03 1518 KSP Residual norm 3.401609027037e-03 1519 KSP Residual norm 3.792488962233e-03 1520 KSP Residual norm 3.770037286683e-03 1521 KSP Residual norm 3.396567762266e-03 1522 KSP Residual norm 3.019110598211e-03 1523 KSP Residual norm 3.152200151492e-03 1524 KSP Residual norm 3.557575669406e-03 1525 KSP Residual norm 3.872615871153e-03 1526 KSP Residual norm 4.159628700692e-03 1527 KSP Residual norm 4.183185965547e-03 1528 KSP Residual norm 4.076722224240e-03 1529 KSP Residual norm 3.792985283323e-03 1530 KSP Residual norm 3.455971465179e-03 1531 KSP Residual norm 3.346246982667e-03 1532 KSP Residual norm 3.461595065848e-03 1533 KSP Residual norm 3.719177306018e-03 1534 KSP Residual norm 3.738392400040e-03 1535 KSP Residual norm 3.321691072143e-03 1536 KSP Residual norm 3.081074856988e-03 1537 KSP Residual norm 3.326777042453e-03 1538 KSP Residual norm 3.959519687464e-03 1539 KSP Residual norm 4.448615395345e-03 1540 KSP Residual norm 4.638141453377e-03 1541 KSP Residual norm 4.626773464162e-03 1542 KSP Residual norm 4.483684621375e-03 1543 KSP Residual norm 4.342412403729e-03 1544 KSP Residual norm 4.205782505569e-03 1545 KSP Residual norm 4.106900721390e-03 1546 KSP Residual norm 4.095639807755e-03 1547 KSP Residual norm 4.162839274005e-03 1548 KSP Residual norm 4.317606778375e-03 1549 KSP Residual norm 4.299305149905e-03 1550 KSP Residual norm 4.148649072744e-03 1551 KSP Residual norm 4.073325903721e-03 1552 KSP Residual norm 4.011903492316e-03 1553 KSP Residual norm 3.927519683330e-03 1554 KSP Residual norm 3.728278189639e-03 1555 KSP Residual norm 3.524235386843e-03 1556 KSP Residual norm 3.605233908790e-03 1557 KSP Residual norm 3.997424334329e-03 1558 KSP Residual norm 4.697881418515e-03 1559 KSP Residual norm 4.893748047315e-03 1560 KSP Residual norm 4.745987021851e-03 1561 KSP Residual norm 4.450249967684e-03 1562 KSP Residual norm 4.357138704257e-03 1563 KSP Residual norm 4.367710481440e-03 1564 KSP Residual norm 4.484935772773e-03 1565 KSP Residual norm 4.499326997528e-03 1566 KSP Residual norm 4.146770349785e-03 1567 KSP Residual norm 3.712414470514e-03 1568 KSP Residual norm 3.548515634102e-03 1569 KSP Residual norm 3.660986761216e-03 1570 KSP Residual norm 3.799659058599e-03 1571 KSP Residual norm 3.814251344302e-03 1572 KSP Residual norm 3.782927834391e-03 1573 KSP Residual norm 3.856112517795e-03 1574 KSP Residual norm 4.053264140504e-03 1575 KSP Residual norm 4.098474476568e-03 1576 KSP Residual norm 3.687961478176e-03 1577 KSP Residual norm 3.334331419984e-03 1578 KSP Residual norm 3.340775962920e-03 1579 KSP Residual norm 3.647538053510e-03 1580 KSP Residual norm 3.732505656428e-03 1581 KSP Residual norm 3.569907419767e-03 1582 KSP Residual norm 3.284915346407e-03 1583 KSP Residual norm 2.867992050476e-03 1584 KSP Residual norm 2.609457809758e-03 1585 KSP Residual norm 2.548729593534e-03 1586 KSP Residual norm 2.703303121263e-03 1587 KSP Residual norm 2.991301617554e-03 1588 KSP Residual norm 3.181960046360e-03 1589 KSP Residual norm 2.812457127578e-03 1590 KSP Residual norm 2.357118285072e-03 1591 KSP Residual norm 2.331637798308e-03 1592 KSP Residual norm 2.767640590317e-03 1593 KSP Residual norm 3.120168921408e-03 1594 KSP Residual norm 3.071446073340e-03 1595 KSP Residual norm 2.901376408602e-03 1596 KSP Residual norm 2.736985639322e-03 1597 KSP Residual norm 2.685199238895e-03 1598 KSP Residual norm 2.662069200765e-03 1599 KSP Residual norm 2.723556510636e-03 1600 KSP Residual norm 2.865503849469e-03 1601 KSP Residual norm 3.124754319916e-03 1602 KSP Residual norm 3.501928923907e-03 1603 KSP Residual norm 3.710686527071e-03 1604 KSP Residual norm 3.698074422762e-03 1605 KSP Residual norm 3.433141741764e-03 1606 KSP Residual norm 3.119678872135e-03 1607 KSP Residual norm 2.928591276034e-03 1608 KSP Residual norm 2.740490664497e-03 1609 KSP Residual norm 2.700187040953e-03 1610 KSP Residual norm 3.058181422370e-03 1611 KSP Residual norm 3.631169145322e-03 1612 KSP Residual norm 3.865620502931e-03 1613 KSP Residual norm 3.410310938254e-03 1614 KSP Residual norm 2.778380603588e-03 1615 KSP Residual norm 2.519935144611e-03 1616 KSP Residual norm 2.536730914580e-03 1617 KSP Residual norm 2.722169589183e-03 1618 KSP Residual norm 2.981607527716e-03 1619 KSP Residual norm 3.074443842388e-03 1620 KSP Residual norm 3.102897043425e-03 1621 KSP Residual norm 3.334968227213e-03 1622 KSP Residual norm 3.765759167624e-03 1623 KSP Residual norm 4.260428606374e-03 1624 KSP Residual norm 4.692079458287e-03 1625 KSP Residual norm 5.031307315013e-03 1626 KSP Residual norm 4.421918845491e-03 1627 KSP Residual norm 3.287483165129e-03 1628 KSP Residual norm 2.369767919189e-03 1629 KSP Residual norm 1.932654903757e-03 1630 KSP Residual norm 1.962317074491e-03 1631 KSP Residual norm 2.434378646182e-03 1632 KSP Residual norm 3.165878044867e-03 1633 KSP Residual norm 4.008110394569e-03 1634 KSP Residual norm 4.243942144954e-03 1635 KSP Residual norm 3.502225516401e-03 1636 KSP Residual norm 2.676679314493e-03 1637 KSP Residual norm 2.113255870404e-03 1638 KSP Residual norm 1.943585178476e-03 1639 KSP Residual norm 2.182514404814e-03 1640 KSP Residual norm 2.842236509915e-03 1641 KSP Residual norm 3.797676250207e-03 1642 KSP Residual norm 4.796159154955e-03 1643 KSP Residual norm 4.855718878594e-03 1644 KSP Residual norm 3.805743905144e-03 1645 KSP Residual norm 2.836189035614e-03 1646 KSP Residual norm 2.311683718706e-03 1647 KSP Residual norm 2.304059112016e-03 1648 KSP Residual norm 3.001923838056e-03 1649 KSP Residual norm 4.465398931268e-03 1650 KSP Residual norm 6.316248645066e-03 1651 KSP Residual norm 7.460452089176e-03 1652 KSP Residual norm 6.157762247948e-03 1653 KSP Residual norm 4.028858910124e-03 1654 KSP Residual norm 2.607315308302e-03 1655 KSP Residual norm 2.135887579079e-03 1656 KSP Residual norm 2.214752212489e-03 1657 KSP Residual norm 2.889084953386e-03 1658 KSP Residual norm 4.448359025874e-03 1659 KSP Residual norm 6.312792290081e-03 1660 KSP Residual norm 6.454085180289e-03 1661 KSP Residual norm 4.821242484495e-03 1662 KSP Residual norm 3.136267667322e-03 1663 KSP Residual norm 2.048069925869e-03 1664 KSP Residual norm 1.642042732627e-03 1665 KSP Residual norm 1.776685047647e-03 1666 KSP Residual norm 2.623757732771e-03 1667 KSP Residual norm 4.458232614128e-03 1668 KSP Residual norm 6.978633797115e-03 1669 KSP Residual norm 7.237631617823e-03 1670 KSP Residual norm 4.985964037370e-03 1671 KSP Residual norm 2.915990817911e-03 1672 KSP Residual norm 2.034845647516e-03 1673 KSP Residual norm 1.913779478677e-03 1674 KSP Residual norm 2.437839848479e-03 1675 KSP Residual norm 3.816997921800e-03 1676 KSP Residual norm 5.453049433777e-03 1677 KSP Residual norm 5.652942559526e-03 1678 KSP Residual norm 4.474314780586e-03 1679 KSP Residual norm 2.979968710938e-03 1680 KSP Residual norm 1.962384551434e-03 1681 KSP Residual norm 1.629787231112e-03 1682 KSP Residual norm 1.775963421001e-03 1683 KSP Residual norm 2.570855608573e-03 1684 KSP Residual norm 4.218418470023e-03 1685 KSP Residual norm 6.808798194916e-03 1686 KSP Residual norm 8.665640165091e-03 1687 KSP Residual norm 6.816417271790e-03 1688 KSP Residual norm 4.077925925808e-03 1689 KSP Residual norm 2.529468344520e-03 1690 KSP Residual norm 2.079852452529e-03 1691 KSP Residual norm 2.169906661041e-03 1692 KSP Residual norm 2.748732070558e-03 1693 KSP Residual norm 4.293496737558e-03 1694 KSP Residual norm 7.106427579240e-03 1695 KSP Residual norm 1.039582615490e-02 1696 KSP Residual norm 1.066412513256e-02 1697 KSP Residual norm 7.363116007261e-03 1698 KSP Residual norm 4.484806027494e-03 1699 KSP Residual norm 3.194517941305e-03 1700 KSP Residual norm 2.997403024709e-03 1701 KSP Residual norm 3.715245506908e-03 1702 KSP Residual norm 5.667183771566e-03 1703 KSP Residual norm 9.895874135020e-03 1704 KSP Residual norm 1.713628494096e-02 1705 KSP Residual norm 2.253742675078e-02 1706 KSP Residual norm 1.815633226840e-02 1707 KSP Residual norm 1.027253732195e-02 1708 KSP Residual norm 5.585731982869e-03 1709 KSP Residual norm 3.560817037432e-03 1710 KSP Residual norm 3.174754585629e-03 1711 KSP Residual norm 4.087044780875e-03 1712 KSP Residual norm 6.310730407470e-03 1713 KSP Residual norm 8.889169687391e-03 1714 KSP Residual norm 1.010227705484e-02 1715 KSP Residual norm 9.757653183414e-03 1716 KSP Residual norm 1.042924992465e-02 1717 KSP Residual norm 1.236414617044e-02 1718 KSP Residual norm 1.344340520603e-02 1719 KSP Residual norm 9.861580508839e-03 1720 KSP Residual norm 5.760429650522e-03 1721 KSP Residual norm 3.968634397526e-03 1722 KSP Residual norm 3.860048899695e-03 1723 KSP Residual norm 4.885282153439e-03 1724 KSP Residual norm 6.411846189810e-03 1725 KSP Residual norm 7.528899153385e-03 1726 KSP Residual norm 8.282565560188e-03 1727 KSP Residual norm 1.028477470185e-02 1728 KSP Residual norm 1.406352667328e-02 1729 KSP Residual norm 1.613317508911e-02 1730 KSP Residual norm 1.265179056408e-02 1731 KSP Residual norm 8.433894035220e-03 1732 KSP Residual norm 7.024467576599e-03 1733 KSP Residual norm 7.522036580324e-03 1734 KSP Residual norm 8.086010468802e-03 1735 KSP Residual norm 6.775999043073e-03 1736 KSP Residual norm 4.982660999927e-03 1737 KSP Residual norm 4.153635535655e-03 1738 KSP Residual norm 4.498743086892e-03 1739 KSP Residual norm 6.376384796902e-03 1740 KSP Residual norm 9.019305036726e-03 1741 KSP Residual norm 1.008571307583e-02 1742 KSP Residual norm 9.867622730951e-03 1743 KSP Residual norm 1.130154844008e-02 1744 KSP Residual norm 1.487352700809e-02 1745 KSP Residual norm 1.674566260071e-02 1746 KSP Residual norm 1.295908724076e-02 1747 KSP Residual norm 9.270210719206e-03 1748 KSP Residual norm 7.608571110303e-03 1749 KSP Residual norm 6.682015809861e-03 1750 KSP Residual norm 5.806877257439e-03 1751 KSP Residual norm 4.662759516076e-03 1752 KSP Residual norm 3.842666916488e-03 1753 KSP Residual norm 3.669687472434e-03 1754 KSP Residual norm 4.150693018521e-03 1755 KSP Residual norm 5.197551156139e-03 1756 KSP Residual norm 5.944114827337e-03 1757 KSP Residual norm 6.456393033999e-03 1758 KSP Residual norm 7.760231053459e-03 1759 KSP Residual norm 1.037947368702e-02 1760 KSP Residual norm 1.496028745626e-02 1761 KSP Residual norm 1.618786763382e-02 1762 KSP Residual norm 1.370970144802e-02 1763 KSP Residual norm 1.180123064819e-02 1764 KSP Residual norm 1.206538297991e-02 1765 KSP Residual norm 1.128858176079e-02 1766 KSP Residual norm 8.433779869320e-03 1767 KSP Residual norm 5.728690607563e-03 1768 KSP Residual norm 4.715870157427e-03 1769 KSP Residual norm 4.903849994668e-03 1770 KSP Residual norm 5.860927693618e-03 1771 KSP Residual norm 5.955039528640e-03 1772 KSP Residual norm 5.623262887997e-03 1773 KSP Residual norm 6.396771032506e-03 1774 KSP Residual norm 9.031367977476e-03 1775 KSP Residual norm 1.299959176062e-02 1776 KSP Residual norm 1.473141113096e-02 1777 KSP Residual norm 1.301728743945e-02 1778 KSP Residual norm 1.186704670476e-02 1779 KSP Residual norm 1.271037519749e-02 1780 KSP Residual norm 1.390386482594e-02 1781 KSP Residual norm 1.203407490800e-02 1782 KSP Residual norm 9.695334509794e-03 1783 KSP Residual norm 8.568230932420e-03 1784 KSP Residual norm 8.648198914198e-03 1785 KSP Residual norm 8.377636695143e-03 1786 KSP Residual norm 7.007933502942e-03 1787 KSP Residual norm 5.982063968369e-03 1788 KSP Residual norm 6.296043628802e-03 1789 KSP Residual norm 7.651552723818e-03 1790 KSP Residual norm 8.561237485700e-03 1791 KSP Residual norm 8.559295288210e-03 1792 KSP Residual norm 8.819457619708e-03 1793 KSP Residual norm 1.100979421964e-02 1794 KSP Residual norm 1.512814836318e-02 1795 KSP Residual norm 1.747930112508e-02 1796 KSP Residual norm 1.503911210909e-02 1797 KSP Residual norm 1.333828397169e-02 1798 KSP Residual norm 1.503110014031e-02 1799 KSP Residual norm 1.858874048740e-02 1800 KSP Residual norm 1.754785446477e-02 1801 KSP Residual norm 1.265005935329e-02 1802 KSP Residual norm 9.644563862227e-03 1803 KSP Residual norm 8.339675321860e-03 1804 KSP Residual norm 8.328279073357e-03 1805 KSP Residual norm 7.687861874409e-03 1806 KSP Residual norm 5.869926944451e-03 1807 KSP Residual norm 4.854194518142e-03 1808 KSP Residual norm 4.751080118652e-03 1809 KSP Residual norm 5.304959976306e-03 1810 KSP Residual norm 5.582226363115e-03 1811 KSP Residual norm 5.301484986900e-03 1812 KSP Residual norm 5.622322252902e-03 1813 KSP Residual norm 7.282562214031e-03 1814 KSP Residual norm 9.685663358062e-03 1815 KSP Residual norm 1.110743276654e-02 1816 KSP Residual norm 1.089423023046e-02 1817 KSP Residual norm 1.200160432866e-02 1818 KSP Residual norm 1.600239670961e-02 1819 KSP Residual norm 2.114175274990e-02 1820 KSP Residual norm 2.168298795445e-02 1821 KSP Residual norm 1.924855736963e-02 1822 KSP Residual norm 1.925728569694e-02 1823 KSP Residual norm 2.156804080003e-02 1824 KSP Residual norm 2.174795961672e-02 1825 KSP Residual norm 1.857980960454e-02 1826 KSP Residual norm 1.580064745947e-02 1827 KSP Residual norm 1.464780188666e-02 1828 KSP Residual norm 1.419629565106e-02 1829 KSP Residual norm 1.199934486506e-02 1830 KSP Residual norm 9.297607562041e-03 1831 KSP Residual norm 7.973058376218e-03 1832 KSP Residual norm 8.071779389779e-03 1833 KSP Residual norm 7.655424487629e-03 1834 KSP Residual norm 6.147071787004e-03 1835 KSP Residual norm 5.092429298832e-03 1836 KSP Residual norm 5.236830339569e-03 1837 KSP Residual norm 5.903585382046e-03 1838 KSP Residual norm 6.270704482619e-03 1839 KSP Residual norm 6.429789049653e-03 1840 KSP Residual norm 7.336468878096e-03 1841 KSP Residual norm 9.369292088197e-03 1842 KSP Residual norm 1.139526872507e-02 1843 KSP Residual norm 1.185995814746e-02 1844 KSP Residual norm 1.241981684210e-02 1845 KSP Residual norm 1.554131717488e-02 1846 KSP Residual norm 1.982680913327e-02 1847 KSP Residual norm 2.082327701056e-02 1848 KSP Residual norm 1.847297822087e-02 1849 KSP Residual norm 1.726200472182e-02 1850 KSP Residual norm 1.819892023894e-02 1851 KSP Residual norm 1.812721678711e-02 1852 KSP Residual norm 1.545821640092e-02 1853 KSP Residual norm 1.351946708467e-02 1854 KSP Residual norm 1.321421672781e-02 1855 KSP Residual norm 1.203583764597e-02 1856 KSP Residual norm 9.275182152481e-03 1857 KSP Residual norm 7.166020487625e-03 1858 KSP Residual norm 6.667041846874e-03 1859 KSP Residual norm 6.979920613231e-03 1860 KSP Residual norm 6.228819166919e-03 1861 KSP Residual norm 5.296069511341e-03 1862 KSP Residual norm 5.395934083308e-03 1863 KSP Residual norm 6.589992548511e-03 1864 KSP Residual norm 7.994790437338e-03 1865 KSP Residual norm 8.198103360516e-03 1866 KSP Residual norm 8.356807517315e-03 1867 KSP Residual norm 9.552692725974e-03 1868 KSP Residual norm 1.180930871093e-02 1869 KSP Residual norm 1.369013031159e-02 1870 KSP Residual norm 1.405625655625e-02 1871 KSP Residual norm 1.499059973109e-02 1872 KSP Residual norm 1.749830190835e-02 1873 KSP Residual norm 1.923240732681e-02 1874 KSP Residual norm 1.792581177987e-02 1875 KSP Residual norm 1.595420253315e-02 1876 KSP Residual norm 1.592437927271e-02 1877 KSP Residual norm 1.571061792241e-02 1878 KSP Residual norm 1.330409361058e-02 1879 KSP Residual norm 1.011402845723e-02 1880 KSP Residual norm 8.385714312478e-03 1881 KSP Residual norm 7.619838379779e-03 1882 KSP Residual norm 7.079068831244e-03 1883 KSP Residual norm 6.263483895898e-03 1884 KSP Residual norm 5.416913551314e-03 1885 KSP Residual norm 5.248740770795e-03 1886 KSP Residual norm 5.823178578214e-03 1887 KSP Residual norm 5.856710332746e-03 1888 KSP Residual norm 5.340607713094e-03 1889 KSP Residual norm 5.372666907277e-03 1890 KSP Residual norm 6.254021658222e-03 1891 KSP Residual norm 7.002555893646e-03 1892 KSP Residual norm 7.119015191351e-03 1893 KSP Residual norm 7.398487660308e-03 1894 KSP Residual norm 8.598149935475e-03 1895 KSP Residual norm 9.082542888550e-03 1896 KSP Residual norm 8.823606871381e-03 1897 KSP Residual norm 8.351271646168e-03 1898 KSP Residual norm 8.792697059084e-03 1899 KSP Residual norm 9.434861567697e-03 1900 KSP Residual norm 8.739188946527e-03 1901 KSP Residual norm 7.439345048644e-03 1902 KSP Residual norm 6.988512905492e-03 1903 KSP Residual norm 7.266598646058e-03 1904 KSP Residual norm 6.860594789317e-03 1905 KSP Residual norm 5.869662347953e-03 1906 KSP Residual norm 5.189292046807e-03 1907 KSP Residual norm 4.698378415098e-03 1908 KSP Residual norm 4.125932694582e-03 1909 KSP Residual norm 3.623733579068e-03 1910 KSP Residual norm 3.397717857275e-03 1911 KSP Residual norm 3.450602416975e-03 1912 KSP Residual norm 3.371679742809e-03 1913 KSP Residual norm 3.211538717453e-03 1914 KSP Residual norm 3.381570484386e-03 1915 KSP Residual norm 3.752374891032e-03 1916 KSP Residual norm 4.180224899146e-03 1917 KSP Residual norm 4.425439107566e-03 1918 KSP Residual norm 4.789578199346e-03 1919 KSP Residual norm 5.284732777086e-03 1920 KSP Residual norm 5.612706090822e-03 1921 KSP Residual norm 5.861795033631e-03 1922 KSP Residual norm 6.262650523811e-03 1923 KSP Residual norm 7.405189723965e-03 1924 KSP Residual norm 8.588777389142e-03 1925 KSP Residual norm 9.036165294655e-03 1926 KSP Residual norm 9.216393583533e-03 1927 KSP Residual norm 1.015521724006e-02 1928 KSP Residual norm 1.063074433343e-02 1929 KSP Residual norm 9.895151313230e-03 1930 KSP Residual norm 9.097228298051e-03 1931 KSP Residual norm 8.685730919586e-03 1932 KSP Residual norm 8.534987793340e-03 1933 KSP Residual norm 7.863482725812e-03 1934 KSP Residual norm 6.232590539545e-03 1935 KSP Residual norm 5.209143529906e-03 1936 KSP Residual norm 4.983723053720e-03 1937 KSP Residual norm 4.813332251178e-03 1938 KSP Residual norm 4.284521837285e-03 1939 KSP Residual norm 3.984519392867e-03 1940 KSP Residual norm 4.077399442331e-03 1941 KSP Residual norm 4.352043481077e-03 1942 KSP Residual norm 4.461858984941e-03 1943 KSP Residual norm 4.380420741836e-03 1944 KSP Residual norm 4.734130093898e-03 1945 KSP Residual norm 5.852860035227e-03 1946 KSP Residual norm 6.975950093485e-03 1947 KSP Residual norm 7.052942525567e-03 1948 KSP Residual norm 6.928412056660e-03 1949 KSP Residual norm 7.888872516775e-03 1950 KSP Residual norm 9.111899173159e-03 1951 KSP Residual norm 9.278983874453e-03 1952 KSP Residual norm 9.858700588481e-03 1953 KSP Residual norm 1.179287626505e-02 1954 KSP Residual norm 1.282807255394e-02 1955 KSP Residual norm 1.262679691404e-02 1956 KSP Residual norm 1.255775353882e-02 1957 KSP Residual norm 1.370183873868e-02 1958 KSP Residual norm 1.400213065512e-02 1959 KSP Residual norm 1.283815157619e-02 1960 KSP Residual norm 1.235915610084e-02 1961 KSP Residual norm 1.266753849066e-02 1962 KSP Residual norm 1.244201832218e-02 1963 KSP Residual norm 1.041024275825e-02 1964 KSP Residual norm 8.842758841380e-03 1965 KSP Residual norm 8.240799359774e-03 1966 KSP Residual norm 7.948646573032e-03 1967 KSP Residual norm 6.445879239365e-03 1968 KSP Residual norm 5.559004256048e-03 1969 KSP Residual norm 5.066960361645e-03 1970 KSP Residual norm 4.673923122700e-03 1971 KSP Residual norm 4.072119570099e-03 1972 KSP Residual norm 3.530170413031e-03 1973 KSP Residual norm 3.445412172218e-03 1974 KSP Residual norm 3.692808674938e-03 1975 KSP Residual norm 3.509296213417e-03 1976 KSP Residual norm 3.194284614736e-03 1977 KSP Residual norm 3.214085704621e-03 1978 KSP Residual norm 3.312784983574e-03 1979 KSP Residual norm 3.376878732161e-03 1980 KSP Residual norm 3.517881561328e-03 1981 KSP Residual norm 3.693201640814e-03 1982 KSP Residual norm 4.035821718953e-03 1983 KSP Residual norm 4.303835709655e-03 1984 KSP Residual norm 4.552984049764e-03 1985 KSP Residual norm 5.220254958434e-03 1986 KSP Residual norm 6.551890355271e-03 1987 KSP Residual norm 8.381415823956e-03 1988 KSP Residual norm 9.738560576716e-03 1989 KSP Residual norm 1.100926661348e-02 1990 KSP Residual norm 1.246532004316e-02 1991 KSP Residual norm 1.407235479874e-02 1992 KSP Residual norm 1.461033789890e-02 1993 KSP Residual norm 1.466527654767e-02 1994 KSP Residual norm 1.608542024596e-02 1995 KSP Residual norm 1.723354282800e-02 1996 KSP Residual norm 1.654429130126e-02 1997 KSP Residual norm 1.497588013766e-02 1998 KSP Residual norm 1.422318323008e-02 1999 KSP Residual norm 1.372816158785e-02 2000 KSP Residual norm 1.187907197624e-02 2001 KSP Residual norm 9.530673313997e-03 2002 KSP Residual norm 8.350842995280e-03 2003 KSP Residual norm 8.040856946159e-03 2004 KSP Residual norm 7.300758382425e-03 2005 KSP Residual norm 6.351812200229e-03 2006 KSP Residual norm 5.449815802557e-03 2007 KSP Residual norm 5.410388539106e-03 2008 KSP Residual norm 5.497381835543e-03 2009 KSP Residual norm 4.766470175888e-03 2010 KSP Residual norm 3.920445921370e-03 2011 KSP Residual norm 3.635909540975e-03 2012 KSP Residual norm 3.578050027117e-03 2013 KSP Residual norm 3.281862671151e-03 2014 KSP Residual norm 3.147248560594e-03 2015 KSP Residual norm 3.257930715380e-03 2016 KSP Residual norm 3.528434561439e-03 2017 KSP Residual norm 3.704069547408e-03 2018 KSP Residual norm 3.666226252482e-03 2019 KSP Residual norm 3.748786599519e-03 2020 KSP Residual norm 4.224789132151e-03 2021 KSP Residual norm 4.713188915244e-03 2022 KSP Residual norm 4.914611177949e-03 2023 KSP Residual norm 5.100616440787e-03 2024 KSP Residual norm 5.619478115716e-03 2025 KSP Residual norm 6.242762113305e-03 2026 KSP Residual norm 6.657735104471e-03 2027 KSP Residual norm 7.401660853611e-03 2028 KSP Residual norm 8.182706351312e-03 2029 KSP Residual norm 8.293373495531e-03 2030 KSP Residual norm 7.720806516677e-03 2031 KSP Residual norm 7.731863907682e-03 2032 KSP Residual norm 8.085902710517e-03 2033 KSP Residual norm 7.932570278344e-03 2034 KSP Residual norm 7.332322255068e-03 2035 KSP Residual norm 6.981249891929e-03 2036 KSP Residual norm 7.071360310580e-03 2037 KSP Residual norm 6.607769945073e-03 2038 KSP Residual norm 5.661103487318e-03 2039 KSP Residual norm 5.113391338547e-03 2040 KSP Residual norm 4.829759783025e-03 2041 KSP Residual norm 4.583432553280e-03 2042 KSP Residual norm 4.072399257332e-03 2043 KSP Residual norm 3.457746678394e-03 2044 KSP Residual norm 3.103218976754e-03 2045 KSP Residual norm 2.986630897216e-03 2046 KSP Residual norm 2.752096872884e-03 2047 KSP Residual norm 2.601932301287e-03 2048 KSP Residual norm 2.720727886614e-03 2049 KSP Residual norm 2.887925956684e-03 2050 KSP Residual norm 3.018447606692e-03 2051 KSP Residual norm 3.082987476405e-03 2052 KSP Residual norm 3.366759308843e-03 2053 KSP Residual norm 3.759993248183e-03 2054 KSP Residual norm 4.037275861301e-03 2055 KSP Residual norm 4.049762465846e-03 2056 KSP Residual norm 4.291473542844e-03 2057 KSP Residual norm 4.793912122751e-03 2058 KSP Residual norm 5.429296355667e-03 2059 KSP Residual norm 5.999137088175e-03 2060 KSP Residual norm 6.714532180763e-03 2061 KSP Residual norm 7.157076086637e-03 2062 KSP Residual norm 7.305423541139e-03 2063 KSP Residual norm 7.840616914368e-03 2064 KSP Residual norm 8.621338555374e-03 2065 KSP Residual norm 9.179826914789e-03 2066 KSP Residual norm 9.219040982171e-03 2067 KSP Residual norm 9.081038905495e-03 2068 KSP Residual norm 9.145498306064e-03 2069 KSP Residual norm 9.219909480688e-03 2070 KSP Residual norm 9.266003675083e-03 2071 KSP Residual norm 8.549392921627e-03 2072 KSP Residual norm 7.813939776317e-03 2073 KSP Residual norm 7.511934879918e-03 2074 KSP Residual norm 6.987487537767e-03 2075 KSP Residual norm 6.181082525053e-03 2076 KSP Residual norm 5.501067850667e-03 2077 KSP Residual norm 5.316939535299e-03 2078 KSP Residual norm 5.376044849232e-03 2079 KSP Residual norm 4.812349537692e-03 2080 KSP Residual norm 4.083001156083e-03 2081 KSP Residual norm 3.984095588508e-03 2082 KSP Residual norm 4.166602468433e-03 2083 KSP Residual norm 3.736388905897e-03 2084 KSP Residual norm 3.181344392970e-03 2085 KSP Residual norm 3.087472465346e-03 2086 KSP Residual norm 3.299309121667e-03 2087 KSP Residual norm 3.122460123169e-03 2088 KSP Residual norm 2.904036501409e-03 2089 KSP Residual norm 2.916197279864e-03 2090 KSP Residual norm 3.158694228775e-03 2091 KSP Residual norm 3.299142972491e-03 2092 KSP Residual norm 3.213974282697e-03 2093 KSP Residual norm 3.321177716622e-03 2094 KSP Residual norm 3.885809232016e-03 2095 KSP Residual norm 4.328927945493e-03 2096 KSP Residual norm 4.128316493644e-03 2097 KSP Residual norm 4.061668994145e-03 2098 KSP Residual norm 4.632615412637e-03 2099 KSP Residual norm 5.419201729535e-03 2100 KSP Residual norm 5.806306099173e-03 2101 KSP Residual norm 6.499985237890e-03 2102 KSP Residual norm 8.009977304793e-03 2103 KSP Residual norm 9.143990958236e-03 2104 KSP Residual norm 9.206698244420e-03 2105 KSP Residual norm 9.294092252086e-03 2106 KSP Residual norm 9.563695136136e-03 2107 KSP Residual norm 8.860454764156e-03 2108 KSP Residual norm 8.024764898450e-03 2109 KSP Residual norm 7.718856528597e-03 2110 KSP Residual norm 7.582562138927e-03 2111 KSP Residual norm 7.114006432207e-03 2112 KSP Residual norm 6.695573452676e-03 2113 KSP Residual norm 6.318059725508e-03 2114 KSP Residual norm 5.735068156662e-03 2115 KSP Residual norm 5.486371917502e-03 2116 KSP Residual norm 5.317941170076e-03 2117 KSP Residual norm 5.153691910605e-03 2118 KSP Residual norm 4.996322690960e-03 2119 KSP Residual norm 4.639298979665e-03 2120 KSP Residual norm 4.155770131347e-03 2121 KSP Residual norm 3.875657105043e-03 2122 KSP Residual norm 3.659569568406e-03 2123 KSP Residual norm 3.574986658392e-03 2124 KSP Residual norm 3.528766455188e-03 2125 KSP Residual norm 3.301499565247e-03 2126 KSP Residual norm 3.034514817403e-03 2127 KSP Residual norm 2.762297206424e-03 2128 KSP Residual norm 2.758080912531e-03 2129 KSP Residual norm 3.004429701481e-03 2130 KSP Residual norm 3.187509659594e-03 2131 KSP Residual norm 3.320572410868e-03 2132 KSP Residual norm 3.299541354205e-03 2133 KSP Residual norm 3.223831194932e-03 2134 KSP Residual norm 3.189967754497e-03 2135 KSP Residual norm 3.403427921578e-03 2136 KSP Residual norm 3.584610061763e-03 2137 KSP Residual norm 3.711454211957e-03 2138 KSP Residual norm 3.953840788285e-03 2139 KSP Residual norm 4.105451876925e-03 2140 KSP Residual norm 4.177676230502e-03 2141 KSP Residual norm 4.308532456220e-03 2142 KSP Residual norm 4.709470620439e-03 2143 KSP Residual norm 5.773774414679e-03 2144 KSP Residual norm 6.587720688920e-03 2145 KSP Residual norm 6.770095137610e-03 2146 KSP Residual norm 8.024962739428e-03 2147 KSP Residual norm 9.979006460618e-03 2148 KSP Residual norm 1.065598861617e-02 2149 KSP Residual norm 1.028467092583e-02 2150 KSP Residual norm 1.031263755519e-02 2151 KSP Residual norm 1.018699324856e-02 2152 KSP Residual norm 9.406281252560e-03 2153 KSP Residual norm 8.920330262389e-03 2154 KSP Residual norm 8.971791173694e-03 2155 KSP Residual norm 9.516663684974e-03 2156 KSP Residual norm 9.079106424707e-03 2157 KSP Residual norm 7.963741724505e-03 2158 KSP Residual norm 7.154186451308e-03 2159 KSP Residual norm 6.775668514411e-03 2160 KSP Residual norm 6.390988704684e-03 2161 KSP Residual norm 5.772587283237e-03 2162 KSP Residual norm 5.435320689840e-03 2163 KSP Residual norm 5.465545999965e-03 2164 KSP Residual norm 5.233001651176e-03 2165 KSP Residual norm 4.652810907220e-03 2166 KSP Residual norm 4.447632631418e-03 2167 KSP Residual norm 4.448594208194e-03 2168 KSP Residual norm 4.350817195271e-03 2169 KSP Residual norm 4.205294412744e-03 2170 KSP Residual norm 4.223189936177e-03 2171 KSP Residual norm 4.288138588316e-03 2172 KSP Residual norm 4.076768554048e-03 2173 KSP Residual norm 3.949251053219e-03 2174 KSP Residual norm 4.222998063808e-03 2175 KSP Residual norm 4.394186001931e-03 2176 KSP Residual norm 4.431744202567e-03 2177 KSP Residual norm 4.714660974315e-03 2178 KSP Residual norm 5.265011457330e-03 2179 KSP Residual norm 5.550076240279e-03 2180 KSP Residual norm 5.385488061286e-03 2181 KSP Residual norm 5.170849664168e-03 2182 KSP Residual norm 5.844721354479e-03 2183 KSP Residual norm 7.297130276539e-03 2184 KSP Residual norm 7.850736875997e-03 2185 KSP Residual norm 7.698468024820e-03 2186 KSP Residual norm 8.290259524435e-03 2187 KSP Residual norm 9.384539453648e-03 2188 KSP Residual norm 9.735129216493e-03 2189 KSP Residual norm 9.868657977979e-03 2190 KSP Residual norm 1.091551433912e-02 2191 KSP Residual norm 1.245981706657e-02 2192 KSP Residual norm 1.197197715594e-02 2193 KSP Residual norm 1.164069198550e-02 2194 KSP Residual norm 1.244648944014e-02 2195 KSP Residual norm 1.350572985483e-02 2196 KSP Residual norm 1.373395245921e-02 2197 KSP Residual norm 1.328422543346e-02 2198 KSP Residual norm 1.223399174464e-02 2199 KSP Residual norm 1.162356128188e-02 2200 KSP Residual norm 1.066884095926e-02 2201 KSP Residual norm 9.112010736212e-03 2202 KSP Residual norm 8.422796463594e-03 2203 KSP Residual norm 8.383002149071e-03 2204 KSP Residual norm 7.824083280895e-03 2205 KSP Residual norm 7.034107064892e-03 2206 KSP Residual norm 6.406017848159e-03 2207 KSP Residual norm 5.924109479508e-03 2208 KSP Residual norm 5.357465387764e-03 2209 KSP Residual norm 4.984026665317e-03 2210 KSP Residual norm 4.746244849070e-03 2211 KSP Residual norm 4.376851720421e-03 2212 KSP Residual norm 3.720427440057e-03 2213 KSP Residual norm 3.146408949809e-03 2214 KSP Residual norm 2.890006160957e-03 2215 KSP Residual norm 2.756658730239e-03 2216 KSP Residual norm 2.653215495302e-03 2217 KSP Residual norm 2.650206256948e-03 2218 KSP Residual norm 2.721887896256e-03 2219 KSP Residual norm 2.684622175643e-03 2220 KSP Residual norm 2.488504584793e-03 2221 KSP Residual norm 2.386959412109e-03 2222 KSP Residual norm 2.371450010664e-03 2223 KSP Residual norm 2.499419977970e-03 2224 KSP Residual norm 2.693033707163e-03 2225 KSP Residual norm 2.860677622571e-03 2226 KSP Residual norm 2.952863582608e-03 2227 KSP Residual norm 2.988598797066e-03 2228 KSP Residual norm 2.890748879091e-03 2229 KSP Residual norm 2.868441200643e-03 2230 KSP Residual norm 2.991111970293e-03 2231 KSP Residual norm 3.172985514033e-03 2232 KSP Residual norm 3.292606908435e-03 2233 KSP Residual norm 3.392155811553e-03 2234 KSP Residual norm 3.535230701617e-03 2235 KSP Residual norm 3.959373637352e-03 2236 KSP Residual norm 4.565212140205e-03 2237 KSP Residual norm 4.821196987496e-03 2238 KSP Residual norm 4.922415050281e-03 2239 KSP Residual norm 5.072974539554e-03 2240 KSP Residual norm 5.263813883195e-03 2241 KSP Residual norm 5.426306929832e-03 2242 KSP Residual norm 6.120664810580e-03 2243 KSP Residual norm 7.159099015502e-03 2244 KSP Residual norm 7.409615555114e-03 2245 KSP Residual norm 7.513554331491e-03 2246 KSP Residual norm 8.000313332997e-03 2247 KSP Residual norm 8.774490252024e-03 2248 KSP Residual norm 9.039336379211e-03 2249 KSP Residual norm 9.416574883628e-03 2250 KSP Residual norm 1.045552940291e-02 2251 KSP Residual norm 1.095124765772e-02 2252 KSP Residual norm 1.091067731545e-02 2253 KSP Residual norm 1.072798400570e-02 2254 KSP Residual norm 1.053330165208e-02 2255 KSP Residual norm 1.087027560563e-02 2256 KSP Residual norm 1.094340292209e-02 2257 KSP Residual norm 9.944151818544e-03 2258 KSP Residual norm 9.524326498058e-03 2259 KSP Residual norm 9.857883498125e-03 2260 KSP Residual norm 9.999241781813e-03 2261 KSP Residual norm 8.917422673018e-03 2262 KSP Residual norm 8.019990947943e-03 2263 KSP Residual norm 7.923931148596e-03 2264 KSP Residual norm 7.630228933650e-03 2265 KSP Residual norm 6.890681148470e-03 2266 KSP Residual norm 6.397314247109e-03 2267 KSP Residual norm 6.076721021358e-03 2268 KSP Residual norm 5.769028879956e-03 2269 KSP Residual norm 5.336649542499e-03 2270 KSP Residual norm 4.881407930542e-03 2271 KSP Residual norm 4.577405831613e-03 2272 KSP Residual norm 4.213415851037e-03 2273 KSP Residual norm 3.775766616320e-03 2274 KSP Residual norm 3.404244489367e-03 2275 KSP Residual norm 3.261894222734e-03 2276 KSP Residual norm 3.100577799789e-03 2277 KSP Residual norm 2.763231017518e-03 2278 KSP Residual norm 2.567503732718e-03 2279 KSP Residual norm 2.629261305111e-03 2280 KSP Residual norm 2.583372462179e-03 2281 KSP Residual norm 2.257492720860e-03 2282 KSP Residual norm 2.168975357396e-03 2283 KSP Residual norm 2.355179931970e-03 2284 KSP Residual norm 2.492067164151e-03 2285 KSP Residual norm 2.456665907090e-03 2286 KSP Residual norm 2.437810288837e-03 2287 KSP Residual norm 2.740218703631e-03 2288 KSP Residual norm 2.946953069909e-03 2289 KSP Residual norm 2.787872921852e-03 2290 KSP Residual norm 2.725536734613e-03 2291 KSP Residual norm 3.033412099267e-03 2292 KSP Residual norm 3.272583682207e-03 2293 KSP Residual norm 3.204235568301e-03 2294 KSP Residual norm 3.421108353676e-03 2295 KSP Residual norm 3.971217627917e-03 2296 KSP Residual norm 4.128587634961e-03 2297 KSP Residual norm 4.298336760100e-03 2298 KSP Residual norm 4.722062357837e-03 2299 KSP Residual norm 5.697560142610e-03 2300 KSP Residual norm 5.968990862527e-03 2301 KSP Residual norm 5.826561794337e-03 2302 KSP Residual norm 6.359427330795e-03 2303 KSP Residual norm 7.376375276938e-03 2304 KSP Residual norm 7.922245439289e-03 2305 KSP Residual norm 8.028445372328e-03 2306 KSP Residual norm 7.699912691885e-03 2307 KSP Residual norm 7.587940128727e-03 2308 KSP Residual norm 7.960298833559e-03 2309 KSP Residual norm 8.461080082996e-03 2310 KSP Residual norm 9.332967294487e-03 2311 KSP Residual norm 1.057501275364e-02 2312 KSP Residual norm 1.151537427737e-02 2313 KSP Residual norm 1.208983731323e-02 2314 KSP Residual norm 1.270610053976e-02 2315 KSP Residual norm 1.401458135679e-02 2316 KSP Residual norm 1.465214520382e-02 2317 KSP Residual norm 1.486866786002e-02 2318 KSP Residual norm 1.560342833672e-02 2319 KSP Residual norm 1.699230182908e-02 2320 KSP Residual norm 1.811264580263e-02 2321 KSP Residual norm 1.771933729362e-02 2322 KSP Residual norm 1.820507682733e-02 2323 KSP Residual norm 1.790918445477e-02 2324 KSP Residual norm 1.672804542537e-02 2325 KSP Residual norm 1.552464898903e-02 2326 KSP Residual norm 1.542876923443e-02 2327 KSP Residual norm 1.452627185034e-02 2328 KSP Residual norm 1.355062176346e-02 2329 KSP Residual norm 1.260040776986e-02 2330 KSP Residual norm 1.229162073681e-02 2331 KSP Residual norm 1.228076319119e-02 2332 KSP Residual norm 1.154472090508e-02 2333 KSP Residual norm 1.048177215151e-02 2334 KSP Residual norm 9.744028166372e-03 2335 KSP Residual norm 9.596511059772e-03 2336 KSP Residual norm 9.486410482497e-03 2337 KSP Residual norm 8.745430932840e-03 2338 KSP Residual norm 8.210290136575e-03 2339 KSP Residual norm 7.839236326358e-03 2340 KSP Residual norm 7.194744679873e-03 2341 KSP Residual norm 6.571357371524e-03 2342 KSP Residual norm 6.106849801378e-03 2343 KSP Residual norm 6.022124082466e-03 2344 KSP Residual norm 5.880372351873e-03 2345 KSP Residual norm 5.944846889075e-03 2346 KSP Residual norm 5.961756628698e-03 2347 KSP Residual norm 5.826538667703e-03 2348 KSP Residual norm 5.760630918210e-03 2349 KSP Residual norm 5.492884703535e-03 2350 KSP Residual norm 5.410281952695e-03 2351 KSP Residual norm 5.494273029143e-03 2352 KSP Residual norm 5.303777160006e-03 2353 KSP Residual norm 5.299306677113e-03 2354 KSP Residual norm 5.593943123853e-03 2355 KSP Residual norm 5.947341230891e-03 2356 KSP Residual norm 6.248050330786e-03 2357 KSP Residual norm 6.672418445200e-03 2358 KSP Residual norm 6.840759883722e-03 2359 KSP Residual norm 7.311645521061e-03 2360 KSP Residual norm 8.528165651140e-03 2361 KSP Residual norm 9.359232085623e-03 2362 KSP Residual norm 8.997214443045e-03 2363 KSP Residual norm 9.151512001640e-03 2364 KSP Residual norm 9.814919143160e-03 2365 KSP Residual norm 9.473961467521e-03 2366 KSP Residual norm 8.916794002765e-03 2367 KSP Residual norm 9.580099819334e-03 2368 KSP Residual norm 1.092803287468e-02 2369 KSP Residual norm 1.177066696453e-02 2370 KSP Residual norm 1.233610734730e-02 2371 KSP Residual norm 1.349596697859e-02 2372 KSP Residual norm 1.422597580577e-02 2373 KSP Residual norm 1.394041333853e-02 2374 KSP Residual norm 1.324928418055e-02 2375 KSP Residual norm 1.396539289495e-02 2376 KSP Residual norm 1.514717693073e-02 2377 KSP Residual norm 1.451979579451e-02 2378 KSP Residual norm 1.264633392368e-02 2379 KSP Residual norm 1.110823036001e-02 2380 KSP Residual norm 1.006475915249e-02 2381 KSP Residual norm 9.365164277252e-03 2382 KSP Residual norm 9.045684726161e-03 2383 KSP Residual norm 8.579824342906e-03 2384 KSP Residual norm 8.162256717764e-03 2385 KSP Residual norm 7.491664866784e-03 2386 KSP Residual norm 7.139888523418e-03 2387 KSP Residual norm 7.212352332472e-03 2388 KSP Residual norm 7.129900453900e-03 2389 KSP Residual norm 6.429884715097e-03 2390 KSP Residual norm 6.118678763045e-03 2391 KSP Residual norm 6.653860543467e-03 2392 KSP Residual norm 6.582069714587e-03 2393 KSP Residual norm 5.762185306889e-03 2394 KSP Residual norm 5.456077407301e-03 2395 KSP Residual norm 5.436151997470e-03 2396 KSP Residual norm 5.019341766606e-03 2397 KSP Residual norm 4.700956991695e-03 2398 KSP Residual norm 4.668023157564e-03 2399 KSP Residual norm 4.501515700463e-03 2400 KSP Residual norm 4.197022164665e-03 2401 KSP Residual norm 3.817706011035e-03 2402 KSP Residual norm 3.664318386502e-03 2403 KSP Residual norm 3.767331470893e-03 2404 KSP Residual norm 3.850171435180e-03 2405 KSP Residual norm 3.890255062260e-03 2406 KSP Residual norm 4.075339454006e-03 2407 KSP Residual norm 4.343737239892e-03 2408 KSP Residual norm 4.331519772355e-03 2409 KSP Residual norm 4.268280541827e-03 2410 KSP Residual norm 4.464476774624e-03 2411 KSP Residual norm 4.943672287012e-03 2412 KSP Residual norm 5.317709547685e-03 2413 KSP Residual norm 5.459401763318e-03 2414 KSP Residual norm 5.646754403152e-03 2415 KSP Residual norm 5.774969211463e-03 2416 KSP Residual norm 5.708560645311e-03 2417 KSP Residual norm 5.674091351982e-03 2418 KSP Residual norm 5.854876560222e-03 2419 KSP Residual norm 6.260401374731e-03 2420 KSP Residual norm 6.621278399721e-03 2421 KSP Residual norm 7.282592497201e-03 2422 KSP Residual norm 8.093100105437e-03 2423 KSP Residual norm 8.303737134056e-03 2424 KSP Residual norm 8.401936399922e-03 2425 KSP Residual norm 8.811703568770e-03 2426 KSP Residual norm 8.942527065268e-03 2427 KSP Residual norm 9.454131455613e-03 2428 KSP Residual norm 1.056739724498e-02 2429 KSP Residual norm 1.131059497836e-02 2430 KSP Residual norm 1.135639791659e-02 2431 KSP Residual norm 1.205126873730e-02 2432 KSP Residual norm 1.262495951635e-02 2433 KSP Residual norm 1.303848674425e-02 2434 KSP Residual norm 1.301252885924e-02 2435 KSP Residual norm 1.377500347167e-02 2436 KSP Residual norm 1.429601251787e-02 2437 KSP Residual norm 1.392106005918e-02 2438 KSP Residual norm 1.360317743723e-02 2439 KSP Residual norm 1.426602654522e-02 2440 KSP Residual norm 1.510382328407e-02 2441 KSP Residual norm 1.473928783444e-02 2442 KSP Residual norm 1.380171657314e-02 2443 KSP Residual norm 1.268300260363e-02 2444 KSP Residual norm 1.255281825396e-02 2445 KSP Residual norm 1.268661415517e-02 2446 KSP Residual norm 1.295737164497e-02 2447 KSP Residual norm 1.355555123857e-02 2448 KSP Residual norm 1.355735990527e-02 2449 KSP Residual norm 1.336200081031e-02 2450 KSP Residual norm 1.252572155735e-02 2451 KSP Residual norm 1.131979365474e-02 2452 KSP Residual norm 1.100821509650e-02 2453 KSP Residual norm 1.087773496952e-02 2454 KSP Residual norm 1.013125318378e-02 2455 KSP Residual norm 9.805815814668e-03 2456 KSP Residual norm 1.004451530965e-02 2457 KSP Residual norm 9.575804427701e-03 2458 KSP Residual norm 8.652734196563e-03 2459 KSP Residual norm 8.163438150136e-03 2460 KSP Residual norm 8.041074727141e-03 2461 KSP Residual norm 7.346137486217e-03 2462 KSP Residual norm 6.565328169772e-03 2463 KSP Residual norm 6.495238995987e-03 2464 KSP Residual norm 6.543818246286e-03 2465 KSP Residual norm 5.844422743171e-03 2466 KSP Residual norm 5.161768107683e-03 2467 KSP Residual norm 4.998513002561e-03 2468 KSP Residual norm 5.023178132147e-03 2469 KSP Residual norm 4.955750454620e-03 2470 KSP Residual norm 4.914149665942e-03 2471 KSP Residual norm 5.082266464149e-03 2472 KSP Residual norm 5.226795434602e-03 2473 KSP Residual norm 5.083034164029e-03 2474 KSP Residual norm 4.786180670349e-03 2475 KSP Residual norm 4.723714363499e-03 2476 KSP Residual norm 4.706592527681e-03 2477 KSP Residual norm 4.400559829879e-03 2478 KSP Residual norm 4.111435196105e-03 2479 KSP Residual norm 4.337075346385e-03 2480 KSP Residual norm 4.945178145864e-03 2481 KSP Residual norm 5.092298303078e-03 2482 KSP Residual norm 4.784661524695e-03 2483 KSP Residual norm 4.919441824071e-03 2484 KSP Residual norm 5.886707767174e-03 2485 KSP Residual norm 6.150378407138e-03 2486 KSP Residual norm 6.227021440015e-03 2487 KSP Residual norm 6.745898636979e-03 2488 KSP Residual norm 7.047764964447e-03 2489 KSP Residual norm 6.892213427333e-03 2490 KSP Residual norm 7.331046814200e-03 2491 KSP Residual norm 8.227580999757e-03 2492 KSP Residual norm 9.024133996447e-03 2493 KSP Residual norm 9.058584392188e-03 2494 KSP Residual norm 9.224694390954e-03 2495 KSP Residual norm 9.945749770545e-03 2496 KSP Residual norm 1.027578826375e-02 2497 KSP Residual norm 1.025977864628e-02 2498 KSP Residual norm 1.047135323925e-02 2499 KSP Residual norm 1.128179504584e-02 2500 KSP Residual norm 1.163186215290e-02 2501 KSP Residual norm 1.180260491874e-02 2502 KSP Residual norm 1.228759810586e-02 2503 KSP Residual norm 1.261920600272e-02 2504 KSP Residual norm 1.246019870243e-02 2505 KSP Residual norm 1.259940043368e-02 2506 KSP Residual norm 1.235165793218e-02 2507 KSP Residual norm 1.155175172210e-02 2508 KSP Residual norm 1.109419638823e-02 2509 KSP Residual norm 1.112496659786e-02 2510 KSP Residual norm 1.105540691914e-02 2511 KSP Residual norm 1.118139652667e-02 2512 KSP Residual norm 1.045814729055e-02 2513 KSP Residual norm 9.565959357082e-03 2514 KSP Residual norm 9.665109194128e-03 2515 KSP Residual norm 1.033527019297e-02 2516 KSP Residual norm 1.020689592293e-02 2517 KSP Residual norm 9.411272897569e-03 2518 KSP Residual norm 8.552625261265e-03 2519 KSP Residual norm 8.020197971840e-03 2520 KSP Residual norm 8.009439687359e-03 2521 KSP Residual norm 8.357946587350e-03 2522 KSP Residual norm 8.362529612484e-03 2523 KSP Residual norm 8.114648870068e-03 2524 KSP Residual norm 7.807525075191e-03 2525 KSP Residual norm 7.818755683000e-03 2526 KSP Residual norm 7.290236106522e-03 2527 KSP Residual norm 6.766378491383e-03 2528 KSP Residual norm 6.419448174050e-03 2529 KSP Residual norm 6.016044262508e-03 2530 KSP Residual norm 5.457758626948e-03 2531 KSP Residual norm 5.225608365999e-03 2532 KSP Residual norm 5.140382521091e-03 2533 KSP Residual norm 4.847419115490e-03 2534 KSP Residual norm 4.344336303964e-03 2535 KSP Residual norm 3.953836922469e-03 2536 KSP Residual norm 3.759166852375e-03 2537 KSP Residual norm 3.465317723701e-03 2538 KSP Residual norm 3.376037501877e-03 2539 KSP Residual norm 3.465844879655e-03 2540 KSP Residual norm 3.188246161516e-03 2541 KSP Residual norm 2.806258819422e-03 2542 KSP Residual norm 2.574138120822e-03 2543 KSP Residual norm 2.400153103637e-03 2544 KSP Residual norm 2.256897887269e-03 2545 KSP Residual norm 2.172604881480e-03 2546 KSP Residual norm 2.119275275677e-03 2547 KSP Residual norm 1.934906777052e-03 2548 KSP Residual norm 1.887190889666e-03 2549 KSP Residual norm 2.005042240433e-03 2550 KSP Residual norm 2.003622964132e-03 2551 KSP Residual norm 2.017525563253e-03 2552 KSP Residual norm 2.120766630389e-03 2553 KSP Residual norm 2.030221752738e-03 2554 KSP Residual norm 1.822857717387e-03 2555 KSP Residual norm 1.858280486180e-03 2556 KSP Residual norm 2.066510567964e-03 2557 KSP Residual norm 2.082650460288e-03 2558 KSP Residual norm 1.942917568133e-03 2559 KSP Residual norm 1.939991779578e-03 2560 KSP Residual norm 1.964283335486e-03 2561 KSP Residual norm 1.952533731378e-03 2562 KSP Residual norm 1.942083965188e-03 2563 KSP Residual norm 2.056157903246e-03 2564 KSP Residual norm 2.194497253377e-03 2565 KSP Residual norm 2.371419632789e-03 2566 KSP Residual norm 2.518347065455e-03 2567 KSP Residual norm 2.701302247258e-03 2568 KSP Residual norm 2.795331332162e-03 2569 KSP Residual norm 2.670469528493e-03 2570 KSP Residual norm 2.486617843393e-03 2571 KSP Residual norm 2.490357246694e-03 2572 KSP Residual norm 2.644629342573e-03 2573 KSP Residual norm 2.717135486971e-03 2574 KSP Residual norm 2.817669991730e-03 2575 KSP Residual norm 3.134078794211e-03 2576 KSP Residual norm 3.439559563310e-03 2577 KSP Residual norm 3.525304337067e-03 2578 KSP Residual norm 3.848407219676e-03 2579 KSP Residual norm 4.172120908185e-03 2580 KSP Residual norm 4.147151486596e-03 2581 KSP Residual norm 4.244007764660e-03 2582 KSP Residual norm 4.653581934327e-03 2583 KSP Residual norm 5.337753974150e-03 2584 KSP Residual norm 5.579099169522e-03 2585 KSP Residual norm 5.293906845109e-03 2586 KSP Residual norm 5.188560162742e-03 2587 KSP Residual norm 5.434809181416e-03 2588 KSP Residual norm 5.777278090177e-03 2589 KSP Residual norm 5.798176790722e-03 2590 KSP Residual norm 6.013124695274e-03 2591 KSP Residual norm 6.449577667500e-03 2592 KSP Residual norm 6.434569035132e-03 2593 KSP Residual norm 6.647043507976e-03 2594 KSP Residual norm 6.766497340129e-03 2595 KSP Residual norm 6.586268634271e-03 2596 KSP Residual norm 6.362863368277e-03 2597 KSP Residual norm 6.972034378214e-03 2598 KSP Residual norm 7.666433212361e-03 2599 KSP Residual norm 7.030591364164e-03 2600 KSP Residual norm 6.261952143498e-03 2601 KSP Residual norm 6.334862749894e-03 2602 KSP Residual norm 6.318795570805e-03 2603 KSP Residual norm 6.172005139057e-03 2604 KSP Residual norm 6.471332821120e-03 2605 KSP Residual norm 6.783784346312e-03 2606 KSP Residual norm 6.416069229820e-03 2607 KSP Residual norm 6.007075002044e-03 2608 KSP Residual norm 5.893058127475e-03 2609 KSP Residual norm 5.683576052440e-03 2610 KSP Residual norm 5.408949172070e-03 2611 KSP Residual norm 5.115837579870e-03 2612 KSP Residual norm 4.845334779024e-03 2613 KSP Residual norm 4.544017248392e-03 2614 KSP Residual norm 4.322561110461e-03 2615 KSP Residual norm 4.051869859407e-03 2616 KSP Residual norm 3.633203941098e-03 2617 KSP Residual norm 3.730359285905e-03 2618 KSP Residual norm 3.953531389059e-03 2619 KSP Residual norm 3.546282614388e-03 2620 KSP Residual norm 2.946557927049e-03 2621 KSP Residual norm 2.744013009653e-03 2622 KSP Residual norm 2.797304890528e-03 2623 KSP Residual norm 2.776063401053e-03 2624 KSP Residual norm 2.666853252490e-03 2625 KSP Residual norm 2.570553908970e-03 2626 KSP Residual norm 2.418137297250e-03 2627 KSP Residual norm 2.342030722541e-03 2628 KSP Residual norm 2.384143106336e-03 2629 KSP Residual norm 2.331581495599e-03 2630 KSP Residual norm 2.118101736682e-03 2631 KSP Residual norm 1.966008740335e-03 2632 KSP Residual norm 1.929165865116e-03 2633 KSP Residual norm 1.858383094757e-03 2634 KSP Residual norm 1.840139851182e-03 2635 KSP Residual norm 1.730583423718e-03 2636 KSP Residual norm 1.576364084886e-03 2637 KSP Residual norm 1.582887351903e-03 2638 KSP Residual norm 1.615390108123e-03 2639 KSP Residual norm 1.430278769497e-03 2640 KSP Residual norm 1.309774816050e-03 2641 KSP Residual norm 1.307817759557e-03 2642 KSP Residual norm 1.213478325933e-03 2643 KSP Residual norm 1.083290090130e-03 2644 KSP Residual norm 1.085094157745e-03 2645 KSP Residual norm 1.149726040631e-03 2646 KSP Residual norm 1.111283849541e-03 2647 KSP Residual norm 9.855745611789e-04 2648 KSP Residual norm 9.046436388309e-04 2649 KSP Residual norm 8.680357480033e-04 2650 KSP Residual norm 8.813872816119e-04 2651 KSP Residual norm 9.193231486463e-04 2652 KSP Residual norm 9.709608864798e-04 2653 KSP Residual norm 9.548075389208e-04 2654 KSP Residual norm 8.900324621046e-04 2655 KSP Residual norm 8.725312938322e-04 2656 KSP Residual norm 9.082437403545e-04 2657 KSP Residual norm 9.385644497781e-04 2658 KSP Residual norm 9.624593688495e-04 2659 KSP Residual norm 1.012088976398e-03 2660 KSP Residual norm 1.011955489881e-03 2661 KSP Residual norm 9.749066420987e-04 2662 KSP Residual norm 9.855223442161e-04 2663 KSP Residual norm 1.059256654704e-03 2664 KSP Residual norm 1.092440140365e-03 2665 KSP Residual norm 1.090564467516e-03 2666 KSP Residual norm 1.065117419279e-03 2667 KSP Residual norm 1.108393848916e-03 2668 KSP Residual norm 1.210653186918e-03 2669 KSP Residual norm 1.256173308257e-03 2670 KSP Residual norm 1.234334641031e-03 2671 KSP Residual norm 1.310454239223e-03 2672 KSP Residual norm 1.416780503419e-03 2673 KSP Residual norm 1.422885149903e-03 2674 KSP Residual norm 1.388075037904e-03 2675 KSP Residual norm 1.366737962308e-03 2676 KSP Residual norm 1.379201938358e-03 2677 KSP Residual norm 1.351799042850e-03 2678 KSP Residual norm 1.475734405630e-03 2679 KSP Residual norm 1.713326884038e-03 2680 KSP Residual norm 1.828599164666e-03 2681 KSP Residual norm 1.857615319113e-03 2682 KSP Residual norm 1.988256942276e-03 2683 KSP Residual norm 2.095755035175e-03 2684 KSP Residual norm 2.175776595377e-03 2685 KSP Residual norm 2.437862726306e-03 2686 KSP Residual norm 2.872368978122e-03 2687 KSP Residual norm 2.930121135955e-03 2688 KSP Residual norm 2.785150249971e-03 2689 KSP Residual norm 2.926861935826e-03 2690 KSP Residual norm 3.256586448558e-03 2691 KSP Residual norm 3.394495732335e-03 2692 KSP Residual norm 3.676170240234e-03 2693 KSP Residual norm 4.464070401612e-03 2694 KSP Residual norm 4.925003235818e-03 2695 KSP Residual norm 4.657972593757e-03 2696 KSP Residual norm 4.520776031390e-03 2697 KSP Residual norm 4.826169301751e-03 2698 KSP Residual norm 4.705226785431e-03 2699 KSP Residual norm 4.845834495802e-03 2700 KSP Residual norm 5.450568165750e-03 2701 KSP Residual norm 5.405685886439e-03 2702 KSP Residual norm 5.064404720592e-03 2703 KSP Residual norm 5.159041773853e-03 2704 KSP Residual norm 5.396253694145e-03 2705 KSP Residual norm 5.087777378594e-03 2706 KSP Residual norm 4.416701744863e-03 2707 KSP Residual norm 4.283464780264e-03 2708 KSP Residual norm 4.227776586809e-03 2709 KSP Residual norm 3.816430510896e-03 2710 KSP Residual norm 3.507188080857e-03 2711 KSP Residual norm 3.664141787644e-03 2712 KSP Residual norm 3.882078097164e-03 2713 KSP Residual norm 3.558005800540e-03 2714 KSP Residual norm 3.362556333894e-03 2715 KSP Residual norm 3.381412266198e-03 2716 KSP Residual norm 3.272343545117e-03 2717 KSP Residual norm 3.098166203496e-03 2718 KSP Residual norm 2.918170899895e-03 2719 KSP Residual norm 2.805589012223e-03 2720 KSP Residual norm 2.617461682809e-03 2721 KSP Residual norm 2.498203821414e-03 2722 KSP Residual norm 2.555703948928e-03 2723 KSP Residual norm 2.704929910323e-03 2724 KSP Residual norm 2.704804499672e-03 2725 KSP Residual norm 2.422352305371e-03 2726 KSP Residual norm 2.146010215001e-03 2727 KSP Residual norm 2.157191475484e-03 2728 KSP Residual norm 2.145336994135e-03 2729 KSP Residual norm 2.086378289458e-03 2730 KSP Residual norm 2.229778840444e-03 2731 KSP Residual norm 2.535817164615e-03 2732 KSP Residual norm 2.632680974580e-03 2733 KSP Residual norm 2.488486653901e-03 2734 KSP Residual norm 2.374380582123e-03 2735 KSP Residual norm 2.430738511771e-03 2736 KSP Residual norm 2.420248713523e-03 2737 KSP Residual norm 2.302847190148e-03 2738 KSP Residual norm 2.254438668835e-03 2739 KSP Residual norm 2.206384051676e-03 2740 KSP Residual norm 2.026611506674e-03 2741 KSP Residual norm 1.863862978796e-03 2742 KSP Residual norm 1.791918577784e-03 2743 KSP Residual norm 1.589158171238e-03 2744 KSP Residual norm 1.448351414910e-03 2745 KSP Residual norm 1.462011618822e-03 2746 KSP Residual norm 1.546262244036e-03 2747 KSP Residual norm 1.457974955196e-03 2748 KSP Residual norm 1.358665370080e-03 2749 KSP Residual norm 1.485787041264e-03 2750 KSP Residual norm 1.649416402076e-03 2751 KSP Residual norm 1.520063285568e-03 2752 KSP Residual norm 1.394521339503e-03 2753 KSP Residual norm 1.542474992449e-03 2754 KSP Residual norm 1.617270768548e-03 2755 KSP Residual norm 1.484129864690e-03 2756 KSP Residual norm 1.489982256038e-03 2757 KSP Residual norm 1.634619136561e-03 2758 KSP Residual norm 1.576841752283e-03 2759 KSP Residual norm 1.484849375883e-03 2760 KSP Residual norm 1.588611441559e-03 2761 KSP Residual norm 1.788271668052e-03 2762 KSP Residual norm 1.838564643069e-03 2763 KSP Residual norm 1.956549529214e-03 2764 KSP Residual norm 2.153577292642e-03 2765 KSP Residual norm 2.158929997251e-03 2766 KSP Residual norm 2.004385674663e-03 2767 KSP Residual norm 2.025979711910e-03 2768 KSP Residual norm 2.272282762743e-03 2769 KSP Residual norm 2.482726499242e-03 2770 KSP Residual norm 2.703743415817e-03 2771 KSP Residual norm 3.014984680178e-03 2772 KSP Residual norm 3.362472240067e-03 2773 KSP Residual norm 3.569907893456e-03 2774 KSP Residual norm 3.793544712662e-03 2775 KSP Residual norm 3.971139276993e-03 2776 KSP Residual norm 4.087569126960e-03 2777 KSP Residual norm 4.363456089764e-03 2778 KSP Residual norm 4.625571421248e-03 2779 KSP Residual norm 4.624976548773e-03 2780 KSP Residual norm 4.718849386844e-03 2781 KSP Residual norm 4.786949244693e-03 2782 KSP Residual norm 4.771666421559e-03 2783 KSP Residual norm 5.104231978217e-03 2784 KSP Residual norm 5.734907457343e-03 2785 KSP Residual norm 5.764605173383e-03 2786 KSP Residual norm 5.426953091590e-03 2787 KSP Residual norm 5.580034553178e-03 2788 KSP Residual norm 5.949693184172e-03 2789 KSP Residual norm 5.996671739248e-03 2790 KSP Residual norm 6.157706620644e-03 2791 KSP Residual norm 6.848408423874e-03 2792 KSP Residual norm 7.066338896982e-03 2793 KSP Residual norm 6.587699824354e-03 2794 KSP Residual norm 6.594254657823e-03 2795 KSP Residual norm 7.284894664177e-03 2796 KSP Residual norm 7.388794406292e-03 2797 KSP Residual norm 6.742009621402e-03 2798 KSP Residual norm 6.669693683210e-03 2799 KSP Residual norm 6.280792363793e-03 2800 KSP Residual norm 5.548612251379e-03 2801 KSP Residual norm 5.358466425664e-03 2802 KSP Residual norm 5.617501994186e-03 2803 KSP Residual norm 5.319539313742e-03 2804 KSP Residual norm 4.781893806291e-03 2805 KSP Residual norm 4.741739705150e-03 2806 KSP Residual norm 4.981281036051e-03 2807 KSP Residual norm 4.723399945724e-03 2808 KSP Residual norm 4.341339983627e-03 2809 KSP Residual norm 4.651093018266e-03 2810 KSP Residual norm 5.040994656518e-03 2811 KSP Residual norm 4.479552064666e-03 2812 KSP Residual norm 3.830957288486e-03 2813 KSP Residual norm 3.533215940491e-03 2814 KSP Residual norm 3.438896124903e-03 2815 KSP Residual norm 3.205565080589e-03 2816 KSP Residual norm 3.178743406165e-03 2817 KSP Residual norm 3.240179887624e-03 2818 KSP Residual norm 3.158195637160e-03 2819 KSP Residual norm 2.892701103103e-03 2820 KSP Residual norm 2.797142641246e-03 2821 KSP Residual norm 2.779326882773e-03 2822 KSP Residual norm 2.617507251360e-03 2823 KSP Residual norm 2.350097638705e-03 2824 KSP Residual norm 2.096293170896e-03 2825 KSP Residual norm 1.948274586444e-03 2826 KSP Residual norm 1.807042821225e-03 2827 KSP Residual norm 1.738739744696e-03 2828 KSP Residual norm 1.678668895603e-03 2829 KSP Residual norm 1.773884200509e-03 2830 KSP Residual norm 1.836552583922e-03 2831 KSP Residual norm 1.737215267139e-03 2832 KSP Residual norm 1.598291536644e-03 2833 KSP Residual norm 1.562991290880e-03 2834 KSP Residual norm 1.467456550119e-03 2835 KSP Residual norm 1.294963690664e-03 2836 KSP Residual norm 1.237019430877e-03 2837 KSP Residual norm 1.312206291645e-03 2838 KSP Residual norm 1.291349637627e-03 2839 KSP Residual norm 1.160335094663e-03 2840 KSP Residual norm 1.063879454608e-03 2841 KSP Residual norm 1.034625807636e-03 2842 KSP Residual norm 9.770485830404e-04 2843 KSP Residual norm 9.374445013353e-04 2844 KSP Residual norm 9.902724864709e-04 2845 KSP Residual norm 1.036533344318e-03 2846 KSP Residual norm 1.079569282831e-03 2847 KSP Residual norm 1.226883463673e-03 2848 KSP Residual norm 1.430574675211e-03 2849 KSP Residual norm 1.354421865633e-03 2850 KSP Residual norm 1.231373694932e-03 2851 KSP Residual norm 1.175397745394e-03 2852 KSP Residual norm 1.070148692108e-03 2853 KSP Residual norm 1.015922741370e-03 2854 KSP Residual norm 1.103931648087e-03 2855 KSP Residual norm 1.171540606755e-03 2856 KSP Residual norm 1.088958128808e-03 2857 KSP Residual norm 1.083246789518e-03 2858 KSP Residual norm 1.113855493690e-03 2859 KSP Residual norm 1.110085190246e-03 2860 KSP Residual norm 1.142143272029e-03 2861 KSP Residual norm 1.193743390105e-03 2862 KSP Residual norm 1.131546392426e-03 2863 KSP Residual norm 1.127578239592e-03 2864 KSP Residual norm 1.271221238619e-03 2865 KSP Residual norm 1.359804747599e-03 2866 KSP Residual norm 1.289379864152e-03 2867 KSP Residual norm 1.321333706950e-03 2868 KSP Residual norm 1.544458941025e-03 2869 KSP Residual norm 1.563976433107e-03 2870 KSP Residual norm 1.399474385920e-03 2871 KSP Residual norm 1.356252805981e-03 2872 KSP Residual norm 1.443852116959e-03 2873 KSP Residual norm 1.356990907946e-03 2874 KSP Residual norm 1.285187356557e-03 2875 KSP Residual norm 1.430496822359e-03 2876 KSP Residual norm 1.681358820304e-03 2877 KSP Residual norm 1.749072634641e-03 2878 KSP Residual norm 1.829374916805e-03 2879 KSP Residual norm 2.117446004955e-03 2880 KSP Residual norm 2.172038560867e-03 2881 KSP Residual norm 1.990623694723e-03 2882 KSP Residual norm 2.142729037698e-03 2883 KSP Residual norm 2.459801592783e-03 2884 KSP Residual norm 2.405512176252e-03 2885 KSP Residual norm 2.407816699724e-03 2886 KSP Residual norm 2.653979926691e-03 2887 KSP Residual norm 2.735241253463e-03 2888 KSP Residual norm 2.684942144314e-03 2889 KSP Residual norm 3.107427807814e-03 2890 KSP Residual norm 3.575515284824e-03 2891 KSP Residual norm 3.376235460739e-03 2892 KSP Residual norm 3.404884516786e-03 2893 KSP Residual norm 3.929868756447e-03 2894 KSP Residual norm 4.111687608396e-03 2895 KSP Residual norm 3.788022365376e-03 2896 KSP Residual norm 4.037486192237e-03 2897 KSP Residual norm 4.582307985708e-03 2898 KSP Residual norm 4.397018421487e-03 2899 KSP Residual norm 4.184351178202e-03 2900 KSP Residual norm 4.129499004219e-03 2901 KSP Residual norm 4.193657012396e-03 2902 KSP Residual norm 4.345469063901e-03 2903 KSP Residual norm 4.722109775846e-03 2904 KSP Residual norm 5.392810790430e-03 2905 KSP Residual norm 5.386641629693e-03 2906 KSP Residual norm 5.090454899875e-03 2907 KSP Residual norm 5.202040851774e-03 2908 KSP Residual norm 5.417690627987e-03 2909 KSP Residual norm 5.076526303676e-03 2910 KSP Residual norm 5.233764214625e-03 2911 KSP Residual norm 6.362598247088e-03 2912 KSP Residual norm 7.075721196366e-03 2913 KSP Residual norm 6.983864367887e-03 2914 KSP Residual norm 7.582708469288e-03 2915 KSP Residual norm 7.969517990493e-03 2916 KSP Residual norm 7.917432839086e-03 2917 KSP Residual norm 8.326635825449e-03 2918 KSP Residual norm 8.822095168357e-03 2919 KSP Residual norm 8.668312680659e-03 2920 KSP Residual norm 9.188392767430e-03 2921 KSP Residual norm 1.019618131499e-02 2922 KSP Residual norm 9.443758357690e-03 2923 KSP Residual norm 7.912514701689e-03 2924 KSP Residual norm 7.307896808691e-03 2925 KSP Residual norm 7.190406002902e-03 2926 KSP Residual norm 6.848901440889e-03 2927 KSP Residual norm 7.007735645264e-03 2928 KSP Residual norm 8.085929902110e-03 2929 KSP Residual norm 9.055854895566e-03 2930 KSP Residual norm 8.788443507417e-03 2931 KSP Residual norm 8.892129795841e-03 2932 KSP Residual norm 9.391775988154e-03 2933 KSP Residual norm 8.778762453979e-03 2934 KSP Residual norm 8.177475371371e-03 2935 KSP Residual norm 8.296157796572e-03 2936 KSP Residual norm 8.428951916742e-03 2937 KSP Residual norm 7.253612634486e-03 2938 KSP Residual norm 6.486381004139e-03 2939 KSP Residual norm 6.653913849035e-03 2940 KSP Residual norm 6.711623842603e-03 2941 KSP Residual norm 6.561290672839e-03 2942 KSP Residual norm 7.211875432346e-03 2943 KSP Residual norm 7.516924345150e-03 2944 KSP Residual norm 6.571239843022e-03 2945 KSP Residual norm 6.308294812942e-03 2946 KSP Residual norm 6.468447711975e-03 2947 KSP Residual norm 5.456713710464e-03 2948 KSP Residual norm 4.407074338385e-03 2949 KSP Residual norm 4.541864464324e-03 2950 KSP Residual norm 5.127110075465e-03 2951 KSP Residual norm 4.475441723750e-03 2952 KSP Residual norm 3.935090225693e-03 2953 KSP Residual norm 3.986292185665e-03 2954 KSP Residual norm 3.861524737337e-03 2955 KSP Residual norm 3.525301600677e-03 2956 KSP Residual norm 3.606986049901e-03 2957 KSP Residual norm 3.741501646941e-03 2958 KSP Residual norm 3.329653686436e-03 2959 KSP Residual norm 3.202102690664e-03 2960 KSP Residual norm 3.450486095169e-03 2961 KSP Residual norm 3.252219165076e-03 2962 KSP Residual norm 2.477276891406e-03 2963 KSP Residual norm 2.049933056556e-03 2964 KSP Residual norm 1.902709902863e-03 2965 KSP Residual norm 1.752531162959e-03 2966 KSP Residual norm 1.696427240540e-03 2967 KSP Residual norm 1.946924154321e-03 2968 KSP Residual norm 2.227951284603e-03 2969 KSP Residual norm 2.009009158130e-03 2970 KSP Residual norm 1.882226516836e-03 2971 KSP Residual norm 2.086086727174e-03 2972 KSP Residual norm 2.055611734290e-03 2973 KSP Residual norm 1.741240774921e-03 2974 KSP Residual norm 1.725175564643e-03 2975 KSP Residual norm 1.755915800229e-03 2976 KSP Residual norm 1.509528375407e-03 2977 KSP Residual norm 1.357637627266e-03 2978 KSP Residual norm 1.460549071024e-03 2979 KSP Residual norm 1.553492067748e-03 2980 KSP Residual norm 1.364009061525e-03 2981 KSP Residual norm 1.272013168916e-03 2982 KSP Residual norm 1.186072933342e-03 2983 KSP Residual norm 9.951101622586e-04 2984 KSP Residual norm 9.363652021728e-04 2985 KSP Residual norm 1.004675852639e-03 2986 KSP Residual norm 1.036029921093e-03 2987 KSP Residual norm 1.064462437764e-03 2988 KSP Residual norm 1.224997879950e-03 2989 KSP Residual norm 1.388057630221e-03 2990 KSP Residual norm 1.372660186962e-03 2991 KSP Residual norm 1.424045125325e-03 2992 KSP Residual norm 1.500297885504e-03 2993 KSP Residual norm 1.473977518522e-03 2994 KSP Residual norm 1.397325608450e-03 2995 KSP Residual norm 1.446577138766e-03 2996 KSP Residual norm 1.432956780215e-03 2997 KSP Residual norm 1.368682613462e-03 2998 KSP Residual norm 1.330389794315e-03 2999 KSP Residual norm 1.354215374244e-03 3000 KSP Residual norm 1.319668006136e-03 3001 KSP Residual norm 1.273994985774e-03 3002 KSP Residual norm 1.371379762475e-03 3003 KSP Residual norm 1.430753016456e-03 3004 KSP Residual norm 1.306663632696e-03 3005 KSP Residual norm 1.286112809999e-03 3006 KSP Residual norm 1.350415243744e-03 3007 KSP Residual norm 1.372214677136e-03 3008 KSP Residual norm 1.418110778472e-03 3009 KSP Residual norm 1.736144243771e-03 3010 KSP Residual norm 1.962396016012e-03 3011 KSP Residual norm 1.875622461591e-03 3012 KSP Residual norm 1.828878048051e-03 3013 KSP Residual norm 1.968093592774e-03 3014 KSP Residual norm 2.024599310890e-03 3015 KSP Residual norm 1.914434050527e-03 3016 KSP Residual norm 1.949234213420e-03 3017 KSP Residual norm 2.059691091928e-03 3018 KSP Residual norm 2.230823155566e-03 3019 KSP Residual norm 2.452639340379e-03 3020 KSP Residual norm 2.884304396912e-03 3021 KSP Residual norm 3.358685292056e-03 3022 KSP Residual norm 3.553951903749e-03 3023 KSP Residual norm 3.449118175846e-03 3024 KSP Residual norm 3.300312969099e-03 3025 KSP Residual norm 3.141321863979e-03 3026 KSP Residual norm 3.189177447526e-03 3027 KSP Residual norm 3.586824420308e-03 3028 KSP Residual norm 4.318036306854e-03 3029 KSP Residual norm 4.643788568772e-03 3030 KSP Residual norm 4.370505229741e-03 3031 KSP Residual norm 4.161352510344e-03 3032 KSP Residual norm 4.426873460573e-03 3033 KSP Residual norm 4.727376409745e-03 3034 KSP Residual norm 4.935447116989e-03 3035 KSP Residual norm 5.195492858474e-03 3036 KSP Residual norm 5.262987991305e-03 3037 KSP Residual norm 4.988286243106e-03 3038 KSP Residual norm 4.838626754328e-03 3039 KSP Residual norm 5.135821517572e-03 3040 KSP Residual norm 5.128463280593e-03 3041 KSP Residual norm 4.894818274532e-03 3042 KSP Residual norm 5.157080366224e-03 3043 KSP Residual norm 5.899655632507e-03 3044 KSP Residual norm 6.311604201119e-03 3045 KSP Residual norm 6.503921463717e-03 3046 KSP Residual norm 7.045433203190e-03 3047 KSP Residual norm 7.822428362162e-03 3048 KSP Residual norm 7.801763914628e-03 3049 KSP Residual norm 7.803226763047e-03 3050 KSP Residual norm 8.170601241380e-03 3051 KSP Residual norm 7.843857344975e-03 3052 KSP Residual norm 7.654351738200e-03 3053 KSP Residual norm 8.644931839356e-03 3054 KSP Residual norm 9.699516109829e-03 3055 KSP Residual norm 9.647065952106e-03 3056 KSP Residual norm 1.006655408950e-02 3057 KSP Residual norm 1.023806234184e-02 3058 KSP Residual norm 9.532513741776e-03 3059 KSP Residual norm 9.513932902273e-03 3060 KSP Residual norm 1.005696053542e-02 3061 KSP Residual norm 9.512926491812e-03 3062 KSP Residual norm 8.959904449497e-03 3063 KSP Residual norm 9.498596804941e-03 3064 KSP Residual norm 1.065376693957e-02 3065 KSP Residual norm 1.010658942622e-02 3066 KSP Residual norm 9.678241257087e-03 3067 KSP Residual norm 1.021317781344e-02 3068 KSP Residual norm 1.109162939418e-02 3069 KSP Residual norm 1.158666376649e-02 3070 KSP Residual norm 1.202557056106e-02 3071 KSP Residual norm 1.211144817720e-02 3072 KSP Residual norm 1.120304665676e-02 3073 KSP Residual norm 1.083553887500e-02 3074 KSP Residual norm 1.121522622165e-02 3075 KSP Residual norm 1.121260214238e-02 3076 KSP Residual norm 1.046440502336e-02 3077 KSP Residual norm 1.004708331370e-02 3078 KSP Residual norm 1.073391842848e-02 3079 KSP Residual norm 1.156613829371e-02 3080 KSP Residual norm 1.236452671945e-02 3081 KSP Residual norm 1.309923551733e-02 3082 KSP Residual norm 1.438114956817e-02 3083 KSP Residual norm 1.426149827407e-02 3084 KSP Residual norm 1.224280180376e-02 3085 KSP Residual norm 1.096181143829e-02 3086 KSP Residual norm 1.111920170108e-02 3087 KSP Residual norm 1.118340676655e-02 3088 KSP Residual norm 1.000130923922e-02 3089 KSP Residual norm 9.754845073746e-03 3090 KSP Residual norm 1.014216156263e-02 3091 KSP Residual norm 9.540691283703e-03 3092 KSP Residual norm 9.401965344054e-03 3093 KSP Residual norm 9.624902854579e-03 3094 KSP Residual norm 9.147961612330e-03 3095 KSP Residual norm 7.906191690820e-03 3096 KSP Residual norm 7.592020558342e-03 3097 KSP Residual norm 7.524035202212e-03 3098 KSP Residual norm 7.266627699893e-03 3099 KSP Residual norm 6.751911787101e-03 3100 KSP Residual norm 6.387038264013e-03 3101 KSP Residual norm 5.486121905795e-03 3102 KSP Residual norm 4.760691147831e-03 3103 KSP Residual norm 4.980285403119e-03 3104 KSP Residual norm 5.753887376636e-03 3105 KSP Residual norm 6.060393070534e-03 3106 KSP Residual norm 5.800149107397e-03 3107 KSP Residual norm 5.712092792131e-03 3108 KSP Residual norm 5.723257999910e-03 3109 KSP Residual norm 5.585497985460e-03 3110 KSP Residual norm 5.299240610802e-03 3111 KSP Residual norm 5.185844312278e-03 3112 KSP Residual norm 5.180691238305e-03 3113 KSP Residual norm 5.170254632558e-03 3114 KSP Residual norm 5.031353871658e-03 3115 KSP Residual norm 4.516523814593e-03 3116 KSP Residual norm 3.891943664907e-03 3117 KSP Residual norm 3.527455407802e-03 3118 KSP Residual norm 3.276607605164e-03 3119 KSP Residual norm 3.137818567100e-03 3120 KSP Residual norm 3.510031082913e-03 3121 KSP Residual norm 4.090076535163e-03 3122 KSP Residual norm 4.204439159335e-03 3123 KSP Residual norm 4.519044874314e-03 3124 KSP Residual norm 4.711943201661e-03 3125 KSP Residual norm 4.108752640532e-03 3126 KSP Residual norm 3.479065521297e-03 3127 KSP Residual norm 3.495031577776e-03 3128 KSP Residual norm 3.934896362242e-03 3129 KSP Residual norm 4.070916898692e-03 3130 KSP Residual norm 4.288349051719e-03 3131 KSP Residual norm 4.317228547289e-03 3132 KSP Residual norm 3.946454044654e-03 3133 KSP Residual norm 3.779196047954e-03 3134 KSP Residual norm 4.032446591841e-03 3135 KSP Residual norm 4.018336837785e-03 3136 KSP Residual norm 3.607753498867e-03 3137 KSP Residual norm 3.481587781544e-03 3138 KSP Residual norm 3.455709777074e-03 3139 KSP Residual norm 3.363059582249e-03 3140 KSP Residual norm 3.313987873533e-03 3141 KSP Residual norm 3.237254013235e-03 3142 KSP Residual norm 2.768398774635e-03 3143 KSP Residual norm 2.468370301959e-03 3144 KSP Residual norm 2.535141088666e-03 3145 KSP Residual norm 2.928032024313e-03 3146 KSP Residual norm 3.847497590560e-03 3147 KSP Residual norm 4.242685138425e-03 3148 KSP Residual norm 3.615538528394e-03 3149 KSP Residual norm 3.113010251107e-03 3150 KSP Residual norm 3.162645118937e-03 3151 KSP Residual norm 3.317127403664e-03 3152 KSP Residual norm 3.471925769185e-03 3153 KSP Residual norm 3.828187151022e-03 3154 KSP Residual norm 4.090509587600e-03 3155 KSP Residual norm 3.897198613880e-03 3156 KSP Residual norm 3.639927871554e-03 3157 KSP Residual norm 3.528236759149e-03 3158 KSP Residual norm 3.438661404826e-03 3159 KSP Residual norm 3.636352107113e-03 3160 KSP Residual norm 3.997719762216e-03 3161 KSP Residual norm 4.112496172284e-03 3162 KSP Residual norm 4.057648658384e-03 3163 KSP Residual norm 4.270657152407e-03 3164 KSP Residual norm 4.405984332441e-03 3165 KSP Residual norm 3.807035656515e-03 3166 KSP Residual norm 3.531658843093e-03 3167 KSP Residual norm 3.943140142473e-03 3168 KSP Residual norm 3.994005998218e-03 3169 KSP Residual norm 3.925806568403e-03 3170 KSP Residual norm 4.283447489515e-03 3171 KSP Residual norm 4.995750857361e-03 3172 KSP Residual norm 5.209611202407e-03 3173 KSP Residual norm 5.080346313836e-03 3174 KSP Residual norm 4.799734984449e-03 3175 KSP Residual norm 4.163699720599e-03 3176 KSP Residual norm 4.081417388337e-03 3177 KSP Residual norm 4.614329415968e-03 3178 KSP Residual norm 5.245797563056e-03 3179 KSP Residual norm 5.297680341569e-03 3180 KSP Residual norm 5.401341806411e-03 3181 KSP Residual norm 6.006059778329e-03 3182 KSP Residual norm 7.005651171496e-03 3183 KSP Residual norm 8.532646190371e-03 3184 KSP Residual norm 9.728916729099e-03 3185 KSP Residual norm 9.381147206066e-03 3186 KSP Residual norm 8.961874693029e-03 3187 KSP Residual norm 9.086673303911e-03 3188 KSP Residual norm 8.251175181581e-03 3189 KSP Residual norm 6.911038858331e-03 3190 KSP Residual norm 6.568919068833e-03 3191 KSP Residual norm 6.805271037313e-03 3192 KSP Residual norm 7.204067554180e-03 3193 KSP Residual norm 8.129184294157e-03 3194 KSP Residual norm 9.005369610526e-03 3195 KSP Residual norm 9.442528385542e-03 3196 KSP Residual norm 9.774101977640e-03 3197 KSP Residual norm 1.033055682201e-02 3198 KSP Residual norm 1.043221180782e-02 3199 KSP Residual norm 1.078135156047e-02 3200 KSP Residual norm 1.200591990423e-02 3201 KSP Residual norm 1.305363207925e-02 3202 KSP Residual norm 1.378489495984e-02 3203 KSP Residual norm 1.356199718732e-02 3204 KSP Residual norm 1.269109848995e-02 3205 KSP Residual norm 1.269489526675e-02 3206 KSP Residual norm 1.451094867081e-02 3207 KSP Residual norm 1.567832583739e-02 3208 KSP Residual norm 1.532058906301e-02 3209 KSP Residual norm 1.576409641088e-02 3210 KSP Residual norm 1.643058038446e-02 3211 KSP Residual norm 1.557642354238e-02 3212 KSP Residual norm 1.552121527862e-02 3213 KSP Residual norm 1.628124738715e-02 3214 KSP Residual norm 1.537868036131e-02 3215 KSP Residual norm 1.411653453381e-02 3216 KSP Residual norm 1.438663611149e-02 3217 KSP Residual norm 1.586387854502e-02 3218 KSP Residual norm 1.532058834136e-02 3219 KSP Residual norm 1.381368567635e-02 3220 KSP Residual norm 1.269730312762e-02 3221 KSP Residual norm 1.185916433179e-02 3222 KSP Residual norm 1.137845876087e-02 3223 KSP Residual norm 1.122832688659e-02 3224 KSP Residual norm 1.142049772445e-02 3225 KSP Residual norm 1.150003013665e-02 3226 KSP Residual norm 1.329890835724e-02 3227 KSP Residual norm 1.690954609903e-02 3228 KSP Residual norm 1.907361819568e-02 3229 KSP Residual norm 1.836512146272e-02 3230 KSP Residual norm 1.644260808452e-02 3231 KSP Residual norm 1.429160276489e-02 3232 KSP Residual norm 1.238839069472e-02 3233 KSP Residual norm 1.206765651211e-02 3234 KSP Residual norm 1.162683853596e-02 3235 KSP Residual norm 1.080653244405e-02 3236 KSP Residual norm 9.770249579341e-03 3237 KSP Residual norm 9.143515025012e-03 3238 KSP Residual norm 8.858832879797e-03 3239 KSP Residual norm 9.023756798777e-03 3240 KSP Residual norm 1.001068990928e-02 3241 KSP Residual norm 1.069471425598e-02 3242 KSP Residual norm 1.020298319345e-02 3243 KSP Residual norm 8.865505274785e-03 3244 KSP Residual norm 7.383176934210e-03 3245 KSP Residual norm 7.414517430798e-03 3246 KSP Residual norm 9.001557144410e-03 3247 KSP Residual norm 1.022769544403e-02 3248 KSP Residual norm 9.785680285920e-03 3249 KSP Residual norm 9.142389815218e-03 3250 KSP Residual norm 8.536210708611e-03 3251 KSP Residual norm 6.867894613730e-03 3252 KSP Residual norm 5.895947524297e-03 3253 KSP Residual norm 5.818159631934e-03 3254 KSP Residual norm 5.969135701545e-03 3255 KSP Residual norm 6.268075658928e-03 3256 KSP Residual norm 7.399276665667e-03 3257 KSP Residual norm 7.661427943296e-03 3258 KSP Residual norm 6.453249799503e-03 3259 KSP Residual norm 5.921344561978e-03 3260 KSP Residual norm 5.982986714829e-03 3261 KSP Residual norm 5.199720903837e-03 3262 KSP Residual norm 4.422412439531e-03 3263 KSP Residual norm 4.694366066974e-03 3264 KSP Residual norm 5.853588950046e-03 3265 KSP Residual norm 6.551662871828e-03 3266 KSP Residual norm 6.490056762985e-03 3267 KSP Residual norm 5.951388959900e-03 3268 KSP Residual norm 4.516412582620e-03 3269 KSP Residual norm 2.983165029024e-03 3270 KSP Residual norm 2.039595003348e-03 3271 KSP Residual norm 1.794976143482e-03 3272 KSP Residual norm 1.939361591023e-03 3273 KSP Residual norm 2.531434460060e-03 3274 KSP Residual norm 3.598802316831e-03 3275 KSP Residual norm 4.674419793336e-03 3276 KSP Residual norm 4.842539121028e-03 3277 KSP Residual norm 3.996983218608e-03 3278 KSP Residual norm 3.140164118259e-03 3279 KSP Residual norm 2.515405621939e-03 3280 KSP Residual norm 2.408966869639e-03 3281 KSP Residual norm 2.720855275036e-03 3282 KSP Residual norm 3.355954732026e-03 3283 KSP Residual norm 4.167591031500e-03 3284 KSP Residual norm 4.680635771816e-03 3285 KSP Residual norm 3.889527774632e-03 3286 KSP Residual norm 2.755805924728e-03 3287 KSP Residual norm 2.315840756050e-03 3288 KSP Residual norm 2.308845251531e-03 3289 KSP Residual norm 2.636645978699e-03 3290 KSP Residual norm 3.402433647969e-03 3291 KSP Residual norm 4.251620632374e-03 3292 KSP Residual norm 3.793730879779e-03 3293 KSP Residual norm 2.832866760978e-03 3294 KSP Residual norm 2.180505219503e-03 3295 KSP Residual norm 1.697614738254e-03 3296 KSP Residual norm 1.665784154063e-03 3297 KSP Residual norm 2.161024671321e-03 3298 KSP Residual norm 3.050830942266e-03 3299 KSP Residual norm 3.509991938504e-03 3300 KSP Residual norm 3.542214302572e-03 3301 KSP Residual norm 3.106173541680e-03 3302 KSP Residual norm 2.376905396518e-03 3303 KSP Residual norm 2.173447876136e-03 3304 KSP Residual norm 2.308632256805e-03 3305 KSP Residual norm 2.176363396384e-03 3306 KSP Residual norm 1.813052173020e-03 3307 KSP Residual norm 1.624029390453e-03 3308 KSP Residual norm 1.559582321585e-03 3309 KSP Residual norm 1.519515691497e-03 3310 KSP Residual norm 1.893851598042e-03 3311 KSP Residual norm 2.836681978166e-03 3312 KSP Residual norm 3.660679422597e-03 3313 KSP Residual norm 3.559310041600e-03 3314 KSP Residual norm 3.105166606565e-03 3315 KSP Residual norm 2.343716525519e-03 3316 KSP Residual norm 1.722636638757e-03 3317 KSP Residual norm 1.443878171809e-03 3318 KSP Residual norm 1.516598879532e-03 3319 KSP Residual norm 1.954914482820e-03 3320 KSP Residual norm 2.782514594175e-03 3321 KSP Residual norm 3.700013789797e-03 3322 KSP Residual norm 3.772175780929e-03 3323 KSP Residual norm 2.935475918005e-03 3324 KSP Residual norm 2.083878885861e-03 3325 KSP Residual norm 1.703560777235e-03 3326 KSP Residual norm 1.706262679731e-03 3327 KSP Residual norm 2.224070904846e-03 3328 KSP Residual norm 3.404123082333e-03 3329 KSP Residual norm 5.017399865633e-03 3330 KSP Residual norm 5.328545072862e-03 3331 KSP Residual norm 4.110554961613e-03 3332 KSP Residual norm 3.055932463793e-03 3333 KSP Residual norm 2.434093665887e-03 3334 KSP Residual norm 2.245025757791e-03 3335 KSP Residual norm 2.597381919391e-03 3336 KSP Residual norm 3.608560344999e-03 3337 KSP Residual norm 5.024544092493e-03 3338 KSP Residual norm 6.757773177279e-03 3339 KSP Residual norm 7.236990599686e-03 3340 KSP Residual norm 5.334018579363e-03 3341 KSP Residual norm 3.670426817844e-03 3342 KSP Residual norm 3.458658010460e-03 3343 KSP Residual norm 4.177897662309e-03 3344 KSP Residual norm 5.361834008504e-03 3345 KSP Residual norm 7.448364528135e-03 3346 KSP Residual norm 1.161156259565e-02 3347 KSP Residual norm 1.466722425199e-02 3348 KSP Residual norm 1.223571276010e-02 3349 KSP Residual norm 8.515548591274e-03 3350 KSP Residual norm 5.560222541152e-03 3351 KSP Residual norm 3.737752004715e-03 3352 KSP Residual norm 3.374691477137e-03 3353 KSP Residual norm 4.248075403035e-03 3354 KSP Residual norm 6.454549110546e-03 3355 KSP Residual norm 9.253753441211e-03 3356 KSP Residual norm 1.172261739627e-02 3357 KSP Residual norm 1.293159358061e-02 3358 KSP Residual norm 1.327533129647e-02 3359 KSP Residual norm 1.421858697334e-02 3360 KSP Residual norm 1.588189853110e-02 3361 KSP Residual norm 1.468433443893e-02 3362 KSP Residual norm 1.061499850396e-02 3363 KSP Residual norm 7.238306118037e-03 3364 KSP Residual norm 5.295484481415e-03 3365 KSP Residual norm 5.030433624303e-03 3366 KSP Residual norm 6.448894481015e-03 3367 KSP Residual norm 1.005121245882e-02 3368 KSP Residual norm 1.460014037975e-02 3369 KSP Residual norm 1.624683604566e-02 3370 KSP Residual norm 1.433959148724e-02 3371 KSP Residual norm 1.160450343864e-02 3372 KSP Residual norm 9.605938004674e-03 3373 KSP Residual norm 8.899127743228e-03 3374 KSP Residual norm 9.370299718388e-03 3375 KSP Residual norm 9.380021626241e-03 3376 KSP Residual norm 8.342611820587e-03 3377 KSP Residual norm 6.985668548425e-03 3378 KSP Residual norm 6.327982854763e-03 3379 KSP Residual norm 6.591958145806e-03 3380 KSP Residual norm 7.177543030292e-03 3381 KSP Residual norm 7.853954858680e-03 3382 KSP Residual norm 8.120645681643e-03 3383 KSP Residual norm 9.053611504499e-03 3384 KSP Residual norm 1.136035037033e-02 3385 KSP Residual norm 1.304283230750e-02 3386 KSP Residual norm 1.163407764338e-02 3387 KSP Residual norm 8.409491414743e-03 3388 KSP Residual norm 6.076571569911e-03 3389 KSP Residual norm 5.063308200875e-03 3390 KSP Residual norm 5.248174315404e-03 3391 KSP Residual norm 6.560021912350e-03 3392 KSP Residual norm 7.415905842330e-03 3393 KSP Residual norm 7.258096522426e-03 3394 KSP Residual norm 7.519030670535e-03 3395 KSP Residual norm 7.947065118337e-03 3396 KSP Residual norm 8.439483136767e-03 3397 KSP Residual norm 9.153467368035e-03 3398 KSP Residual norm 8.306626157337e-03 3399 KSP Residual norm 5.999621700177e-03 3400 KSP Residual norm 4.249803258353e-03 3401 KSP Residual norm 3.266740729237e-03 3402 KSP Residual norm 2.962905410642e-03 3403 KSP Residual norm 3.294654846176e-03 3404 KSP Residual norm 3.473894366545e-03 3405 KSP Residual norm 3.310995992788e-03 3406 KSP Residual norm 3.726455952379e-03 3407 KSP Residual norm 4.885673388954e-03 3408 KSP Residual norm 5.546168850270e-03 3409 KSP Residual norm 5.740402294123e-03 3410 KSP Residual norm 6.079770275235e-03 3411 KSP Residual norm 6.014014871281e-03 3412 KSP Residual norm 6.020096376792e-03 3413 KSP Residual norm 5.968133126080e-03 3414 KSP Residual norm 5.059945350920e-03 3415 KSP Residual norm 3.587378586357e-03 3416 KSP Residual norm 2.822033047601e-03 3417 KSP Residual norm 2.513662279448e-03 3418 KSP Residual norm 2.379184307503e-03 3419 KSP Residual norm 2.329629264453e-03 3420 KSP Residual norm 2.204936400537e-03 3421 KSP Residual norm 2.085485526127e-03 3422 KSP Residual norm 2.239133298867e-03 3423 KSP Residual norm 2.781163346688e-03 3424 KSP Residual norm 3.467984178772e-03 3425 KSP Residual norm 3.766608644314e-03 3426 KSP Residual norm 3.542590969970e-03 3427 KSP Residual norm 3.021545797448e-03 3428 KSP Residual norm 2.575151560536e-03 3429 KSP Residual norm 2.360653493124e-03 3430 KSP Residual norm 2.316791073053e-03 3431 KSP Residual norm 2.217026763308e-03 3432 KSP Residual norm 1.992379378075e-03 3433 KSP Residual norm 1.753332197379e-03 3434 KSP Residual norm 1.757412270373e-03 3435 KSP Residual norm 2.202389081792e-03 3436 KSP Residual norm 2.686035122139e-03 3437 KSP Residual norm 2.499700048378e-03 3438 KSP Residual norm 2.244939603433e-03 3439 KSP Residual norm 2.330553771967e-03 3440 KSP Residual norm 2.624392189817e-03 3441 KSP Residual norm 2.623874272572e-03 3442 KSP Residual norm 2.070401979000e-03 3443 KSP Residual norm 1.444355694733e-03 3444 KSP Residual norm 1.094870722720e-03 3445 KSP Residual norm 9.607246377444e-04 3446 KSP Residual norm 8.651114234745e-04 3447 KSP Residual norm 7.499264278678e-04 3448 KSP Residual norm 6.984004105417e-04 3449 KSP Residual norm 7.806927786775e-04 3450 KSP Residual norm 1.030933255808e-03 3451 KSP Residual norm 1.398237545698e-03 3452 KSP Residual norm 1.562644637794e-03 3453 KSP Residual norm 1.511551810266e-03 3454 KSP Residual norm 1.473179602700e-03 3455 KSP Residual norm 1.517294639902e-03 3456 KSP Residual norm 1.480657264659e-03 3457 KSP Residual norm 1.201818362205e-03 3458 KSP Residual norm 8.991204921500e-04 3459 KSP Residual norm 8.256666138973e-04 3460 KSP Residual norm 8.916031729385e-04 3461 KSP Residual norm 9.093828933303e-04 3462 KSP Residual norm 7.921076595761e-04 3463 KSP Residual norm 6.812389654892e-04 3464 KSP Residual norm 6.586554280163e-04 3465 KSP Residual norm 7.291705292598e-04 3466 KSP Residual norm 8.036112210929e-04 3467 KSP Residual norm 8.366608045404e-04 3468 KSP Residual norm 7.963892254702e-04 3469 KSP Residual norm 7.877505216885e-04 3470 KSP Residual norm 8.353415436455e-04 3471 KSP Residual norm 9.277720257402e-04 3472 KSP Residual norm 1.101942335291e-03 3473 KSP Residual norm 1.067069499995e-03 3474 KSP Residual norm 8.006057154162e-04 3475 KSP Residual norm 6.566476734925e-04 3476 KSP Residual norm 6.350126093126e-04 3477 KSP Residual norm 6.742107335825e-04 3478 KSP Residual norm 6.354443420061e-04 3479 KSP Residual norm 5.179732380869e-04 3480 KSP Residual norm 4.121368114192e-04 3481 KSP Residual norm 3.787878580931e-04 3482 KSP Residual norm 4.440028813957e-04 3483 KSP Residual norm 5.114775212055e-04 3484 KSP Residual norm 4.743065925748e-04 3485 KSP Residual norm 4.294702343993e-04 3486 KSP Residual norm 4.657060853571e-04 3487 KSP Residual norm 5.852635504900e-04 3488 KSP Residual norm 7.011058864148e-04 3489 KSP Residual norm 7.195693799062e-04 3490 KSP Residual norm 6.535302888204e-04 3491 KSP Residual norm 5.841926553952e-04 3492 KSP Residual norm 5.497100454829e-04 3493 KSP Residual norm 5.255295851755e-04 3494 KSP Residual norm 4.356822844171e-04 3495 KSP Residual norm 3.685597805375e-04 3496 KSP Residual norm 3.330568822722e-04 3497 KSP Residual norm 3.421860146338e-04 3498 KSP Residual norm 3.690993881108e-04 3499 KSP Residual norm 3.532750971556e-04 3500 KSP Residual norm 3.049433515967e-04 3501 KSP Residual norm 2.931011824341e-04 3502 KSP Residual norm 3.440663736710e-04 3503 KSP Residual norm 4.632817903386e-04 3504 KSP Residual norm 5.561854642997e-04 3505 KSP Residual norm 5.630559169398e-04 3506 KSP Residual norm 5.380430164398e-04 3507 KSP Residual norm 6.104820630153e-04 3508 KSP Residual norm 8.274878968856e-04 3509 KSP Residual norm 9.875315649218e-04 3510 KSP Residual norm 9.092201828164e-04 3511 KSP Residual norm 7.757570077409e-04 3512 KSP Residual norm 7.559988057923e-04 3513 KSP Residual norm 8.803879684377e-04 3514 KSP Residual norm 9.508967221048e-04 3515 KSP Residual norm 7.389022793062e-04 3516 KSP Residual norm 6.193006140846e-04 3517 KSP Residual norm 6.570425920917e-04 3518 KSP Residual norm 7.832796504841e-04 3519 KSP Residual norm 8.532010926067e-04 3520 KSP Residual norm 8.208984225632e-04 3521 KSP Residual norm 7.932871011582e-04 3522 KSP Residual norm 8.798936491137e-04 3523 KSP Residual norm 1.070383262272e-03 3524 KSP Residual norm 1.195189003322e-03 3525 KSP Residual norm 1.205191158875e-03 3526 KSP Residual norm 1.266488170870e-03 3527 KSP Residual norm 1.554374548755e-03 3528 KSP Residual norm 2.023641804266e-03 3529 KSP Residual norm 2.045493343792e-03 3530 KSP Residual norm 1.478040580461e-03 3531 KSP Residual norm 1.075182917332e-03 3532 KSP Residual norm 9.771798637772e-04 3533 KSP Residual norm 1.067532871406e-03 3534 KSP Residual norm 1.030407985674e-03 3535 KSP Residual norm 8.620271812682e-04 3536 KSP Residual norm 8.317543992205e-04 3537 KSP Residual norm 1.015062917753e-03 3538 KSP Residual norm 1.334676394032e-03 3539 KSP Residual norm 1.496384546331e-03 3540 KSP Residual norm 1.441118149914e-03 3541 KSP Residual norm 1.525326786434e-03 3542 KSP Residual norm 1.931801313192e-03 3543 KSP Residual norm 2.280673015354e-03 3544 KSP Residual norm 2.018253041515e-03 3545 KSP Residual norm 1.678637829758e-03 3546 KSP Residual norm 1.639551928886e-03 3547 KSP Residual norm 1.806076301212e-03 3548 KSP Residual norm 1.997638456010e-03 3549 KSP Residual norm 1.877914332868e-03 3550 KSP Residual norm 1.484076951522e-03 3551 KSP Residual norm 1.330692024164e-03 3552 KSP Residual norm 1.437317909713e-03 3553 KSP Residual norm 1.613975703049e-03 3554 KSP Residual norm 1.673509057881e-03 3555 KSP Residual norm 1.613179476473e-03 3556 KSP Residual norm 1.724258525116e-03 3557 KSP Residual norm 2.249061196567e-03 3558 KSP Residual norm 3.032330505208e-03 3559 KSP Residual norm 3.311623721401e-03 3560 KSP Residual norm 3.128816736141e-03 3561 KSP Residual norm 3.171573670495e-03 3562 KSP Residual norm 3.733094582663e-03 3563 KSP Residual norm 4.097017261153e-03 3564 KSP Residual norm 3.513162399956e-03 3565 KSP Residual norm 2.948428313749e-03 3566 KSP Residual norm 2.901393034600e-03 3567 KSP Residual norm 2.984534470525e-03 3568 KSP Residual norm 2.779805984533e-03 3569 KSP Residual norm 2.432290259025e-03 3570 KSP Residual norm 2.389515820843e-03 3571 KSP Residual norm 2.593743319429e-03 3572 KSP Residual norm 2.785783894119e-03 3573 KSP Residual norm 2.501197636017e-03 3574 KSP Residual norm 2.082103648580e-03 3575 KSP Residual norm 2.041599248973e-03 3576 KSP Residual norm 2.399253747622e-03 3577 KSP Residual norm 2.732195497203e-03 3578 KSP Residual norm 2.709450757970e-03 3579 KSP Residual norm 2.626285332542e-03 3580 KSP Residual norm 3.036264845884e-03 3581 KSP Residual norm 4.021274289136e-03 3582 KSP Residual norm 4.881630819856e-03 3583 KSP Residual norm 4.877683385218e-03 3584 KSP Residual norm 4.338828882271e-03 3585 KSP Residual norm 4.245469786156e-03 3586 KSP Residual norm 4.453349478537e-03 3587 KSP Residual norm 4.187751638837e-03 3588 KSP Residual norm 3.784529402750e-03 3589 KSP Residual norm 3.561587575294e-03 3590 KSP Residual norm 3.509756733965e-03 3591 KSP Residual norm 3.549135223631e-03 3592 KSP Residual norm 3.401174808818e-03 3593 KSP Residual norm 3.240863107469e-03 3594 KSP Residual norm 3.520559389878e-03 3595 KSP Residual norm 4.082694692065e-03 3596 KSP Residual norm 4.136352599546e-03 3597 KSP Residual norm 4.166900620936e-03 3598 KSP Residual norm 5.044139245858e-03 3599 KSP Residual norm 6.561622484536e-03 3600 KSP Residual norm 7.307164257339e-03 3601 KSP Residual norm 7.443863158124e-03 3602 KSP Residual norm 7.828314430839e-03 3603 KSP Residual norm 8.166278844055e-03 3604 KSP Residual norm 7.658887239474e-03 3605 KSP Residual norm 6.207092964009e-03 3606 KSP Residual norm 4.678215399264e-03 3607 KSP Residual norm 4.331164823929e-03 3608 KSP Residual norm 4.833612569223e-03 3609 KSP Residual norm 4.850875438914e-03 3610 KSP Residual norm 4.047203963261e-03 3611 KSP Residual norm 3.592469971986e-03 3612 KSP Residual norm 3.532370630108e-03 3613 KSP Residual norm 3.553192131972e-03 3614 KSP Residual norm 3.665485709967e-03 3615 KSP Residual norm 3.676007855053e-03 3616 KSP Residual norm 3.981008685067e-03 3617 KSP Residual norm 5.034569569705e-03 3618 KSP Residual norm 6.117479478495e-03 3619 KSP Residual norm 6.033465754972e-03 3620 KSP Residual norm 6.227823513094e-03 3621 KSP Residual norm 7.423841165264e-03 3622 KSP Residual norm 8.306450199043e-03 3623 KSP Residual norm 8.151912053584e-03 3624 KSP Residual norm 7.754934563164e-03 3625 KSP Residual norm 6.914354816909e-03 3626 KSP Residual norm 6.253304568856e-03 3627 KSP Residual norm 5.509961410378e-03 3628 KSP Residual norm 4.394914369350e-03 3629 KSP Residual norm 3.843952478986e-03 3630 KSP Residual norm 4.149416599223e-03 3631 KSP Residual norm 4.231999601513e-03 3632 KSP Residual norm 3.540120774239e-03 3633 KSP Residual norm 3.313927565109e-03 3634 KSP Residual norm 3.855125406389e-03 3635 KSP Residual norm 4.200891466551e-03 3636 KSP Residual norm 4.150891094791e-03 3637 KSP Residual norm 3.814103303157e-03 3638 KSP Residual norm 3.586413132211e-03 3639 KSP Residual norm 3.863108676532e-03 3640 KSP Residual norm 4.476251263711e-03 3641 KSP Residual norm 4.712874915970e-03 3642 KSP Residual norm 5.332327228744e-03 3643 KSP Residual norm 6.941879291547e-03 3644 KSP Residual norm 8.118919341655e-03 3645 KSP Residual norm 7.298348207224e-03 3646 KSP Residual norm 5.954375112478e-03 3647 KSP Residual norm 4.984839833387e-03 3648 KSP Residual norm 4.384941933256e-03 3649 KSP Residual norm 4.133886164299e-03 3650 KSP Residual norm 3.545905667035e-03 3651 KSP Residual norm 2.994975000936e-03 3652 KSP Residual norm 2.767849745248e-03 3653 KSP Residual norm 2.769630219044e-03 3654 KSP Residual norm 2.439119191702e-03 3655 KSP Residual norm 2.185834742866e-03 3656 KSP Residual norm 2.202221778376e-03 3657 KSP Residual norm 2.271546394093e-03 3658 KSP Residual norm 2.366900033367e-03 3659 KSP Residual norm 2.659340650284e-03 3660 KSP Residual norm 2.704229966839e-03 3661 KSP Residual norm 2.692655430256e-03 3662 KSP Residual norm 2.892313527326e-03 3663 KSP Residual norm 3.288715276748e-03 3664 KSP Residual norm 3.769148978695e-03 3665 KSP Residual norm 4.412253772286e-03 3666 KSP Residual norm 4.761415185989e-03 3667 KSP Residual norm 4.615132664724e-03 3668 KSP Residual norm 4.554537747807e-03 3669 KSP Residual norm 4.331885400047e-03 3670 KSP Residual norm 3.832073404816e-03 3671 KSP Residual norm 3.661534364403e-03 3672 KSP Residual norm 3.689620790663e-03 3673 KSP Residual norm 3.429423924588e-03 3674 KSP Residual norm 3.060374069393e-03 3675 KSP Residual norm 2.864262662252e-03 3676 KSP Residual norm 2.672317271465e-03 3677 KSP Residual norm 2.587979518671e-03 3678 KSP Residual norm 2.568162988685e-03 3679 KSP Residual norm 2.186044796154e-03 3680 KSP Residual norm 1.727412388961e-03 3681 KSP Residual norm 1.682847424785e-03 3682 KSP Residual norm 1.821662828664e-03 3683 KSP Residual norm 1.855863327891e-03 3684 KSP Residual norm 1.818325924442e-03 3685 KSP Residual norm 1.901158812440e-03 3686 KSP Residual norm 2.089844447169e-03 3687 KSP Residual norm 2.602974999488e-03 3688 KSP Residual norm 3.018596557282e-03 3689 KSP Residual norm 2.884115491376e-03 3690 KSP Residual norm 2.961167133682e-03 3691 KSP Residual norm 3.602270471292e-03 3692 KSP Residual norm 4.112099294529e-03 3693 KSP Residual norm 3.931680371933e-03 3694 KSP Residual norm 3.659312970152e-03 3695 KSP Residual norm 3.471262559772e-03 3696 KSP Residual norm 3.131867604394e-03 3697 KSP Residual norm 2.647193407058e-03 3698 KSP Residual norm 1.962630853265e-03 3699 KSP Residual norm 1.438871758597e-03 3700 KSP Residual norm 1.319042910112e-03 3701 KSP Residual norm 1.392918112041e-03 3702 KSP Residual norm 1.316320114879e-03 3703 KSP Residual norm 1.255495628385e-03 3704 KSP Residual norm 1.221276120255e-03 3705 KSP Residual norm 1.199294005221e-03 3706 KSP Residual norm 1.265677446081e-03 3707 KSP Residual norm 1.376997309064e-03 3708 KSP Residual norm 1.365480084884e-03 3709 KSP Residual norm 1.278357561762e-03 3710 KSP Residual norm 1.307647305705e-03 3711 KSP Residual norm 1.271105239642e-03 3712 KSP Residual norm 1.121904585817e-03 3713 KSP Residual norm 1.105572981791e-03 3714 KSP Residual norm 1.150658368807e-03 3715 KSP Residual norm 1.133291576619e-03 3716 KSP Residual norm 1.133521042662e-03 3717 KSP Residual norm 1.172699492001e-03 3718 KSP Residual norm 1.227452311584e-03 3719 KSP Residual norm 1.375807043239e-03 3720 KSP Residual norm 1.519214167410e-03 3721 KSP Residual norm 1.496648336825e-03 3722 KSP Residual norm 1.479128754948e-03 3723 KSP Residual norm 1.386605556850e-03 3724 KSP Residual norm 1.123984411991e-03 3725 KSP Residual norm 1.019650926164e-03 3726 KSP Residual norm 1.062686708325e-03 3727 KSP Residual norm 9.688928695139e-04 3728 KSP Residual norm 7.979894915360e-04 3729 KSP Residual norm 7.999058710281e-04 3730 KSP Residual norm 9.139786750030e-04 3731 KSP Residual norm 1.040568590637e-03 3732 KSP Residual norm 1.128567347413e-03 3733 KSP Residual norm 1.204887771337e-03 3734 KSP Residual norm 1.300372691905e-03 3735 KSP Residual norm 1.508734119908e-03 3736 KSP Residual norm 1.729315134442e-03 3737 KSP Residual norm 1.846671078686e-03 3738 KSP Residual norm 2.160761751184e-03 3739 KSP Residual norm 2.430394745094e-03 3740 KSP Residual norm 2.271433580421e-03 3741 KSP Residual norm 1.895654453543e-03 3742 KSP Residual norm 1.694876639236e-03 3743 KSP Residual norm 1.664785611067e-03 3744 KSP Residual norm 1.845714329804e-03 3745 KSP Residual norm 1.928638202536e-03 3746 KSP Residual norm 1.714802331203e-03 3747 KSP Residual norm 1.625047760745e-03 3748 KSP Residual norm 1.713473645431e-03 3749 KSP Residual norm 1.829653885607e-03 3750 KSP Residual norm 1.672760741911e-03 3751 KSP Residual norm 1.474465283155e-03 3752 KSP Residual norm 1.425192074654e-03 3753 KSP Residual norm 1.467973336645e-03 3754 KSP Residual norm 1.473926936692e-03 3755 KSP Residual norm 1.350396371662e-03 3756 KSP Residual norm 1.304074505142e-03 3757 KSP Residual norm 1.494142474303e-03 3758 KSP Residual norm 1.673990463401e-03 3759 KSP Residual norm 1.722978244809e-03 3760 KSP Residual norm 1.883893105765e-03 3761 KSP Residual norm 2.252772410373e-03 3762 KSP Residual norm 2.457645832058e-03 3763 KSP Residual norm 2.591371644087e-03 3764 KSP Residual norm 3.014127607147e-03 3765 KSP Residual norm 3.658591732078e-03 3766 KSP Residual norm 3.966524686175e-03 3767 KSP Residual norm 3.862014072163e-03 3768 KSP Residual norm 3.777509564235e-03 3769 KSP Residual norm 4.056011536640e-03 3770 KSP Residual norm 4.548859906303e-03 3771 KSP Residual norm 4.544581773738e-03 3772 KSP Residual norm 4.091599538555e-03 3773 KSP Residual norm 4.306313663683e-03 3774 KSP Residual norm 4.351657747698e-03 3775 KSP Residual norm 3.824055216587e-03 3776 KSP Residual norm 3.505714847880e-03 3777 KSP Residual norm 3.537716266683e-03 3778 KSP Residual norm 3.317663861356e-03 3779 KSP Residual norm 2.850104340265e-03 3780 KSP Residual norm 2.553317624037e-03 3781 KSP Residual norm 2.325589585195e-03 3782 KSP Residual norm 2.140112890449e-03 3783 KSP Residual norm 2.184446675867e-03 3784 KSP Residual norm 2.275638442386e-03 3785 KSP Residual norm 2.437849826740e-03 3786 KSP Residual norm 2.868317331370e-03 3787 KSP Residual norm 3.312619326922e-03 3788 KSP Residual norm 3.410376571653e-03 3789 KSP Residual norm 3.786321830045e-03 3790 KSP Residual norm 4.508358257941e-03 3791 KSP Residual norm 4.892462517884e-03 3792 KSP Residual norm 5.553242085047e-03 3793 KSP Residual norm 6.601775799811e-03 3794 KSP Residual norm 7.053367506202e-03 3795 KSP Residual norm 7.333486350305e-03 3796 KSP Residual norm 7.649033463018e-03 3797 KSP Residual norm 7.876043153176e-03 3798 KSP Residual norm 8.189345098556e-03 3799 KSP Residual norm 8.555044954388e-03 3800 KSP Residual norm 8.014667913915e-03 3801 KSP Residual norm 7.248413672911e-03 3802 KSP Residual norm 7.287397118484e-03 3803 KSP Residual norm 7.637040534018e-03 3804 KSP Residual norm 6.897714931498e-03 3805 KSP Residual norm 5.944469575451e-03 3806 KSP Residual norm 5.528459318034e-03 3807 KSP Residual norm 5.120640390509e-03 3808 KSP Residual norm 5.004626284979e-03 3809 KSP Residual norm 4.807067376378e-03 3810 KSP Residual norm 4.145125430127e-03 3811 KSP Residual norm 4.048204614586e-03 3812 KSP Residual norm 4.404451642166e-03 3813 KSP Residual norm 4.421239620152e-03 3814 KSP Residual norm 4.235949899498e-03 3815 KSP Residual norm 4.318141981494e-03 3816 KSP Residual norm 4.354335812443e-03 3817 KSP Residual norm 4.546322496775e-03 3818 KSP Residual norm 5.031750667498e-03 3819 KSP Residual norm 5.678117818594e-03 3820 KSP Residual norm 6.142015034997e-03 3821 KSP Residual norm 6.888812809295e-03 3822 KSP Residual norm 7.390367234017e-03 3823 KSP Residual norm 7.242315061307e-03 3824 KSP Residual norm 7.198365334206e-03 3825 KSP Residual norm 7.905688770395e-03 3826 KSP Residual norm 8.467086227498e-03 3827 KSP Residual norm 8.994802372687e-03 3828 KSP Residual norm 9.159103811113e-03 3829 KSP Residual norm 9.321363538906e-03 3830 KSP Residual norm 9.340719550397e-03 3831 KSP Residual norm 8.794202141717e-03 3832 KSP Residual norm 7.320236022828e-03 3833 KSP Residual norm 6.202776925164e-03 3834 KSP Residual norm 6.002133205176e-03 3835 KSP Residual norm 5.774001141672e-03 3836 KSP Residual norm 5.522115330165e-03 3837 KSP Residual norm 5.750920916907e-03 3838 KSP Residual norm 5.784424597201e-03 3839 KSP Residual norm 5.504244278726e-03 3840 KSP Residual norm 5.786508402931e-03 3841 KSP Residual norm 6.258709456382e-03 3842 KSP Residual norm 6.184804909771e-03 3843 KSP Residual norm 6.356004268985e-03 3844 KSP Residual norm 6.602123566606e-03 3845 KSP Residual norm 6.629844664195e-03 3846 KSP Residual norm 6.591228941087e-03 3847 KSP Residual norm 6.622254373335e-03 3848 KSP Residual norm 7.022275135718e-03 3849 KSP Residual norm 7.837290866996e-03 3850 KSP Residual norm 8.123428859581e-03 3851 KSP Residual norm 7.749126144592e-03 3852 KSP Residual norm 8.163895683028e-03 3853 KSP Residual norm 9.515605767568e-03 3854 KSP Residual norm 1.001839354636e-02 3855 KSP Residual norm 9.104095295020e-03 3856 KSP Residual norm 8.774792035458e-03 3857 KSP Residual norm 9.217738089275e-03 3858 KSP Residual norm 1.018684571783e-02 3859 KSP Residual norm 9.748869663756e-03 3860 KSP Residual norm 7.794010844671e-03 3861 KSP Residual norm 6.623638961256e-03 3862 KSP Residual norm 6.355345934030e-03 3863 KSP Residual norm 5.763380439604e-03 3864 KSP Residual norm 4.581589857017e-03 3865 KSP Residual norm 4.204658458253e-03 3866 KSP Residual norm 4.358757553644e-03 3867 KSP Residual norm 4.372774461814e-03 3868 KSP Residual norm 4.075135303573e-03 3869 KSP Residual norm 3.988355680697e-03 3870 KSP Residual norm 4.074966854814e-03 3871 KSP Residual norm 4.428383285782e-03 3872 KSP Residual norm 4.992694123191e-03 3873 KSP Residual norm 5.345483223032e-03 3874 KSP Residual norm 5.907896699672e-03 3875 KSP Residual norm 6.791550695225e-03 3876 KSP Residual norm 7.453043324368e-03 3877 KSP Residual norm 7.316237666160e-03 3878 KSP Residual norm 7.638203490970e-03 3879 KSP Residual norm 8.189926544721e-03 3880 KSP Residual norm 8.172873192297e-03 3881 KSP Residual norm 8.202362329638e-03 3882 KSP Residual norm 7.875925066871e-03 3883 KSP Residual norm 7.302825340613e-03 3884 KSP Residual norm 6.769260297802e-03 3885 KSP Residual norm 5.824290141855e-03 3886 KSP Residual norm 4.628515099291e-03 3887 KSP Residual norm 4.344626413698e-03 3888 KSP Residual norm 4.438663569125e-03 3889 KSP Residual norm 3.802257686884e-03 3890 KSP Residual norm 3.095255544821e-03 3891 KSP Residual norm 2.848210691569e-03 3892 KSP Residual norm 2.657250095577e-03 3893 KSP Residual norm 2.276320456704e-03 3894 KSP Residual norm 2.037450393309e-03 3895 KSP Residual norm 1.915467200636e-03 3896 KSP Residual norm 1.719587131875e-03 3897 KSP Residual norm 1.524609344566e-03 3898 KSP Residual norm 1.290825461981e-03 3899 KSP Residual norm 1.168883320575e-03 3900 KSP Residual norm 1.283331231701e-03 3901 KSP Residual norm 1.436126987057e-03 3902 KSP Residual norm 1.361955006227e-03 3903 KSP Residual norm 1.338175783573e-03 3904 KSP Residual norm 1.570638634894e-03 3905 KSP Residual norm 1.713508983336e-03 3906 KSP Residual norm 1.677892560676e-03 3907 KSP Residual norm 1.813736764789e-03 3908 KSP Residual norm 2.242120276841e-03 3909 KSP Residual norm 2.698144730117e-03 3910 KSP Residual norm 2.943518756740e-03 3911 KSP Residual norm 2.811756037698e-03 3912 KSP Residual norm 2.699136252558e-03 3913 KSP Residual norm 2.913649753289e-03 3914 KSP Residual norm 2.953715188845e-03 3915 KSP Residual norm 2.652624157403e-03 3916 KSP Residual norm 2.673256732648e-03 3917 KSP Residual norm 2.724974376132e-03 3918 KSP Residual norm 2.333190412935e-03 3919 KSP Residual norm 1.956168059339e-03 3920 KSP Residual norm 1.871657576280e-03 3921 KSP Residual norm 1.899763659179e-03 3922 KSP Residual norm 1.997132487780e-03 3923 KSP Residual norm 2.058263607911e-03 3924 KSP Residual norm 1.903797645321e-03 3925 KSP Residual norm 1.710810284625e-03 3926 KSP Residual norm 1.557951987260e-03 3927 KSP Residual norm 1.380954604580e-03 3928 KSP Residual norm 1.273116266406e-03 3929 KSP Residual norm 1.281841353544e-03 3930 KSP Residual norm 1.277594278924e-03 3931 KSP Residual norm 1.277687865662e-03 3932 KSP Residual norm 1.353717396913e-03 3933 KSP Residual norm 1.402807119549e-03 3934 KSP Residual norm 1.422277591573e-03 3935 KSP Residual norm 1.625924173011e-03 3936 KSP Residual norm 1.975493126500e-03 3937 KSP Residual norm 2.196794890192e-03 3938 KSP Residual norm 2.341207215937e-03 3939 KSP Residual norm 2.430487228105e-03 3940 KSP Residual norm 2.495149065372e-03 3941 KSP Residual norm 2.778120305213e-03 3942 KSP Residual norm 3.124408930176e-03 3943 KSP Residual norm 3.190882529742e-03 3944 KSP Residual norm 3.373692722048e-03 3945 KSP Residual norm 3.709821475184e-03 3946 KSP Residual norm 3.491173160141e-03 3947 KSP Residual norm 3.063171306949e-03 3948 KSP Residual norm 2.800498250288e-03 3949 KSP Residual norm 2.768468359212e-03 3950 KSP Residual norm 2.789753714034e-03 3951 KSP Residual norm 2.738210884861e-03 3952 KSP Residual norm 2.465213574489e-03 3953 KSP Residual norm 2.265157643342e-03 3954 KSP Residual norm 2.358747748701e-03 3955 KSP Residual norm 2.411573013505e-03 3956 KSP Residual norm 2.089312592337e-03 3957 KSP Residual norm 1.933501287631e-03 3958 KSP Residual norm 2.108368215422e-03 3959 KSP Residual norm 2.219814652133e-03 3960 KSP Residual norm 2.110738251193e-03 3961 KSP Residual norm 2.234926751875e-03 3962 KSP Residual norm 2.593577306467e-03 3963 KSP Residual norm 2.983545200723e-03 3964 KSP Residual norm 3.078357910477e-03 3965 KSP Residual norm 2.966701504541e-03 3966 KSP Residual norm 3.149638391658e-03 3967 KSP Residual norm 3.445184167452e-03 3968 KSP Residual norm 3.453035160513e-03 3969 KSP Residual norm 3.502134926831e-03 3970 KSP Residual norm 4.085379176779e-03 3971 KSP Residual norm 4.590664868093e-03 3972 KSP Residual norm 4.469800844813e-03 3973 KSP Residual norm 4.237203899081e-03 3974 KSP Residual norm 4.128879619177e-03 3975 KSP Residual norm 4.125836613557e-03 3976 KSP Residual norm 4.214714507964e-03 3977 KSP Residual norm 4.105466834480e-03 3978 KSP Residual norm 3.934598316084e-03 3979 KSP Residual norm 3.912392761648e-03 3980 KSP Residual norm 3.798478103772e-03 3981 KSP Residual norm 3.562448082973e-03 3982 KSP Residual norm 3.545680883614e-03 3983 KSP Residual norm 3.645794420826e-03 3984 KSP Residual norm 3.592630361608e-03 3985 KSP Residual norm 3.553548998729e-03 3986 KSP Residual norm 3.700141927027e-03 3987 KSP Residual norm 3.567627859690e-03 3988 KSP Residual norm 3.390844827108e-03 3989 KSP Residual norm 3.281308568522e-03 3990 KSP Residual norm 2.991696242542e-03 3991 KSP Residual norm 2.792869991820e-03 3992 KSP Residual norm 3.010884760411e-03 3993 KSP Residual norm 3.227496452664e-03 3994 KSP Residual norm 3.412383234711e-03 3995 KSP Residual norm 3.795193937763e-03 3996 KSP Residual norm 4.149179729550e-03 3997 KSP Residual norm 4.539427431375e-03 3998 KSP Residual norm 5.416436361368e-03 3999 KSP Residual norm 6.300588368756e-03 4000 KSP Residual norm 6.167787442207e-03 4001 KSP Residual norm 6.292404625557e-03 4002 KSP Residual norm 7.008852289372e-03 4003 KSP Residual norm 7.565498299997e-03 4004 KSP Residual norm 8.120854469308e-03 4005 KSP Residual norm 8.713611599056e-03 4006 KSP Residual norm 9.677399641609e-03 4007 KSP Residual norm 1.054689757176e-02 4008 KSP Residual norm 1.087582928849e-02 4009 KSP Residual norm 1.007451883091e-02 4010 KSP Residual norm 1.005418117457e-02 4011 KSP Residual norm 1.124097769836e-02 4012 KSP Residual norm 1.148649707407e-02 4013 KSP Residual norm 9.941201966828e-03 4014 KSP Residual norm 8.901971081700e-03 4015 KSP Residual norm 8.696142532189e-03 4016 KSP Residual norm 8.404181169403e-03 4017 KSP Residual norm 7.741560127934e-03 4018 KSP Residual norm 6.870162833530e-03 4019 KSP Residual norm 6.409642477895e-03 4020 KSP Residual norm 6.069328086156e-03 4021 KSP Residual norm 5.776827413134e-03 4022 KSP Residual norm 5.670138057287e-03 4023 KSP Residual norm 6.239748496973e-03 4024 KSP Residual norm 6.364078567745e-03 4025 KSP Residual norm 5.480279527492e-03 4026 KSP Residual norm 5.420611891364e-03 4027 KSP Residual norm 6.693178523687e-03 4028 KSP Residual norm 7.573505928781e-03 4029 KSP Residual norm 7.355062725531e-03 4030 KSP Residual norm 7.865815366378e-03 4031 KSP Residual norm 8.913608852850e-03 4032 KSP Residual norm 9.353830422719e-03 4033 KSP Residual norm 9.445000231771e-03 4034 KSP Residual norm 9.764659241521e-03 4035 KSP Residual norm 1.100159122518e-02 4036 KSP Residual norm 1.285288256792e-02 4037 KSP Residual norm 1.335029056582e-02 4038 KSP Residual norm 1.294587675911e-02 4039 KSP Residual norm 1.422094898257e-02 4040 KSP Residual norm 1.628215393699e-02 4041 KSP Residual norm 1.696945076695e-02 4042 KSP Residual norm 1.694930718735e-02 4043 KSP Residual norm 1.843978945611e-02 4044 KSP Residual norm 2.052299648808e-02 4045 KSP Residual norm 2.211748841535e-02 4046 KSP Residual norm 2.321705907275e-02 4047 KSP Residual norm 2.341689257375e-02 4048 KSP Residual norm 2.416178335527e-02 4049 KSP Residual norm 2.599295943495e-02 4050 KSP Residual norm 2.514273558437e-02 4051 KSP Residual norm 2.373749571765e-02 4052 KSP Residual norm 2.496239301966e-02 4053 KSP Residual norm 2.384572486980e-02 4054 KSP Residual norm 2.189786972011e-02 4055 KSP Residual norm 2.172818225751e-02 4056 KSP Residual norm 2.205485921263e-02 4057 KSP Residual norm 2.086284956444e-02 4058 KSP Residual norm 1.978949412410e-02 4059 KSP Residual norm 1.903694302453e-02 4060 KSP Residual norm 1.756156298591e-02 4061 KSP Residual norm 1.650577386275e-02 4062 KSP Residual norm 1.612937501544e-02 4063 KSP Residual norm 1.558395950881e-02 4064 KSP Residual norm 1.577750699603e-02 4065 KSP Residual norm 1.655934045560e-02 4066 KSP Residual norm 1.473611853106e-02 4067 KSP Residual norm 1.377019185090e-02 4068 KSP Residual norm 1.564529179969e-02 4069 KSP Residual norm 1.774505414733e-02 4070 KSP Residual norm 1.868732589035e-02 4071 KSP Residual norm 1.991885818140e-02 4072 KSP Residual norm 2.012862143662e-02 4073 KSP Residual norm 1.988132788479e-02 4074 KSP Residual norm 2.115103418971e-02 4075 KSP Residual norm 2.358979444836e-02 4076 KSP Residual norm 2.611719633359e-02 4077 KSP Residual norm 3.035047492284e-02 4078 KSP Residual norm 3.352971782032e-02 4079 KSP Residual norm 3.578598417072e-02 4080 KSP Residual norm 4.263841989420e-02 4081 KSP Residual norm 5.165886197902e-02 4082 KSP Residual norm 5.486201128347e-02 4083 KSP Residual norm 5.049910298019e-02 4084 KSP Residual norm 4.605553120815e-02 4085 KSP Residual norm 4.490494642830e-02 4086 KSP Residual norm 4.829641408881e-02 4087 KSP Residual norm 5.501933354121e-02 4088 KSP Residual norm 5.762181496335e-02 4089 KSP Residual norm 6.184507763720e-02 4090 KSP Residual norm 6.799001535709e-02 4091 KSP Residual norm 6.699676586533e-02 4092 KSP Residual norm 6.137143947571e-02 4093 KSP Residual norm 6.145208768691e-02 4094 KSP Residual norm 6.447849143416e-02 4095 KSP Residual norm 6.677785223841e-02 4096 KSP Residual norm 6.994893194717e-02 4097 KSP Residual norm 7.259122470004e-02 4098 KSP Residual norm 6.628765269480e-02 4099 KSP Residual norm 5.769623044857e-02 4100 KSP Residual norm 4.954700349837e-02 4101 KSP Residual norm 4.471952485641e-02 4102 KSP Residual norm 4.250378170916e-02 4103 KSP Residual norm 4.024537134283e-02 4104 KSP Residual norm 3.712590771737e-02 4105 KSP Residual norm 3.736567911800e-02 4106 KSP Residual norm 3.892325200756e-02 4107 KSP Residual norm 3.618833648615e-02 4108 KSP Residual norm 3.274608263635e-02 4109 KSP Residual norm 3.302857805750e-02 4110 KSP Residual norm 3.368772472420e-02 4111 KSP Residual norm 3.084212384729e-02 4112 KSP Residual norm 2.668617404434e-02 4113 KSP Residual norm 2.604814110785e-02 4114 KSP Residual norm 2.984441792959e-02 4115 KSP Residual norm 3.794117643623e-02 4116 KSP Residual norm 4.452640277485e-02 4117 KSP Residual norm 4.683760157063e-02 4118 KSP Residual norm 4.627721780100e-02 4119 KSP Residual norm 4.750187095074e-02 4120 KSP Residual norm 5.246150669565e-02 4121 KSP Residual norm 6.162377530446e-02 4122 KSP Residual norm 6.940348416890e-02 4123 KSP Residual norm 6.691640581601e-02 4124 KSP Residual norm 6.090565988371e-02 4125 KSP Residual norm 5.826128057062e-02 4126 KSP Residual norm 5.637134138895e-02 4127 KSP Residual norm 5.479174409528e-02 4128 KSP Residual norm 5.077100516913e-02 4129 KSP Residual norm 4.710560934884e-02 4130 KSP Residual norm 4.689111810416e-02 4131 KSP Residual norm 4.582858436072e-02 4132 KSP Residual norm 4.018768421611e-02 4133 KSP Residual norm 3.983785178650e-02 4134 KSP Residual norm 3.979030025495e-02 4135 KSP Residual norm 3.514106000032e-02 4136 KSP Residual norm 3.181214922977e-02 4137 KSP Residual norm 3.383059959436e-02 4138 KSP Residual norm 3.435143887120e-02 4139 KSP Residual norm 3.201574560649e-02 4140 KSP Residual norm 3.025203926137e-02 4141 KSP Residual norm 2.955297935694e-02 4142 KSP Residual norm 3.005511786720e-02 4143 KSP Residual norm 3.134659999092e-02 4144 KSP Residual norm 3.152611526486e-02 4145 KSP Residual norm 3.378896641577e-02 4146 KSP Residual norm 3.767631864871e-02 4147 KSP Residual norm 3.872142644142e-02 4148 KSP Residual norm 3.788041737293e-02 4149 KSP Residual norm 4.003507550931e-02 4150 KSP Residual norm 4.352619292239e-02 4151 KSP Residual norm 4.722291075145e-02 4152 KSP Residual norm 5.312430242758e-02 4153 KSP Residual norm 6.384037240444e-02 4154 KSP Residual norm 6.976262480136e-02 4155 KSP Residual norm 6.727404192307e-02 4156 KSP Residual norm 6.278491467126e-02 4157 KSP Residual norm 5.903721072657e-02 4158 KSP Residual norm 5.770156694826e-02 4159 KSP Residual norm 5.719171927231e-02 4160 KSP Residual norm 5.405639928914e-02 4161 KSP Residual norm 5.104723515948e-02 4162 KSP Residual norm 5.246997457170e-02 4163 KSP Residual norm 5.085264261434e-02 4164 KSP Residual norm 4.639547771841e-02 4165 KSP Residual norm 4.424276902582e-02 4166 KSP Residual norm 3.894890310919e-02 4167 KSP Residual norm 3.188992989941e-02 4168 KSP Residual norm 2.891759071865e-02 4169 KSP Residual norm 2.911735950990e-02 4170 KSP Residual norm 2.737932630824e-02 4171 KSP Residual norm 2.513876186543e-02 4172 KSP Residual norm 2.334986682740e-02 4173 KSP Residual norm 2.168115377523e-02 4174 KSP Residual norm 2.196663323481e-02 4175 KSP Residual norm 2.301565649075e-02 4176 KSP Residual norm 2.151719060006e-02 4177 KSP Residual norm 2.006323971901e-02 4178 KSP Residual norm 2.055958248099e-02 4179 KSP Residual norm 2.085541329820e-02 4180 KSP Residual norm 1.971004760518e-02 4181 KSP Residual norm 1.928401803922e-02 4182 KSP Residual norm 1.961270292296e-02 4183 KSP Residual norm 2.032814605531e-02 4184 KSP Residual norm 2.084056987718e-02 4185 KSP Residual norm 1.921762997018e-02 4186 KSP Residual norm 1.847233155789e-02 4187 KSP Residual norm 2.049672581148e-02 4188 KSP Residual norm 2.389381826180e-02 4189 KSP Residual norm 2.589106108971e-02 4190 KSP Residual norm 2.829041676912e-02 4191 KSP Residual norm 3.107395419930e-02 4192 KSP Residual norm 3.313222908529e-02 4193 KSP Residual norm 3.574395005143e-02 4194 KSP Residual norm 3.763611194066e-02 4195 KSP Residual norm 3.783113059850e-02 4196 KSP Residual norm 3.810303732131e-02 4197 KSP Residual norm 3.910627755148e-02 4198 KSP Residual norm 4.087449554290e-02 4199 KSP Residual norm 4.133407095921e-02 4200 KSP Residual norm 3.912211956128e-02 4201 KSP Residual norm 3.584684092597e-02 4202 KSP Residual norm 3.410416417466e-02 4203 KSP Residual norm 3.089408696511e-02 4204 KSP Residual norm 2.721438390337e-02 4205 KSP Residual norm 2.542328046610e-02 4206 KSP Residual norm 2.534751439496e-02 4207 KSP Residual norm 2.201163504458e-02 4208 KSP Residual norm 1.897173081031e-02 4209 KSP Residual norm 1.778612566789e-02 4210 KSP Residual norm 1.658467259507e-02 4211 KSP Residual norm 1.463353687131e-02 4212 KSP Residual norm 1.372266039403e-02 4213 KSP Residual norm 1.386600833626e-02 4214 KSP Residual norm 1.413474215759e-02 4215 KSP Residual norm 1.364982865091e-02 4216 KSP Residual norm 1.220735256835e-02 4217 KSP Residual norm 1.056331911208e-02 4218 KSP Residual norm 1.088532116905e-02 4219 KSP Residual norm 1.122182851744e-02 4220 KSP Residual norm 1.024576915002e-02 4221 KSP Residual norm 9.685094919381e-03 4222 KSP Residual norm 1.020841898353e-02 4223 KSP Residual norm 1.026862157750e-02 4224 KSP Residual norm 9.442588629808e-03 4225 KSP Residual norm 8.810335436026e-03 4226 KSP Residual norm 8.722293627830e-03 4227 KSP Residual norm 9.293531504949e-03 4228 KSP Residual norm 9.822152110870e-03 4229 KSP Residual norm 9.889328955851e-03 4230 KSP Residual norm 1.070451223944e-02 4231 KSP Residual norm 1.238185370013e-02 4232 KSP Residual norm 1.378262595931e-02 4233 KSP Residual norm 1.535885577032e-02 4234 KSP Residual norm 1.790240334794e-02 4235 KSP Residual norm 1.878800842548e-02 4236 KSP Residual norm 1.710013687941e-02 4237 KSP Residual norm 1.594068750413e-02 4238 KSP Residual norm 1.655386570933e-02 4239 KSP Residual norm 1.749197146271e-02 4240 KSP Residual norm 1.952041724866e-02 4241 KSP Residual norm 2.144960774538e-02 4242 KSP Residual norm 2.157263928537e-02 4243 KSP Residual norm 2.121099531783e-02 4244 KSP Residual norm 2.009938524786e-02 4245 KSP Residual norm 1.813194249856e-02 4246 KSP Residual norm 1.801936846260e-02 4247 KSP Residual norm 1.971323604789e-02 4248 KSP Residual norm 1.983009954623e-02 4249 KSP Residual norm 1.892251611939e-02 4250 KSP Residual norm 1.889829507085e-02 4251 KSP Residual norm 1.867747261733e-02 4252 KSP Residual norm 1.792211688059e-02 4253 KSP Residual norm 1.674841349386e-02 4254 KSP Residual norm 1.585124779596e-02 4255 KSP Residual norm 1.599284123319e-02 4256 KSP Residual norm 1.576600388785e-02 4257 KSP Residual norm 1.491558449498e-02 4258 KSP Residual norm 1.443848922019e-02 4259 KSP Residual norm 1.393131558042e-02 4260 KSP Residual norm 1.243906245472e-02 4261 KSP Residual norm 1.001290372198e-02 4262 KSP Residual norm 8.511398912990e-03 4263 KSP Residual norm 8.066213770473e-03 4264 KSP Residual norm 8.227340391243e-03 4265 KSP Residual norm 8.500394900963e-03 4266 KSP Residual norm 9.014187567342e-03 4267 KSP Residual norm 9.462171200366e-03 4268 KSP Residual norm 9.889216057286e-03 4269 KSP Residual norm 8.953715920124e-03 4270 KSP Residual norm 7.793467920104e-03 4271 KSP Residual norm 7.605625436814e-03 4272 KSP Residual norm 8.237896802035e-03 4273 KSP Residual norm 8.962565770242e-03 4274 KSP Residual norm 9.700766071067e-03 4275 KSP Residual norm 1.057777171600e-02 4276 KSP Residual norm 1.163363620762e-02 4277 KSP Residual norm 1.208131610918e-02 4278 KSP Residual norm 1.201757014741e-02 4279 KSP Residual norm 1.250389070910e-02 4280 KSP Residual norm 1.272933054374e-02 4281 KSP Residual norm 1.216689327075e-02 4282 KSP Residual norm 1.193828250704e-02 4283 KSP Residual norm 1.246689184820e-02 4284 KSP Residual norm 1.312474726255e-02 4285 KSP Residual norm 1.263541419629e-02 4286 KSP Residual norm 1.172921671573e-02 4287 KSP Residual norm 1.248007088743e-02 4288 KSP Residual norm 1.407592826539e-02 4289 KSP Residual norm 1.413448717270e-02 4290 KSP Residual norm 1.357751255638e-02 4291 KSP Residual norm 1.452968722840e-02 4292 KSP Residual norm 1.584177590728e-02 4293 KSP Residual norm 1.465715531433e-02 4294 KSP Residual norm 1.333612525778e-02 4295 KSP Residual norm 1.384375993868e-02 4296 KSP Residual norm 1.386915622517e-02 4297 KSP Residual norm 1.261838066562e-02 4298 KSP Residual norm 1.175311334406e-02 4299 KSP Residual norm 1.215236500646e-02 4300 KSP Residual norm 1.255994524421e-02 4301 KSP Residual norm 1.124460072384e-02 4302 KSP Residual norm 9.343985657655e-03 4303 KSP Residual norm 8.335202935980e-03 4304 KSP Residual norm 8.285002658333e-03 4305 KSP Residual norm 8.067162735187e-03 4306 KSP Residual norm 7.348408965037e-03 4307 KSP Residual norm 7.251382815039e-03 4308 KSP Residual norm 7.724153703956e-03 4309 KSP Residual norm 8.165541009594e-03 4310 KSP Residual norm 8.161988205972e-03 4311 KSP Residual norm 8.837952909069e-03 4312 KSP Residual norm 9.636726196004e-03 4313 KSP Residual norm 1.063023191083e-02 4314 KSP Residual norm 1.091507928602e-02 4315 KSP Residual norm 1.148995199751e-02 4316 KSP Residual norm 1.256403293086e-02 4317 KSP Residual norm 1.295631262711e-02 4318 KSP Residual norm 1.236584792169e-02 4319 KSP Residual norm 1.206236777988e-02 4320 KSP Residual norm 1.263655891020e-02 4321 KSP Residual norm 1.304027979574e-02 4322 KSP Residual norm 1.334171959374e-02 4323 KSP Residual norm 1.373594325953e-02 4324 KSP Residual norm 1.329683296616e-02 4325 KSP Residual norm 1.372400313264e-02 4326 KSP Residual norm 1.516615141962e-02 4327 KSP Residual norm 1.673829022112e-02 4328 KSP Residual norm 1.889700559647e-02 4329 KSP Residual norm 2.206670894884e-02 4330 KSP Residual norm 2.430567542455e-02 4331 KSP Residual norm 2.599881652863e-02 4332 KSP Residual norm 2.939128723687e-02 4333 KSP Residual norm 3.303847575911e-02 4334 KSP Residual norm 3.261819884770e-02 4335 KSP Residual norm 3.227755508146e-02 4336 KSP Residual norm 3.516514611817e-02 4337 KSP Residual norm 3.714248843602e-02 4338 KSP Residual norm 3.687957608601e-02 4339 KSP Residual norm 3.547144948854e-02 4340 KSP Residual norm 3.538730000684e-02 4341 KSP Residual norm 3.747209848009e-02 4342 KSP Residual norm 4.033719367847e-02 4343 KSP Residual norm 4.188069048659e-02 4344 KSP Residual norm 4.027148882804e-02 4345 KSP Residual norm 3.822077958758e-02 4346 KSP Residual norm 3.653243560561e-02 4347 KSP Residual norm 3.511659944011e-02 4348 KSP Residual norm 3.206975127852e-02 4349 KSP Residual norm 3.086545559654e-02 4350 KSP Residual norm 3.127029182767e-02 4351 KSP Residual norm 3.044384193361e-02 4352 KSP Residual norm 2.846068960966e-02 4353 KSP Residual norm 2.777564701555e-02 4354 KSP Residual norm 2.569979107924e-02 4355 KSP Residual norm 2.331487808072e-02 4356 KSP Residual norm 2.253010482072e-02 4357 KSP Residual norm 2.216228329378e-02 4358 KSP Residual norm 2.043967396656e-02 4359 KSP Residual norm 2.023111342518e-02 4360 KSP Residual norm 2.187629590988e-02 4361 KSP Residual norm 2.184819232415e-02 4362 KSP Residual norm 1.822781641288e-02 4363 KSP Residual norm 1.558867590643e-02 4364 KSP Residual norm 1.575311061178e-02 4365 KSP Residual norm 1.809224181664e-02 4366 KSP Residual norm 2.130859643515e-02 4367 KSP Residual norm 2.376103624940e-02 4368 KSP Residual norm 2.502147297115e-02 4369 KSP Residual norm 2.623400155944e-02 4370 KSP Residual norm 2.739120320655e-02 4371 KSP Residual norm 2.942773841794e-02 4372 KSP Residual norm 3.200238216774e-02 4373 KSP Residual norm 3.338058966096e-02 4374 KSP Residual norm 3.313429368699e-02 4375 KSP Residual norm 3.461235426972e-02 4376 KSP Residual norm 3.811517365282e-02 4377 KSP Residual norm 4.046572050369e-02 4378 KSP Residual norm 4.134439617517e-02 4379 KSP Residual norm 4.562675841668e-02 4380 KSP Residual norm 5.212114202472e-02 4381 KSP Residual norm 5.456690035163e-02 4382 KSP Residual norm 5.225156155629e-02 4383 KSP Residual norm 5.154045850229e-02 4384 KSP Residual norm 5.583443611844e-02 4385 KSP Residual norm 6.290982861831e-02 4386 KSP Residual norm 6.450934490892e-02 4387 KSP Residual norm 6.225312631384e-02 4388 KSP Residual norm 6.204275106502e-02 4389 KSP Residual norm 6.709831924677e-02 4390 KSP Residual norm 7.303918792821e-02 4391 KSP Residual norm 7.607221642580e-02 4392 KSP Residual norm 7.785470924369e-02 4393 KSP Residual norm 7.826606828851e-02 4394 KSP Residual norm 7.967619320243e-02 4395 KSP Residual norm 7.984488955252e-02 4396 KSP Residual norm 7.594015846163e-02 4397 KSP Residual norm 7.448891051246e-02 4398 KSP Residual norm 7.395679493806e-02 4399 KSP Residual norm 7.079387810796e-02 4400 KSP Residual norm 6.814797450049e-02 4401 KSP Residual norm 6.878511005462e-02 4402 KSP Residual norm 6.821471316216e-02 4403 KSP Residual norm 6.129683459287e-02 4404 KSP Residual norm 5.513161157759e-02 4405 KSP Residual norm 5.400916771858e-02 4406 KSP Residual norm 5.713751137240e-02 4407 KSP Residual norm 5.508343444529e-02 4408 KSP Residual norm 4.958965746627e-02 4409 KSP Residual norm 4.482339821928e-02 4410 KSP Residual norm 4.083577232812e-02 4411 KSP Residual norm 3.957885546103e-02 4412 KSP Residual norm 3.967240796630e-02 4413 KSP Residual norm 3.728369645287e-02 4414 KSP Residual norm 3.226500874638e-02 4415 KSP Residual norm 2.658508602181e-02 4416 KSP Residual norm 2.243037461516e-02 4417 KSP Residual norm 2.118619526210e-02 4418 KSP Residual norm 2.038447161234e-02 4419 KSP Residual norm 1.902572524747e-02 4420 KSP Residual norm 1.756000590113e-02 4421 KSP Residual norm 1.694984169246e-02 4422 KSP Residual norm 1.623223727461e-02 4423 KSP Residual norm 1.484945348301e-02 4424 KSP Residual norm 1.425405980546e-02 4425 KSP Residual norm 1.470653094556e-02 4426 KSP Residual norm 1.458935889012e-02 4427 KSP Residual norm 1.293215686369e-02 4428 KSP Residual norm 1.158375529529e-02 4429 KSP Residual norm 1.145287243611e-02 4430 KSP Residual norm 1.096277925257e-02 4431 KSP Residual norm 1.022076337818e-02 4432 KSP Residual norm 1.032267774439e-02 4433 KSP Residual norm 1.236991410043e-02 4434 KSP Residual norm 1.503856744922e-02 4435 KSP Residual norm 1.615662952662e-02 4436 KSP Residual norm 1.770620500533e-02 4437 KSP Residual norm 1.956199501807e-02 4438 KSP Residual norm 2.015816510449e-02 4439 KSP Residual norm 1.951267658699e-02 4440 KSP Residual norm 1.843100893997e-02 4441 KSP Residual norm 1.851266566866e-02 4442 KSP Residual norm 1.892542891790e-02 4443 KSP Residual norm 1.796682590042e-02 4444 KSP Residual norm 1.722464550182e-02 4445 KSP Residual norm 1.868191386322e-02 4446 KSP Residual norm 2.066282950335e-02 4447 KSP Residual norm 2.020060391601e-02 4448 KSP Residual norm 2.000627501822e-02 4449 KSP Residual norm 2.033319998842e-02 4450 KSP Residual norm 1.959216035400e-02 4451 KSP Residual norm 1.808017765673e-02 4452 KSP Residual norm 1.698079625997e-02 4453 KSP Residual norm 1.626785390426e-02 4454 KSP Residual norm 1.616999484548e-02 4455 KSP Residual norm 1.606097315872e-02 4456 KSP Residual norm 1.642414344635e-02 4457 KSP Residual norm 1.682823461466e-02 4458 KSP Residual norm 1.633484799356e-02 4459 KSP Residual norm 1.588795131649e-02 4460 KSP Residual norm 1.584363336047e-02 4461 KSP Residual norm 1.538679118579e-02 4462 KSP Residual norm 1.429521352663e-02 4463 KSP Residual norm 1.289393515798e-02 4464 KSP Residual norm 1.245768605385e-02 4465 KSP Residual norm 1.272402080298e-02 4466 KSP Residual norm 1.275535831287e-02 4467 KSP Residual norm 1.219396475175e-02 4468 KSP Residual norm 1.163351295872e-02 4469 KSP Residual norm 1.175752166397e-02 4470 KSP Residual norm 1.093596782102e-02 4471 KSP Residual norm 9.663729370247e-03 4472 KSP Residual norm 9.081115435288e-03 4473 KSP Residual norm 9.339063465569e-03 4474 KSP Residual norm 9.591448392416e-03 4475 KSP Residual norm 9.599471359057e-03 4476 KSP Residual norm 9.703447291851e-03 4477 KSP Residual norm 1.001450604403e-02 4478 KSP Residual norm 1.077962278287e-02 4479 KSP Residual norm 1.150009077025e-02 4480 KSP Residual norm 1.229345486357e-02 4481 KSP Residual norm 1.287235228033e-02 4482 KSP Residual norm 1.265535392658e-02 4483 KSP Residual norm 1.156308035077e-02 4484 KSP Residual norm 1.114011596853e-02 4485 KSP Residual norm 1.147153421137e-02 4486 KSP Residual norm 1.112594017387e-02 4487 KSP Residual norm 1.021418512486e-02 4488 KSP Residual norm 9.792645690396e-03 4489 KSP Residual norm 9.245683125612e-03 4490 KSP Residual norm 9.257146877628e-03 4491 KSP Residual norm 9.392165152820e-03 4492 KSP Residual norm 9.524362593064e-03 4493 KSP Residual norm 1.023149277578e-02 4494 KSP Residual norm 1.212519050172e-02 4495 KSP Residual norm 1.373353270897e-02 4496 KSP Residual norm 1.355322660452e-02 4497 KSP Residual norm 1.319365688975e-02 4498 KSP Residual norm 1.320598856272e-02 4499 KSP Residual norm 1.303314579576e-02 4500 KSP Residual norm 1.280999738416e-02 4501 KSP Residual norm 1.286543335859e-02 4502 KSP Residual norm 1.245117101716e-02 4503 KSP Residual norm 1.166004328005e-02 4504 KSP Residual norm 1.149385895545e-02 4505 KSP Residual norm 1.198758776558e-02 4506 KSP Residual norm 1.207378491676e-02 4507 KSP Residual norm 1.084156987528e-02 4508 KSP Residual norm 9.254195116218e-03 4509 KSP Residual norm 8.870781250756e-03 4510 KSP Residual norm 9.020231346209e-03 4511 KSP Residual norm 8.569913824718e-03 4512 KSP Residual norm 7.785538142756e-03 4513 KSP Residual norm 7.305871557667e-03 4514 KSP Residual norm 7.284265054015e-03 4515 KSP Residual norm 7.221062818228e-03 4516 KSP Residual norm 7.140567405317e-03 4517 KSP Residual norm 7.415245136982e-03 4518 KSP Residual norm 7.786163532958e-03 4519 KSP Residual norm 7.493589885061e-03 4520 KSP Residual norm 6.923520903935e-03 4521 KSP Residual norm 6.488676059735e-03 4522 KSP Residual norm 6.314385517513e-03 4523 KSP Residual norm 5.771334330971e-03 4524 KSP Residual norm 5.137059606800e-03 4525 KSP Residual norm 5.190613076976e-03 4526 KSP Residual norm 5.485144484875e-03 4527 KSP Residual norm 4.987731218788e-03 4528 KSP Residual norm 4.275484929414e-03 4529 KSP Residual norm 4.273648435680e-03 4530 KSP Residual norm 4.681552936585e-03 4531 KSP Residual norm 4.961890292442e-03 4532 KSP Residual norm 5.024460531338e-03 4533 KSP Residual norm 5.094514678711e-03 4534 KSP Residual norm 5.511628985233e-03 4535 KSP Residual norm 5.772947393146e-03 4536 KSP Residual norm 5.494456452308e-03 4537 KSP Residual norm 5.430000883516e-03 4538 KSP Residual norm 6.029664715855e-03 4539 KSP Residual norm 7.202474564781e-03 4540 KSP Residual norm 8.255026164732e-03 4541 KSP Residual norm 8.663638679693e-03 4542 KSP Residual norm 9.156804572598e-03 4543 KSP Residual norm 9.528409971005e-03 4544 KSP Residual norm 9.004011761565e-03 4545 KSP Residual norm 8.265189619488e-03 4546 KSP Residual norm 8.511100581038e-03 4547 KSP Residual norm 8.848333678829e-03 4548 KSP Residual norm 8.469172509830e-03 4549 KSP Residual norm 8.183452675852e-03 4550 KSP Residual norm 8.418835744943e-03 4551 KSP Residual norm 8.349988468643e-03 4552 KSP Residual norm 8.110743967592e-03 4553 KSP Residual norm 7.868795202645e-03 4554 KSP Residual norm 7.533548722006e-03 4555 KSP Residual norm 7.503047814224e-03 4556 KSP Residual norm 7.198527592347e-03 4557 KSP Residual norm 6.602610859014e-03 4558 KSP Residual norm 5.928200665097e-03 4559 KSP Residual norm 5.753114308744e-03 4560 KSP Residual norm 5.989274351955e-03 4561 KSP Residual norm 5.846791526975e-03 4562 KSP Residual norm 5.511767992415e-03 4563 KSP Residual norm 5.382926278810e-03 4564 KSP Residual norm 5.448343355683e-03 4565 KSP Residual norm 5.105030191056e-03 4566 KSP Residual norm 4.723043641197e-03 4567 KSP Residual norm 4.469811098350e-03 4568 KSP Residual norm 4.334992698130e-03 4569 KSP Residual norm 3.928590823078e-03 4570 KSP Residual norm 3.646305926684e-03 4571 KSP Residual norm 3.586355737516e-03 4572 KSP Residual norm 3.472674737293e-03 4573 KSP Residual norm 3.220610295982e-03 4574 KSP Residual norm 3.010126817340e-03 4575 KSP Residual norm 2.929265323296e-03 4576 KSP Residual norm 2.862266090784e-03 4577 KSP Residual norm 2.760268484709e-03 4578 KSP Residual norm 2.698525320928e-03 4579 KSP Residual norm 2.821719281981e-03 4580 KSP Residual norm 3.247349282734e-03 4581 KSP Residual norm 3.852340658180e-03 4582 KSP Residual norm 4.180526509750e-03 4583 KSP Residual norm 4.384976716934e-03 4584 KSP Residual norm 4.354906564723e-03 4585 KSP Residual norm 4.314425355341e-03 4586 KSP Residual norm 4.232315007032e-03 4587 KSP Residual norm 4.262720403959e-03 4588 KSP Residual norm 4.539999824623e-03 4589 KSP Residual norm 4.867119743444e-03 4590 KSP Residual norm 5.381570729403e-03 4591 KSP Residual norm 6.450614716025e-03 4592 KSP Residual norm 7.690289509690e-03 4593 KSP Residual norm 8.000161345832e-03 4594 KSP Residual norm 7.819448166282e-03 4595 KSP Residual norm 8.008696594533e-03 4596 KSP Residual norm 7.867188305389e-03 4597 KSP Residual norm 7.625887126866e-03 4598 KSP Residual norm 8.289235801512e-03 4599 KSP Residual norm 9.568200090172e-03 4600 KSP Residual norm 9.906722861646e-03 4601 KSP Residual norm 9.959100563549e-03 4602 KSP Residual norm 1.002358188073e-02 4603 KSP Residual norm 9.892212356164e-03 4604 KSP Residual norm 1.007772310809e-02 4605 KSP Residual norm 1.094742253074e-02 4606 KSP Residual norm 1.204157128724e-02 4607 KSP Residual norm 1.281646958731e-02 4608 KSP Residual norm 1.215726535352e-02 4609 KSP Residual norm 1.093467492496e-02 4610 KSP Residual norm 1.069801877084e-02 4611 KSP Residual norm 1.077395405875e-02 4612 KSP Residual norm 1.075960710753e-02 4613 KSP Residual norm 1.013651319526e-02 4614 KSP Residual norm 9.710897936448e-03 4615 KSP Residual norm 9.182699053929e-03 4616 KSP Residual norm 8.293399133489e-03 4617 KSP Residual norm 7.723147755421e-03 4618 KSP Residual norm 7.991858294129e-03 4619 KSP Residual norm 8.938344640381e-03 4620 KSP Residual norm 8.654439004309e-03 4621 KSP Residual norm 7.181141301408e-03 4622 KSP Residual norm 6.357230595613e-03 4623 KSP Residual norm 6.422117995639e-03 4624 KSP Residual norm 6.483063368787e-03 4625 KSP Residual norm 6.865442041828e-03 4626 KSP Residual norm 7.410354335749e-03 4627 KSP Residual norm 7.197010267209e-03 4628 KSP Residual norm 6.824861693545e-03 4629 KSP Residual norm 6.897666276806e-03 4630 KSP Residual norm 7.506296015931e-03 4631 KSP Residual norm 7.905088162358e-03 4632 KSP Residual norm 7.862757023938e-03 4633 KSP Residual norm 8.544522738893e-03 4634 KSP Residual norm 9.360255424249e-03 4635 KSP Residual norm 9.179099451290e-03 4636 KSP Residual norm 8.286752076018e-03 4637 KSP Residual norm 7.746846310036e-03 4638 KSP Residual norm 7.534871355381e-03 4639 KSP Residual norm 7.282719682088e-03 4640 KSP Residual norm 7.553492203465e-03 4641 KSP Residual norm 9.295155557681e-03 4642 KSP Residual norm 1.171992700109e-02 4643 KSP Residual norm 1.286637312900e-02 4644 KSP Residual norm 1.271570269138e-02 4645 KSP Residual norm 1.298333595908e-02 4646 KSP Residual norm 1.347439695833e-02 4647 KSP Residual norm 1.305011359740e-02 4648 KSP Residual norm 1.306731634648e-02 4649 KSP Residual norm 1.289962866463e-02 4650 KSP Residual norm 1.268087597179e-02 4651 KSP Residual norm 1.181452450529e-02 4652 KSP Residual norm 1.162013706209e-02 4653 KSP Residual norm 1.227906872619e-02 4654 KSP Residual norm 1.325070303771e-02 4655 KSP Residual norm 1.337762070145e-02 4656 KSP Residual norm 1.315158500085e-02 4657 KSP Residual norm 1.280614725143e-02 4658 KSP Residual norm 1.279197585693e-02 4659 KSP Residual norm 1.297450067139e-02 4660 KSP Residual norm 1.288594217896e-02 4661 KSP Residual norm 1.324791314990e-02 4662 KSP Residual norm 1.332737571601e-02 4663 KSP Residual norm 1.377272971364e-02 4664 KSP Residual norm 1.470093687998e-02 4665 KSP Residual norm 1.521424957280e-02 4666 KSP Residual norm 1.426627944548e-02 4667 KSP Residual norm 1.334366572410e-02 4668 KSP Residual norm 1.269344829471e-02 4669 KSP Residual norm 1.157214060039e-02 4670 KSP Residual norm 1.089210980287e-02 4671 KSP Residual norm 1.138383261846e-02 4672 KSP Residual norm 1.239333059775e-02 4673 KSP Residual norm 1.238727732204e-02 4674 KSP Residual norm 1.115915590468e-02 4675 KSP Residual norm 9.818285958862e-03 4676 KSP Residual norm 8.565492420102e-03 4677 KSP Residual norm 8.386541780752e-03 4678 KSP Residual norm 9.516688861134e-03 4679 KSP Residual norm 1.055328475215e-02 4680 KSP Residual norm 1.103554801045e-02 4681 KSP Residual norm 1.214862504166e-02 4682 KSP Residual norm 1.334068555972e-02 4683 KSP Residual norm 1.261599459542e-02 4684 KSP Residual norm 1.226182142533e-02 4685 KSP Residual norm 1.213480550847e-02 4686 KSP Residual norm 1.128132550377e-02 4687 KSP Residual norm 1.061403127221e-02 4688 KSP Residual norm 1.063349230992e-02 4689 KSP Residual norm 1.100848987199e-02 4690 KSP Residual norm 1.182402965199e-02 4691 KSP Residual norm 1.241531199741e-02 4692 KSP Residual norm 1.178730273607e-02 4693 KSP Residual norm 1.118470952312e-02 4694 KSP Residual norm 1.088443059126e-02 4695 KSP Residual norm 1.124772832917e-02 4696 KSP Residual norm 1.227516148048e-02 4697 KSP Residual norm 1.361596978640e-02 4698 KSP Residual norm 1.481280206513e-02 4699 KSP Residual norm 1.503265598500e-02 4700 KSP Residual norm 1.405032574327e-02 4701 KSP Residual norm 1.349902341937e-02 4702 KSP Residual norm 1.431737388001e-02 4703 KSP Residual norm 1.510908150518e-02 4704 KSP Residual norm 1.493536282337e-02 4705 KSP Residual norm 1.551422531292e-02 4706 KSP Residual norm 1.734336603421e-02 4707 KSP Residual norm 2.002501626185e-02 4708 KSP Residual norm 2.437462491444e-02 4709 KSP Residual norm 2.787437687602e-02 4710 KSP Residual norm 2.969851828531e-02 4711 KSP Residual norm 3.218934254707e-02 4712 KSP Residual norm 3.334929843073e-02 4713 KSP Residual norm 3.362831222543e-02 4714 KSP Residual norm 3.716477564943e-02 4715 KSP Residual norm 3.912634467630e-02 4716 KSP Residual norm 3.385720793202e-02 4717 KSP Residual norm 3.148142322739e-02 4718 KSP Residual norm 3.614145952465e-02 4719 KSP Residual norm 4.372417586063e-02 4720 KSP Residual norm 4.681159484674e-02 4721 KSP Residual norm 4.927379738417e-02 4722 KSP Residual norm 5.332342467052e-02 4723 KSP Residual norm 5.711397446185e-02 4724 KSP Residual norm 5.630133139287e-02 4725 KSP Residual norm 5.150282811884e-02 4726 KSP Residual norm 4.848854259112e-02 4727 KSP Residual norm 4.839766712392e-02 4728 KSP Residual norm 4.434141718208e-02 4729 KSP Residual norm 3.899315766144e-02 4730 KSP Residual norm 3.714692830255e-02 4731 KSP Residual norm 3.808062959376e-02 4732 KSP Residual norm 4.016749477394e-02 4733 KSP Residual norm 4.266357705396e-02 4734 KSP Residual norm 4.339169673220e-02 4735 KSP Residual norm 4.319612095025e-02 4736 KSP Residual norm 4.215492209268e-02 4737 KSP Residual norm 4.114426642373e-02 4738 KSP Residual norm 3.930242419604e-02 4739 KSP Residual norm 3.635129552145e-02 4740 KSP Residual norm 3.390812645207e-02 4741 KSP Residual norm 3.148750122687e-02 4742 KSP Residual norm 2.999444339151e-02 4743 KSP Residual norm 2.840582120708e-02 4744 KSP Residual norm 2.762142759199e-02 4745 KSP Residual norm 2.756049086040e-02 4746 KSP Residual norm 2.528557249255e-02 4747 KSP Residual norm 1.985618261662e-02 4748 KSP Residual norm 1.704874133330e-02 4749 KSP Residual norm 1.702916365609e-02 4750 KSP Residual norm 1.760842219336e-02 4751 KSP Residual norm 1.789682033856e-02 4752 KSP Residual norm 1.713223185387e-02 4753 KSP Residual norm 1.569896355416e-02 4754 KSP Residual norm 1.436561731637e-02 4755 KSP Residual norm 1.359599960161e-02 4756 KSP Residual norm 1.300462886490e-02 4757 KSP Residual norm 1.245207510571e-02 4758 KSP Residual norm 1.202992508680e-02 4759 KSP Residual norm 1.253906473485e-02 4760 KSP Residual norm 1.370882739476e-02 4761 KSP Residual norm 1.427135535699e-02 4762 KSP Residual norm 1.239537244118e-02 4763 KSP Residual norm 1.231282471630e-02 4764 KSP Residual norm 1.422684655278e-02 4765 KSP Residual norm 1.553672450769e-02 4766 KSP Residual norm 1.642684069468e-02 4767 KSP Residual norm 1.800600010207e-02 4768 KSP Residual norm 1.919540710086e-02 4769 KSP Residual norm 1.740662476578e-02 4770 KSP Residual norm 1.477995780205e-02 4771 KSP Residual norm 1.399626756825e-02 4772 KSP Residual norm 1.511396159601e-02 4773 KSP Residual norm 1.688380847740e-02 4774 KSP Residual norm 1.916919031668e-02 4775 KSP Residual norm 2.244179738950e-02 4776 KSP Residual norm 2.656126852966e-02 4777 KSP Residual norm 2.897145629632e-02 4778 KSP Residual norm 2.940996576390e-02 4779 KSP Residual norm 2.825507853185e-02 4780 KSP Residual norm 2.845705802397e-02 4781 KSP Residual norm 2.886157826641e-02 4782 KSP Residual norm 2.701527474462e-02 4783 KSP Residual norm 2.642677855064e-02 4784 KSP Residual norm 2.901462934352e-02 4785 KSP Residual norm 3.402873026474e-02 4786 KSP Residual norm 3.853720495238e-02 4787 KSP Residual norm 3.850369558944e-02 4788 KSP Residual norm 3.196678228647e-02 4789 KSP Residual norm 2.634458486650e-02 4790 KSP Residual norm 2.410137337534e-02 4791 KSP Residual norm 2.516763698972e-02 4792 KSP Residual norm 2.772430350738e-02 4793 KSP Residual norm 2.689201040742e-02 4794 KSP Residual norm 2.436219820416e-02 4795 KSP Residual norm 2.261589709462e-02 4796 KSP Residual norm 2.277671648243e-02 4797 KSP Residual norm 2.562926450131e-02 4798 KSP Residual norm 3.042263996726e-02 4799 KSP Residual norm 3.251990926068e-02 4800 KSP Residual norm 3.085956605997e-02 4801 KSP Residual norm 2.646868399630e-02 4802 KSP Residual norm 2.386192171524e-02 4803 KSP Residual norm 2.261458721256e-02 4804 KSP Residual norm 2.062660091819e-02 4805 KSP Residual norm 1.869516568364e-02 4806 KSP Residual norm 1.760108480192e-02 4807 KSP Residual norm 1.754443504284e-02 4808 KSP Residual norm 1.796042878408e-02 4809 KSP Residual norm 1.788378531861e-02 4810 KSP Residual norm 1.530127019498e-02 4811 KSP Residual norm 1.215105364291e-02 4812 KSP Residual norm 1.084502145397e-02 4813 KSP Residual norm 1.040249650107e-02 4814 KSP Residual norm 9.958002318289e-03 4815 KSP Residual norm 9.097127761621e-03 4816 KSP Residual norm 8.910933318721e-03 4817 KSP Residual norm 9.315951713342e-03 4818 KSP Residual norm 9.950766199381e-03 4819 KSP Residual norm 9.998912624615e-03 4820 KSP Residual norm 9.455630025168e-03 4821 KSP Residual norm 8.270036764891e-03 4822 KSP Residual norm 7.859719892972e-03 4823 KSP Residual norm 8.155333463607e-03 4824 KSP Residual norm 8.272047439841e-03 4825 KSP Residual norm 8.085623764538e-03 4826 KSP Residual norm 7.547949352706e-03 4827 KSP Residual norm 6.644991843527e-03 4828 KSP Residual norm 5.462484033838e-03 4829 KSP Residual norm 4.988850411074e-03 4830 KSP Residual norm 5.102907622420e-03 4831 KSP Residual norm 5.449278214594e-03 4832 KSP Residual norm 5.720242187523e-03 4833 KSP Residual norm 6.082833267210e-03 4834 KSP Residual norm 5.929903839415e-03 4835 KSP Residual norm 4.995252054354e-03 4836 KSP Residual norm 4.491057907517e-03 4837 KSP Residual norm 4.837108278648e-03 4838 KSP Residual norm 6.055193862859e-03 4839 KSP Residual norm 6.920685278467e-03 4840 KSP Residual norm 7.231459824192e-03 4841 KSP Residual norm 8.107222146470e-03 4842 KSP Residual norm 9.308239239686e-03 4843 KSP Residual norm 9.871463989251e-03 4844 KSP Residual norm 1.002484830643e-02 4845 KSP Residual norm 1.026884090543e-02 4846 KSP Residual norm 1.037122151751e-02 4847 KSP Residual norm 9.770518532811e-03 4848 KSP Residual norm 9.286292396229e-03 4849 KSP Residual norm 9.804405914052e-03 4850 KSP Residual norm 1.042391520388e-02 4851 KSP Residual norm 1.065719222685e-02 4852 KSP Residual norm 1.111461650311e-02 4853 KSP Residual norm 1.172781563300e-02 4854 KSP Residual norm 1.265211396877e-02 4855 KSP Residual norm 1.306519731946e-02 4856 KSP Residual norm 1.287582554304e-02 4857 KSP Residual norm 1.232046664975e-02 4858 KSP Residual norm 1.194964402615e-02 4859 KSP Residual norm 1.148504283675e-02 4860 KSP Residual norm 1.070334349280e-02 4861 KSP Residual norm 1.065549411294e-02 4862 KSP Residual norm 1.068936895043e-02 4863 KSP Residual norm 1.001884044768e-02 4864 KSP Residual norm 8.632443770420e-03 4865 KSP Residual norm 6.928568699760e-03 4866 KSP Residual norm 5.646926488273e-03 4867 KSP Residual norm 4.978013413895e-03 4868 KSP Residual norm 4.959400548923e-03 4869 KSP Residual norm 5.160469696160e-03 4870 KSP Residual norm 4.971745575577e-03 4871 KSP Residual norm 4.928852565150e-03 4872 KSP Residual norm 5.549808642864e-03 4873 KSP Residual norm 6.227012788396e-03 4874 KSP Residual norm 5.897499897790e-03 4875 KSP Residual norm 4.857923099915e-03 4876 KSP Residual norm 4.121677201552e-03 4877 KSP Residual norm 4.073826787684e-03 4878 KSP Residual norm 4.450419215251e-03 4879 KSP Residual norm 5.263443308304e-03 4880 KSP Residual norm 6.394327906829e-03 4881 KSP Residual norm 6.381629515620e-03 4882 KSP Residual norm 5.412036775608e-03 4883 KSP Residual norm 4.694628047564e-03 4884 KSP Residual norm 4.657643136149e-03 4885 KSP Residual norm 5.266059715368e-03 4886 KSP Residual norm 6.271757358305e-03 4887 KSP Residual norm 6.796039189475e-03 4888 KSP Residual norm 6.379619582637e-03 4889 KSP Residual norm 6.246866960254e-03 4890 KSP Residual norm 6.591449778191e-03 4891 KSP Residual norm 6.257950698969e-03 4892 KSP Residual norm 6.083635246191e-03 4893 KSP Residual norm 6.257576495005e-03 4894 KSP Residual norm 5.832728742303e-03 4895 KSP Residual norm 4.989826804701e-03 4896 KSP Residual norm 4.185729388971e-03 4897 KSP Residual norm 3.749863191958e-03 4898 KSP Residual norm 4.055073930552e-03 4899 KSP Residual norm 5.203687735174e-03 4900 KSP Residual norm 7.289298072190e-03 4901 KSP Residual norm 9.765633568643e-03 4902 KSP Residual norm 1.080706924468e-02 4903 KSP Residual norm 1.002662072119e-02 4904 KSP Residual norm 8.988931475513e-03 4905 KSP Residual norm 8.453173556504e-03 4906 KSP Residual norm 9.320737266791e-03 4907 KSP Residual norm 1.047363050911e-02 4908 KSP Residual norm 1.076803557135e-02 4909 KSP Residual norm 1.039213410634e-02 4910 KSP Residual norm 9.155568292591e-03 4911 KSP Residual norm 7.626370109756e-03 4912 KSP Residual norm 6.557297619764e-03 4913 KSP Residual norm 5.688947686707e-03 4914 KSP Residual norm 5.307608385522e-03 4915 KSP Residual norm 5.402110500513e-03 4916 KSP Residual norm 5.758432233635e-03 4917 KSP Residual norm 6.434845501485e-03 4918 KSP Residual norm 7.111541646617e-03 4919 KSP Residual norm 7.488030948720e-03 4920 KSP Residual norm 6.705354653259e-03 4921 KSP Residual norm 4.790018037051e-03 4922 KSP Residual norm 3.362452309240e-03 4923 KSP Residual norm 2.946761601425e-03 4924 KSP Residual norm 3.390387075941e-03 4925 KSP Residual norm 4.241381450779e-03 4926 KSP Residual norm 5.270902723724e-03 4927 KSP Residual norm 6.059402611143e-03 4928 KSP Residual norm 5.337757436964e-03 4929 KSP Residual norm 3.833830493822e-03 4930 KSP Residual norm 2.747645358915e-03 4931 KSP Residual norm 2.150688673279e-03 4932 KSP Residual norm 1.914781624795e-03 4933 KSP Residual norm 2.077408449524e-03 4934 KSP Residual norm 2.632585692938e-03 4935 KSP Residual norm 3.300334592556e-03 4936 KSP Residual norm 3.179171046809e-03 4937 KSP Residual norm 2.228488962779e-03 4938 KSP Residual norm 1.514179510410e-03 4939 KSP Residual norm 1.178432803597e-03 4940 KSP Residual norm 1.092308238180e-03 4941 KSP Residual norm 1.194188530520e-03 4942 KSP Residual norm 1.459530746680e-03 4943 KSP Residual norm 1.974357583893e-03 4944 KSP Residual norm 2.456511377489e-03 4945 KSP Residual norm 2.363429800002e-03 4946 KSP Residual norm 1.747724840607e-03 4947 KSP Residual norm 1.238167988902e-03 4948 KSP Residual norm 9.347618818802e-04 4949 KSP Residual norm 7.846093234227e-04 4950 KSP Residual norm 7.969019913978e-04 4951 KSP Residual norm 1.018032594440e-03 4952 KSP Residual norm 1.435193586315e-03 4953 KSP Residual norm 1.763016426696e-03 4954 KSP Residual norm 1.879673692883e-03 4955 KSP Residual norm 1.648663413833e-03 4956 KSP Residual norm 1.115507322067e-03 4957 KSP Residual norm 6.986300219611e-04 4958 KSP Residual norm 5.518190849376e-04 4959 KSP Residual norm 5.930730453886e-04 4960 KSP Residual norm 7.720168068322e-04 4961 KSP Residual norm 9.852553178381e-04 4962 KSP Residual norm 1.136653159694e-03 4963 KSP Residual norm 1.180083614422e-03 4964 KSP Residual norm 1.160501693820e-03 4965 KSP Residual norm 1.130479996735e-03 4966 KSP Residual norm 1.193048737706e-03 4967 KSP Residual norm 1.337340833074e-03 4968 KSP Residual norm 1.407270243137e-03 4969 KSP Residual norm 1.332676891727e-03 4970 KSP Residual norm 1.301055778962e-03 4971 KSP Residual norm 1.361806020942e-03 4972 KSP Residual norm 1.541484114909e-03 4973 KSP Residual norm 1.818375177566e-03 4974 KSP Residual norm 2.031378473856e-03 4975 KSP Residual norm 1.853145278024e-03 4976 KSP Residual norm 1.524647736242e-03 4977 KSP Residual norm 1.342015063503e-03 4978 KSP Residual norm 1.405340684928e-03 4979 KSP Residual norm 1.737159120482e-03 4980 KSP Residual norm 2.422470991332e-03 4981 KSP Residual norm 3.063469428059e-03 4982 KSP Residual norm 3.263232649914e-03 4983 KSP Residual norm 3.286345157348e-03 4984 KSP Residual norm 3.329342521906e-03 4985 KSP Residual norm 3.326627977020e-03 4986 KSP Residual norm 3.371354442555e-03 4987 KSP Residual norm 3.421220349236e-03 4988 KSP Residual norm 3.287196782563e-03 4989 KSP Residual norm 2.715823964119e-03 4990 KSP Residual norm 2.214979943780e-03 4991 KSP Residual norm 2.114034858605e-03 4992 KSP Residual norm 2.269898558350e-03 4993 KSP Residual norm 2.499906013821e-03 4994 KSP Residual norm 2.912036386996e-03 4995 KSP Residual norm 3.387337502437e-03 4996 KSP Residual norm 3.235464518676e-03 4997 KSP Residual norm 2.880509673571e-03 4998 KSP Residual norm 2.816461095183e-03 4999 KSP Residual norm 3.022250534837e-03 5000 KSP Residual norm 3.291150101357e-03 5001 KSP Residual norm 3.389669223662e-03 5002 KSP Residual norm 2.817208090928e-03 5003 KSP Residual norm 1.978803521597e-03 5004 KSP Residual norm 1.422343727108e-03 5005 KSP Residual norm 1.144685926365e-03 5006 KSP Residual norm 1.234667529448e-03 5007 KSP Residual norm 1.819585014719e-03 5008 KSP Residual norm 2.750600411515e-03 5009 KSP Residual norm 3.165378437491e-03 5010 KSP Residual norm 2.677495677307e-03 5011 KSP Residual norm 2.163050280073e-03 5012 KSP Residual norm 1.874207928539e-03 5013 KSP Residual norm 1.535687724493e-03 5014 KSP Residual norm 1.313885632045e-03 5015 KSP Residual norm 1.408869915294e-03 5016 KSP Residual norm 1.588029743271e-03 5017 KSP Residual norm 1.588344572732e-03 5018 KSP Residual norm 1.669796293906e-03 5019 KSP Residual norm 1.944841238776e-03 5020 KSP Residual norm 2.045359968243e-03 5021 KSP Residual norm 1.656331145003e-03 5022 KSP Residual norm 1.279755905280e-03 5023 KSP Residual norm 1.079348218324e-03 5024 KSP Residual norm 1.096175544553e-03 5025 KSP Residual norm 1.288021654318e-03 5026 KSP Residual norm 1.594553342392e-03 5027 KSP Residual norm 1.888061940387e-03 5028 KSP Residual norm 2.171248386832e-03 5029 KSP Residual norm 2.454370553550e-03 5030 KSP Residual norm 2.816579734572e-03 5031 KSP Residual norm 3.084512066111e-03 5032 KSP Residual norm 3.027622341693e-03 5033 KSP Residual norm 2.978665746558e-03 5034 KSP Residual norm 2.690866475698e-03 5035 KSP Residual norm 2.110124306155e-03 5036 KSP Residual norm 1.674828335178e-03 5037 KSP Residual norm 1.566565908324e-03 5038 KSP Residual norm 1.747937220098e-03 5039 KSP Residual norm 2.248892010782e-03 5040 KSP Residual norm 2.857705442715e-03 5041 KSP Residual norm 3.085106545901e-03 5042 KSP Residual norm 3.051309956733e-03 5043 KSP Residual norm 3.033356990345e-03 5044 KSP Residual norm 3.192142597189e-03 5045 KSP Residual norm 3.375624152832e-03 5046 KSP Residual norm 3.336374499905e-03 5047 KSP Residual norm 3.050958470633e-03 5048 KSP Residual norm 2.756607565141e-03 5049 KSP Residual norm 2.515508106626e-03 5050 KSP Residual norm 2.489483992298e-03 5051 KSP Residual norm 2.849603288255e-03 5052 KSP Residual norm 3.546447799366e-03 5053 KSP Residual norm 4.151401419809e-03 5054 KSP Residual norm 4.446572591243e-03 5055 KSP Residual norm 4.856234194189e-03 5056 KSP Residual norm 5.256038219276e-03 5057 KSP Residual norm 5.845514047672e-03 5058 KSP Residual norm 6.592947114170e-03 5059 KSP Residual norm 6.076006162852e-03 5060 KSP Residual norm 5.404819106328e-03 5061 KSP Residual norm 5.269282759973e-03 5062 KSP Residual norm 5.001933187235e-03 5063 KSP Residual norm 4.622849954769e-03 5064 KSP Residual norm 4.823257629216e-03 5065 KSP Residual norm 5.573911968753e-03 5066 KSP Residual norm 6.516973487883e-03 5067 KSP Residual norm 7.820591843174e-03 5068 KSP Residual norm 8.378533585387e-03 5069 KSP Residual norm 7.809567523651e-03 5070 KSP Residual norm 6.823901788272e-03 5071 KSP Residual norm 6.260748730520e-03 5072 KSP Residual norm 5.898450566380e-03 5073 KSP Residual norm 5.687804058418e-03 5074 KSP Residual norm 5.262967099664e-03 5075 KSP Residual norm 4.844076719264e-03 5076 KSP Residual norm 4.651855957704e-03 5077 KSP Residual norm 4.479522372641e-03 5078 KSP Residual norm 4.820802430663e-03 5079 KSP Residual norm 5.157007390652e-03 5080 KSP Residual norm 5.347094848023e-03 5081 KSP Residual norm 5.480147694964e-03 5082 KSP Residual norm 6.197823277763e-03 5083 KSP Residual norm 6.953930398315e-03 5084 KSP Residual norm 7.494722045878e-03 5085 KSP Residual norm 7.122049835994e-03 5086 KSP Residual norm 5.250897891001e-03 5087 KSP Residual norm 3.733753304278e-03 5088 KSP Residual norm 3.000130609437e-03 5089 KSP Residual norm 2.730444865874e-03 5090 KSP Residual norm 2.806777592697e-03 5091 KSP Residual norm 3.534693244892e-03 5092 KSP Residual norm 4.742855747346e-03 5093 KSP Residual norm 5.618857680117e-03 5094 KSP Residual norm 5.727560122837e-03 5095 KSP Residual norm 5.665085254908e-03 5096 KSP Residual norm 5.572251597088e-03 5097 KSP Residual norm 5.392866905234e-03 5098 KSP Residual norm 4.797175112142e-03 5099 KSP Residual norm 4.115744724931e-03 5100 KSP Residual norm 3.808753573586e-03 5101 KSP Residual norm 3.911638581068e-03 5102 KSP Residual norm 3.978371081644e-03 5103 KSP Residual norm 4.160433694042e-03 5104 KSP Residual norm 4.717609959807e-03 5105 KSP Residual norm 5.529647619336e-03 5106 KSP Residual norm 6.057214307890e-03 5107 KSP Residual norm 6.661957144616e-03 5108 KSP Residual norm 7.161526349291e-03 5109 KSP Residual norm 7.570901013840e-03 5110 KSP Residual norm 7.707589057577e-03 5111 KSP Residual norm 7.683271579517e-03 5112 KSP Residual norm 7.846003147522e-03 5113 KSP Residual norm 8.308064350657e-03 5114 KSP Residual norm 8.337705614986e-03 5115 KSP Residual norm 7.612423043577e-03 5116 KSP Residual norm 7.332948570313e-03 5117 KSP Residual norm 7.268416620020e-03 5118 KSP Residual norm 7.415675864539e-03 5119 KSP Residual norm 8.621764085841e-03 5120 KSP Residual norm 1.058892453647e-02 5121 KSP Residual norm 1.178819456823e-02 5122 KSP Residual norm 1.225711412480e-02 5123 KSP Residual norm 1.273167378411e-02 5124 KSP Residual norm 1.301503302738e-02 5125 KSP Residual norm 1.221850412513e-02 5126 KSP Residual norm 1.124446749456e-02 5127 KSP Residual norm 1.023472666998e-02 5128 KSP Residual norm 9.532791955133e-03 5129 KSP Residual norm 9.174097741455e-03 5130 KSP Residual norm 9.694037708432e-03 5131 KSP Residual norm 1.055795212558e-02 5132 KSP Residual norm 9.425549028743e-03 5133 KSP Residual norm 8.619620720571e-03 5134 KSP Residual norm 1.002930961693e-02 5135 KSP Residual norm 1.278178777314e-02 5136 KSP Residual norm 1.347910571283e-02 5137 KSP Residual norm 1.170006737430e-02 5138 KSP Residual norm 9.876322953708e-03 5139 KSP Residual norm 9.995696068193e-03 5140 KSP Residual norm 1.197374794613e-02 5141 KSP Residual norm 1.180659063410e-02 5142 KSP Residual norm 8.960345861898e-03 5143 KSP Residual norm 6.670112656017e-03 5144 KSP Residual norm 5.325361947471e-03 5145 KSP Residual norm 4.736550206560e-03 5146 KSP Residual norm 5.072528972719e-03 5147 KSP Residual norm 5.757265321682e-03 5148 KSP Residual norm 5.833410810155e-03 5149 KSP Residual norm 5.502048773279e-03 5150 KSP Residual norm 5.020828627945e-03 5151 KSP Residual norm 4.879906807749e-03 5152 KSP Residual norm 5.473968702837e-03 5153 KSP Residual norm 6.806856885144e-03 5154 KSP Residual norm 8.018509552909e-03 5155 KSP Residual norm 8.304671294585e-03 5156 KSP Residual norm 8.427119056107e-03 5157 KSP Residual norm 9.083636959881e-03 5158 KSP Residual norm 1.031694186913e-02 5159 KSP Residual norm 1.026507915940e-02 5160 KSP Residual norm 9.187457077705e-03 5161 KSP Residual norm 8.958463131120e-03 5162 KSP Residual norm 9.005353946366e-03 5163 KSP Residual norm 7.998277276331e-03 5164 KSP Residual norm 7.076361672102e-03 5165 KSP Residual norm 6.012448325201e-03 5166 KSP Residual norm 4.614796846782e-03 5167 KSP Residual norm 4.092226890317e-03 5168 KSP Residual norm 4.552101862680e-03 5169 KSP Residual norm 5.165059680433e-03 5170 KSP Residual norm 5.132031450739e-03 5171 KSP Residual norm 5.228096972927e-03 5172 KSP Residual norm 6.330401772911e-03 5173 KSP Residual norm 9.020369001129e-03 5174 KSP Residual norm 1.191885618250e-02 5175 KSP Residual norm 1.155960131928e-02 5176 KSP Residual norm 9.926623465561e-03 5177 KSP Residual norm 9.926733624389e-03 5178 KSP Residual norm 1.195694135051e-02 5179 KSP Residual norm 1.338259629072e-02 5180 KSP Residual norm 1.389102283443e-02 5181 KSP Residual norm 1.267728231799e-02 5182 KSP Residual norm 1.165462408308e-02 5183 KSP Residual norm 1.221733246193e-02 5184 KSP Residual norm 1.256365927293e-02 5185 KSP Residual norm 1.147165485488e-02 5186 KSP Residual norm 9.948083793328e-03 5187 KSP Residual norm 8.952059063007e-03 5188 KSP Residual norm 9.548145245048e-03 5189 KSP Residual norm 1.198090008198e-02 5190 KSP Residual norm 1.421032375516e-02 5191 KSP Residual norm 1.455043651700e-02 5192 KSP Residual norm 1.597607475038e-02 5193 KSP Residual norm 1.731138927366e-02 5194 KSP Residual norm 1.869657793075e-02 5195 KSP Residual norm 2.056260700247e-02 5196 KSP Residual norm 1.989650020402e-02 5197 KSP Residual norm 1.644841553315e-02 5198 KSP Residual norm 1.402123831286e-02 5199 KSP Residual norm 1.393307731848e-02 5200 KSP Residual norm 1.499889495236e-02 5201 KSP Residual norm 1.512649830288e-02 5202 KSP Residual norm 1.452302777099e-02 5203 KSP Residual norm 1.400754830125e-02 5204 KSP Residual norm 1.454268965046e-02 5205 KSP Residual norm 1.605382740223e-02 5206 KSP Residual norm 1.747113205190e-02 5207 KSP Residual norm 1.826558199165e-02 5208 KSP Residual norm 1.766043281495e-02 5209 KSP Residual norm 1.624484941571e-02 5210 KSP Residual norm 1.573189280158e-02 5211 KSP Residual norm 1.590837291997e-02 5212 KSP Residual norm 1.489017780109e-02 5213 KSP Residual norm 1.299770002216e-02 5214 KSP Residual norm 1.147567719050e-02 5215 KSP Residual norm 1.042029489694e-02 5216 KSP Residual norm 1.003509276822e-02 5217 KSP Residual norm 9.768968368273e-03 5218 KSP Residual norm 8.752247826823e-03 5219 KSP Residual norm 7.601636512821e-03 5220 KSP Residual norm 8.168160771245e-03 5221 KSP Residual norm 1.080881386017e-02 5222 KSP Residual norm 1.341820622108e-02 5223 KSP Residual norm 1.304121223602e-02 5224 KSP Residual norm 1.110327025048e-02 5225 KSP Residual norm 9.743934920607e-03 5226 KSP Residual norm 9.277908333168e-03 5227 KSP Residual norm 8.558564414890e-03 5228 KSP Residual norm 7.211650134836e-03 5229 KSP Residual norm 5.988205291223e-03 5230 KSP Residual norm 5.703158478056e-03 5231 KSP Residual norm 5.551940465001e-03 5232 KSP Residual norm 5.386992639716e-03 5233 KSP Residual norm 5.019571930758e-03 5234 KSP Residual norm 4.476459677889e-03 5235 KSP Residual norm 3.781661865522e-03 5236 KSP Residual norm 3.701038045348e-03 5237 KSP Residual norm 4.429787412057e-03 5238 KSP Residual norm 5.345586233516e-03 5239 KSP Residual norm 5.605324989227e-03 5240 KSP Residual norm 5.575771676464e-03 5241 KSP Residual norm 5.831218616475e-03 5242 KSP Residual norm 6.658094702125e-03 5243 KSP Residual norm 7.821203358408e-03 5244 KSP Residual norm 7.423070869094e-03 5245 KSP Residual norm 5.436615846481e-03 5246 KSP Residual norm 3.950016393308e-03 5247 KSP Residual norm 3.315810664097e-03 5248 KSP Residual norm 3.325728496516e-03 5249 KSP Residual norm 3.126640604489e-03 5250 KSP Residual norm 2.558728088265e-03 5251 KSP Residual norm 2.262427999336e-03 5252 KSP Residual norm 2.406402413605e-03 5253 KSP Residual norm 2.997248114239e-03 5254 KSP Residual norm 3.787470875264e-03 5255 KSP Residual norm 4.270036535989e-03 5256 KSP Residual norm 4.631049225251e-03 5257 KSP Residual norm 5.218134405630e-03 5258 KSP Residual norm 6.607068903890e-03 5259 KSP Residual norm 7.960533351606e-03 5260 KSP Residual norm 7.166807492011e-03 5261 KSP Residual norm 5.997626686697e-03 5262 KSP Residual norm 6.178378670894e-03 5263 KSP Residual norm 7.438134165446e-03 5264 KSP Residual norm 6.984649077495e-03 5265 KSP Residual norm 5.136207872633e-03 5266 KSP Residual norm 4.644114249803e-03 5267 KSP Residual norm 5.507585266405e-03 5268 KSP Residual norm 6.560606415636e-03 5269 KSP Residual norm 6.299041827985e-03 5270 KSP Residual norm 5.947565289050e-03 5271 KSP Residual norm 6.052466081758e-03 5272 KSP Residual norm 6.633452172370e-03 5273 KSP Residual norm 7.343726482563e-03 5274 KSP Residual norm 7.706364749192e-03 5275 KSP Residual norm 7.498006751567e-03 5276 KSP Residual norm 7.582417555133e-03 5277 KSP Residual norm 8.575346699186e-03 5278 KSP Residual norm 8.913395547110e-03 5279 KSP Residual norm 7.819152327133e-03 5280 KSP Residual norm 6.835399530654e-03 5281 KSP Residual norm 6.616375226819e-03 5282 KSP Residual norm 6.942442179146e-03 5283 KSP Residual norm 6.701451226587e-03 5284 KSP Residual norm 6.135577499091e-03 5285 KSP Residual norm 5.841247466051e-03 5286 KSP Residual norm 5.811567014070e-03 5287 KSP Residual norm 6.153801526829e-03 5288 KSP Residual norm 6.370917959567e-03 5289 KSP Residual norm 5.974126718975e-03 5290 KSP Residual norm 5.157197549608e-03 5291 KSP Residual norm 4.506417439043e-03 5292 KSP Residual norm 4.226676984224e-03 5293 KSP Residual norm 4.476507758499e-03 5294 KSP Residual norm 5.282982936160e-03 5295 KSP Residual norm 5.736576221504e-03 5296 KSP Residual norm 4.864534830334e-03 5297 KSP Residual norm 4.218531524174e-03 5298 KSP Residual norm 4.315266172564e-03 5299 KSP Residual norm 4.359964728158e-03 5300 KSP Residual norm 4.042754202727e-03 5301 KSP Residual norm 3.587051339079e-03 5302 KSP Residual norm 3.354907241891e-03 5303 KSP Residual norm 3.598450594975e-03 5304 KSP Residual norm 3.825154915405e-03 5305 KSP Residual norm 3.696607710441e-03 5306 KSP Residual norm 3.609045564149e-03 5307 KSP Residual norm 3.785466801854e-03 5308 KSP Residual norm 4.186759102805e-03 5309 KSP Residual norm 4.485623763844e-03 5310 KSP Residual norm 4.194450565602e-03 5311 KSP Residual norm 3.792055641733e-03 5312 KSP Residual norm 3.922425745534e-03 5313 KSP Residual norm 4.447457413111e-03 5314 KSP Residual norm 4.232897818345e-03 5315 KSP Residual norm 3.658716643868e-03 5316 KSP Residual norm 3.287345091224e-03 5317 KSP Residual norm 2.969861678067e-03 5318 KSP Residual norm 2.548695342826e-03 5319 KSP Residual norm 2.186322831664e-03 5320 KSP Residual norm 1.865956342423e-03 5321 KSP Residual norm 1.639912204247e-03 5322 KSP Residual norm 1.612372144684e-03 5323 KSP Residual norm 1.548680511225e-03 5324 KSP Residual norm 1.463191038880e-03 5325 KSP Residual norm 1.485551702543e-03 5326 KSP Residual norm 1.578625931248e-03 5327 KSP Residual norm 1.678239612159e-03 5328 KSP Residual norm 1.817246820426e-03 5329 KSP Residual norm 2.093034605559e-03 5330 KSP Residual norm 2.379539180216e-03 5331 KSP Residual norm 2.281340139175e-03 5332 KSP Residual norm 2.113771530724e-03 5333 KSP Residual norm 1.953018997233e-03 5334 KSP Residual norm 1.787146918122e-03 5335 KSP Residual norm 1.673173151466e-03 5336 KSP Residual norm 1.673879455218e-03 5337 KSP Residual norm 1.610088340176e-03 5338 KSP Residual norm 1.375386601546e-03 5339 KSP Residual norm 1.204256628393e-03 5340 KSP Residual norm 1.172068834485e-03 5341 KSP Residual norm 1.167299749369e-03 5342 KSP Residual norm 1.084987721925e-03 5343 KSP Residual norm 1.048608500019e-03 5344 KSP Residual norm 1.182894398543e-03 5345 KSP Residual norm 1.453015709397e-03 5346 KSP Residual norm 1.767447466914e-03 5347 KSP Residual norm 2.047673873151e-03 5348 KSP Residual norm 2.165888451651e-03 5349 KSP Residual norm 2.345026100221e-03 5350 KSP Residual norm 2.699483159671e-03 5351 KSP Residual norm 3.029545334055e-03 5352 KSP Residual norm 3.098176374562e-03 5353 KSP Residual norm 2.863551593303e-03 5354 KSP Residual norm 2.665654364024e-03 5355 KSP Residual norm 2.591283133431e-03 5356 KSP Residual norm 2.494178966467e-03 5357 KSP Residual norm 2.197196779720e-03 5358 KSP Residual norm 1.911928992764e-03 5359 KSP Residual norm 1.870311203534e-03 5360 KSP Residual norm 1.849589844497e-03 5361 KSP Residual norm 1.742174612829e-03 5362 KSP Residual norm 1.629652677123e-03 5363 KSP Residual norm 1.711177703496e-03 5364 KSP Residual norm 1.912003863409e-03 5365 KSP Residual norm 1.960739899509e-03 5366 KSP Residual norm 1.901441585670e-03 5367 KSP Residual norm 1.925905953983e-03 5368 KSP Residual norm 1.986371216477e-03 5369 KSP Residual norm 1.957363818653e-03 5370 KSP Residual norm 2.053645174170e-03 5371 KSP Residual norm 2.330000687696e-03 5372 KSP Residual norm 2.490599777998e-03 5373 KSP Residual norm 2.764168582941e-03 5374 KSP Residual norm 3.333090739671e-03 5375 KSP Residual norm 3.299297499929e-03 5376 KSP Residual norm 2.943633959223e-03 5377 KSP Residual norm 3.020934084618e-03 5378 KSP Residual norm 3.559496348939e-03 5379 KSP Residual norm 3.750851249493e-03 5380 KSP Residual norm 3.057948346840e-03 5381 KSP Residual norm 2.483756421269e-03 5382 KSP Residual norm 2.313690894628e-03 5383 KSP Residual norm 2.284330872334e-03 5384 KSP Residual norm 2.067498440454e-03 5385 KSP Residual norm 1.900828101659e-03 5386 KSP Residual norm 1.896266700462e-03 5387 KSP Residual norm 2.053747040898e-03 5388 KSP Residual norm 2.344842302619e-03 5389 KSP Residual norm 2.448419012429e-03 5390 KSP Residual norm 2.216759496229e-03 5391 KSP Residual norm 2.143984695538e-03 5392 KSP Residual norm 2.267819864387e-03 5393 KSP Residual norm 2.100480202825e-03 5394 KSP Residual norm 1.758421892352e-03 5395 KSP Residual norm 1.624170996767e-03 5396 KSP Residual norm 1.623729138292e-03 5397 KSP Residual norm 1.715904587707e-03 5398 KSP Residual norm 1.628547705282e-03 5399 KSP Residual norm 1.369979652687e-03 5400 KSP Residual norm 1.197552314130e-03 5401 KSP Residual norm 1.235218715062e-03 5402 KSP Residual norm 1.251795156399e-03 5403 KSP Residual norm 1.055398984970e-03 5404 KSP Residual norm 8.693521818439e-04 5405 KSP Residual norm 7.765063210340e-04 5406 KSP Residual norm 8.061923680833e-04 5407 KSP Residual norm 8.149611894986e-04 5408 KSP Residual norm 7.216353075345e-04 5409 KSP Residual norm 6.957923218712e-04 5410 KSP Residual norm 7.951162909202e-04 5411 KSP Residual norm 8.469759210108e-04 5412 KSP Residual norm 8.136838889844e-04 5413 KSP Residual norm 8.467882306835e-04 5414 KSP Residual norm 9.013272552227e-04 5415 KSP Residual norm 9.403081361950e-04 5416 KSP Residual norm 9.849734302925e-04 5417 KSP Residual norm 9.031059445749e-04 5418 KSP Residual norm 7.554833038730e-04 5419 KSP Residual norm 6.641867816478e-04 5420 KSP Residual norm 5.882686720478e-04 5421 KSP Residual norm 5.146939792882e-04 5422 KSP Residual norm 4.958902421118e-04 5423 KSP Residual norm 4.933509916772e-04 5424 KSP Residual norm 4.882247397423e-04 5425 KSP Residual norm 4.747048969735e-04 5426 KSP Residual norm 4.784796763508e-04 5427 KSP Residual norm 4.999307350715e-04 5428 KSP Residual norm 5.243624785548e-04 5429 KSP Residual norm 5.483302830145e-04 5430 KSP Residual norm 6.131506349290e-04 5431 KSP Residual norm 6.352148222316e-04 5432 KSP Residual norm 5.757184833653e-04 5433 KSP Residual norm 5.968829149853e-04 5434 KSP Residual norm 6.817250660139e-04 5435 KSP Residual norm 6.874654095218e-04 5436 KSP Residual norm 6.585349986495e-04 5437 KSP Residual norm 6.565726686462e-04 5438 KSP Residual norm 6.764294138953e-04 5439 KSP Residual norm 7.278807566020e-04 5440 KSP Residual norm 8.085586887535e-04 5441 KSP Residual norm 7.630429361587e-04 5442 KSP Residual norm 6.206752290149e-04 5443 KSP Residual norm 5.819511475547e-04 5444 KSP Residual norm 6.465257952358e-04 5445 KSP Residual norm 6.454210708622e-04 5446 KSP Residual norm 5.561588402338e-04 5447 KSP Residual norm 4.881485648730e-04 5448 KSP Residual norm 5.141944068032e-04 5449 KSP Residual norm 5.888141774876e-04 5450 KSP Residual norm 5.735795430384e-04 5451 KSP Residual norm 5.252693912947e-04 5452 KSP Residual norm 5.472216968971e-04 5453 KSP Residual norm 6.429222011012e-04 5454 KSP Residual norm 7.053246569235e-04 5455 KSP Residual norm 7.429112339675e-04 5456 KSP Residual norm 7.693895427094e-04 5457 KSP Residual norm 8.491570316152e-04 5458 KSP Residual norm 9.660910274535e-04 5459 KSP Residual norm 9.911439508787e-04 5460 KSP Residual norm 9.367592708760e-04 5461 KSP Residual norm 9.912179534390e-04 5462 KSP Residual norm 1.168263072062e-03 5463 KSP Residual norm 1.256266202974e-03 5464 KSP Residual norm 1.171014256605e-03 5465 KSP Residual norm 1.132362106124e-03 5466 KSP Residual norm 1.166902839267e-03 5467 KSP Residual norm 1.182462127042e-03 5468 KSP Residual norm 1.070222512522e-03 5469 KSP Residual norm 9.206785642296e-04 5470 KSP Residual norm 8.413939349152e-04 5471 KSP Residual norm 8.537473414134e-04 5472 KSP Residual norm 9.202916667921e-04 5473 KSP Residual norm 8.716198936471e-04 5474 KSP Residual norm 8.713846114770e-04 5475 KSP Residual norm 9.529167027262e-04 5476 KSP Residual norm 9.668177988798e-04 5477 KSP Residual norm 8.555935437774e-04 5478 KSP Residual norm 7.819021412871e-04 5479 KSP Residual norm 8.279334404567e-04 5480 KSP Residual norm 8.632508766114e-04 5481 KSP Residual norm 7.624863590967e-04 5482 KSP Residual norm 7.082102483952e-04 5483 KSP Residual norm 6.948532033834e-04 5484 KSP Residual norm 6.398521462701e-04 5485 KSP Residual norm 5.842358819130e-04 5486 KSP Residual norm 5.534344985165e-04 5487 KSP Residual norm 5.157037674218e-04 5488 KSP Residual norm 5.159158713989e-04 5489 KSP Residual norm 5.823786140097e-04 5490 KSP Residual norm 5.981525457366e-04 5491 KSP Residual norm 5.570890062575e-04 5492 KSP Residual norm 5.450117473878e-04 5493 KSP Residual norm 5.295188269093e-04 5494 KSP Residual norm 4.938334956021e-04 5495 KSP Residual norm 4.886598351553e-04 5496 KSP Residual norm 5.163247130083e-04 5497 KSP Residual norm 4.899156276800e-04 5498 KSP Residual norm 4.760163279581e-04 5499 KSP Residual norm 5.104929275361e-04 5500 KSP Residual norm 5.518958695104e-04 5501 KSP Residual norm 5.204581735775e-04 5502 KSP Residual norm 4.702311113589e-04 5503 KSP Residual norm 4.631875978934e-04 5504 KSP Residual norm 5.069550315656e-04 5505 KSP Residual norm 5.428603301611e-04 5506 KSP Residual norm 5.157784070885e-04 5507 KSP Residual norm 4.556783214684e-04 5508 KSP Residual norm 4.245485377656e-04 5509 KSP Residual norm 4.268921172213e-04 5510 KSP Residual norm 4.283720223248e-04 5511 KSP Residual norm 3.867115304653e-04 5512 KSP Residual norm 3.352428401407e-04 5513 KSP Residual norm 3.174533887163e-04 5514 KSP Residual norm 2.873558066900e-04 5515 KSP Residual norm 2.543423152381e-04 5516 KSP Residual norm 2.205920585448e-04 5517 KSP Residual norm 1.904484367096e-04 5518 KSP Residual norm 1.762030537920e-04 5519 KSP Residual norm 1.713669432590e-04 5520 KSP Residual norm 1.484616200034e-04 5521 KSP Residual norm 1.291725557462e-04 5522 KSP Residual norm 1.279656720388e-04 5523 KSP Residual norm 1.339253399541e-04 5524 KSP Residual norm 1.130742007400e-04 5525 KSP Residual norm 9.226543504901e-05 5526 KSP Residual norm 9.337781091670e-05 5527 KSP Residual norm 1.033740271140e-04 5528 KSP Residual norm 1.077084271398e-04 5529 KSP Residual norm 1.078378465288e-04 5530 KSP Residual norm 1.142127592458e-04 5531 KSP Residual norm 1.296306433532e-04 5532 KSP Residual norm 1.397546422793e-04 5533 KSP Residual norm 1.360697136744e-04 5534 KSP Residual norm 1.330607527380e-04 5535 KSP Residual norm 1.282371457806e-04 5536 KSP Residual norm 1.300163013936e-04 5537 KSP Residual norm 1.403428137837e-04 5538 KSP Residual norm 1.366756756029e-04 5539 KSP Residual norm 1.142203581770e-04 5540 KSP Residual norm 1.058979001250e-04 5541 KSP Residual norm 1.061069452013e-04 5542 KSP Residual norm 1.025637608431e-04 5543 KSP Residual norm 9.764269999651e-05 5544 KSP Residual norm 9.671340362633e-05 5545 KSP Residual norm 1.025095953929e-04 5546 KSP Residual norm 1.078139515394e-04 5547 KSP Residual norm 1.150143576937e-04 5548 KSP Residual norm 1.202196098788e-04 5549 KSP Residual norm 1.276574943103e-04 5550 KSP Residual norm 1.341955162129e-04 5551 KSP Residual norm 1.496811108075e-04 5552 KSP Residual norm 1.696865048115e-04 5553 KSP Residual norm 1.805843626317e-04 5554 KSP Residual norm 1.773557321230e-04 5555 KSP Residual norm 1.740569548714e-04 5556 KSP Residual norm 1.893071345861e-04 5557 KSP Residual norm 2.000699092322e-04 5558 KSP Residual norm 1.976560690861e-04 5559 KSP Residual norm 1.968881124422e-04 5560 KSP Residual norm 2.015241556695e-04 5561 KSP Residual norm 2.136191320420e-04 5562 KSP Residual norm 2.041253067977e-04 5563 KSP Residual norm 1.817563216214e-04 5564 KSP Residual norm 1.820620737751e-04 5565 KSP Residual norm 2.040498486297e-04 5566 KSP Residual norm 2.191985137190e-04 5567 KSP Residual norm 2.110098065809e-04 5568 KSP Residual norm 1.936365651161e-04 5569 KSP Residual norm 1.950353541804e-04 5570 KSP Residual norm 2.361640077098e-04 5571 KSP Residual norm 2.909394547983e-04 5572 KSP Residual norm 3.174936992635e-04 5573 KSP Residual norm 3.128069202910e-04 5574 KSP Residual norm 2.963798382496e-04 5575 KSP Residual norm 2.977631698650e-04 5576 KSP Residual norm 3.074171672751e-04 5577 KSP Residual norm 3.124346464784e-04 5578 KSP Residual norm 3.147092932601e-04 5579 KSP Residual norm 3.289911755945e-04 5580 KSP Residual norm 3.401372379534e-04 5581 KSP Residual norm 3.459055044411e-04 5582 KSP Residual norm 3.494742390363e-04 5583 KSP Residual norm 3.241751533875e-04 5584 KSP Residual norm 2.984716177779e-04 5585 KSP Residual norm 2.841187218740e-04 5586 KSP Residual norm 2.500963203614e-04 5587 KSP Residual norm 1.985559588665e-04 5588 KSP Residual norm 1.813063461890e-04 5589 KSP Residual norm 1.884995846394e-04 5590 KSP Residual norm 1.862124949968e-04 5591 KSP Residual norm 1.576137634920e-04 5592 KSP Residual norm 1.314333115333e-04 5593 KSP Residual norm 1.233170540821e-04 5594 KSP Residual norm 1.256715269944e-04 5595 KSP Residual norm 1.274156428359e-04 5596 KSP Residual norm 1.304975747973e-04 5597 KSP Residual norm 1.403750917018e-04 5598 KSP Residual norm 1.518813122260e-04 5599 KSP Residual norm 1.594566833030e-04 5600 KSP Residual norm 1.607368647173e-04 5601 KSP Residual norm 1.512590127040e-04 5602 KSP Residual norm 1.503473571558e-04 5603 KSP Residual norm 1.593245697035e-04 5604 KSP Residual norm 1.564282073915e-04 5605 KSP Residual norm 1.528019132185e-04 5606 KSP Residual norm 1.554991897247e-04 5607 KSP Residual norm 1.505493314950e-04 5608 KSP Residual norm 1.494931849597e-04 5609 KSP Residual norm 1.543089634330e-04 5610 KSP Residual norm 1.525547365678e-04 5611 KSP Residual norm 1.443624500493e-04 5612 KSP Residual norm 1.413876022153e-04 5613 KSP Residual norm 1.340625797578e-04 5614 KSP Residual norm 1.190953314077e-04 5615 KSP Residual norm 1.092086393545e-04 5616 KSP Residual norm 1.013878003574e-04 5617 KSP Residual norm 8.985011747366e-05 5618 KSP Residual norm 7.803686905695e-05 5619 KSP Residual norm 6.968710138343e-05 0 KSP Residual norm 5.405774214400e+04 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 7.439873800415e+00 1 KSP Residual norm 2.271092312202e+00 2 KSP Residual norm 1.579282324756e+00 3 KSP Residual norm 1.107108552974e+00 4 KSP Residual norm 8.906995999187e-01 5 KSP Residual norm 7.119535581527e-01 6 KSP Residual norm 6.292448465808e-01 7 KSP Residual norm 5.574190632829e-01 8 KSP Residual norm 4.914453903768e-01 9 KSP Residual norm 4.169590427313e-01 10 KSP Residual norm 3.833759307828e-01 11 KSP Residual norm 3.525084900775e-01 12 KSP Residual norm 3.090712658867e-01 13 KSP Residual norm 3.072058092791e-01 14 KSP Residual norm 2.791154478844e-01 15 KSP Residual norm 2.596487823324e-01 16 KSP Residual norm 2.454855113839e-01 17 KSP Residual norm 2.323657304086e-01 18 KSP Residual norm 2.180987080607e-01 19 KSP Residual norm 2.071787317658e-01 20 KSP Residual norm 1.892488656995e-01 21 KSP Residual norm 1.812493522062e-01 22 KSP Residual norm 1.876453055369e-01 23 KSP Residual norm 1.683189793079e-01 24 KSP Residual norm 1.543469424065e-01 25 KSP Residual norm 1.496069368476e-01 26 KSP Residual norm 1.493645624754e-01 27 KSP Residual norm 1.525727733866e-01 28 KSP Residual norm 1.504760809841e-01 29 KSP Residual norm 1.398787428655e-01 30 KSP Residual norm 1.377742552240e-01 31 KSP Residual norm 1.350150247010e-01 32 KSP Residual norm 1.275144910850e-01 33 KSP Residual norm 1.273327004584e-01 34 KSP Residual norm 1.216415065625e-01 35 KSP Residual norm 1.109570051668e-01 36 KSP Residual norm 1.074358443291e-01 37 KSP Residual norm 1.061971546557e-01 38 KSP Residual norm 1.067809057557e-01 39 KSP Residual norm 1.063479786887e-01 40 KSP Residual norm 1.034518470284e-01 41 KSP Residual norm 9.721337391778e-02 42 KSP Residual norm 9.699203830334e-02 43 KSP Residual norm 9.481065986826e-02 44 KSP Residual norm 9.136671340793e-02 45 KSP Residual norm 8.851274329317e-02 46 KSP Residual norm 8.467762039012e-02 47 KSP Residual norm 8.471801429555e-02 48 KSP Residual norm 8.278633364455e-02 49 KSP Residual norm 7.838491006512e-02 50 KSP Residual norm 7.670240644506e-02 51 KSP Residual norm 8.043224588019e-02 52 KSP Residual norm 8.363745629948e-02 53 KSP Residual norm 7.940546973611e-02 54 KSP Residual norm 7.456475032696e-02 55 KSP Residual norm 7.700637275547e-02 56 KSP Residual norm 7.543373247528e-02 57 KSP Residual norm 7.094501517914e-02 58 KSP Residual norm 7.075418803924e-02 59 KSP Residual norm 6.974463189188e-02 60 KSP Residual norm 6.564358161289e-02 61 KSP Residual norm 6.777118961072e-02 62 KSP Residual norm 6.572645071229e-02 63 KSP Residual norm 6.592491054825e-02 64 KSP Residual norm 6.361704393145e-02 65 KSP Residual norm 6.091631661495e-02 66 KSP Residual norm 6.045947178947e-02 67 KSP Residual norm 5.960456641459e-02 68 KSP Residual norm 5.849383480514e-02 69 KSP Residual norm 5.706157714189e-02 70 KSP Residual norm 5.785957008296e-02 71 KSP Residual norm 5.598254854109e-02 72 KSP Residual norm 5.456921702305e-02 73 KSP Residual norm 5.612718298077e-02 74 KSP Residual norm 5.404331315421e-02 75 KSP Residual norm 5.445184161126e-02 76 KSP Residual norm 5.513790692539e-02 77 KSP Residual norm 5.603152577219e-02 78 KSP Residual norm 5.594578881463e-02 79 KSP Residual norm 5.270506297482e-02 80 KSP Residual norm 4.969434451616e-02 81 KSP Residual norm 5.066917256846e-02 82 KSP Residual norm 5.145815352597e-02 83 KSP Residual norm 4.855115729361e-02 84 KSP Residual norm 4.872335394452e-02 85 KSP Residual norm 4.674867276039e-02 86 KSP Residual norm 4.754130141657e-02 87 KSP Residual norm 4.833727177651e-02 88 KSP Residual norm 4.753752603403e-02 89 KSP Residual norm 4.711922842054e-02 90 KSP Residual norm 4.405539388922e-02 91 KSP Residual norm 4.296516931381e-02 92 KSP Residual norm 4.339330987856e-02 93 KSP Residual norm 4.309048328104e-02 94 KSP Residual norm 4.249286390589e-02 95 KSP Residual norm 4.279072317861e-02 96 KSP Residual norm 4.138481489705e-02 97 KSP Residual norm 4.075783749198e-02 98 KSP Residual norm 4.150832258048e-02 99 KSP Residual norm 4.196726309881e-02 100 KSP Residual norm 4.511508557881e-02 101 KSP Residual norm 4.353430099295e-02 102 KSP Residual norm 4.110659276090e-02 103 KSP Residual norm 4.216362776668e-02 104 KSP Residual norm 4.132142431466e-02 105 KSP Residual norm 3.937463571948e-02 106 KSP Residual norm 3.847820147612e-02 107 KSP Residual norm 3.903461144864e-02 108 KSP Residual norm 3.873827003588e-02 109 KSP Residual norm 3.757181226921e-02 110 KSP Residual norm 3.713370747089e-02 111 KSP Residual norm 3.797595762562e-02 112 KSP Residual norm 3.904892207809e-02 113 KSP Residual norm 3.832213482446e-02 114 KSP Residual norm 3.673918137725e-02 115 KSP Residual norm 3.568169366392e-02 116 KSP Residual norm 3.463483623212e-02 117 KSP Residual norm 3.474865380315e-02 118 KSP Residual norm 3.369100450734e-02 119 KSP Residual norm 3.429462851161e-02 120 KSP Residual norm 3.356928597959e-02 121 KSP Residual norm 3.388831153274e-02 122 KSP Residual norm 3.403530170365e-02 123 KSP Residual norm 3.432423924461e-02 124 KSP Residual norm 3.480510081872e-02 125 KSP Residual norm 3.501270104127e-02 126 KSP Residual norm 3.436014578560e-02 127 KSP Residual norm 3.313137022795e-02 128 KSP Residual norm 3.224202549572e-02 129 KSP Residual norm 3.282496094068e-02 130 KSP Residual norm 3.179581680589e-02 131 KSP Residual norm 3.126804650993e-02 132 KSP Residual norm 3.164061472181e-02 133 KSP Residual norm 3.064596217610e-02 134 KSP Residual norm 3.123484050542e-02 135 KSP Residual norm 3.098163173034e-02 136 KSP Residual norm 3.021371993367e-02 137 KSP Residual norm 2.997438018349e-02 138 KSP Residual norm 2.996861618503e-02 139 KSP Residual norm 2.952174027773e-02 140 KSP Residual norm 2.781275108201e-02 141 KSP Residual norm 2.831949666222e-02 142 KSP Residual norm 2.943315787910e-02 143 KSP Residual norm 2.861351453584e-02 144 KSP Residual norm 2.857243223511e-02 145 KSP Residual norm 2.776445330757e-02 146 KSP Residual norm 2.815377297347e-02 147 KSP Residual norm 2.833952777968e-02 148 KSP Residual norm 2.820315095721e-02 149 KSP Residual norm 2.887229075208e-02 150 KSP Residual norm 2.979067721535e-02 151 KSP Residual norm 2.815700116316e-02 152 KSP Residual norm 2.680023054442e-02 153 KSP Residual norm 2.639722486097e-02 154 KSP Residual norm 2.662100285008e-02 155 KSP Residual norm 2.675384875846e-02 156 KSP Residual norm 2.701241246267e-02 157 KSP Residual norm 2.643754530991e-02 158 KSP Residual norm 2.592855185871e-02 159 KSP Residual norm 2.618977338412e-02 160 KSP Residual norm 2.586809270174e-02 161 KSP Residual norm 2.576998073277e-02 162 KSP Residual norm 2.540380238842e-02 163 KSP Residual norm 2.495114311948e-02 164 KSP Residual norm 2.455716236236e-02 165 KSP Residual norm 2.428454731484e-02 166 KSP Residual norm 2.464342635564e-02 167 KSP Residual norm 2.452483143323e-02 168 KSP Residual norm 2.481544666658e-02 169 KSP Residual norm 2.477918836230e-02 170 KSP Residual norm 2.359566978599e-02 171 KSP Residual norm 2.418262225393e-02 172 KSP Residual norm 2.564699145458e-02 173 KSP Residual norm 2.500981972636e-02 174 KSP Residual norm 2.471425418109e-02 175 KSP Residual norm 2.445163739021e-02 176 KSP Residual norm 2.385452288757e-02 177 KSP Residual norm 2.428112914859e-02 178 KSP Residual norm 2.291705127975e-02 179 KSP Residual norm 2.271629416191e-02 180 KSP Residual norm 2.345648624740e-02 181 KSP Residual norm 2.411307961322e-02 182 KSP Residual norm 2.305778032055e-02 183 KSP Residual norm 2.363029408956e-02 184 KSP Residual norm 2.334430183050e-02 185 KSP Residual norm 2.288119542251e-02 186 KSP Residual norm 2.268194883960e-02 187 KSP Residual norm 2.194114600599e-02 188 KSP Residual norm 2.138125374716e-02 189 KSP Residual norm 2.122308178911e-02 190 KSP Residual norm 2.167011211336e-02 191 KSP Residual norm 2.124093196590e-02 192 KSP Residual norm 2.254672827834e-02 193 KSP Residual norm 2.259893744483e-02 194 KSP Residual norm 2.124131449373e-02 195 KSP Residual norm 2.128570772460e-02 196 KSP Residual norm 2.272443514094e-02 197 KSP Residual norm 2.248357368551e-02 198 KSP Residual norm 2.186082830509e-02 199 KSP Residual norm 2.193211823715e-02 200 KSP Residual norm 2.092614010486e-02 201 KSP Residual norm 2.058552529496e-02 202 KSP Residual norm 2.093249615703e-02 203 KSP Residual norm 2.045325798910e-02 204 KSP Residual norm 2.043511462793e-02 205 KSP Residual norm 2.037030539329e-02 206 KSP Residual norm 2.103460538310e-02 207 KSP Residual norm 2.069703021464e-02 208 KSP Residual norm 1.978197990018e-02 209 KSP Residual norm 1.997727225256e-02 210 KSP Residual norm 2.014133842310e-02 211 KSP Residual norm 1.909869680533e-02 212 KSP Residual norm 1.898652371759e-02 213 KSP Residual norm 1.917618301995e-02 214 KSP Residual norm 1.903312625797e-02 215 KSP Residual norm 1.935123404273e-02 216 KSP Residual norm 1.876736458793e-02 217 KSP Residual norm 1.907165437333e-02 218 KSP Residual norm 1.931891473034e-02 219 KSP Residual norm 1.947700505686e-02 220 KSP Residual norm 1.946867709414e-02 221 KSP Residual norm 1.975097371142e-02 222 KSP Residual norm 1.936728796435e-02 223 KSP Residual norm 1.861169070605e-02 224 KSP Residual norm 1.821669203618e-02 225 KSP Residual norm 1.830030189024e-02 226 KSP Residual norm 1.847574748920e-02 227 KSP Residual norm 1.821934647650e-02 228 KSP Residual norm 1.858550789703e-02 229 KSP Residual norm 1.790620122765e-02 230 KSP Residual norm 1.736046756570e-02 231 KSP Residual norm 1.808056503558e-02 232 KSP Residual norm 1.831571970355e-02 233 KSP Residual norm 1.782550071852e-02 234 KSP Residual norm 1.704250503644e-02 235 KSP Residual norm 1.775557915607e-02 236 KSP Residual norm 1.749168705311e-02 237 KSP Residual norm 1.716635016316e-02 238 KSP Residual norm 1.697804812505e-02 239 KSP Residual norm 1.719949296811e-02 240 KSP Residual norm 1.798197854926e-02 241 KSP Residual norm 1.773491134572e-02 242 KSP Residual norm 1.724256440113e-02 243 KSP Residual norm 1.777189498102e-02 244 KSP Residual norm 1.819945790203e-02 245 KSP Residual norm 1.762431996374e-02 246 KSP Residual norm 1.711668380042e-02 247 KSP Residual norm 1.717847950917e-02 248 KSP Residual norm 1.675314264329e-02 249 KSP Residual norm 1.672527101557e-02 250 KSP Residual norm 1.652103275422e-02 251 KSP Residual norm 1.696676153706e-02 252 KSP Residual norm 1.715357344334e-02 253 KSP Residual norm 1.706981861601e-02 254 KSP Residual norm 1.696030869204e-02 255 KSP Residual norm 1.763423572015e-02 256 KSP Residual norm 1.829751997263e-02 257 KSP Residual norm 1.972223165522e-02 258 KSP Residual norm 2.130118969187e-02 259 KSP Residual norm 2.315722127923e-02 260 KSP Residual norm 2.684768883739e-02 261 KSP Residual norm 3.348163940639e-02 262 KSP Residual norm 4.199127730378e-02 263 KSP Residual norm 5.398653105045e-02 264 KSP Residual norm 6.821839369420e-02 265 KSP Residual norm 8.029032178293e-02 266 KSP Residual norm 8.815811508553e-02 267 KSP Residual norm 8.962694266764e-02 268 KSP Residual norm 8.065993997821e-02 269 KSP Residual norm 6.648938465331e-02 270 KSP Residual norm 5.345676431768e-02 271 KSP Residual norm 3.972291572909e-02 272 KSP Residual norm 2.946517479139e-02 273 KSP Residual norm 2.384222409654e-02 274 KSP Residual norm 2.191852135161e-02 275 KSP Residual norm 2.190584659264e-02 276 KSP Residual norm 2.207283053774e-02 277 KSP Residual norm 2.285429680636e-02 278 KSP Residual norm 2.042181640152e-02 279 KSP Residual norm 1.666191118634e-02 280 KSP Residual norm 1.233344503183e-02 281 KSP Residual norm 9.732943526464e-03 282 KSP Residual norm 8.231148322685e-03 283 KSP Residual norm 7.658532417128e-03 284 KSP Residual norm 7.159090758549e-03 285 KSP Residual norm 6.837740854101e-03 286 KSP Residual norm 6.771864567159e-03 287 KSP Residual norm 6.506633131436e-03 288 KSP Residual norm 6.552854756680e-03 289 KSP Residual norm 6.465354304817e-03 290 KSP Residual norm 6.348201270809e-03 291 KSP Residual norm 6.054566113069e-03 292 KSP Residual norm 5.368641816405e-03 293 KSP Residual norm 4.740328570452e-03 294 KSP Residual norm 4.455220605625e-03 295 KSP Residual norm 4.414378942258e-03 296 KSP Residual norm 4.365393288580e-03 297 KSP Residual norm 4.112409955106e-03 298 KSP Residual norm 3.683251112994e-03 299 KSP Residual norm 3.338614520090e-03 300 KSP Residual norm 3.099327570392e-03 301 KSP Residual norm 3.109917829137e-03 302 KSP Residual norm 3.004996253210e-03 303 KSP Residual norm 2.830520603323e-03 304 KSP Residual norm 2.748235291994e-03 305 KSP Residual norm 2.914293446395e-03 306 KSP Residual norm 2.780149150920e-03 307 KSP Residual norm 2.608542174674e-03 308 KSP Residual norm 2.404201656147e-03 309 KSP Residual norm 2.247889289592e-03 310 KSP Residual norm 2.359008329300e-03 311 KSP Residual norm 2.662652962160e-03 312 KSP Residual norm 2.917896860107e-03 313 KSP Residual norm 2.781060424836e-03 314 KSP Residual norm 2.499901538412e-03 315 KSP Residual norm 2.318409392919e-03 316 KSP Residual norm 2.237948977105e-03 317 KSP Residual norm 2.275842789015e-03 318 KSP Residual norm 2.095361798572e-03 319 KSP Residual norm 1.939363510796e-03 320 KSP Residual norm 1.824756647376e-03 321 KSP Residual norm 1.855785739942e-03 322 KSP Residual norm 1.876266124266e-03 323 KSP Residual norm 1.706083418355e-03 324 KSP Residual norm 1.504223272392e-03 325 KSP Residual norm 1.367003023878e-03 326 KSP Residual norm 1.325025357987e-03 327 KSP Residual norm 1.217685553397e-03 328 KSP Residual norm 1.147595137980e-03 329 KSP Residual norm 1.140757899810e-03 330 KSP Residual norm 1.161337539671e-03 331 KSP Residual norm 1.189634577655e-03 332 KSP Residual norm 1.156853032944e-03 333 KSP Residual norm 1.168543981069e-03 334 KSP Residual norm 1.255758863060e-03 335 KSP Residual norm 1.329900079847e-03 336 KSP Residual norm 1.277788599679e-03 337 KSP Residual norm 1.156283394140e-03 338 KSP Residual norm 1.170787202744e-03 339 KSP Residual norm 1.165756028629e-03 340 KSP Residual norm 1.149594359750e-03 341 KSP Residual norm 1.035743087984e-03 342 KSP Residual norm 9.055748010466e-04 343 KSP Residual norm 8.729284667762e-04 344 KSP Residual norm 9.123661560199e-04 345 KSP Residual norm 9.403865973752e-04 346 KSP Residual norm 9.397482648680e-04 347 KSP Residual norm 9.781230092460e-04 348 KSP Residual norm 1.028764754662e-03 349 KSP Residual norm 1.098227372587e-03 350 KSP Residual norm 1.131938817545e-03 351 KSP Residual norm 1.178183837363e-03 352 KSP Residual norm 1.219263604270e-03 353 KSP Residual norm 1.285550443119e-03 354 KSP Residual norm 1.314390656564e-03 355 KSP Residual norm 1.280155123264e-03 356 KSP Residual norm 1.337317413181e-03 357 KSP Residual norm 1.375269807467e-03 358 KSP Residual norm 1.339949505912e-03 359 KSP Residual norm 1.312858194973e-03 360 KSP Residual norm 1.302230401883e-03 361 KSP Residual norm 1.296953236606e-03 362 KSP Residual norm 1.241958131296e-03 363 KSP Residual norm 1.317399611984e-03 364 KSP Residual norm 1.404700632013e-03 365 KSP Residual norm 1.465377387279e-03 366 KSP Residual norm 1.449387099771e-03 367 KSP Residual norm 1.340755308031e-03 368 KSP Residual norm 1.276373010248e-03 369 KSP Residual norm 1.300492007224e-03 370 KSP Residual norm 1.305680420594e-03 371 KSP Residual norm 1.247563315763e-03 372 KSP Residual norm 1.215067995838e-03 373 KSP Residual norm 1.245842567847e-03 374 KSP Residual norm 1.248065264454e-03 375 KSP Residual norm 1.279964483758e-03 376 KSP Residual norm 1.296017171744e-03 377 KSP Residual norm 1.322475758719e-03 378 KSP Residual norm 1.346553928101e-03 379 KSP Residual norm 1.409217159475e-03 380 KSP Residual norm 1.427387037000e-03 381 KSP Residual norm 1.385982705722e-03 382 KSP Residual norm 1.326140475173e-03 383 KSP Residual norm 1.363313832696e-03 384 KSP Residual norm 1.399869841773e-03 385 KSP Residual norm 1.382389217698e-03 386 KSP Residual norm 1.354178848227e-03 387 KSP Residual norm 1.352742312886e-03 388 KSP Residual norm 1.353096597807e-03 389 KSP Residual norm 1.334145707497e-03 390 KSP Residual norm 1.244976876447e-03 391 KSP Residual norm 1.206076100038e-03 392 KSP Residual norm 1.261782193894e-03 393 KSP Residual norm 1.300016335362e-03 394 KSP Residual norm 1.220254929690e-03 395 KSP Residual norm 1.106128865061e-03 396 KSP Residual norm 1.046642239943e-03 397 KSP Residual norm 1.064825863444e-03 398 KSP Residual norm 1.076071139351e-03 399 KSP Residual norm 1.072418592011e-03 400 KSP Residual norm 1.054699683117e-03 401 KSP Residual norm 1.044433124340e-03 402 KSP Residual norm 1.067396312320e-03 403 KSP Residual norm 1.152053727618e-03 404 KSP Residual norm 1.214731081875e-03 405 KSP Residual norm 1.200559675103e-03 406 KSP Residual norm 1.118454110419e-03 407 KSP Residual norm 1.065613359694e-03 408 KSP Residual norm 1.059780664889e-03 409 KSP Residual norm 1.147890672875e-03 410 KSP Residual norm 1.160152942280e-03 411 KSP Residual norm 1.155457644175e-03 412 KSP Residual norm 1.218328095414e-03 413 KSP Residual norm 1.282417285128e-03 414 KSP Residual norm 1.290107982445e-03 415 KSP Residual norm 1.276638656415e-03 416 KSP Residual norm 1.278858623101e-03 417 KSP Residual norm 1.363772235297e-03 418 KSP Residual norm 1.419371458797e-03 419 KSP Residual norm 1.412832072037e-03 420 KSP Residual norm 1.365748169574e-03 421 KSP Residual norm 1.405720923862e-03 422 KSP Residual norm 1.453177134976e-03 423 KSP Residual norm 1.520454157083e-03 424 KSP Residual norm 1.446299931363e-03 425 KSP Residual norm 1.351877459723e-03 426 KSP Residual norm 1.327808945913e-03 427 KSP Residual norm 1.350351799488e-03 428 KSP Residual norm 1.385479038188e-03 429 KSP Residual norm 1.387796324528e-03 430 KSP Residual norm 1.384647149458e-03 431 KSP Residual norm 1.460355026546e-03 432 KSP Residual norm 1.480065918295e-03 433 KSP Residual norm 1.456790034607e-03 434 KSP Residual norm 1.429909079454e-03 435 KSP Residual norm 1.451310070774e-03 436 KSP Residual norm 1.406727345937e-03 437 KSP Residual norm 1.325991263019e-03 438 KSP Residual norm 1.246160382525e-03 439 KSP Residual norm 1.255009836241e-03 440 KSP Residual norm 1.303332052273e-03 441 KSP Residual norm 1.286598917010e-03 442 KSP Residual norm 1.239425544177e-03 443 KSP Residual norm 1.252515845760e-03 444 KSP Residual norm 1.294422403484e-03 445 KSP Residual norm 1.276765289333e-03 446 KSP Residual norm 1.244824813612e-03 447 KSP Residual norm 1.183245770405e-03 448 KSP Residual norm 1.153565086222e-03 449 KSP Residual norm 1.119489362997e-03 450 KSP Residual norm 1.117623170827e-03 451 KSP Residual norm 1.096029604095e-03 452 KSP Residual norm 1.055193076106e-03 453 KSP Residual norm 1.005504863389e-03 454 KSP Residual norm 9.941050424498e-04 455 KSP Residual norm 9.770049958018e-04 456 KSP Residual norm 9.092852704615e-04 457 KSP Residual norm 8.974990456384e-04 458 KSP Residual norm 9.012999264230e-04 459 KSP Residual norm 9.147555348631e-04 460 KSP Residual norm 9.349542493543e-04 461 KSP Residual norm 9.631165447434e-04 462 KSP Residual norm 9.927856223692e-04 463 KSP Residual norm 9.783830479125e-04 464 KSP Residual norm 9.818168561440e-04 465 KSP Residual norm 9.896615948644e-04 466 KSP Residual norm 1.006369854207e-03 467 KSP Residual norm 9.939846132237e-04 468 KSP Residual norm 9.767849618419e-04 469 KSP Residual norm 9.502666517985e-04 470 KSP Residual norm 9.627158476888e-04 471 KSP Residual norm 9.696220358260e-04 472 KSP Residual norm 9.555925574913e-04 473 KSP Residual norm 9.596753073191e-04 474 KSP Residual norm 9.778312668318e-04 475 KSP Residual norm 9.886046752730e-04 476 KSP Residual norm 9.764004519850e-04 477 KSP Residual norm 9.018124220522e-04 478 KSP Residual norm 8.857320067402e-04 479 KSP Residual norm 8.713510421022e-04 480 KSP Residual norm 8.969565417031e-04 481 KSP Residual norm 9.336233261242e-04 482 KSP Residual norm 9.433172095001e-04 483 KSP Residual norm 9.161931619946e-04 484 KSP Residual norm 9.038656681831e-04 485 KSP Residual norm 9.181623563365e-04 486 KSP Residual norm 9.069719514158e-04 487 KSP Residual norm 9.191716357195e-04 488 KSP Residual norm 9.206538113900e-04 489 KSP Residual norm 9.309524213734e-04 490 KSP Residual norm 9.715978975188e-04 491 KSP Residual norm 1.013562676274e-03 492 KSP Residual norm 1.009919531281e-03 493 KSP Residual norm 1.059499239989e-03 494 KSP Residual norm 1.076813253306e-03 495 KSP Residual norm 1.074668719575e-03 496 KSP Residual norm 1.133223873975e-03 497 KSP Residual norm 1.147944277888e-03 498 KSP Residual norm 1.192672858255e-03 499 KSP Residual norm 1.304195086532e-03 500 KSP Residual norm 1.302173161555e-03 501 KSP Residual norm 1.270177225081e-03 502 KSP Residual norm 1.216566257579e-03 503 KSP Residual norm 1.165110642117e-03 504 KSP Residual norm 1.195550821372e-03 505 KSP Residual norm 1.192631583738e-03 506 KSP Residual norm 1.237066709884e-03 507 KSP Residual norm 1.210893265197e-03 508 KSP Residual norm 1.078432807806e-03 509 KSP Residual norm 1.008083683026e-03 510 KSP Residual norm 1.056488978524e-03 511 KSP Residual norm 1.126142245496e-03 512 KSP Residual norm 1.128376995711e-03 513 KSP Residual norm 1.096525953864e-03 514 KSP Residual norm 1.095326824687e-03 515 KSP Residual norm 1.142282392403e-03 516 KSP Residual norm 1.139660427298e-03 517 KSP Residual norm 1.094055452431e-03 518 KSP Residual norm 1.063409470212e-03 519 KSP Residual norm 1.110273349622e-03 520 KSP Residual norm 1.202600760469e-03 521 KSP Residual norm 1.268411358991e-03 522 KSP Residual norm 1.277862861022e-03 523 KSP Residual norm 1.305649608547e-03 524 KSP Residual norm 1.340498031515e-03 525 KSP Residual norm 1.325384291903e-03 526 KSP Residual norm 1.304614031779e-03 527 KSP Residual norm 1.287973763995e-03 528 KSP Residual norm 1.267718986109e-03 529 KSP Residual norm 1.241471871921e-03 530 KSP Residual norm 1.269218895099e-03 531 KSP Residual norm 1.252827011848e-03 532 KSP Residual norm 1.206084773674e-03 533 KSP Residual norm 1.191593136725e-03 534 KSP Residual norm 1.164464769735e-03 535 KSP Residual norm 1.192334038324e-03 536 KSP Residual norm 1.219492933410e-03 537 KSP Residual norm 1.199743127274e-03 538 KSP Residual norm 1.248335920228e-03 539 KSP Residual norm 1.252673398232e-03 540 KSP Residual norm 1.246751345189e-03 541 KSP Residual norm 1.205530499378e-03 542 KSP Residual norm 1.195070824271e-03 543 KSP Residual norm 1.223501263517e-03 544 KSP Residual norm 1.262089895506e-03 545 KSP Residual norm 1.323061159399e-03 546 KSP Residual norm 1.349425603612e-03 547 KSP Residual norm 1.309128586794e-03 548 KSP Residual norm 1.302075761304e-03 549 KSP Residual norm 1.331125648600e-03 550 KSP Residual norm 1.353544159260e-03 551 KSP Residual norm 1.377107006558e-03 552 KSP Residual norm 1.349377941781e-03 553 KSP Residual norm 1.358605387046e-03 554 KSP Residual norm 1.436355572419e-03 555 KSP Residual norm 1.502607291761e-03 556 KSP Residual norm 1.449345666679e-03 557 KSP Residual norm 1.424783995975e-03 558 KSP Residual norm 1.427960551265e-03 559 KSP Residual norm 1.427466632801e-03 560 KSP Residual norm 1.481062651578e-03 561 KSP Residual norm 1.562158551958e-03 562 KSP Residual norm 1.598554803542e-03 563 KSP Residual norm 1.597557725759e-03 564 KSP Residual norm 1.659863389266e-03 565 KSP Residual norm 1.695117214430e-03 566 KSP Residual norm 1.751026900278e-03 567 KSP Residual norm 1.739353545363e-03 568 KSP Residual norm 1.817037903174e-03 569 KSP Residual norm 1.857665704497e-03 570 KSP Residual norm 1.782656590883e-03 571 KSP Residual norm 1.821065237934e-03 572 KSP Residual norm 1.812838460391e-03 573 KSP Residual norm 1.922829241083e-03 574 KSP Residual norm 1.960159350212e-03 575 KSP Residual norm 2.020917548665e-03 576 KSP Residual norm 2.001600983891e-03 577 KSP Residual norm 1.941586650750e-03 578 KSP Residual norm 2.010907353233e-03 579 KSP Residual norm 2.231140523179e-03 580 KSP Residual norm 2.374708165863e-03 581 KSP Residual norm 2.445724331607e-03 582 KSP Residual norm 2.451548247846e-03 583 KSP Residual norm 2.471622812581e-03 584 KSP Residual norm 2.546071482432e-03 585 KSP Residual norm 2.576192819880e-03 586 KSP Residual norm 2.684760642875e-03 587 KSP Residual norm 2.844454005347e-03 588 KSP Residual norm 2.891295774829e-03 589 KSP Residual norm 2.827164491452e-03 590 KSP Residual norm 2.978210284874e-03 591 KSP Residual norm 3.118874246324e-03 592 KSP Residual norm 3.226927540750e-03 593 KSP Residual norm 3.297702822485e-03 594 KSP Residual norm 3.269635997402e-03 595 KSP Residual norm 3.272395692426e-03 596 KSP Residual norm 3.127220033345e-03 597 KSP Residual norm 3.002089958033e-03 598 KSP Residual norm 2.887952193552e-03 599 KSP Residual norm 2.789545707468e-03 600 KSP Residual norm 2.728822784045e-03 601 KSP Residual norm 2.586721450953e-03 602 KSP Residual norm 2.456861877398e-03 603 KSP Residual norm 2.270195608825e-03 604 KSP Residual norm 2.234015372155e-03 605 KSP Residual norm 2.271006954022e-03 606 KSP Residual norm 2.274709919172e-03 607 KSP Residual norm 2.319140505599e-03 608 KSP Residual norm 2.371304192676e-03 609 KSP Residual norm 2.381183182868e-03 610 KSP Residual norm 2.473342013503e-03 611 KSP Residual norm 2.668702366905e-03 612 KSP Residual norm 2.840263324868e-03 613 KSP Residual norm 2.887342238157e-03 614 KSP Residual norm 2.945717643492e-03 615 KSP Residual norm 2.954708258451e-03 616 KSP Residual norm 2.782576372141e-03 617 KSP Residual norm 2.604890421262e-03 618 KSP Residual norm 2.554034090005e-03 619 KSP Residual norm 2.623418706237e-03 620 KSP Residual norm 2.561484842595e-03 621 KSP Residual norm 2.393195287692e-03 622 KSP Residual norm 2.160609863780e-03 623 KSP Residual norm 1.979602551998e-03 624 KSP Residual norm 1.963931478777e-03 625 KSP Residual norm 1.948555791659e-03 626 KSP Residual norm 2.017840029712e-03 627 KSP Residual norm 2.046199056507e-03 628 KSP Residual norm 1.993108418353e-03 629 KSP Residual norm 1.974961947376e-03 630 KSP Residual norm 2.061584360663e-03 631 KSP Residual norm 2.054957286204e-03 632 KSP Residual norm 1.990859027025e-03 633 KSP Residual norm 2.041368256010e-03 634 KSP Residual norm 2.058655836834e-03 635 KSP Residual norm 2.099752038269e-03 636 KSP Residual norm 2.131260502055e-03 637 KSP Residual norm 2.152145468088e-03 638 KSP Residual norm 2.186736970547e-03 639 KSP Residual norm 2.226068625504e-03 640 KSP Residual norm 2.414508229979e-03 641 KSP Residual norm 2.641630675320e-03 642 KSP Residual norm 2.875761700403e-03 643 KSP Residual norm 3.128751719400e-03 644 KSP Residual norm 3.218863877361e-03 645 KSP Residual norm 3.286440896976e-03 646 KSP Residual norm 3.573443161029e-03 647 KSP Residual norm 3.886465854832e-03 648 KSP Residual norm 3.996486091627e-03 649 KSP Residual norm 3.952223090674e-03 650 KSP Residual norm 4.148696249293e-03 651 KSP Residual norm 4.079176608710e-03 652 KSP Residual norm 3.918952724596e-03 653 KSP Residual norm 3.923035202176e-03 654 KSP Residual norm 3.944710633091e-03 655 KSP Residual norm 3.857676629281e-03 656 KSP Residual norm 3.736280082312e-03 657 KSP Residual norm 3.793661571477e-03 658 KSP Residual norm 3.873896125551e-03 659 KSP Residual norm 3.877810193883e-03 660 KSP Residual norm 4.169433784065e-03 661 KSP Residual norm 4.323131896838e-03 662 KSP Residual norm 4.296835496769e-03 663 KSP Residual norm 4.207135380912e-03 664 KSP Residual norm 4.031582293437e-03 665 KSP Residual norm 4.218523456173e-03 666 KSP Residual norm 4.529317974038e-03 667 KSP Residual norm 4.934835798003e-03 668 KSP Residual norm 5.301936631949e-03 669 KSP Residual norm 5.277983685620e-03 670 KSP Residual norm 5.019195591950e-03 671 KSP Residual norm 4.851201693366e-03 672 KSP Residual norm 4.734902022510e-03 673 KSP Residual norm 4.976852212368e-03 674 KSP Residual norm 5.193092769362e-03 675 KSP Residual norm 5.074930831870e-03 676 KSP Residual norm 5.032785592826e-03 677 KSP Residual norm 5.158280095896e-03 678 KSP Residual norm 5.322468916220e-03 679 KSP Residual norm 5.634887770503e-03 680 KSP Residual norm 5.909765459769e-03 681 KSP Residual norm 6.020179701853e-03 682 KSP Residual norm 5.783216033479e-03 683 KSP Residual norm 5.624748149217e-03 684 KSP Residual norm 5.501514634248e-03 685 KSP Residual norm 5.696721575211e-03 686 KSP Residual norm 5.715606984051e-03 687 KSP Residual norm 5.562257632383e-03 688 KSP Residual norm 5.544174264842e-03 689 KSP Residual norm 5.485930659494e-03 690 KSP Residual norm 4.956546863206e-03 691 KSP Residual norm 4.632123369208e-03 692 KSP Residual norm 4.604542140537e-03 693 KSP Residual norm 4.866250739707e-03 694 KSP Residual norm 4.773117405448e-03 695 KSP Residual norm 4.708889265211e-03 696 KSP Residual norm 4.918986654095e-03 697 KSP Residual norm 5.108781269711e-03 698 KSP Residual norm 4.924217059739e-03 699 KSP Residual norm 4.618712978303e-03 700 KSP Residual norm 4.540370025483e-03 701 KSP Residual norm 4.648699130247e-03 702 KSP Residual norm 4.530699767673e-03 703 KSP Residual norm 4.208925421903e-03 704 KSP Residual norm 4.188225308855e-03 705 KSP Residual norm 4.422248684790e-03 706 KSP Residual norm 4.568392073765e-03 707 KSP Residual norm 4.616808828879e-03 708 KSP Residual norm 4.588543206558e-03 709 KSP Residual norm 4.639488790088e-03 710 KSP Residual norm 4.444272664026e-03 711 KSP Residual norm 4.234628347681e-03 712 KSP Residual norm 4.127758383775e-03 713 KSP Residual norm 4.226146707318e-03 714 KSP Residual norm 4.315745072182e-03 715 KSP Residual norm 4.414551383475e-03 716 KSP Residual norm 4.526361096101e-03 717 KSP Residual norm 4.821404255311e-03 718 KSP Residual norm 5.048407023781e-03 719 KSP Residual norm 4.865432067942e-03 720 KSP Residual norm 4.736673154911e-03 721 KSP Residual norm 4.844045662115e-03 722 KSP Residual norm 5.025216696831e-03 723 KSP Residual norm 5.063807199625e-03 724 KSP Residual norm 5.199316362538e-03 725 KSP Residual norm 5.073180119090e-03 726 KSP Residual norm 4.867937378468e-03 727 KSP Residual norm 4.622519181293e-03 728 KSP Residual norm 4.301126032899e-03 729 KSP Residual norm 4.102680159217e-03 730 KSP Residual norm 4.182928190220e-03 731 KSP Residual norm 4.286367784157e-03 732 KSP Residual norm 4.391122318785e-03 733 KSP Residual norm 4.595538944216e-03 734 KSP Residual norm 4.706566736553e-03 735 KSP Residual norm 4.503865315399e-03 736 KSP Residual norm 4.316431285229e-03 737 KSP Residual norm 4.430407108087e-03 738 KSP Residual norm 4.506848843703e-03 739 KSP Residual norm 4.433850111061e-03 740 KSP Residual norm 4.380771081996e-03 741 KSP Residual norm 4.222631885474e-03 742 KSP Residual norm 4.147771849579e-03 743 KSP Residual norm 4.365639718295e-03 744 KSP Residual norm 4.717559413792e-03 745 KSP Residual norm 4.814102031643e-03 746 KSP Residual norm 4.664000257623e-03 747 KSP Residual norm 4.828979911030e-03 748 KSP Residual norm 4.936597793490e-03 749 KSP Residual norm 4.861591122631e-03 750 KSP Residual norm 4.672726595736e-03 751 KSP Residual norm 4.922548149667e-03 752 KSP Residual norm 5.229361787727e-03 753 KSP Residual norm 5.249145190642e-03 754 KSP Residual norm 4.948681226659e-03 755 KSP Residual norm 4.630066570005e-03 756 KSP Residual norm 4.599738801059e-03 757 KSP Residual norm 4.929083792314e-03 758 KSP Residual norm 5.566633954187e-03 759 KSP Residual norm 5.769661533521e-03 760 KSP Residual norm 5.644415061329e-03 761 KSP Residual norm 5.704979658186e-03 762 KSP Residual norm 5.491941722453e-03 763 KSP Residual norm 5.126667613508e-03 764 KSP Residual norm 4.851766976706e-03 765 KSP Residual norm 4.811919456483e-03 766 KSP Residual norm 4.907107900783e-03 767 KSP Residual norm 5.248085780992e-03 768 KSP Residual norm 5.489611963591e-03 769 KSP Residual norm 5.983835818817e-03 770 KSP Residual norm 6.525916643132e-03 771 KSP Residual norm 6.778487524504e-03 772 KSP Residual norm 6.669372553854e-03 773 KSP Residual norm 6.135866213372e-03 774 KSP Residual norm 5.783129247859e-03 775 KSP Residual norm 5.784201879151e-03 776 KSP Residual norm 5.741559179621e-03 777 KSP Residual norm 5.703354544397e-03 778 KSP Residual norm 5.482250606638e-03 779 KSP Residual norm 5.310252890419e-03 780 KSP Residual norm 5.335711864097e-03 781 KSP Residual norm 5.313727521953e-03 782 KSP Residual norm 5.298550285404e-03 783 KSP Residual norm 5.277860914991e-03 784 KSP Residual norm 5.562270366961e-03 785 KSP Residual norm 5.773676284701e-03 786 KSP Residual norm 5.937828098629e-03 787 KSP Residual norm 5.933205595583e-03 788 KSP Residual norm 5.706620882406e-03 789 KSP Residual norm 5.746541406510e-03 790 KSP Residual norm 5.682632330616e-03 791 KSP Residual norm 5.718373329312e-03 792 KSP Residual norm 5.858127842517e-03 793 KSP Residual norm 6.036029033969e-03 794 KSP Residual norm 5.937362017402e-03 795 KSP Residual norm 5.662821329724e-03 796 KSP Residual norm 5.570868424837e-03 797 KSP Residual norm 5.736255872605e-03 798 KSP Residual norm 5.718267824764e-03 799 KSP Residual norm 5.523708236915e-03 800 KSP Residual norm 5.260143665610e-03 801 KSP Residual norm 5.273276972924e-03 802 KSP Residual norm 5.326177610156e-03 803 KSP Residual norm 5.470489474939e-03 804 KSP Residual norm 5.563660821212e-03 805 KSP Residual norm 5.450467821544e-03 806 KSP Residual norm 5.257754749895e-03 807 KSP Residual norm 5.173273041024e-03 808 KSP Residual norm 5.173399710786e-03 809 KSP Residual norm 5.129745167935e-03 810 KSP Residual norm 4.990486065819e-03 811 KSP Residual norm 4.752904983010e-03 812 KSP Residual norm 4.620248446524e-03 813 KSP Residual norm 4.643793500577e-03 814 KSP Residual norm 4.670321867182e-03 815 KSP Residual norm 4.617087182743e-03 816 KSP Residual norm 4.700188292896e-03 817 KSP Residual norm 4.605612711004e-03 818 KSP Residual norm 4.383306226908e-03 819 KSP Residual norm 4.491049783797e-03 820 KSP Residual norm 4.641695104821e-03 821 KSP Residual norm 4.728778395632e-03 822 KSP Residual norm 4.726685398043e-03 823 KSP Residual norm 4.719323445919e-03 824 KSP Residual norm 4.397459455653e-03 825 KSP Residual norm 4.202202803142e-03 826 KSP Residual norm 4.187108103326e-03 827 KSP Residual norm 4.413636897633e-03 828 KSP Residual norm 4.294243790417e-03 829 KSP Residual norm 4.237713512230e-03 830 KSP Residual norm 4.230817171428e-03 831 KSP Residual norm 4.361317085853e-03 832 KSP Residual norm 4.322420049048e-03 833 KSP Residual norm 4.333158250701e-03 834 KSP Residual norm 4.272257205512e-03 835 KSP Residual norm 4.166208845377e-03 836 KSP Residual norm 4.276427624622e-03 837 KSP Residual norm 4.262789381603e-03 838 KSP Residual norm 4.390849186044e-03 839 KSP Residual norm 4.392670055004e-03 840 KSP Residual norm 4.197153591994e-03 841 KSP Residual norm 3.985740589991e-03 842 KSP Residual norm 3.751916382322e-03 843 KSP Residual norm 3.744476290366e-03 844 KSP Residual norm 3.703285656437e-03 845 KSP Residual norm 3.681674911257e-03 846 KSP Residual norm 3.665749856416e-03 847 KSP Residual norm 3.795682207139e-03 848 KSP Residual norm 3.879211906228e-03 849 KSP Residual norm 3.904866739752e-03 850 KSP Residual norm 4.053463668093e-03 851 KSP Residual norm 4.310206873307e-03 852 KSP Residual norm 4.398468828346e-03 853 KSP Residual norm 4.303875562632e-03 854 KSP Residual norm 4.281086768268e-03 855 KSP Residual norm 4.381377636333e-03 856 KSP Residual norm 4.408508579163e-03 857 KSP Residual norm 4.261455674502e-03 858 KSP Residual norm 4.085046370784e-03 859 KSP Residual norm 3.962246614559e-03 860 KSP Residual norm 4.017324998468e-03 861 KSP Residual norm 4.080511395918e-03 862 KSP Residual norm 4.036490084737e-03 863 KSP Residual norm 3.937069305169e-03 864 KSP Residual norm 3.707986416453e-03 865 KSP Residual norm 3.777951030973e-03 866 KSP Residual norm 3.974784104611e-03 867 KSP Residual norm 4.188430303811e-03 868 KSP Residual norm 4.219085975362e-03 869 KSP Residual norm 4.145977225117e-03 870 KSP Residual norm 4.181287665007e-03 871 KSP Residual norm 4.195306731661e-03 872 KSP Residual norm 4.174729054987e-03 873 KSP Residual norm 4.178829217510e-03 874 KSP Residual norm 4.152426882362e-03 875 KSP Residual norm 4.337268072075e-03 876 KSP Residual norm 4.575096886853e-03 877 KSP Residual norm 4.540538034903e-03 878 KSP Residual norm 4.689722606122e-03 879 KSP Residual norm 5.092002143298e-03 880 KSP Residual norm 5.426101401836e-03 881 KSP Residual norm 5.319771478453e-03 882 KSP Residual norm 5.065400101422e-03 883 KSP Residual norm 4.987627471669e-03 884 KSP Residual norm 5.231010313901e-03 885 KSP Residual norm 5.697968184051e-03 886 KSP Residual norm 5.984576951654e-03 887 KSP Residual norm 5.890014849194e-03 888 KSP Residual norm 5.765833588141e-03 889 KSP Residual norm 5.831964254872e-03 890 KSP Residual norm 5.804212026051e-03 891 KSP Residual norm 5.453299776526e-03 892 KSP Residual norm 5.581785894480e-03 893 KSP Residual norm 5.914960551582e-03 894 KSP Residual norm 6.215274237221e-03 895 KSP Residual norm 6.380147448218e-03 896 KSP Residual norm 6.434006988794e-03 897 KSP Residual norm 6.956152793164e-03 898 KSP Residual norm 7.569943236908e-03 899 KSP Residual norm 7.196213063372e-03 900 KSP Residual norm 6.669265142295e-03 901 KSP Residual norm 6.291602518634e-03 902 KSP Residual norm 6.219791968697e-03 903 KSP Residual norm 6.338503625427e-03 904 KSP Residual norm 6.424207519846e-03 905 KSP Residual norm 6.358795522748e-03 906 KSP Residual norm 6.065467535689e-03 907 KSP Residual norm 5.900664281144e-03 908 KSP Residual norm 5.842698544577e-03 909 KSP Residual norm 5.930779381444e-03 910 KSP Residual norm 5.701726896468e-03 911 KSP Residual norm 5.451726110990e-03 912 KSP Residual norm 5.358715189907e-03 913 KSP Residual norm 5.074947630587e-03 914 KSP Residual norm 4.879003170423e-03 915 KSP Residual norm 4.508233945089e-03 916 KSP Residual norm 4.443694702769e-03 917 KSP Residual norm 4.613572364187e-03 918 KSP Residual norm 4.738259183511e-03 919 KSP Residual norm 4.791155625810e-03 920 KSP Residual norm 4.974137880502e-03 921 KSP Residual norm 5.104963370756e-03 922 KSP Residual norm 5.161325592815e-03 923 KSP Residual norm 5.066610625764e-03 924 KSP Residual norm 4.859808367323e-03 925 KSP Residual norm 4.759496591612e-03 926 KSP Residual norm 4.877206386720e-03 927 KSP Residual norm 5.057897513934e-03 928 KSP Residual norm 5.111313050571e-03 929 KSP Residual norm 5.389458752792e-03 930 KSP Residual norm 5.737138995252e-03 931 KSP Residual norm 5.998194270249e-03 932 KSP Residual norm 6.146434423575e-03 933 KSP Residual norm 6.569709275569e-03 934 KSP Residual norm 6.962879787477e-03 935 KSP Residual norm 7.372751593413e-03 936 KSP Residual norm 7.516081815966e-03 937 KSP Residual norm 7.345681334036e-03 938 KSP Residual norm 7.387023502760e-03 939 KSP Residual norm 7.751916785347e-03 940 KSP Residual norm 8.124853985422e-03 941 KSP Residual norm 8.161143065742e-03 942 KSP Residual norm 8.067366672832e-03 943 KSP Residual norm 8.229867127952e-03 944 KSP Residual norm 8.853793072529e-03 945 KSP Residual norm 9.269567121478e-03 946 KSP Residual norm 8.751506073752e-03 947 KSP Residual norm 7.938945593248e-03 948 KSP Residual norm 7.797694446958e-03 949 KSP Residual norm 7.929228242394e-03 950 KSP Residual norm 7.719452507282e-03 951 KSP Residual norm 6.832883878296e-03 952 KSP Residual norm 6.443643850214e-03 953 KSP Residual norm 6.295045856499e-03 954 KSP Residual norm 6.150883178803e-03 955 KSP Residual norm 6.074214553054e-03 956 KSP Residual norm 6.034663057893e-03 957 KSP Residual norm 6.236959412573e-03 958 KSP Residual norm 6.307983143271e-03 959 KSP Residual norm 6.337338786166e-03 960 KSP Residual norm 6.424917217463e-03 961 KSP Residual norm 6.797528340310e-03 962 KSP Residual norm 6.747058303445e-03 963 KSP Residual norm 6.474594117430e-03 964 KSP Residual norm 5.993114301998e-03 965 KSP Residual norm 5.624010238615e-03 966 KSP Residual norm 5.593720784244e-03 967 KSP Residual norm 5.589581870468e-03 968 KSP Residual norm 5.603005508742e-03 969 KSP Residual norm 5.810391484899e-03 970 KSP Residual norm 6.194263875096e-03 971 KSP Residual norm 6.136652736529e-03 972 KSP Residual norm 6.283690392221e-03 973 KSP Residual norm 6.519842963053e-03 974 KSP Residual norm 6.895802471345e-03 975 KSP Residual norm 6.992540210060e-03 976 KSP Residual norm 6.948142695474e-03 977 KSP Residual norm 6.902910092755e-03 978 KSP Residual norm 6.810624040762e-03 979 KSP Residual norm 6.933177851430e-03 980 KSP Residual norm 7.308598793955e-03 981 KSP Residual norm 7.517714501679e-03 982 KSP Residual norm 7.832519795887e-03 983 KSP Residual norm 7.864611364591e-03 984 KSP Residual norm 7.684466785193e-03 985 KSP Residual norm 7.640868812113e-03 986 KSP Residual norm 7.935507353152e-03 987 KSP Residual norm 8.163967874078e-03 988 KSP Residual norm 8.055085011071e-03 989 KSP Residual norm 7.676106656276e-03 990 KSP Residual norm 7.230936801529e-03 991 KSP Residual norm 6.951908145572e-03 992 KSP Residual norm 6.802094733110e-03 993 KSP Residual norm 6.683949692399e-03 994 KSP Residual norm 6.336700308828e-03 995 KSP Residual norm 6.244222851116e-03 996 KSP Residual norm 6.304898269266e-03 997 KSP Residual norm 6.330769690003e-03 998 KSP Residual norm 6.228526816184e-03 999 KSP Residual norm 6.236334612293e-03 1000 KSP Residual norm 6.082543349162e-03 1001 KSP Residual norm 5.951192334012e-03 1002 KSP Residual norm 5.896320518952e-03 1003 KSP Residual norm 5.893906943159e-03 1004 KSP Residual norm 6.189911727147e-03 1005 KSP Residual norm 6.298401986625e-03 1006 KSP Residual norm 6.394011235232e-03 1007 KSP Residual norm 6.515404444493e-03 1008 KSP Residual norm 6.720948446026e-03 1009 KSP Residual norm 6.589878288593e-03 1010 KSP Residual norm 6.152351062045e-03 1011 KSP Residual norm 5.969266592480e-03 1012 KSP Residual norm 6.170609409981e-03 1013 KSP Residual norm 6.469621506901e-03 1014 KSP Residual norm 6.459914206986e-03 1015 KSP Residual norm 6.593668533056e-03 1016 KSP Residual norm 6.622514784875e-03 1017 KSP Residual norm 6.656735647554e-03 1018 KSP Residual norm 6.751257601391e-03 1019 KSP Residual norm 6.988646562368e-03 1020 KSP Residual norm 6.905687803374e-03 1021 KSP Residual norm 6.655272945212e-03 1022 KSP Residual norm 6.777996924187e-03 1023 KSP Residual norm 6.947630339348e-03 1024 KSP Residual norm 6.950024896731e-03 1025 KSP Residual norm 6.581465729748e-03 1026 KSP Residual norm 6.623424889169e-03 1027 KSP Residual norm 6.544497469118e-03 1028 KSP Residual norm 6.765260623953e-03 1029 KSP Residual norm 6.915449248189e-03 1030 KSP Residual norm 6.372236580133e-03 1031 KSP Residual norm 5.953128974582e-03 1032 KSP Residual norm 6.015530242281e-03 1033 KSP Residual norm 5.943168745891e-03 1034 KSP Residual norm 5.596915041857e-03 1035 KSP Residual norm 5.408257600816e-03 1036 KSP Residual norm 5.462516314601e-03 1037 KSP Residual norm 5.452288052399e-03 1038 KSP Residual norm 5.531964783471e-03 1039 KSP Residual norm 5.396516226761e-03 1040 KSP Residual norm 4.956288530837e-03 1041 KSP Residual norm 4.675443151148e-03 1042 KSP Residual norm 4.634914726898e-03 1043 KSP Residual norm 4.719420543451e-03 1044 KSP Residual norm 4.878425027179e-03 1045 KSP Residual norm 4.992200285686e-03 1046 KSP Residual norm 4.883623561089e-03 1047 KSP Residual norm 4.902805121842e-03 1048 KSP Residual norm 5.038729498491e-03 1049 KSP Residual norm 5.137707181042e-03 1050 KSP Residual norm 4.830803807506e-03 1051 KSP Residual norm 4.320867223425e-03 1052 KSP Residual norm 4.142120709162e-03 1053 KSP Residual norm 4.298950610032e-03 1054 KSP Residual norm 4.467252788161e-03 1055 KSP Residual norm 4.522559064521e-03 1056 KSP Residual norm 4.761306312461e-03 1057 KSP Residual norm 5.150319637346e-03 1058 KSP Residual norm 5.322174446922e-03 1059 KSP Residual norm 5.238802254595e-03 1060 KSP Residual norm 5.061485394297e-03 1061 KSP Residual norm 4.909992936371e-03 1062 KSP Residual norm 4.685635154322e-03 1063 KSP Residual norm 4.549642150475e-03 1064 KSP Residual norm 4.315222756062e-03 1065 KSP Residual norm 4.281712726966e-03 1066 KSP Residual norm 4.628655632869e-03 1067 KSP Residual norm 5.104523083976e-03 1068 KSP Residual norm 5.189322907695e-03 1069 KSP Residual norm 5.221327773111e-03 1070 KSP Residual norm 5.319418446647e-03 1071 KSP Residual norm 5.285463126603e-03 1072 KSP Residual norm 5.180630969047e-03 1073 KSP Residual norm 5.214370896858e-03 1074 KSP Residual norm 5.560363588384e-03 1075 KSP Residual norm 5.820058047521e-03 1076 KSP Residual norm 6.161044287324e-03 1077 KSP Residual norm 6.134595772506e-03 1078 KSP Residual norm 6.162806832956e-03 1079 KSP Residual norm 6.015504687689e-03 1080 KSP Residual norm 5.965365763166e-03 1081 KSP Residual norm 5.925516590688e-03 1082 KSP Residual norm 6.050030721362e-03 1083 KSP Residual norm 6.035467347951e-03 1084 KSP Residual norm 5.840636688796e-03 1085 KSP Residual norm 5.974492387850e-03 1086 KSP Residual norm 6.306378068772e-03 1087 KSP Residual norm 6.223532094991e-03 1088 KSP Residual norm 5.978102326085e-03 1089 KSP Residual norm 6.032699383338e-03 1090 KSP Residual norm 5.963182048539e-03 1091 KSP Residual norm 5.845853825071e-03 1092 KSP Residual norm 5.703606769746e-03 1093 KSP Residual norm 5.589634794580e-03 1094 KSP Residual norm 5.945482515465e-03 1095 KSP Residual norm 6.096712270725e-03 1096 KSP Residual norm 5.727527640483e-03 1097 KSP Residual norm 5.463535057390e-03 1098 KSP Residual norm 5.621914974956e-03 1099 KSP Residual norm 5.892077814275e-03 1100 KSP Residual norm 6.139083451418e-03 1101 KSP Residual norm 5.740656746572e-03 1102 KSP Residual norm 5.328292130255e-03 1103 KSP Residual norm 5.258243410082e-03 1104 KSP Residual norm 5.294375311027e-03 1105 KSP Residual norm 5.457460865136e-03 1106 KSP Residual norm 5.774135867417e-03 1107 KSP Residual norm 6.231248846901e-03 1108 KSP Residual norm 6.634947482888e-03 1109 KSP Residual norm 6.915589060378e-03 1110 KSP Residual norm 6.842662490898e-03 1111 KSP Residual norm 6.653520411095e-03 1112 KSP Residual norm 6.507144896527e-03 1113 KSP Residual norm 6.203930956071e-03 1114 KSP Residual norm 6.190651197615e-03 1115 KSP Residual norm 6.443233750548e-03 1116 KSP Residual norm 6.846407285740e-03 1117 KSP Residual norm 6.813516923484e-03 1118 KSP Residual norm 6.586346121190e-03 1119 KSP Residual norm 6.716516964440e-03 1120 KSP Residual norm 6.750677507897e-03 1121 KSP Residual norm 6.650235124616e-03 1122 KSP Residual norm 6.340073963061e-03 1123 KSP Residual norm 6.442977105372e-03 1124 KSP Residual norm 6.607485326192e-03 1125 KSP Residual norm 6.654648159593e-03 1126 KSP Residual norm 6.390913213498e-03 1127 KSP Residual norm 6.139694331253e-03 1128 KSP Residual norm 5.975839996696e-03 1129 KSP Residual norm 6.181452135028e-03 1130 KSP Residual norm 6.483924116115e-03 1131 KSP Residual norm 6.244388643648e-03 1132 KSP Residual norm 5.726665913960e-03 1133 KSP Residual norm 5.548101276250e-03 1134 KSP Residual norm 5.670483843427e-03 1135 KSP Residual norm 5.815484357897e-03 1136 KSP Residual norm 5.836656679452e-03 1137 KSP Residual norm 6.126786438304e-03 1138 KSP Residual norm 6.131637697385e-03 1139 KSP Residual norm 5.939461551154e-03 1140 KSP Residual norm 5.899863029397e-03 1141 KSP Residual norm 5.879249371999e-03 1142 KSP Residual norm 5.733033436179e-03 1143 KSP Residual norm 5.810084705179e-03 1144 KSP Residual norm 6.192124782418e-03 1145 KSP Residual norm 6.198880886066e-03 1146 KSP Residual norm 6.126373924098e-03 1147 KSP Residual norm 6.202550295473e-03 1148 KSP Residual norm 6.619257825958e-03 1149 KSP Residual norm 6.958254861449e-03 1150 KSP Residual norm 6.903498120753e-03 1151 KSP Residual norm 6.853397498305e-03 1152 KSP Residual norm 6.783288123518e-03 1153 KSP Residual norm 6.684459902594e-03 1154 KSP Residual norm 7.000778316486e-03 1155 KSP Residual norm 6.912436408032e-03 1156 KSP Residual norm 6.819457534321e-03 1157 KSP Residual norm 6.677222508219e-03 1158 KSP Residual norm 6.455952608254e-03 1159 KSP Residual norm 6.346410790821e-03 1160 KSP Residual norm 6.331219961493e-03 1161 KSP Residual norm 6.640611156020e-03 1162 KSP Residual norm 6.790349677284e-03 1163 KSP Residual norm 6.332263850028e-03 1164 KSP Residual norm 5.816647448194e-03 1165 KSP Residual norm 5.697215514246e-03 1166 KSP Residual norm 6.016051161797e-03 1167 KSP Residual norm 6.340316880162e-03 1168 KSP Residual norm 6.101820359558e-03 1169 KSP Residual norm 5.802519263927e-03 1170 KSP Residual norm 5.442038846680e-03 1171 KSP Residual norm 4.842404703879e-03 1172 KSP Residual norm 4.307367171043e-03 1173 KSP Residual norm 4.312710721844e-03 1174 KSP Residual norm 4.615300758975e-03 1175 KSP Residual norm 4.835514304094e-03 1176 KSP Residual norm 4.971895126694e-03 1177 KSP Residual norm 4.922519270854e-03 1178 KSP Residual norm 5.027507820926e-03 1179 KSP Residual norm 5.381926369022e-03 1180 KSP Residual norm 5.924644512727e-03 1181 KSP Residual norm 5.891518828823e-03 1182 KSP Residual norm 5.675322830638e-03 1183 KSP Residual norm 5.488681315201e-03 1184 KSP Residual norm 5.464539823754e-03 1185 KSP Residual norm 5.191120797280e-03 1186 KSP Residual norm 5.036484482861e-03 1187 KSP Residual norm 5.514703417422e-03 1188 KSP Residual norm 6.021271127233e-03 1189 KSP Residual norm 6.102166452512e-03 1190 KSP Residual norm 5.759277572695e-03 1191 KSP Residual norm 5.365159664280e-03 1192 KSP Residual norm 4.831505966819e-03 1193 KSP Residual norm 4.669862169270e-03 1194 KSP Residual norm 4.619647530157e-03 1195 KSP Residual norm 4.692323156185e-03 1196 KSP Residual norm 4.827179343472e-03 1197 KSP Residual norm 4.722460358927e-03 1198 KSP Residual norm 4.457709639624e-03 1199 KSP Residual norm 4.232960701882e-03 1200 KSP Residual norm 3.913306868682e-03 1201 KSP Residual norm 3.938209539604e-03 1202 KSP Residual norm 4.209949450264e-03 1203 KSP Residual norm 4.348061906313e-03 1204 KSP Residual norm 4.309578170217e-03 1205 KSP Residual norm 4.215557309157e-03 1206 KSP Residual norm 4.328624858550e-03 1207 KSP Residual norm 4.679769902461e-03 1208 KSP Residual norm 4.772875673194e-03 1209 KSP Residual norm 4.840061492543e-03 1210 KSP Residual norm 4.894836284673e-03 1211 KSP Residual norm 5.106793066553e-03 1212 KSP Residual norm 5.419370252828e-03 1213 KSP Residual norm 5.602897497417e-03 1214 KSP Residual norm 5.620211820518e-03 1215 KSP Residual norm 5.541580410610e-03 1216 KSP Residual norm 5.377318358986e-03 1217 KSP Residual norm 5.355932724490e-03 1218 KSP Residual norm 5.172556245287e-03 1219 KSP Residual norm 5.195067873454e-03 1220 KSP Residual norm 5.411953098667e-03 1221 KSP Residual norm 5.542498186585e-03 1222 KSP Residual norm 5.699400248498e-03 1223 KSP Residual norm 6.083526194246e-03 1224 KSP Residual norm 6.776632040042e-03 1225 KSP Residual norm 6.591245482881e-03 1226 KSP Residual norm 6.056789615984e-03 1227 KSP Residual norm 5.620002002496e-03 1228 KSP Residual norm 5.511076945362e-03 1229 KSP Residual norm 5.403195902175e-03 1230 KSP Residual norm 4.979935000628e-03 1231 KSP Residual norm 4.573086112890e-03 1232 KSP Residual norm 4.459743381228e-03 1233 KSP Residual norm 4.372845880899e-03 1234 KSP Residual norm 4.302849623364e-03 1235 KSP Residual norm 4.111474668947e-03 1236 KSP Residual norm 3.979685522244e-03 1237 KSP Residual norm 4.065333685994e-03 1238 KSP Residual norm 4.212503344253e-03 1239 KSP Residual norm 4.255606705877e-03 1240 KSP Residual norm 4.410395450204e-03 1241 KSP Residual norm 4.645881211501e-03 1242 KSP Residual norm 4.396305934514e-03 1243 KSP Residual norm 3.795604675090e-03 1244 KSP Residual norm 3.382639010574e-03 1245 KSP Residual norm 3.263719609938e-03 1246 KSP Residual norm 3.292309527915e-03 1247 KSP Residual norm 3.454794521972e-03 1248 KSP Residual norm 3.756456407396e-03 1249 KSP Residual norm 4.129318294708e-03 1250 KSP Residual norm 4.188513748578e-03 1251 KSP Residual norm 4.333470426178e-03 1252 KSP Residual norm 4.534784618524e-03 1253 KSP Residual norm 4.662683091238e-03 1254 KSP Residual norm 4.745039771543e-03 1255 KSP Residual norm 5.036284709742e-03 1256 KSP Residual norm 5.140546926157e-03 1257 KSP Residual norm 5.026826636179e-03 1258 KSP Residual norm 5.010636800537e-03 1259 KSP Residual norm 5.316707900383e-03 1260 KSP Residual norm 5.326368822382e-03 1261 KSP Residual norm 5.266328100617e-03 1262 KSP Residual norm 5.234593905012e-03 1263 KSP Residual norm 5.186181958411e-03 1264 KSP Residual norm 4.808342735265e-03 1265 KSP Residual norm 4.556387656805e-03 1266 KSP Residual norm 4.549331733816e-03 1267 KSP Residual norm 4.669412804075e-03 1268 KSP Residual norm 4.574004829108e-03 1269 KSP Residual norm 4.340970646999e-03 1270 KSP Residual norm 4.319369679024e-03 1271 KSP Residual norm 4.578968060174e-03 1272 KSP Residual norm 4.846032166389e-03 1273 KSP Residual norm 4.746368789336e-03 1274 KSP Residual norm 4.609013741009e-03 1275 KSP Residual norm 4.708151412562e-03 1276 KSP Residual norm 4.995241581273e-03 1277 KSP Residual norm 5.332579238943e-03 1278 KSP Residual norm 5.204965860375e-03 1279 KSP Residual norm 4.945698544804e-03 1280 KSP Residual norm 4.804631013042e-03 1281 KSP Residual norm 4.917148464483e-03 1282 KSP Residual norm 5.240730514973e-03 1283 KSP Residual norm 5.576608925829e-03 1284 KSP Residual norm 5.371381327328e-03 1285 KSP Residual norm 5.011314190904e-03 1286 KSP Residual norm 4.866107074784e-03 1287 KSP Residual norm 4.787906287522e-03 1288 KSP Residual norm 4.864775553534e-03 1289 KSP Residual norm 4.916785832573e-03 1290 KSP Residual norm 4.657463941679e-03 1291 KSP Residual norm 4.530399732902e-03 1292 KSP Residual norm 4.567561150432e-03 1293 KSP Residual norm 4.623376308790e-03 1294 KSP Residual norm 4.386296319403e-03 1295 KSP Residual norm 4.224960644615e-03 1296 KSP Residual norm 4.124504209937e-03 1297 KSP Residual norm 4.113756726773e-03 1298 KSP Residual norm 4.219404272017e-03 1299 KSP Residual norm 4.319074017788e-03 1300 KSP Residual norm 4.338363708766e-03 1301 KSP Residual norm 4.087293049445e-03 1302 KSP Residual norm 3.828187057594e-03 1303 KSP Residual norm 3.615664295881e-03 1304 KSP Residual norm 3.720737447464e-03 1305 KSP Residual norm 3.995361756972e-03 1306 KSP Residual norm 4.202771155210e-03 1307 KSP Residual norm 4.412679881259e-03 1308 KSP Residual norm 4.559842389085e-03 1309 KSP Residual norm 4.771511782515e-03 1310 KSP Residual norm 4.804998789986e-03 1311 KSP Residual norm 5.021417214261e-03 1312 KSP Residual norm 5.325459583935e-03 1313 KSP Residual norm 5.448092377851e-03 1314 KSP Residual norm 5.357339275060e-03 1315 KSP Residual norm 5.172504557519e-03 1316 KSP Residual norm 4.864858663382e-03 1317 KSP Residual norm 4.573250060157e-03 1318 KSP Residual norm 4.497697556282e-03 1319 KSP Residual norm 4.611563966140e-03 1320 KSP Residual norm 4.608284094663e-03 1321 KSP Residual norm 4.378159837225e-03 1322 KSP Residual norm 4.486083715508e-03 1323 KSP Residual norm 5.006618659923e-03 1324 KSP Residual norm 5.400365411200e-03 1325 KSP Residual norm 5.307026679105e-03 1326 KSP Residual norm 5.293514731677e-03 1327 KSP Residual norm 5.306975541628e-03 1328 KSP Residual norm 5.368792860268e-03 1329 KSP Residual norm 4.919812160700e-03 1330 KSP Residual norm 4.471669316051e-03 1331 KSP Residual norm 4.515952541262e-03 1332 KSP Residual norm 4.917163287239e-03 1333 KSP Residual norm 5.245851058222e-03 1334 KSP Residual norm 5.621468067928e-03 1335 KSP Residual norm 5.652217685842e-03 1336 KSP Residual norm 5.708425105195e-03 1337 KSP Residual norm 5.824373614395e-03 1338 KSP Residual norm 6.009562993019e-03 1339 KSP Residual norm 6.144142720600e-03 1340 KSP Residual norm 6.599285586716e-03 1341 KSP Residual norm 7.344808253598e-03 1342 KSP Residual norm 7.688040205459e-03 1343 KSP Residual norm 7.586092654733e-03 1344 KSP Residual norm 7.173564259039e-03 1345 KSP Residual norm 6.598240988005e-03 1346 KSP Residual norm 5.892587099044e-03 1347 KSP Residual norm 5.613682437077e-03 1348 KSP Residual norm 5.460696099903e-03 1349 KSP Residual norm 5.462517746714e-03 1350 KSP Residual norm 5.600953475630e-03 1351 KSP Residual norm 5.387775229912e-03 1352 KSP Residual norm 5.181412421696e-03 1353 KSP Residual norm 4.931952401920e-03 1354 KSP Residual norm 4.570619979257e-03 1355 KSP Residual norm 4.377392593060e-03 1356 KSP Residual norm 4.134943800022e-03 1357 KSP Residual norm 4.050101621145e-03 1358 KSP Residual norm 3.927170173902e-03 1359 KSP Residual norm 4.002863746419e-03 1360 KSP Residual norm 4.133904576647e-03 1361 KSP Residual norm 4.093282465061e-03 1362 KSP Residual norm 3.857297451022e-03 1363 KSP Residual norm 3.620285409349e-03 1364 KSP Residual norm 3.382937730415e-03 1365 KSP Residual norm 3.338970913338e-03 1366 KSP Residual norm 3.621615143604e-03 1367 KSP Residual norm 4.004203600945e-03 1368 KSP Residual norm 4.313641226175e-03 1369 KSP Residual norm 4.381419457289e-03 1370 KSP Residual norm 4.062557649672e-03 1371 KSP Residual norm 3.804951568249e-03 1372 KSP Residual norm 3.824966688333e-03 1373 KSP Residual norm 4.170571410610e-03 1374 KSP Residual norm 4.359063888330e-03 1375 KSP Residual norm 4.369708161276e-03 1376 KSP Residual norm 4.616366604858e-03 1377 KSP Residual norm 4.813132711751e-03 1378 KSP Residual norm 4.841581890431e-03 1379 KSP Residual norm 4.615457216910e-03 1380 KSP Residual norm 4.364865174047e-03 1381 KSP Residual norm 4.402076551154e-03 1382 KSP Residual norm 4.550847289956e-03 1383 KSP Residual norm 4.653118821445e-03 1384 KSP Residual norm 4.848951209101e-03 1385 KSP Residual norm 5.335005979295e-03 1386 KSP Residual norm 5.990831239637e-03 1387 KSP Residual norm 6.390408526740e-03 1388 KSP Residual norm 6.238132730307e-03 1389 KSP Residual norm 5.777213476124e-03 1390 KSP Residual norm 5.745474019357e-03 1391 KSP Residual norm 6.171919713719e-03 1392 KSP Residual norm 6.814628576131e-03 1393 KSP Residual norm 7.092930622436e-03 1394 KSP Residual norm 7.202829647083e-03 1395 KSP Residual norm 7.263068243672e-03 1396 KSP Residual norm 7.991866027692e-03 1397 KSP Residual norm 8.076857344675e-03 1398 KSP Residual norm 7.416722959321e-03 1399 KSP Residual norm 7.256745039815e-03 1400 KSP Residual norm 7.817460645521e-03 1401 KSP Residual norm 7.888386414770e-03 1402 KSP Residual norm 7.778721146485e-03 1403 KSP Residual norm 7.633145130674e-03 1404 KSP Residual norm 7.080884138097e-03 1405 KSP Residual norm 6.392244539346e-03 1406 KSP Residual norm 5.964969378442e-03 1407 KSP Residual norm 5.759052737011e-03 1408 KSP Residual norm 5.509325648131e-03 1409 KSP Residual norm 4.965502665671e-03 1410 KSP Residual norm 4.553387456256e-03 1411 KSP Residual norm 4.350348933840e-03 1412 KSP Residual norm 4.450579817108e-03 1413 KSP Residual norm 4.462209314224e-03 1414 KSP Residual norm 4.759025680116e-03 1415 KSP Residual norm 4.843115994220e-03 1416 KSP Residual norm 4.883781434826e-03 1417 KSP Residual norm 4.855960636670e-03 1418 KSP Residual norm 4.909991832520e-03 1419 KSP Residual norm 5.167335712909e-03 1420 KSP Residual norm 5.062535842783e-03 1421 KSP Residual norm 4.861047026837e-03 1422 KSP Residual norm 4.614075594659e-03 1423 KSP Residual norm 4.448845132952e-03 1424 KSP Residual norm 4.341318580693e-03 1425 KSP Residual norm 4.446510898985e-03 1426 KSP Residual norm 4.663442738450e-03 1427 KSP Residual norm 4.765644861878e-03 1428 KSP Residual norm 4.659291336051e-03 1429 KSP Residual norm 4.359520779209e-03 1430 KSP Residual norm 4.223229345176e-03 1431 KSP Residual norm 4.418285143963e-03 1432 KSP Residual norm 4.619124062440e-03 1433 KSP Residual norm 4.553885494409e-03 1434 KSP Residual norm 4.280900988546e-03 1435 KSP Residual norm 4.085018266935e-03 1436 KSP Residual norm 4.004797334160e-03 1437 KSP Residual norm 4.007557730599e-03 1438 KSP Residual norm 3.818920853936e-03 1439 KSP Residual norm 3.735385033394e-03 1440 KSP Residual norm 3.797665363864e-03 1441 KSP Residual norm 3.647434711828e-03 1442 KSP Residual norm 3.434667605772e-03 1443 KSP Residual norm 3.579061679853e-03 1444 KSP Residual norm 3.929397396732e-03 1445 KSP Residual norm 4.161246364486e-03 1446 KSP Residual norm 4.247942879325e-03 1447 KSP Residual norm 4.214530157465e-03 1448 KSP Residual norm 4.349880370678e-03 1449 KSP Residual norm 4.382618450311e-03 1450 KSP Residual norm 4.317896908899e-03 1451 KSP Residual norm 4.318923942325e-03 1452 KSP Residual norm 4.597371089027e-03 1453 KSP Residual norm 4.734297417514e-03 1454 KSP Residual norm 4.723529661083e-03 1455 KSP Residual norm 4.535019309761e-03 1456 KSP Residual norm 4.557228968325e-03 1457 KSP Residual norm 4.675657336082e-03 1458 KSP Residual norm 4.678101252096e-03 1459 KSP Residual norm 4.440029523124e-03 1460 KSP Residual norm 4.443447061774e-03 1461 KSP Residual norm 4.675489876803e-03 1462 KSP Residual norm 4.697316439515e-03 1463 KSP Residual norm 4.347006366960e-03 1464 KSP Residual norm 4.116358055477e-03 1465 KSP Residual norm 3.909385741593e-03 1466 KSP Residual norm 3.427271155185e-03 1467 KSP Residual norm 3.148050704572e-03 1468 KSP Residual norm 3.146758567557e-03 1469 KSP Residual norm 3.326439000327e-03 1470 KSP Residual norm 3.528227381870e-03 1471 KSP Residual norm 3.734636049772e-03 1472 KSP Residual norm 3.801175141192e-03 1473 KSP Residual norm 3.682733054341e-03 1474 KSP Residual norm 3.869391327748e-03 1475 KSP Residual norm 4.124262041395e-03 1476 KSP Residual norm 4.301928175170e-03 1477 KSP Residual norm 4.360539607347e-03 1478 KSP Residual norm 4.695576840578e-03 1479 KSP Residual norm 5.338862907112e-03 1480 KSP Residual norm 5.857255626224e-03 1481 KSP Residual norm 5.758243446024e-03 1482 KSP Residual norm 5.429528389948e-03 1483 KSP Residual norm 5.332596773347e-03 1484 KSP Residual norm 5.317884347570e-03 1485 KSP Residual norm 5.251252274662e-03 1486 KSP Residual norm 4.970344100120e-03 1487 KSP Residual norm 4.679405455370e-03 1488 KSP Residual norm 4.257698337737e-03 1489 KSP Residual norm 3.922890835350e-03 1490 KSP Residual norm 3.740125726498e-03 1491 KSP Residual norm 3.739714064191e-03 1492 KSP Residual norm 3.966049899764e-03 1493 KSP Residual norm 4.027427417459e-03 1494 KSP Residual norm 3.958089315333e-03 1495 KSP Residual norm 3.911692723710e-03 1496 KSP Residual norm 4.016878615819e-03 1497 KSP Residual norm 4.250091344628e-03 1498 KSP Residual norm 4.518796865137e-03 1499 KSP Residual norm 5.010301973299e-03 1500 KSP Residual norm 5.578390370620e-03 1501 KSP Residual norm 5.614031960271e-03 1502 KSP Residual norm 5.487587594523e-03 1503 KSP Residual norm 5.490917249746e-03 1504 KSP Residual norm 5.410518625817e-03 1505 KSP Residual norm 4.795851091615e-03 1506 KSP Residual norm 3.971918432854e-03 1507 KSP Residual norm 3.703618718688e-03 1508 KSP Residual norm 3.914709198679e-03 1509 KSP Residual norm 4.020164070164e-03 1510 KSP Residual norm 4.140657598389e-03 1511 KSP Residual norm 4.045678848300e-03 1512 KSP Residual norm 3.877274234083e-03 1513 KSP Residual norm 3.773905053915e-03 1514 KSP Residual norm 3.777227522832e-03 1515 KSP Residual norm 3.615497735700e-03 1516 KSP Residual norm 3.329500584338e-03 1517 KSP Residual norm 3.211498061362e-03 1518 KSP Residual norm 3.401609027037e-03 1519 KSP Residual norm 3.792488962233e-03 1520 KSP Residual norm 3.770037286683e-03 1521 KSP Residual norm 3.396567762266e-03 1522 KSP Residual norm 3.019110598211e-03 1523 KSP Residual norm 3.152200151492e-03 1524 KSP Residual norm 3.557575669406e-03 1525 KSP Residual norm 3.872615871153e-03 1526 KSP Residual norm 4.159628700692e-03 1527 KSP Residual norm 4.183185965547e-03 1528 KSP Residual norm 4.076722224240e-03 1529 KSP Residual norm 3.792985283323e-03 1530 KSP Residual norm 3.455971465179e-03 1531 KSP Residual norm 3.346246982667e-03 1532 KSP Residual norm 3.461595065848e-03 1533 KSP Residual norm 3.719177306018e-03 1534 KSP Residual norm 3.738392400040e-03 1535 KSP Residual norm 3.321691072143e-03 1536 KSP Residual norm 3.081074856988e-03 1537 KSP Residual norm 3.326777042453e-03 1538 KSP Residual norm 3.959519687464e-03 1539 KSP Residual norm 4.448615395345e-03 1540 KSP Residual norm 4.638141453377e-03 1541 KSP Residual norm 4.626773464162e-03 1542 KSP Residual norm 4.483684621375e-03 1543 KSP Residual norm 4.342412403729e-03 1544 KSP Residual norm 4.205782505569e-03 1545 KSP Residual norm 4.106900721390e-03 1546 KSP Residual norm 4.095639807755e-03 1547 KSP Residual norm 4.162839274005e-03 1548 KSP Residual norm 4.317606778375e-03 1549 KSP Residual norm 4.299305149905e-03 1550 KSP Residual norm 4.148649072744e-03 1551 KSP Residual norm 4.073325903721e-03 1552 KSP Residual norm 4.011903492316e-03 1553 KSP Residual norm 3.927519683330e-03 1554 KSP Residual norm 3.728278189639e-03 1555 KSP Residual norm 3.524235386843e-03 1556 KSP Residual norm 3.605233908790e-03 1557 KSP Residual norm 3.997424334329e-03 1558 KSP Residual norm 4.697881418515e-03 1559 KSP Residual norm 4.893748047315e-03 1560 KSP Residual norm 4.745987021851e-03 1561 KSP Residual norm 4.450249967684e-03 1562 KSP Residual norm 4.357138704257e-03 1563 KSP Residual norm 4.367710481440e-03 1564 KSP Residual norm 4.484935772773e-03 1565 KSP Residual norm 4.499326997528e-03 1566 KSP Residual norm 4.146770349785e-03 1567 KSP Residual norm 3.712414470514e-03 1568 KSP Residual norm 3.548515634102e-03 1569 KSP Residual norm 3.660986761216e-03 1570 KSP Residual norm 3.799659058599e-03 1571 KSP Residual norm 3.814251344302e-03 1572 KSP Residual norm 3.782927834391e-03 1573 KSP Residual norm 3.856112517795e-03 1574 KSP Residual norm 4.053264140504e-03 1575 KSP Residual norm 4.098474476568e-03 1576 KSP Residual norm 3.687961478176e-03 1577 KSP Residual norm 3.334331419984e-03 1578 KSP Residual norm 3.340775962920e-03 1579 KSP Residual norm 3.647538053510e-03 1580 KSP Residual norm 3.732505656428e-03 1581 KSP Residual norm 3.569907419767e-03 1582 KSP Residual norm 3.284915346407e-03 1583 KSP Residual norm 2.867992050476e-03 1584 KSP Residual norm 2.609457809758e-03 1585 KSP Residual norm 2.548729593534e-03 1586 KSP Residual norm 2.703303121263e-03 1587 KSP Residual norm 2.991301617554e-03 1588 KSP Residual norm 3.181960046360e-03 1589 KSP Residual norm 2.812457127578e-03 1590 KSP Residual norm 2.357118285072e-03 1591 KSP Residual norm 2.331637798308e-03 1592 KSP Residual norm 2.767640590317e-03 1593 KSP Residual norm 3.120168921408e-03 1594 KSP Residual norm 3.071446073340e-03 1595 KSP Residual norm 2.901376408602e-03 1596 KSP Residual norm 2.736985639322e-03 1597 KSP Residual norm 2.685199238895e-03 1598 KSP Residual norm 2.662069200765e-03 1599 KSP Residual norm 2.723556510636e-03 1600 KSP Residual norm 2.865503849469e-03 1601 KSP Residual norm 3.124754319916e-03 1602 KSP Residual norm 3.501928923907e-03 1603 KSP Residual norm 3.710686527071e-03 1604 KSP Residual norm 3.698074422762e-03 1605 KSP Residual norm 3.433141741764e-03 1606 KSP Residual norm 3.119678872135e-03 1607 KSP Residual norm 2.928591276034e-03 1608 KSP Residual norm 2.740490664497e-03 1609 KSP Residual norm 2.700187040953e-03 1610 KSP Residual norm 3.058181422370e-03 1611 KSP Residual norm 3.631169145322e-03 1612 KSP Residual norm 3.865620502931e-03 1613 KSP Residual norm 3.410310938254e-03 1614 KSP Residual norm 2.778380603588e-03 1615 KSP Residual norm 2.519935144611e-03 1616 KSP Residual norm 2.536730914580e-03 1617 KSP Residual norm 2.722169589183e-03 1618 KSP Residual norm 2.981607527716e-03 1619 KSP Residual norm 3.074443842388e-03 1620 KSP Residual norm 3.102897043425e-03 1621 KSP Residual norm 3.334968227213e-03 1622 KSP Residual norm 3.765759167624e-03 1623 KSP Residual norm 4.260428606374e-03 1624 KSP Residual norm 4.692079458287e-03 1625 KSP Residual norm 5.031307315013e-03 1626 KSP Residual norm 4.421918845491e-03 1627 KSP Residual norm 3.287483165129e-03 1628 KSP Residual norm 2.369767919189e-03 1629 KSP Residual norm 1.932654903757e-03 1630 KSP Residual norm 1.962317074491e-03 1631 KSP Residual norm 2.434378646182e-03 1632 KSP Residual norm 3.165878044867e-03 1633 KSP Residual norm 4.008110394569e-03 1634 KSP Residual norm 4.243942144954e-03 1635 KSP Residual norm 3.502225516401e-03 1636 KSP Residual norm 2.676679314493e-03 1637 KSP Residual norm 2.113255870404e-03 1638 KSP Residual norm 1.943585178476e-03 1639 KSP Residual norm 2.182514404814e-03 1640 KSP Residual norm 2.842236509915e-03 1641 KSP Residual norm 3.797676250207e-03 1642 KSP Residual norm 4.796159154955e-03 1643 KSP Residual norm 4.855718878594e-03 1644 KSP Residual norm 3.805743905144e-03 1645 KSP Residual norm 2.836189035614e-03 1646 KSP Residual norm 2.311683718706e-03 1647 KSP Residual norm 2.304059112016e-03 1648 KSP Residual norm 3.001923838056e-03 1649 KSP Residual norm 4.465398931268e-03 1650 KSP Residual norm 6.316248645066e-03 1651 KSP Residual norm 7.460452089176e-03 1652 KSP Residual norm 6.157762247948e-03 1653 KSP Residual norm 4.028858910124e-03 1654 KSP Residual norm 2.607315308302e-03 1655 KSP Residual norm 2.135887579079e-03 1656 KSP Residual norm 2.214752212489e-03 1657 KSP Residual norm 2.889084953386e-03 1658 KSP Residual norm 4.448359025874e-03 1659 KSP Residual norm 6.312792290081e-03 1660 KSP Residual norm 6.454085180289e-03 1661 KSP Residual norm 4.821242484495e-03 1662 KSP Residual norm 3.136267667322e-03 1663 KSP Residual norm 2.048069925869e-03 1664 KSP Residual norm 1.642042732627e-03 1665 KSP Residual norm 1.776685047647e-03 1666 KSP Residual norm 2.623757732771e-03 1667 KSP Residual norm 4.458232614128e-03 1668 KSP Residual norm 6.978633797115e-03 1669 KSP Residual norm 7.237631617823e-03 1670 KSP Residual norm 4.985964037370e-03 1671 KSP Residual norm 2.915990817911e-03 1672 KSP Residual norm 2.034845647516e-03 1673 KSP Residual norm 1.913779478677e-03 1674 KSP Residual norm 2.437839848479e-03 1675 KSP Residual norm 3.816997921800e-03 1676 KSP Residual norm 5.453049433777e-03 1677 KSP Residual norm 5.652942559526e-03 1678 KSP Residual norm 4.474314780586e-03 1679 KSP Residual norm 2.979968710938e-03 1680 KSP Residual norm 1.962384551434e-03 1681 KSP Residual norm 1.629787231112e-03 1682 KSP Residual norm 1.775963421001e-03 1683 KSP Residual norm 2.570855608573e-03 1684 KSP Residual norm 4.218418470023e-03 1685 KSP Residual norm 6.808798194916e-03 1686 KSP Residual norm 8.665640165091e-03 1687 KSP Residual norm 6.816417271790e-03 1688 KSP Residual norm 4.077925925808e-03 1689 KSP Residual norm 2.529468344520e-03 1690 KSP Residual norm 2.079852452529e-03 1691 KSP Residual norm 2.169906661041e-03 1692 KSP Residual norm 2.748732070558e-03 1693 KSP Residual norm 4.293496737558e-03 1694 KSP Residual norm 7.106427579240e-03 1695 KSP Residual norm 1.039582615490e-02 1696 KSP Residual norm 1.066412513256e-02 1697 KSP Residual norm 7.363116007261e-03 1698 KSP Residual norm 4.484806027494e-03 1699 KSP Residual norm 3.194517941305e-03 1700 KSP Residual norm 2.997403024709e-03 1701 KSP Residual norm 3.715245506908e-03 1702 KSP Residual norm 5.667183771566e-03 1703 KSP Residual norm 9.895874135020e-03 1704 KSP Residual norm 1.713628494096e-02 1705 KSP Residual norm 2.253742675078e-02 1706 KSP Residual norm 1.815633226840e-02 1707 KSP Residual norm 1.027253732195e-02 1708 KSP Residual norm 5.585731982869e-03 1709 KSP Residual norm 3.560817037432e-03 1710 KSP Residual norm 3.174754585629e-03 1711 KSP Residual norm 4.087044780875e-03 1712 KSP Residual norm 6.310730407470e-03 1713 KSP Residual norm 8.889169687391e-03 1714 KSP Residual norm 1.010227705484e-02 1715 KSP Residual norm 9.757653183414e-03 1716 KSP Residual norm 1.042924992465e-02 1717 KSP Residual norm 1.236414617044e-02 1718 KSP Residual norm 1.344340520603e-02 1719 KSP Residual norm 9.861580508839e-03 1720 KSP Residual norm 5.760429650522e-03 1721 KSP Residual norm 3.968634397526e-03 1722 KSP Residual norm 3.860048899695e-03 1723 KSP Residual norm 4.885282153439e-03 1724 KSP Residual norm 6.411846189810e-03 1725 KSP Residual norm 7.528899153385e-03 1726 KSP Residual norm 8.282565560188e-03 1727 KSP Residual norm 1.028477470185e-02 1728 KSP Residual norm 1.406352667328e-02 1729 KSP Residual norm 1.613317508911e-02 1730 KSP Residual norm 1.265179056408e-02 1731 KSP Residual norm 8.433894035220e-03 1732 KSP Residual norm 7.024467576599e-03 1733 KSP Residual norm 7.522036580324e-03 1734 KSP Residual norm 8.086010468802e-03 1735 KSP Residual norm 6.775999043073e-03 1736 KSP Residual norm 4.982660999927e-03 1737 KSP Residual norm 4.153635535655e-03 1738 KSP Residual norm 4.498743086892e-03 1739 KSP Residual norm 6.376384796902e-03 1740 KSP Residual norm 9.019305036726e-03 1741 KSP Residual norm 1.008571307583e-02 1742 KSP Residual norm 9.867622730951e-03 1743 KSP Residual norm 1.130154844008e-02 1744 KSP Residual norm 1.487352700809e-02 1745 KSP Residual norm 1.674566260071e-02 1746 KSP Residual norm 1.295908724076e-02 1747 KSP Residual norm 9.270210719206e-03 1748 KSP Residual norm 7.608571110303e-03 1749 KSP Residual norm 6.682015809861e-03 1750 KSP Residual norm 5.806877257439e-03 1751 KSP Residual norm 4.662759516076e-03 1752 KSP Residual norm 3.842666916488e-03 1753 KSP Residual norm 3.669687472434e-03 1754 KSP Residual norm 4.150693018521e-03 1755 KSP Residual norm 5.197551156139e-03 1756 KSP Residual norm 5.944114827337e-03 1757 KSP Residual norm 6.456393033999e-03 1758 KSP Residual norm 7.760231053459e-03 1759 KSP Residual norm 1.037947368702e-02 1760 KSP Residual norm 1.496028745626e-02 1761 KSP Residual norm 1.618786763382e-02 1762 KSP Residual norm 1.370970144802e-02 1763 KSP Residual norm 1.180123064819e-02 1764 KSP Residual norm 1.206538297991e-02 1765 KSP Residual norm 1.128858176079e-02 1766 KSP Residual norm 8.433779869320e-03 1767 KSP Residual norm 5.728690607563e-03 1768 KSP Residual norm 4.715870157427e-03 1769 KSP Residual norm 4.903849994668e-03 1770 KSP Residual norm 5.860927693618e-03 1771 KSP Residual norm 5.955039528640e-03 1772 KSP Residual norm 5.623262887997e-03 1773 KSP Residual norm 6.396771032506e-03 1774 KSP Residual norm 9.031367977476e-03 1775 KSP Residual norm 1.299959176062e-02 1776 KSP Residual norm 1.473141113096e-02 1777 KSP Residual norm 1.301728743945e-02 1778 KSP Residual norm 1.186704670476e-02 1779 KSP Residual norm 1.271037519749e-02 1780 KSP Residual norm 1.390386482594e-02 1781 KSP Residual norm 1.203407490800e-02 1782 KSP Residual norm 9.695334509794e-03 1783 KSP Residual norm 8.568230932420e-03 1784 KSP Residual norm 8.648198914198e-03 1785 KSP Residual norm 8.377636695143e-03 1786 KSP Residual norm 7.007933502942e-03 1787 KSP Residual norm 5.982063968369e-03 1788 KSP Residual norm 6.296043628802e-03 1789 KSP Residual norm 7.651552723818e-03 1790 KSP Residual norm 8.561237485700e-03 1791 KSP Residual norm 8.559295288210e-03 1792 KSP Residual norm 8.819457619708e-03 1793 KSP Residual norm 1.100979421964e-02 1794 KSP Residual norm 1.512814836318e-02 1795 KSP Residual norm 1.747930112508e-02 1796 KSP Residual norm 1.503911210909e-02 1797 KSP Residual norm 1.333828397169e-02 1798 KSP Residual norm 1.503110014031e-02 1799 KSP Residual norm 1.858874048740e-02 1800 KSP Residual norm 1.754785446477e-02 1801 KSP Residual norm 1.265005935329e-02 1802 KSP Residual norm 9.644563862227e-03 1803 KSP Residual norm 8.339675321860e-03 1804 KSP Residual norm 8.328279073357e-03 1805 KSP Residual norm 7.687861874409e-03 1806 KSP Residual norm 5.869926944451e-03 1807 KSP Residual norm 4.854194518142e-03 1808 KSP Residual norm 4.751080118652e-03 1809 KSP Residual norm 5.304959976306e-03 1810 KSP Residual norm 5.582226363115e-03 1811 KSP Residual norm 5.301484986900e-03 1812 KSP Residual norm 5.622322252902e-03 1813 KSP Residual norm 7.282562214031e-03 1814 KSP Residual norm 9.685663358062e-03 1815 KSP Residual norm 1.110743276654e-02 1816 KSP Residual norm 1.089423023046e-02 1817 KSP Residual norm 1.200160432866e-02 1818 KSP Residual norm 1.600239670961e-02 1819 KSP Residual norm 2.114175274990e-02 1820 KSP Residual norm 2.168298795445e-02 1821 KSP Residual norm 1.924855736963e-02 1822 KSP Residual norm 1.925728569694e-02 1823 KSP Residual norm 2.156804080003e-02 1824 KSP Residual norm 2.174795961672e-02 1825 KSP Residual norm 1.857980960454e-02 1826 KSP Residual norm 1.580064745947e-02 1827 KSP Residual norm 1.464780188666e-02 1828 KSP Residual norm 1.419629565106e-02 1829 KSP Residual norm 1.199934486506e-02 1830 KSP Residual norm 9.297607562041e-03 1831 KSP Residual norm 7.973058376218e-03 1832 KSP Residual norm 8.071779389779e-03 1833 KSP Residual norm 7.655424487629e-03 1834 KSP Residual norm 6.147071787004e-03 1835 KSP Residual norm 5.092429298832e-03 1836 KSP Residual norm 5.236830339569e-03 1837 KSP Residual norm 5.903585382046e-03 1838 KSP Residual norm 6.270704482619e-03 1839 KSP Residual norm 6.429789049653e-03 1840 KSP Residual norm 7.336468878096e-03 1841 KSP Residual norm 9.369292088197e-03 1842 KSP Residual norm 1.139526872507e-02 1843 KSP Residual norm 1.185995814746e-02 1844 KSP Residual norm 1.241981684210e-02 1845 KSP Residual norm 1.554131717488e-02 1846 KSP Residual norm 1.982680913327e-02 1847 KSP Residual norm 2.082327701056e-02 1848 KSP Residual norm 1.847297822087e-02 1849 KSP Residual norm 1.726200472182e-02 1850 KSP Residual norm 1.819892023894e-02 1851 KSP Residual norm 1.812721678711e-02 1852 KSP Residual norm 1.545821640092e-02 1853 KSP Residual norm 1.351946708467e-02 1854 KSP Residual norm 1.321421672781e-02 1855 KSP Residual norm 1.203583764597e-02 1856 KSP Residual norm 9.275182152481e-03 1857 KSP Residual norm 7.166020487625e-03 1858 KSP Residual norm 6.667041846874e-03 1859 KSP Residual norm 6.979920613231e-03 1860 KSP Residual norm 6.228819166919e-03 1861 KSP Residual norm 5.296069511341e-03 1862 KSP Residual norm 5.395934083308e-03 1863 KSP Residual norm 6.589992548511e-03 1864 KSP Residual norm 7.994790437338e-03 1865 KSP Residual norm 8.198103360516e-03 1866 KSP Residual norm 8.356807517315e-03 1867 KSP Residual norm 9.552692725974e-03 1868 KSP Residual norm 1.180930871093e-02 1869 KSP Residual norm 1.369013031159e-02 1870 KSP Residual norm 1.405625655625e-02 1871 KSP Residual norm 1.499059973109e-02 1872 KSP Residual norm 1.749830190835e-02 1873 KSP Residual norm 1.923240732681e-02 1874 KSP Residual norm 1.792581177987e-02 1875 KSP Residual norm 1.595420253315e-02 1876 KSP Residual norm 1.592437927271e-02 1877 KSP Residual norm 1.571061792241e-02 1878 KSP Residual norm 1.330409361058e-02 1879 KSP Residual norm 1.011402845723e-02 1880 KSP Residual norm 8.385714312478e-03 1881 KSP Residual norm 7.619838379779e-03 1882 KSP Residual norm 7.079068831244e-03 1883 KSP Residual norm 6.263483895898e-03 1884 KSP Residual norm 5.416913551314e-03 1885 KSP Residual norm 5.248740770795e-03 1886 KSP Residual norm 5.823178578214e-03 1887 KSP Residual norm 5.856710332746e-03 1888 KSP Residual norm 5.340607713094e-03 1889 KSP Residual norm 5.372666907277e-03 1890 KSP Residual norm 6.254021658222e-03 1891 KSP Residual norm 7.002555893646e-03 1892 KSP Residual norm 7.119015191351e-03 1893 KSP Residual norm 7.398487660308e-03 1894 KSP Residual norm 8.598149935475e-03 1895 KSP Residual norm 9.082542888550e-03 1896 KSP Residual norm 8.823606871381e-03 1897 KSP Residual norm 8.351271646168e-03 1898 KSP Residual norm 8.792697059084e-03 1899 KSP Residual norm 9.434861567697e-03 1900 KSP Residual norm 8.739188946527e-03 1901 KSP Residual norm 7.439345048644e-03 1902 KSP Residual norm 6.988512905492e-03 1903 KSP Residual norm 7.266598646058e-03 1904 KSP Residual norm 6.860594789317e-03 1905 KSP Residual norm 5.869662347953e-03 1906 KSP Residual norm 5.189292046807e-03 1907 KSP Residual norm 4.698378415098e-03 1908 KSP Residual norm 4.125932694582e-03 1909 KSP Residual norm 3.623733579068e-03 1910 KSP Residual norm 3.397717857275e-03 1911 KSP Residual norm 3.450602416975e-03 1912 KSP Residual norm 3.371679742809e-03 1913 KSP Residual norm 3.211538717453e-03 1914 KSP Residual norm 3.381570484386e-03 1915 KSP Residual norm 3.752374891032e-03 1916 KSP Residual norm 4.180224899146e-03 1917 KSP Residual norm 4.425439107566e-03 1918 KSP Residual norm 4.789578199346e-03 1919 KSP Residual norm 5.284732777086e-03 1920 KSP Residual norm 5.612706090822e-03 1921 KSP Residual norm 5.861795033631e-03 1922 KSP Residual norm 6.262650523811e-03 1923 KSP Residual norm 7.405189723965e-03 1924 KSP Residual norm 8.588777389142e-03 1925 KSP Residual norm 9.036165294655e-03 1926 KSP Residual norm 9.216393583533e-03 1927 KSP Residual norm 1.015521724006e-02 1928 KSP Residual norm 1.063074433343e-02 1929 KSP Residual norm 9.895151313230e-03 1930 KSP Residual norm 9.097228298051e-03 1931 KSP Residual norm 8.685730919586e-03 1932 KSP Residual norm 8.534987793340e-03 1933 KSP Residual norm 7.863482725812e-03 1934 KSP Residual norm 6.232590539545e-03 1935 KSP Residual norm 5.209143529906e-03 1936 KSP Residual norm 4.983723053720e-03 1937 KSP Residual norm 4.813332251178e-03 1938 KSP Residual norm 4.284521837285e-03 1939 KSP Residual norm 3.984519392867e-03 1940 KSP Residual norm 4.077399442331e-03 1941 KSP Residual norm 4.352043481077e-03 1942 KSP Residual norm 4.461858984941e-03 1943 KSP Residual norm 4.380420741836e-03 1944 KSP Residual norm 4.734130093898e-03 1945 KSP Residual norm 5.852860035227e-03 1946 KSP Residual norm 6.975950093485e-03 1947 KSP Residual norm 7.052942525567e-03 1948 KSP Residual norm 6.928412056660e-03 1949 KSP Residual norm 7.888872516775e-03 1950 KSP Residual norm 9.111899173159e-03 1951 KSP Residual norm 9.278983874453e-03 1952 KSP Residual norm 9.858700588481e-03 1953 KSP Residual norm 1.179287626505e-02 1954 KSP Residual norm 1.282807255394e-02 1955 KSP Residual norm 1.262679691404e-02 1956 KSP Residual norm 1.255775353882e-02 1957 KSP Residual norm 1.370183873868e-02 1958 KSP Residual norm 1.400213065512e-02 1959 KSP Residual norm 1.283815157619e-02 1960 KSP Residual norm 1.235915610084e-02 1961 KSP Residual norm 1.266753849066e-02 1962 KSP Residual norm 1.244201832218e-02 1963 KSP Residual norm 1.041024275825e-02 1964 KSP Residual norm 8.842758841380e-03 1965 KSP Residual norm 8.240799359774e-03 1966 KSP Residual norm 7.948646573032e-03 1967 KSP Residual norm 6.445879239365e-03 1968 KSP Residual norm 5.559004256048e-03 1969 KSP Residual norm 5.066960361645e-03 1970 KSP Residual norm 4.673923122700e-03 1971 KSP Residual norm 4.072119570099e-03 1972 KSP Residual norm 3.530170413031e-03 1973 KSP Residual norm 3.445412172218e-03 1974 KSP Residual norm 3.692808674938e-03 1975 KSP Residual norm 3.509296213417e-03 1976 KSP Residual norm 3.194284614736e-03 1977 KSP Residual norm 3.214085704621e-03 1978 KSP Residual norm 3.312784983574e-03 1979 KSP Residual norm 3.376878732161e-03 1980 KSP Residual norm 3.517881561328e-03 1981 KSP Residual norm 3.693201640814e-03 1982 KSP Residual norm 4.035821718953e-03 1983 KSP Residual norm 4.303835709655e-03 1984 KSP Residual norm 4.552984049764e-03 1985 KSP Residual norm 5.220254958434e-03 1986 KSP Residual norm 6.551890355271e-03 1987 KSP Residual norm 8.381415823956e-03 1988 KSP Residual norm 9.738560576716e-03 1989 KSP Residual norm 1.100926661348e-02 1990 KSP Residual norm 1.246532004316e-02 1991 KSP Residual norm 1.407235479874e-02 1992 KSP Residual norm 1.461033789890e-02 1993 KSP Residual norm 1.466527654767e-02 1994 KSP Residual norm 1.608542024596e-02 1995 KSP Residual norm 1.723354282800e-02 1996 KSP Residual norm 1.654429130126e-02 1997 KSP Residual norm 1.497588013766e-02 1998 KSP Residual norm 1.422318323008e-02 1999 KSP Residual norm 1.372816158785e-02 2000 KSP Residual norm 1.187907197624e-02 2001 KSP Residual norm 9.530673313997e-03 2002 KSP Residual norm 8.350842995280e-03 2003 KSP Residual norm 8.040856946159e-03 2004 KSP Residual norm 7.300758382425e-03 2005 KSP Residual norm 6.351812200229e-03 2006 KSP Residual norm 5.449815802557e-03 2007 KSP Residual norm 5.410388539106e-03 2008 KSP Residual norm 5.497381835543e-03 2009 KSP Residual norm 4.766470175888e-03 2010 KSP Residual norm 3.920445921370e-03 2011 KSP Residual norm 3.635909540975e-03 2012 KSP Residual norm 3.578050027117e-03 2013 KSP Residual norm 3.281862671151e-03 2014 KSP Residual norm 3.147248560594e-03 2015 KSP Residual norm 3.257930715380e-03 2016 KSP Residual norm 3.528434561439e-03 2017 KSP Residual norm 3.704069547408e-03 2018 KSP Residual norm 3.666226252482e-03 2019 KSP Residual norm 3.748786599519e-03 2020 KSP Residual norm 4.224789132151e-03 2021 KSP Residual norm 4.713188915244e-03 2022 KSP Residual norm 4.914611177949e-03 2023 KSP Residual norm 5.100616440787e-03 2024 KSP Residual norm 5.619478115716e-03 2025 KSP Residual norm 6.242762113305e-03 2026 KSP Residual norm 6.657735104471e-03 2027 KSP Residual norm 7.401660853611e-03 2028 KSP Residual norm 8.182706351312e-03 2029 KSP Residual norm 8.293373495531e-03 2030 KSP Residual norm 7.720806516677e-03 2031 KSP Residual norm 7.731863907682e-03 2032 KSP Residual norm 8.085902710517e-03 2033 KSP Residual norm 7.932570278344e-03 2034 KSP Residual norm 7.332322255068e-03 2035 KSP Residual norm 6.981249891929e-03 2036 KSP Residual norm 7.071360310580e-03 2037 KSP Residual norm 6.607769945073e-03 2038 KSP Residual norm 5.661103487318e-03 2039 KSP Residual norm 5.113391338547e-03 2040 KSP Residual norm 4.829759783025e-03 2041 KSP Residual norm 4.583432553280e-03 2042 KSP Residual norm 4.072399257332e-03 2043 KSP Residual norm 3.457746678394e-03 2044 KSP Residual norm 3.103218976754e-03 2045 KSP Residual norm 2.986630897216e-03 2046 KSP Residual norm 2.752096872884e-03 2047 KSP Residual norm 2.601932301287e-03 2048 KSP Residual norm 2.720727886614e-03 2049 KSP Residual norm 2.887925956684e-03 2050 KSP Residual norm 3.018447606692e-03 2051 KSP Residual norm 3.082987476405e-03 2052 KSP Residual norm 3.366759308843e-03 2053 KSP Residual norm 3.759993248183e-03 2054 KSP Residual norm 4.037275861301e-03 2055 KSP Residual norm 4.049762465846e-03 2056 KSP Residual norm 4.291473542844e-03 2057 KSP Residual norm 4.793912122751e-03 2058 KSP Residual norm 5.429296355667e-03 2059 KSP Residual norm 5.999137088175e-03 2060 KSP Residual norm 6.714532180763e-03 2061 KSP Residual norm 7.157076086637e-03 2062 KSP Residual norm 7.305423541139e-03 2063 KSP Residual norm 7.840616914368e-03 2064 KSP Residual norm 8.621338555374e-03 2065 KSP Residual norm 9.179826914789e-03 2066 KSP Residual norm 9.219040982171e-03 2067 KSP Residual norm 9.081038905495e-03 2068 KSP Residual norm 9.145498306064e-03 2069 KSP Residual norm 9.219909480688e-03 2070 KSP Residual norm 9.266003675083e-03 2071 KSP Residual norm 8.549392921627e-03 2072 KSP Residual norm 7.813939776317e-03 2073 KSP Residual norm 7.511934879918e-03 2074 KSP Residual norm 6.987487537767e-03 2075 KSP Residual norm 6.181082525053e-03 2076 KSP Residual norm 5.501067850667e-03 2077 KSP Residual norm 5.316939535299e-03 2078 KSP Residual norm 5.376044849232e-03 2079 KSP Residual norm 4.812349537692e-03 2080 KSP Residual norm 4.083001156083e-03 2081 KSP Residual norm 3.984095588508e-03 2082 KSP Residual norm 4.166602468433e-03 2083 KSP Residual norm 3.736388905897e-03 2084 KSP Residual norm 3.181344392970e-03 2085 KSP Residual norm 3.087472465346e-03 2086 KSP Residual norm 3.299309121667e-03 2087 KSP Residual norm 3.122460123169e-03 2088 KSP Residual norm 2.904036501409e-03 2089 KSP Residual norm 2.916197279864e-03 2090 KSP Residual norm 3.158694228775e-03 2091 KSP Residual norm 3.299142972491e-03 2092 KSP Residual norm 3.213974282697e-03 2093 KSP Residual norm 3.321177716622e-03 2094 KSP Residual norm 3.885809232016e-03 2095 KSP Residual norm 4.328927945493e-03 2096 KSP Residual norm 4.128316493644e-03 2097 KSP Residual norm 4.061668994145e-03 2098 KSP Residual norm 4.632615412637e-03 2099 KSP Residual norm 5.419201729535e-03 2100 KSP Residual norm 5.806306099173e-03 2101 KSP Residual norm 6.499985237890e-03 2102 KSP Residual norm 8.009977304793e-03 2103 KSP Residual norm 9.143990958236e-03 2104 KSP Residual norm 9.206698244420e-03 2105 KSP Residual norm 9.294092252086e-03 2106 KSP Residual norm 9.563695136136e-03 2107 KSP Residual norm 8.860454764156e-03 2108 KSP Residual norm 8.024764898450e-03 2109 KSP Residual norm 7.718856528597e-03 2110 KSP Residual norm 7.582562138927e-03 2111 KSP Residual norm 7.114006432207e-03 2112 KSP Residual norm 6.695573452676e-03 2113 KSP Residual norm 6.318059725508e-03 2114 KSP Residual norm 5.735068156662e-03 2115 KSP Residual norm 5.486371917502e-03 2116 KSP Residual norm 5.317941170076e-03 2117 KSP Residual norm 5.153691910605e-03 2118 KSP Residual norm 4.996322690960e-03 2119 KSP Residual norm 4.639298979665e-03 2120 KSP Residual norm 4.155770131347e-03 2121 KSP Residual norm 3.875657105043e-03 2122 KSP Residual norm 3.659569568406e-03 2123 KSP Residual norm 3.574986658392e-03 2124 KSP Residual norm 3.528766455188e-03 2125 KSP Residual norm 3.301499565247e-03 2126 KSP Residual norm 3.034514817403e-03 2127 KSP Residual norm 2.762297206424e-03 2128 KSP Residual norm 2.758080912531e-03 2129 KSP Residual norm 3.004429701481e-03 2130 KSP Residual norm 3.187509659594e-03 2131 KSP Residual norm 3.320572410868e-03 2132 KSP Residual norm 3.299541354205e-03 2133 KSP Residual norm 3.223831194932e-03 2134 KSP Residual norm 3.189967754497e-03 2135 KSP Residual norm 3.403427921578e-03 2136 KSP Residual norm 3.584610061763e-03 2137 KSP Residual norm 3.711454211957e-03 2138 KSP Residual norm 3.953840788285e-03 2139 KSP Residual norm 4.105451876925e-03 2140 KSP Residual norm 4.177676230502e-03 2141 KSP Residual norm 4.308532456220e-03 2142 KSP Residual norm 4.709470620439e-03 2143 KSP Residual norm 5.773774414679e-03 2144 KSP Residual norm 6.587720688920e-03 2145 KSP Residual norm 6.770095137610e-03 2146 KSP Residual norm 8.024962739428e-03 2147 KSP Residual norm 9.979006460618e-03 2148 KSP Residual norm 1.065598861617e-02 2149 KSP Residual norm 1.028467092583e-02 2150 KSP Residual norm 1.031263755519e-02 2151 KSP Residual norm 1.018699324856e-02 2152 KSP Residual norm 9.406281252560e-03 2153 KSP Residual norm 8.920330262389e-03 2154 KSP Residual norm 8.971791173694e-03 2155 KSP Residual norm 9.516663684974e-03 2156 KSP Residual norm 9.079106424707e-03 2157 KSP Residual norm 7.963741724505e-03 2158 KSP Residual norm 7.154186451308e-03 2159 KSP Residual norm 6.775668514411e-03 2160 KSP Residual norm 6.390988704684e-03 2161 KSP Residual norm 5.772587283237e-03 2162 KSP Residual norm 5.435320689840e-03 2163 KSP Residual norm 5.465545999965e-03 2164 KSP Residual norm 5.233001651176e-03 2165 KSP Residual norm 4.652810907220e-03 2166 KSP Residual norm 4.447632631418e-03 2167 KSP Residual norm 4.448594208194e-03 2168 KSP Residual norm 4.350817195271e-03 2169 KSP Residual norm 4.205294412744e-03 2170 KSP Residual norm 4.223189936177e-03 2171 KSP Residual norm 4.288138588316e-03 2172 KSP Residual norm 4.076768554048e-03 2173 KSP Residual norm 3.949251053219e-03 2174 KSP Residual norm 4.222998063808e-03 2175 KSP Residual norm 4.394186001931e-03 2176 KSP Residual norm 4.431744202567e-03 2177 KSP Residual norm 4.714660974315e-03 2178 KSP Residual norm 5.265011457330e-03 2179 KSP Residual norm 5.550076240279e-03 2180 KSP Residual norm 5.385488061286e-03 2181 KSP Residual norm 5.170849664168e-03 2182 KSP Residual norm 5.844721354479e-03 2183 KSP Residual norm 7.297130276539e-03 2184 KSP Residual norm 7.850736875997e-03 2185 KSP Residual norm 7.698468024820e-03 2186 KSP Residual norm 8.290259524435e-03 2187 KSP Residual norm 9.384539453648e-03 2188 KSP Residual norm 9.735129216493e-03 2189 KSP Residual norm 9.868657977979e-03 2190 KSP Residual norm 1.091551433912e-02 2191 KSP Residual norm 1.245981706657e-02 2192 KSP Residual norm 1.197197715594e-02 2193 KSP Residual norm 1.164069198550e-02 2194 KSP Residual norm 1.244648944014e-02 2195 KSP Residual norm 1.350572985483e-02 2196 KSP Residual norm 1.373395245921e-02 2197 KSP Residual norm 1.328422543346e-02 2198 KSP Residual norm 1.223399174464e-02 2199 KSP Residual norm 1.162356128188e-02 2200 KSP Residual norm 1.066884095926e-02 2201 KSP Residual norm 9.112010736212e-03 2202 KSP Residual norm 8.422796463594e-03 2203 KSP Residual norm 8.383002149071e-03 2204 KSP Residual norm 7.824083280895e-03 2205 KSP Residual norm 7.034107064892e-03 2206 KSP Residual norm 6.406017848159e-03 2207 KSP Residual norm 5.924109479508e-03 2208 KSP Residual norm 5.357465387764e-03 2209 KSP Residual norm 4.984026665317e-03 2210 KSP Residual norm 4.746244849070e-03 2211 KSP Residual norm 4.376851720421e-03 2212 KSP Residual norm 3.720427440057e-03 2213 KSP Residual norm 3.146408949809e-03 2214 KSP Residual norm 2.890006160957e-03 2215 KSP Residual norm 2.756658730239e-03 2216 KSP Residual norm 2.653215495302e-03 2217 KSP Residual norm 2.650206256948e-03 2218 KSP Residual norm 2.721887896256e-03 2219 KSP Residual norm 2.684622175643e-03 2220 KSP Residual norm 2.488504584793e-03 2221 KSP Residual norm 2.386959412109e-03 2222 KSP Residual norm 2.371450010664e-03 2223 KSP Residual norm 2.499419977970e-03 2224 KSP Residual norm 2.693033707163e-03 2225 KSP Residual norm 2.860677622571e-03 2226 KSP Residual norm 2.952863582608e-03 2227 KSP Residual norm 2.988598797066e-03 2228 KSP Residual norm 2.890748879091e-03 2229 KSP Residual norm 2.868441200643e-03 2230 KSP Residual norm 2.991111970293e-03 2231 KSP Residual norm 3.172985514033e-03 2232 KSP Residual norm 3.292606908435e-03 2233 KSP Residual norm 3.392155811553e-03 2234 KSP Residual norm 3.535230701617e-03 2235 KSP Residual norm 3.959373637352e-03 2236 KSP Residual norm 4.565212140205e-03 2237 KSP Residual norm 4.821196987496e-03 2238 KSP Residual norm 4.922415050281e-03 2239 KSP Residual norm 5.072974539554e-03 2240 KSP Residual norm 5.263813883195e-03 2241 KSP Residual norm 5.426306929832e-03 2242 KSP Residual norm 6.120664810580e-03 2243 KSP Residual norm 7.159099015502e-03 2244 KSP Residual norm 7.409615555114e-03 2245 KSP Residual norm 7.513554331491e-03 2246 KSP Residual norm 8.000313332997e-03 2247 KSP Residual norm 8.774490252024e-03 2248 KSP Residual norm 9.039336379211e-03 2249 KSP Residual norm 9.416574883628e-03 2250 KSP Residual norm 1.045552940291e-02 2251 KSP Residual norm 1.095124765772e-02 2252 KSP Residual norm 1.091067731545e-02 2253 KSP Residual norm 1.072798400570e-02 2254 KSP Residual norm 1.053330165208e-02 2255 KSP Residual norm 1.087027560563e-02 2256 KSP Residual norm 1.094340292209e-02 2257 KSP Residual norm 9.944151818544e-03 2258 KSP Residual norm 9.524326498058e-03 2259 KSP Residual norm 9.857883498125e-03 2260 KSP Residual norm 9.999241781813e-03 2261 KSP Residual norm 8.917422673018e-03 2262 KSP Residual norm 8.019990947943e-03 2263 KSP Residual norm 7.923931148596e-03 2264 KSP Residual norm 7.630228933650e-03 2265 KSP Residual norm 6.890681148470e-03 2266 KSP Residual norm 6.397314247109e-03 2267 KSP Residual norm 6.076721021358e-03 2268 KSP Residual norm 5.769028879956e-03 2269 KSP Residual norm 5.336649542499e-03 2270 KSP Residual norm 4.881407930542e-03 2271 KSP Residual norm 4.577405831613e-03 2272 KSP Residual norm 4.213415851037e-03 2273 KSP Residual norm 3.775766616320e-03 2274 KSP Residual norm 3.404244489367e-03 2275 KSP Residual norm 3.261894222734e-03 2276 KSP Residual norm 3.100577799789e-03 2277 KSP Residual norm 2.763231017518e-03 2278 KSP Residual norm 2.567503732718e-03 2279 KSP Residual norm 2.629261305111e-03 2280 KSP Residual norm 2.583372462179e-03 2281 KSP Residual norm 2.257492720860e-03 2282 KSP Residual norm 2.168975357396e-03 2283 KSP Residual norm 2.355179931970e-03 2284 KSP Residual norm 2.492067164151e-03 2285 KSP Residual norm 2.456665907090e-03 2286 KSP Residual norm 2.437810288837e-03 2287 KSP Residual norm 2.740218703631e-03 2288 KSP Residual norm 2.946953069909e-03 2289 KSP Residual norm 2.787872921852e-03 2290 KSP Residual norm 2.725536734613e-03 2291 KSP Residual norm 3.033412099267e-03 2292 KSP Residual norm 3.272583682207e-03 2293 KSP Residual norm 3.204235568301e-03 2294 KSP Residual norm 3.421108353676e-03 2295 KSP Residual norm 3.971217627917e-03 2296 KSP Residual norm 4.128587634961e-03 2297 KSP Residual norm 4.298336760100e-03 2298 KSP Residual norm 4.722062357837e-03 2299 KSP Residual norm 5.697560142610e-03 2300 KSP Residual norm 5.968990862527e-03 2301 KSP Residual norm 5.826561794337e-03 2302 KSP Residual norm 6.359427330795e-03 2303 KSP Residual norm 7.376375276938e-03 2304 KSP Residual norm 7.922245439289e-03 2305 KSP Residual norm 8.028445372328e-03 2306 KSP Residual norm 7.699912691885e-03 2307 KSP Residual norm 7.587940128727e-03 2308 KSP Residual norm 7.960298833559e-03 2309 KSP Residual norm 8.461080082996e-03 2310 KSP Residual norm 9.332967294487e-03 2311 KSP Residual norm 1.057501275364e-02 2312 KSP Residual norm 1.151537427737e-02 2313 KSP Residual norm 1.208983731323e-02 2314 KSP Residual norm 1.270610053976e-02 2315 KSP Residual norm 1.401458135679e-02 2316 KSP Residual norm 1.465214520382e-02 2317 KSP Residual norm 1.486866786002e-02 2318 KSP Residual norm 1.560342833672e-02 2319 KSP Residual norm 1.699230182908e-02 2320 KSP Residual norm 1.811264580263e-02 2321 KSP Residual norm 1.771933729362e-02 2322 KSP Residual norm 1.820507682733e-02 2323 KSP Residual norm 1.790918445477e-02 2324 KSP Residual norm 1.672804542537e-02 2325 KSP Residual norm 1.552464898903e-02 2326 KSP Residual norm 1.542876923443e-02 2327 KSP Residual norm 1.452627185034e-02 2328 KSP Residual norm 1.355062176346e-02 2329 KSP Residual norm 1.260040776986e-02 2330 KSP Residual norm 1.229162073681e-02 2331 KSP Residual norm 1.228076319119e-02 2332 KSP Residual norm 1.154472090508e-02 2333 KSP Residual norm 1.048177215151e-02 2334 KSP Residual norm 9.744028166372e-03 2335 KSP Residual norm 9.596511059772e-03 2336 KSP Residual norm 9.486410482497e-03 2337 KSP Residual norm 8.745430932840e-03 2338 KSP Residual norm 8.210290136575e-03 2339 KSP Residual norm 7.839236326358e-03 2340 KSP Residual norm 7.194744679873e-03 2341 KSP Residual norm 6.571357371524e-03 2342 KSP Residual norm 6.106849801378e-03 2343 KSP Residual norm 6.022124082466e-03 2344 KSP Residual norm 5.880372351873e-03 2345 KSP Residual norm 5.944846889075e-03 2346 KSP Residual norm 5.961756628698e-03 2347 KSP Residual norm 5.826538667703e-03 2348 KSP Residual norm 5.760630918210e-03 2349 KSP Residual norm 5.492884703535e-03 2350 KSP Residual norm 5.410281952695e-03 2351 KSP Residual norm 5.494273029143e-03 2352 KSP Residual norm 5.303777160006e-03 2353 KSP Residual norm 5.299306677113e-03 2354 KSP Residual norm 5.593943123853e-03 2355 KSP Residual norm 5.947341230891e-03 2356 KSP Residual norm 6.248050330786e-03 2357 KSP Residual norm 6.672418445200e-03 2358 KSP Residual norm 6.840759883722e-03 2359 KSP Residual norm 7.311645521061e-03 2360 KSP Residual norm 8.528165651140e-03 2361 KSP Residual norm 9.359232085623e-03 2362 KSP Residual norm 8.997214443045e-03 2363 KSP Residual norm 9.151512001640e-03 2364 KSP Residual norm 9.814919143160e-03 2365 KSP Residual norm 9.473961467521e-03 2366 KSP Residual norm 8.916794002765e-03 2367 KSP Residual norm 9.580099819334e-03 2368 KSP Residual norm 1.092803287468e-02 2369 KSP Residual norm 1.177066696453e-02 2370 KSP Residual norm 1.233610734730e-02 2371 KSP Residual norm 1.349596697859e-02 2372 KSP Residual norm 1.422597580577e-02 2373 KSP Residual norm 1.394041333853e-02 2374 KSP Residual norm 1.324928418055e-02 2375 KSP Residual norm 1.396539289495e-02 2376 KSP Residual norm 1.514717693073e-02 2377 KSP Residual norm 1.451979579451e-02 2378 KSP Residual norm 1.264633392368e-02 2379 KSP Residual norm 1.110823036001e-02 2380 KSP Residual norm 1.006475915249e-02 2381 KSP Residual norm 9.365164277252e-03 2382 KSP Residual norm 9.045684726161e-03 2383 KSP Residual norm 8.579824342906e-03 2384 KSP Residual norm 8.162256717764e-03 2385 KSP Residual norm 7.491664866784e-03 2386 KSP Residual norm 7.139888523418e-03 2387 KSP Residual norm 7.212352332472e-03 2388 KSP Residual norm 7.129900453900e-03 2389 KSP Residual norm 6.429884715097e-03 2390 KSP Residual norm 6.118678763045e-03 2391 KSP Residual norm 6.653860543467e-03 2392 KSP Residual norm 6.582069714587e-03 2393 KSP Residual norm 5.762185306889e-03 2394 KSP Residual norm 5.456077407301e-03 2395 KSP Residual norm 5.436151997470e-03 2396 KSP Residual norm 5.019341766606e-03 2397 KSP Residual norm 4.700956991695e-03 2398 KSP Residual norm 4.668023157564e-03 2399 KSP Residual norm 4.501515700463e-03 2400 KSP Residual norm 4.197022164665e-03 2401 KSP Residual norm 3.817706011035e-03 2402 KSP Residual norm 3.664318386502e-03 2403 KSP Residual norm 3.767331470893e-03 2404 KSP Residual norm 3.850171435180e-03 2405 KSP Residual norm 3.890255062260e-03 2406 KSP Residual norm 4.075339454006e-03 2407 KSP Residual norm 4.343737239892e-03 2408 KSP Residual norm 4.331519772355e-03 2409 KSP Residual norm 4.268280541827e-03 2410 KSP Residual norm 4.464476774624e-03 2411 KSP Residual norm 4.943672287012e-03 2412 KSP Residual norm 5.317709547685e-03 2413 KSP Residual norm 5.459401763318e-03 2414 KSP Residual norm 5.646754403152e-03 2415 KSP Residual norm 5.774969211463e-03 2416 KSP Residual norm 5.708560645311e-03 2417 KSP Residual norm 5.674091351982e-03 2418 KSP Residual norm 5.854876560222e-03 2419 KSP Residual norm 6.260401374731e-03 2420 KSP Residual norm 6.621278399721e-03 2421 KSP Residual norm 7.282592497201e-03 2422 KSP Residual norm 8.093100105437e-03 2423 KSP Residual norm 8.303737134056e-03 2424 KSP Residual norm 8.401936399922e-03 2425 KSP Residual norm 8.811703568770e-03 2426 KSP Residual norm 8.942527065268e-03 2427 KSP Residual norm 9.454131455613e-03 2428 KSP Residual norm 1.056739724498e-02 2429 KSP Residual norm 1.131059497836e-02 2430 KSP Residual norm 1.135639791659e-02 2431 KSP Residual norm 1.205126873730e-02 2432 KSP Residual norm 1.262495951635e-02 2433 KSP Residual norm 1.303848674425e-02 2434 KSP Residual norm 1.301252885924e-02 2435 KSP Residual norm 1.377500347167e-02 2436 KSP Residual norm 1.429601251787e-02 2437 KSP Residual norm 1.392106005918e-02 2438 KSP Residual norm 1.360317743723e-02 2439 KSP Residual norm 1.426602654522e-02 2440 KSP Residual norm 1.510382328407e-02 2441 KSP Residual norm 1.473928783444e-02 2442 KSP Residual norm 1.380171657314e-02 2443 KSP Residual norm 1.268300260363e-02 2444 KSP Residual norm 1.255281825396e-02 2445 KSP Residual norm 1.268661415517e-02 2446 KSP Residual norm 1.295737164497e-02 2447 KSP Residual norm 1.355555123857e-02 2448 KSP Residual norm 1.355735990527e-02 2449 KSP Residual norm 1.336200081031e-02 2450 KSP Residual norm 1.252572155735e-02 2451 KSP Residual norm 1.131979365474e-02 2452 KSP Residual norm 1.100821509650e-02 2453 KSP Residual norm 1.087773496952e-02 2454 KSP Residual norm 1.013125318378e-02 2455 KSP Residual norm 9.805815814668e-03 2456 KSP Residual norm 1.004451530965e-02 2457 KSP Residual norm 9.575804427701e-03 2458 KSP Residual norm 8.652734196563e-03 2459 KSP Residual norm 8.163438150136e-03 2460 KSP Residual norm 8.041074727141e-03 2461 KSP Residual norm 7.346137486217e-03 2462 KSP Residual norm 6.565328169772e-03 2463 KSP Residual norm 6.495238995987e-03 2464 KSP Residual norm 6.543818246286e-03 2465 KSP Residual norm 5.844422743171e-03 2466 KSP Residual norm 5.161768107683e-03 2467 KSP Residual norm 4.998513002561e-03 2468 KSP Residual norm 5.023178132147e-03 2469 KSP Residual norm 4.955750454620e-03 2470 KSP Residual norm 4.914149665942e-03 2471 KSP Residual norm 5.082266464149e-03 2472 KSP Residual norm 5.226795434602e-03 2473 KSP Residual norm 5.083034164029e-03 2474 KSP Residual norm 4.786180670349e-03 2475 KSP Residual norm 4.723714363499e-03 2476 KSP Residual norm 4.706592527681e-03 2477 KSP Residual norm 4.400559829879e-03 2478 KSP Residual norm 4.111435196105e-03 2479 KSP Residual norm 4.337075346385e-03 2480 KSP Residual norm 4.945178145864e-03 2481 KSP Residual norm 5.092298303078e-03 2482 KSP Residual norm 4.784661524695e-03 2483 KSP Residual norm 4.919441824071e-03 2484 KSP Residual norm 5.886707767174e-03 2485 KSP Residual norm 6.150378407138e-03 2486 KSP Residual norm 6.227021440015e-03 2487 KSP Residual norm 6.745898636979e-03 2488 KSP Residual norm 7.047764964447e-03 2489 KSP Residual norm 6.892213427333e-03 2490 KSP Residual norm 7.331046814200e-03 2491 KSP Residual norm 8.227580999757e-03 2492 KSP Residual norm 9.024133996447e-03 2493 KSP Residual norm 9.058584392188e-03 2494 KSP Residual norm 9.224694390954e-03 2495 KSP Residual norm 9.945749770545e-03 2496 KSP Residual norm 1.027578826375e-02 2497 KSP Residual norm 1.025977864628e-02 2498 KSP Residual norm 1.047135323925e-02 2499 KSP Residual norm 1.128179504584e-02 2500 KSP Residual norm 1.163186215290e-02 2501 KSP Residual norm 1.180260491874e-02 2502 KSP Residual norm 1.228759810586e-02 2503 KSP Residual norm 1.261920600272e-02 2504 KSP Residual norm 1.246019870243e-02 2505 KSP Residual norm 1.259940043368e-02 2506 KSP Residual norm 1.235165793218e-02 2507 KSP Residual norm 1.155175172210e-02 2508 KSP Residual norm 1.109419638823e-02 2509 KSP Residual norm 1.112496659786e-02 2510 KSP Residual norm 1.105540691914e-02 2511 KSP Residual norm 1.118139652667e-02 2512 KSP Residual norm 1.045814729055e-02 2513 KSP Residual norm 9.565959357082e-03 2514 KSP Residual norm 9.665109194128e-03 2515 KSP Residual norm 1.033527019297e-02 2516 KSP Residual norm 1.020689592293e-02 2517 KSP Residual norm 9.411272897569e-03 2518 KSP Residual norm 8.552625261265e-03 2519 KSP Residual norm 8.020197971840e-03 2520 KSP Residual norm 8.009439687359e-03 2521 KSP Residual norm 8.357946587350e-03 2522 KSP Residual norm 8.362529612484e-03 2523 KSP Residual norm 8.114648870068e-03 2524 KSP Residual norm 7.807525075191e-03 2525 KSP Residual norm 7.818755683000e-03 2526 KSP Residual norm 7.290236106522e-03 2527 KSP Residual norm 6.766378491383e-03 2528 KSP Residual norm 6.419448174050e-03 2529 KSP Residual norm 6.016044262508e-03 2530 KSP Residual norm 5.457758626948e-03 2531 KSP Residual norm 5.225608365999e-03 2532 KSP Residual norm 5.140382521091e-03 2533 KSP Residual norm 4.847419115490e-03 2534 KSP Residual norm 4.344336303964e-03 2535 KSP Residual norm 3.953836922469e-03 2536 KSP Residual norm 3.759166852375e-03 2537 KSP Residual norm 3.465317723701e-03 2538 KSP Residual norm 3.376037501877e-03 2539 KSP Residual norm 3.465844879655e-03 2540 KSP Residual norm 3.188246161516e-03 2541 KSP Residual norm 2.806258819422e-03 2542 KSP Residual norm 2.574138120822e-03 2543 KSP Residual norm 2.400153103637e-03 2544 KSP Residual norm 2.256897887269e-03 2545 KSP Residual norm 2.172604881480e-03 2546 KSP Residual norm 2.119275275677e-03 2547 KSP Residual norm 1.934906777052e-03 2548 KSP Residual norm 1.887190889666e-03 2549 KSP Residual norm 2.005042240433e-03 2550 KSP Residual norm 2.003622964132e-03 2551 KSP Residual norm 2.017525563253e-03 2552 KSP Residual norm 2.120766630389e-03 2553 KSP Residual norm 2.030221752738e-03 2554 KSP Residual norm 1.822857717387e-03 2555 KSP Residual norm 1.858280486180e-03 2556 KSP Residual norm 2.066510567964e-03 2557 KSP Residual norm 2.082650460288e-03 2558 KSP Residual norm 1.942917568133e-03 2559 KSP Residual norm 1.939991779578e-03 2560 KSP Residual norm 1.964283335486e-03 2561 KSP Residual norm 1.952533731378e-03 2562 KSP Residual norm 1.942083965188e-03 2563 KSP Residual norm 2.056157903246e-03 2564 KSP Residual norm 2.194497253377e-03 2565 KSP Residual norm 2.371419632789e-03 2566 KSP Residual norm 2.518347065455e-03 2567 KSP Residual norm 2.701302247258e-03 2568 KSP Residual norm 2.795331332162e-03 2569 KSP Residual norm 2.670469528493e-03 2570 KSP Residual norm 2.486617843393e-03 2571 KSP Residual norm 2.490357246694e-03 2572 KSP Residual norm 2.644629342573e-03 2573 KSP Residual norm 2.717135486971e-03 2574 KSP Residual norm 2.817669991730e-03 2575 KSP Residual norm 3.134078794211e-03 2576 KSP Residual norm 3.439559563310e-03 2577 KSP Residual norm 3.525304337067e-03 2578 KSP Residual norm 3.848407219676e-03 2579 KSP Residual norm 4.172120908185e-03 2580 KSP Residual norm 4.147151486596e-03 2581 KSP Residual norm 4.244007764660e-03 2582 KSP Residual norm 4.653581934327e-03 2583 KSP Residual norm 5.337753974150e-03 2584 KSP Residual norm 5.579099169522e-03 2585 KSP Residual norm 5.293906845109e-03 2586 KSP Residual norm 5.188560162742e-03 2587 KSP Residual norm 5.434809181416e-03 2588 KSP Residual norm 5.777278090177e-03 2589 KSP Residual norm 5.798176790722e-03 2590 KSP Residual norm 6.013124695274e-03 2591 KSP Residual norm 6.449577667500e-03 2592 KSP Residual norm 6.434569035132e-03 2593 KSP Residual norm 6.647043507976e-03 2594 KSP Residual norm 6.766497340129e-03 2595 KSP Residual norm 6.586268634271e-03 2596 KSP Residual norm 6.362863368277e-03 2597 KSP Residual norm 6.972034378214e-03 2598 KSP Residual norm 7.666433212361e-03 2599 KSP Residual norm 7.030591364164e-03 2600 KSP Residual norm 6.261952143498e-03 2601 KSP Residual norm 6.334862749894e-03 2602 KSP Residual norm 6.318795570805e-03 2603 KSP Residual norm 6.172005139057e-03 2604 KSP Residual norm 6.471332821120e-03 2605 KSP Residual norm 6.783784346312e-03 2606 KSP Residual norm 6.416069229820e-03 2607 KSP Residual norm 6.007075002044e-03 2608 KSP Residual norm 5.893058127475e-03 2609 KSP Residual norm 5.683576052440e-03 2610 KSP Residual norm 5.408949172070e-03 2611 KSP Residual norm 5.115837579870e-03 2612 KSP Residual norm 4.845334779024e-03 2613 KSP Residual norm 4.544017248392e-03 2614 KSP Residual norm 4.322561110461e-03 2615 KSP Residual norm 4.051869859407e-03 2616 KSP Residual norm 3.633203941098e-03 2617 KSP Residual norm 3.730359285905e-03 2618 KSP Residual norm 3.953531389059e-03 2619 KSP Residual norm 3.546282614388e-03 2620 KSP Residual norm 2.946557927049e-03 2621 KSP Residual norm 2.744013009653e-03 2622 KSP Residual norm 2.797304890528e-03 2623 KSP Residual norm 2.776063401053e-03 2624 KSP Residual norm 2.666853252490e-03 2625 KSP Residual norm 2.570553908970e-03 2626 KSP Residual norm 2.418137297250e-03 2627 KSP Residual norm 2.342030722541e-03 2628 KSP Residual norm 2.384143106336e-03 2629 KSP Residual norm 2.331581495599e-03 2630 KSP Residual norm 2.118101736682e-03 2631 KSP Residual norm 1.966008740335e-03 2632 KSP Residual norm 1.929165865116e-03 2633 KSP Residual norm 1.858383094757e-03 2634 KSP Residual norm 1.840139851182e-03 2635 KSP Residual norm 1.730583423718e-03 2636 KSP Residual norm 1.576364084886e-03 2637 KSP Residual norm 1.582887351903e-03 2638 KSP Residual norm 1.615390108123e-03 2639 KSP Residual norm 1.430278769497e-03 2640 KSP Residual norm 1.309774816050e-03 2641 KSP Residual norm 1.307817759557e-03 2642 KSP Residual norm 1.213478325933e-03 2643 KSP Residual norm 1.083290090130e-03 2644 KSP Residual norm 1.085094157745e-03 2645 KSP Residual norm 1.149726040631e-03 2646 KSP Residual norm 1.111283849541e-03 2647 KSP Residual norm 9.855745611789e-04 2648 KSP Residual norm 9.046436388309e-04 2649 KSP Residual norm 8.680357480033e-04 2650 KSP Residual norm 8.813872816119e-04 2651 KSP Residual norm 9.193231486463e-04 2652 KSP Residual norm 9.709608864798e-04 2653 KSP Residual norm 9.548075389208e-04 2654 KSP Residual norm 8.900324621046e-04 2655 KSP Residual norm 8.725312938322e-04 2656 KSP Residual norm 9.082437403545e-04 2657 KSP Residual norm 9.385644497781e-04 2658 KSP Residual norm 9.624593688495e-04 2659 KSP Residual norm 1.012088976398e-03 2660 KSP Residual norm 1.011955489881e-03 2661 KSP Residual norm 9.749066420987e-04 2662 KSP Residual norm 9.855223442161e-04 2663 KSP Residual norm 1.059256654704e-03 2664 KSP Residual norm 1.092440140365e-03 2665 KSP Residual norm 1.090564467516e-03 2666 KSP Residual norm 1.065117419279e-03 2667 KSP Residual norm 1.108393848916e-03 2668 KSP Residual norm 1.210653186918e-03 2669 KSP Residual norm 1.256173308257e-03 2670 KSP Residual norm 1.234334641031e-03 2671 KSP Residual norm 1.310454239223e-03 2672 KSP Residual norm 1.416780503419e-03 2673 KSP Residual norm 1.422885149903e-03 2674 KSP Residual norm 1.388075037904e-03 2675 KSP Residual norm 1.366737962308e-03 2676 KSP Residual norm 1.379201938358e-03 2677 KSP Residual norm 1.351799042850e-03 2678 KSP Residual norm 1.475734405630e-03 2679 KSP Residual norm 1.713326884038e-03 2680 KSP Residual norm 1.828599164666e-03 2681 KSP Residual norm 1.857615319113e-03 2682 KSP Residual norm 1.988256942276e-03 2683 KSP Residual norm 2.095755035175e-03 2684 KSP Residual norm 2.175776595377e-03 2685 KSP Residual norm 2.437862726306e-03 2686 KSP Residual norm 2.872368978122e-03 2687 KSP Residual norm 2.930121135955e-03 2688 KSP Residual norm 2.785150249971e-03 2689 KSP Residual norm 2.926861935826e-03 2690 KSP Residual norm 3.256586448558e-03 2691 KSP Residual norm 3.394495732335e-03 2692 KSP Residual norm 3.676170240234e-03 2693 KSP Residual norm 4.464070401612e-03 2694 KSP Residual norm 4.925003235818e-03 2695 KSP Residual norm 4.657972593757e-03 2696 KSP Residual norm 4.520776031390e-03 2697 KSP Residual norm 4.826169301751e-03 2698 KSP Residual norm 4.705226785431e-03 2699 KSP Residual norm 4.845834495802e-03 2700 KSP Residual norm 5.450568165750e-03 2701 KSP Residual norm 5.405685886439e-03 2702 KSP Residual norm 5.064404720592e-03 2703 KSP Residual norm 5.159041773853e-03 2704 KSP Residual norm 5.396253694145e-03 2705 KSP Residual norm 5.087777378594e-03 2706 KSP Residual norm 4.416701744863e-03 2707 KSP Residual norm 4.283464780264e-03 2708 KSP Residual norm 4.227776586809e-03 2709 KSP Residual norm 3.816430510896e-03 2710 KSP Residual norm 3.507188080857e-03 2711 KSP Residual norm 3.664141787644e-03 2712 KSP Residual norm 3.882078097164e-03 2713 KSP Residual norm 3.558005800540e-03 2714 KSP Residual norm 3.362556333894e-03 2715 KSP Residual norm 3.381412266198e-03 2716 KSP Residual norm 3.272343545117e-03 2717 KSP Residual norm 3.098166203496e-03 2718 KSP Residual norm 2.918170899895e-03 2719 KSP Residual norm 2.805589012223e-03 2720 KSP Residual norm 2.617461682809e-03 2721 KSP Residual norm 2.498203821414e-03 2722 KSP Residual norm 2.555703948928e-03 2723 KSP Residual norm 2.704929910323e-03 2724 KSP Residual norm 2.704804499672e-03 2725 KSP Residual norm 2.422352305371e-03 2726 KSP Residual norm 2.146010215001e-03 2727 KSP Residual norm 2.157191475484e-03 2728 KSP Residual norm 2.145336994135e-03 2729 KSP Residual norm 2.086378289458e-03 2730 KSP Residual norm 2.229778840444e-03 2731 KSP Residual norm 2.535817164615e-03 2732 KSP Residual norm 2.632680974580e-03 2733 KSP Residual norm 2.488486653901e-03 2734 KSP Residual norm 2.374380582123e-03 2735 KSP Residual norm 2.430738511771e-03 2736 KSP Residual norm 2.420248713523e-03 2737 KSP Residual norm 2.302847190148e-03 2738 KSP Residual norm 2.254438668835e-03 2739 KSP Residual norm 2.206384051676e-03 2740 KSP Residual norm 2.026611506674e-03 2741 KSP Residual norm 1.863862978796e-03 2742 KSP Residual norm 1.791918577784e-03 2743 KSP Residual norm 1.589158171238e-03 2744 KSP Residual norm 1.448351414910e-03 2745 KSP Residual norm 1.462011618822e-03 2746 KSP Residual norm 1.546262244036e-03 2747 KSP Residual norm 1.457974955196e-03 2748 KSP Residual norm 1.358665370080e-03 2749 KSP Residual norm 1.485787041264e-03 2750 KSP Residual norm 1.649416402076e-03 2751 KSP Residual norm 1.520063285568e-03 2752 KSP Residual norm 1.394521339503e-03 2753 KSP Residual norm 1.542474992449e-03 2754 KSP Residual norm 1.617270768548e-03 2755 KSP Residual norm 1.484129864690e-03 2756 KSP Residual norm 1.489982256038e-03 2757 KSP Residual norm 1.634619136561e-03 2758 KSP Residual norm 1.576841752283e-03 2759 KSP Residual norm 1.484849375883e-03 2760 KSP Residual norm 1.588611441559e-03 2761 KSP Residual norm 1.788271668052e-03 2762 KSP Residual norm 1.838564643069e-03 2763 KSP Residual norm 1.956549529214e-03 2764 KSP Residual norm 2.153577292642e-03 2765 KSP Residual norm 2.158929997251e-03 2766 KSP Residual norm 2.004385674663e-03 2767 KSP Residual norm 2.025979711910e-03 2768 KSP Residual norm 2.272282762743e-03 2769 KSP Residual norm 2.482726499242e-03 2770 KSP Residual norm 2.703743415817e-03 2771 KSP Residual norm 3.014984680178e-03 2772 KSP Residual norm 3.362472240067e-03 2773 KSP Residual norm 3.569907893456e-03 2774 KSP Residual norm 3.793544712662e-03 2775 KSP Residual norm 3.971139276993e-03 2776 KSP Residual norm 4.087569126960e-03 2777 KSP Residual norm 4.363456089764e-03 2778 KSP Residual norm 4.625571421248e-03 2779 KSP Residual norm 4.624976548773e-03 2780 KSP Residual norm 4.718849386844e-03 2781 KSP Residual norm 4.786949244693e-03 2782 KSP Residual norm 4.771666421559e-03 2783 KSP Residual norm 5.104231978217e-03 2784 KSP Residual norm 5.734907457343e-03 2785 KSP Residual norm 5.764605173383e-03 2786 KSP Residual norm 5.426953091590e-03 2787 KSP Residual norm 5.580034553178e-03 2788 KSP Residual norm 5.949693184172e-03 2789 KSP Residual norm 5.996671739248e-03 2790 KSP Residual norm 6.157706620644e-03 2791 KSP Residual norm 6.848408423874e-03 2792 KSP Residual norm 7.066338896982e-03 2793 KSP Residual norm 6.587699824354e-03 2794 KSP Residual norm 6.594254657823e-03 2795 KSP Residual norm 7.284894664177e-03 2796 KSP Residual norm 7.388794406292e-03 2797 KSP Residual norm 6.742009621402e-03 2798 KSP Residual norm 6.669693683210e-03 2799 KSP Residual norm 6.280792363793e-03 2800 KSP Residual norm 5.548612251379e-03 2801 KSP Residual norm 5.358466425664e-03 2802 KSP Residual norm 5.617501994186e-03 2803 KSP Residual norm 5.319539313742e-03 2804 KSP Residual norm 4.781893806291e-03 2805 KSP Residual norm 4.741739705150e-03 2806 KSP Residual norm 4.981281036051e-03 2807 KSP Residual norm 4.723399945724e-03 2808 KSP Residual norm 4.341339983627e-03 2809 KSP Residual norm 4.651093018266e-03 2810 KSP Residual norm 5.040994656518e-03 2811 KSP Residual norm 4.479552064666e-03 2812 KSP Residual norm 3.830957288486e-03 2813 KSP Residual norm 3.533215940491e-03 2814 KSP Residual norm 3.438896124903e-03 2815 KSP Residual norm 3.205565080589e-03 2816 KSP Residual norm 3.178743406165e-03 2817 KSP Residual norm 3.240179887624e-03 2818 KSP Residual norm 3.158195637160e-03 2819 KSP Residual norm 2.892701103103e-03 2820 KSP Residual norm 2.797142641246e-03 2821 KSP Residual norm 2.779326882773e-03 2822 KSP Residual norm 2.617507251360e-03 2823 KSP Residual norm 2.350097638705e-03 2824 KSP Residual norm 2.096293170896e-03 2825 KSP Residual norm 1.948274586444e-03 2826 KSP Residual norm 1.807042821225e-03 2827 KSP Residual norm 1.738739744696e-03 2828 KSP Residual norm 1.678668895603e-03 2829 KSP Residual norm 1.773884200509e-03 2830 KSP Residual norm 1.836552583922e-03 2831 KSP Residual norm 1.737215267139e-03 2832 KSP Residual norm 1.598291536644e-03 2833 KSP Residual norm 1.562991290880e-03 2834 KSP Residual norm 1.467456550119e-03 2835 KSP Residual norm 1.294963690664e-03 2836 KSP Residual norm 1.237019430877e-03 2837 KSP Residual norm 1.312206291645e-03 2838 KSP Residual norm 1.291349637627e-03 2839 KSP Residual norm 1.160335094663e-03 2840 KSP Residual norm 1.063879454608e-03 2841 KSP Residual norm 1.034625807636e-03 2842 KSP Residual norm 9.770485830404e-04 2843 KSP Residual norm 9.374445013353e-04 2844 KSP Residual norm 9.902724864709e-04 2845 KSP Residual norm 1.036533344318e-03 2846 KSP Residual norm 1.079569282831e-03 2847 KSP Residual norm 1.226883463673e-03 2848 KSP Residual norm 1.430574675211e-03 2849 KSP Residual norm 1.354421865633e-03 2850 KSP Residual norm 1.231373694932e-03 2851 KSP Residual norm 1.175397745394e-03 2852 KSP Residual norm 1.070148692108e-03 2853 KSP Residual norm 1.015922741370e-03 2854 KSP Residual norm 1.103931648087e-03 2855 KSP Residual norm 1.171540606755e-03 2856 KSP Residual norm 1.088958128808e-03 2857 KSP Residual norm 1.083246789518e-03 2858 KSP Residual norm 1.113855493690e-03 2859 KSP Residual norm 1.110085190246e-03 2860 KSP Residual norm 1.142143272029e-03 2861 KSP Residual norm 1.193743390105e-03 2862 KSP Residual norm 1.131546392426e-03 2863 KSP Residual norm 1.127578239592e-03 2864 KSP Residual norm 1.271221238619e-03 2865 KSP Residual norm 1.359804747599e-03 2866 KSP Residual norm 1.289379864152e-03 2867 KSP Residual norm 1.321333706950e-03 2868 KSP Residual norm 1.544458941025e-03 2869 KSP Residual norm 1.563976433107e-03 2870 KSP Residual norm 1.399474385920e-03 2871 KSP Residual norm 1.356252805981e-03 2872 KSP Residual norm 1.443852116959e-03 2873 KSP Residual norm 1.356990907946e-03 2874 KSP Residual norm 1.285187356557e-03 2875 KSP Residual norm 1.430496822359e-03 2876 KSP Residual norm 1.681358820304e-03 2877 KSP Residual norm 1.749072634641e-03 2878 KSP Residual norm 1.829374916805e-03 2879 KSP Residual norm 2.117446004955e-03 2880 KSP Residual norm 2.172038560867e-03 2881 KSP Residual norm 1.990623694723e-03 2882 KSP Residual norm 2.142729037698e-03 2883 KSP Residual norm 2.459801592783e-03 2884 KSP Residual norm 2.405512176252e-03 2885 KSP Residual norm 2.407816699724e-03 2886 KSP Residual norm 2.653979926691e-03 2887 KSP Residual norm 2.735241253463e-03 2888 KSP Residual norm 2.684942144314e-03 2889 KSP Residual norm 3.107427807814e-03 2890 KSP Residual norm 3.575515284824e-03 2891 KSP Residual norm 3.376235460739e-03 2892 KSP Residual norm 3.404884516786e-03 2893 KSP Residual norm 3.929868756447e-03 2894 KSP Residual norm 4.111687608396e-03 2895 KSP Residual norm 3.788022365376e-03 2896 KSP Residual norm 4.037486192237e-03 2897 KSP Residual norm 4.582307985708e-03 2898 KSP Residual norm 4.397018421487e-03 2899 KSP Residual norm 4.184351178202e-03 2900 KSP Residual norm 4.129499004219e-03 2901 KSP Residual norm 4.193657012396e-03 2902 KSP Residual norm 4.345469063901e-03 2903 KSP Residual norm 4.722109775846e-03 2904 KSP Residual norm 5.392810790430e-03 2905 KSP Residual norm 5.386641629693e-03 2906 KSP Residual norm 5.090454899875e-03 2907 KSP Residual norm 5.202040851774e-03 2908 KSP Residual norm 5.417690627987e-03 2909 KSP Residual norm 5.076526303676e-03 2910 KSP Residual norm 5.233764214625e-03 2911 KSP Residual norm 6.362598247088e-03 2912 KSP Residual norm 7.075721196366e-03 2913 KSP Residual norm 6.983864367887e-03 2914 KSP Residual norm 7.582708469288e-03 2915 KSP Residual norm 7.969517990493e-03 2916 KSP Residual norm 7.917432839086e-03 2917 KSP Residual norm 8.326635825449e-03 2918 KSP Residual norm 8.822095168357e-03 2919 KSP Residual norm 8.668312680659e-03 2920 KSP Residual norm 9.188392767430e-03 2921 KSP Residual norm 1.019618131499e-02 2922 KSP Residual norm 9.443758357690e-03 2923 KSP Residual norm 7.912514701689e-03 2924 KSP Residual norm 7.307896808691e-03 2925 KSP Residual norm 7.190406002902e-03 2926 KSP Residual norm 6.848901440889e-03 2927 KSP Residual norm 7.007735645264e-03 2928 KSP Residual norm 8.085929902110e-03 2929 KSP Residual norm 9.055854895566e-03 2930 KSP Residual norm 8.788443507417e-03 2931 KSP Residual norm 8.892129795841e-03 2932 KSP Residual norm 9.391775988154e-03 2933 KSP Residual norm 8.778762453979e-03 2934 KSP Residual norm 8.177475371371e-03 2935 KSP Residual norm 8.296157796572e-03 2936 KSP Residual norm 8.428951916742e-03 2937 KSP Residual norm 7.253612634486e-03 2938 KSP Residual norm 6.486381004139e-03 2939 KSP Residual norm 6.653913849035e-03 2940 KSP Residual norm 6.711623842603e-03 2941 KSP Residual norm 6.561290672839e-03 2942 KSP Residual norm 7.211875432346e-03 2943 KSP Residual norm 7.516924345150e-03 2944 KSP Residual norm 6.571239843022e-03 2945 KSP Residual norm 6.308294812942e-03 2946 KSP Residual norm 6.468447711975e-03 2947 KSP Residual norm 5.456713710464e-03 2948 KSP Residual norm 4.407074338385e-03 2949 KSP Residual norm 4.541864464324e-03 2950 KSP Residual norm 5.127110075465e-03 2951 KSP Residual norm 4.475441723750e-03 2952 KSP Residual norm 3.935090225693e-03 2953 KSP Residual norm 3.986292185665e-03 2954 KSP Residual norm 3.861524737337e-03 2955 KSP Residual norm 3.525301600677e-03 2956 KSP Residual norm 3.606986049901e-03 2957 KSP Residual norm 3.741501646941e-03 2958 KSP Residual norm 3.329653686436e-03 2959 KSP Residual norm 3.202102690664e-03 2960 KSP Residual norm 3.450486095169e-03 2961 KSP Residual norm 3.252219165076e-03 2962 KSP Residual norm 2.477276891406e-03 2963 KSP Residual norm 2.049933056556e-03 2964 KSP Residual norm 1.902709902863e-03 2965 KSP Residual norm 1.752531162959e-03 2966 KSP Residual norm 1.696427240540e-03 2967 KSP Residual norm 1.946924154321e-03 2968 KSP Residual norm 2.227951284603e-03 2969 KSP Residual norm 2.009009158130e-03 2970 KSP Residual norm 1.882226516836e-03 2971 KSP Residual norm 2.086086727174e-03 2972 KSP Residual norm 2.055611734290e-03 2973 KSP Residual norm 1.741240774921e-03 2974 KSP Residual norm 1.725175564643e-03 2975 KSP Residual norm 1.755915800229e-03 2976 KSP Residual norm 1.509528375407e-03 2977 KSP Residual norm 1.357637627266e-03 2978 KSP Residual norm 1.460549071024e-03 2979 KSP Residual norm 1.553492067748e-03 2980 KSP Residual norm 1.364009061525e-03 2981 KSP Residual norm 1.272013168916e-03 2982 KSP Residual norm 1.186072933342e-03 2983 KSP Residual norm 9.951101622586e-04 2984 KSP Residual norm 9.363652021728e-04 2985 KSP Residual norm 1.004675852639e-03 2986 KSP Residual norm 1.036029921093e-03 2987 KSP Residual norm 1.064462437764e-03 2988 KSP Residual norm 1.224997879950e-03 2989 KSP Residual norm 1.388057630221e-03 2990 KSP Residual norm 1.372660186962e-03 2991 KSP Residual norm 1.424045125325e-03 2992 KSP Residual norm 1.500297885504e-03 2993 KSP Residual norm 1.473977518522e-03 2994 KSP Residual norm 1.397325608450e-03 2995 KSP Residual norm 1.446577138766e-03 2996 KSP Residual norm 1.432956780215e-03 2997 KSP Residual norm 1.368682613462e-03 2998 KSP Residual norm 1.330389794315e-03 2999 KSP Residual norm 1.354215374244e-03 3000 KSP Residual norm 1.319668006136e-03 3001 KSP Residual norm 1.273994985774e-03 3002 KSP Residual norm 1.371379762475e-03 3003 KSP Residual norm 1.430753016456e-03 3004 KSP Residual norm 1.306663632696e-03 3005 KSP Residual norm 1.286112809999e-03 3006 KSP Residual norm 1.350415243744e-03 3007 KSP Residual norm 1.372214677136e-03 3008 KSP Residual norm 1.418110778472e-03 3009 KSP Residual norm 1.736144243771e-03 3010 KSP Residual norm 1.962396016012e-03 3011 KSP Residual norm 1.875622461591e-03 3012 KSP Residual norm 1.828878048051e-03 3013 KSP Residual norm 1.968093592774e-03 3014 KSP Residual norm 2.024599310890e-03 3015 KSP Residual norm 1.914434050527e-03 3016 KSP Residual norm 1.949234213420e-03 3017 KSP Residual norm 2.059691091928e-03 3018 KSP Residual norm 2.230823155566e-03 3019 KSP Residual norm 2.452639340379e-03 3020 KSP Residual norm 2.884304396912e-03 3021 KSP Residual norm 3.358685292056e-03 3022 KSP Residual norm 3.553951903749e-03 3023 KSP Residual norm 3.449118175846e-03 3024 KSP Residual norm 3.300312969099e-03 3025 KSP Residual norm 3.141321863979e-03 3026 KSP Residual norm 3.189177447526e-03 3027 KSP Residual norm 3.586824420308e-03 3028 KSP Residual norm 4.318036306854e-03 3029 KSP Residual norm 4.643788568772e-03 3030 KSP Residual norm 4.370505229741e-03 3031 KSP Residual norm 4.161352510344e-03 3032 KSP Residual norm 4.426873460573e-03 3033 KSP Residual norm 4.727376409745e-03 3034 KSP Residual norm 4.935447116989e-03 3035 KSP Residual norm 5.195492858474e-03 3036 KSP Residual norm 5.262987991305e-03 3037 KSP Residual norm 4.988286243106e-03 3038 KSP Residual norm 4.838626754328e-03 3039 KSP Residual norm 5.135821517572e-03 3040 KSP Residual norm 5.128463280593e-03 3041 KSP Residual norm 4.894818274532e-03 3042 KSP Residual norm 5.157080366224e-03 3043 KSP Residual norm 5.899655632507e-03 3044 KSP Residual norm 6.311604201119e-03 3045 KSP Residual norm 6.503921463717e-03 3046 KSP Residual norm 7.045433203190e-03 3047 KSP Residual norm 7.822428362162e-03 3048 KSP Residual norm 7.801763914628e-03 3049 KSP Residual norm 7.803226763047e-03 3050 KSP Residual norm 8.170601241380e-03 3051 KSP Residual norm 7.843857344975e-03 3052 KSP Residual norm 7.654351738200e-03 3053 KSP Residual norm 8.644931839356e-03 3054 KSP Residual norm 9.699516109829e-03 3055 KSP Residual norm 9.647065952106e-03 3056 KSP Residual norm 1.006655408950e-02 3057 KSP Residual norm 1.023806234184e-02 3058 KSP Residual norm 9.532513741776e-03 3059 KSP Residual norm 9.513932902273e-03 3060 KSP Residual norm 1.005696053542e-02 3061 KSP Residual norm 9.512926491812e-03 3062 KSP Residual norm 8.959904449497e-03 3063 KSP Residual norm 9.498596804941e-03 3064 KSP Residual norm 1.065376693957e-02 3065 KSP Residual norm 1.010658942622e-02 3066 KSP Residual norm 9.678241257087e-03 3067 KSP Residual norm 1.021317781344e-02 3068 KSP Residual norm 1.109162939418e-02 3069 KSP Residual norm 1.158666376649e-02 3070 KSP Residual norm 1.202557056106e-02 3071 KSP Residual norm 1.211144817720e-02 3072 KSP Residual norm 1.120304665676e-02 3073 KSP Residual norm 1.083553887500e-02 3074 KSP Residual norm 1.121522622165e-02 3075 KSP Residual norm 1.121260214238e-02 3076 KSP Residual norm 1.046440502336e-02 3077 KSP Residual norm 1.004708331370e-02 3078 KSP Residual norm 1.073391842848e-02 3079 KSP Residual norm 1.156613829371e-02 3080 KSP Residual norm 1.236452671945e-02 3081 KSP Residual norm 1.309923551733e-02 3082 KSP Residual norm 1.438114956817e-02 3083 KSP Residual norm 1.426149827407e-02 3084 KSP Residual norm 1.224280180376e-02 3085 KSP Residual norm 1.096181143829e-02 3086 KSP Residual norm 1.111920170108e-02 3087 KSP Residual norm 1.118340676655e-02 3088 KSP Residual norm 1.000130923922e-02 3089 KSP Residual norm 9.754845073746e-03 3090 KSP Residual norm 1.014216156263e-02 3091 KSP Residual norm 9.540691283703e-03 3092 KSP Residual norm 9.401965344054e-03 3093 KSP Residual norm 9.624902854579e-03 3094 KSP Residual norm 9.147961612330e-03 3095 KSP Residual norm 7.906191690820e-03 3096 KSP Residual norm 7.592020558342e-03 3097 KSP Residual norm 7.524035202212e-03 3098 KSP Residual norm 7.266627699893e-03 3099 KSP Residual norm 6.751911787101e-03 3100 KSP Residual norm 6.387038264013e-03 3101 KSP Residual norm 5.486121905795e-03 3102 KSP Residual norm 4.760691147831e-03 3103 KSP Residual norm 4.980285403119e-03 3104 KSP Residual norm 5.753887376636e-03 3105 KSP Residual norm 6.060393070534e-03 3106 KSP Residual norm 5.800149107397e-03 3107 KSP Residual norm 5.712092792131e-03 3108 KSP Residual norm 5.723257999910e-03 3109 KSP Residual norm 5.585497985460e-03 3110 KSP Residual norm 5.299240610802e-03 3111 KSP Residual norm 5.185844312278e-03 3112 KSP Residual norm 5.180691238305e-03 3113 KSP Residual norm 5.170254632558e-03 3114 KSP Residual norm 5.031353871658e-03 3115 KSP Residual norm 4.516523814593e-03 3116 KSP Residual norm 3.891943664907e-03 3117 KSP Residual norm 3.527455407802e-03 3118 KSP Residual norm 3.276607605164e-03 3119 KSP Residual norm 3.137818567100e-03 3120 KSP Residual norm 3.510031082913e-03 3121 KSP Residual norm 4.090076535163e-03 3122 KSP Residual norm 4.204439159335e-03 3123 KSP Residual norm 4.519044874314e-03 3124 KSP Residual norm 4.711943201661e-03 3125 KSP Residual norm 4.108752640532e-03 3126 KSP Residual norm 3.479065521297e-03 3127 KSP Residual norm 3.495031577776e-03 3128 KSP Residual norm 3.934896362242e-03 3129 KSP Residual norm 4.070916898692e-03 3130 KSP Residual norm 4.288349051719e-03 3131 KSP Residual norm 4.317228547289e-03 3132 KSP Residual norm 3.946454044654e-03 3133 KSP Residual norm 3.779196047954e-03 3134 KSP Residual norm 4.032446591841e-03 3135 KSP Residual norm 4.018336837785e-03 3136 KSP Residual norm 3.607753498867e-03 3137 KSP Residual norm 3.481587781544e-03 3138 KSP Residual norm 3.455709777074e-03 3139 KSP Residual norm 3.363059582249e-03 3140 KSP Residual norm 3.313987873533e-03 3141 KSP Residual norm 3.237254013235e-03 3142 KSP Residual norm 2.768398774635e-03 3143 KSP Residual norm 2.468370301959e-03 3144 KSP Residual norm 2.535141088666e-03 3145 KSP Residual norm 2.928032024313e-03 3146 KSP Residual norm 3.847497590560e-03 3147 KSP Residual norm 4.242685138425e-03 3148 KSP Residual norm 3.615538528394e-03 3149 KSP Residual norm 3.113010251107e-03 3150 KSP Residual norm 3.162645118937e-03 3151 KSP Residual norm 3.317127403664e-03 3152 KSP Residual norm 3.471925769185e-03 3153 KSP Residual norm 3.828187151022e-03 3154 KSP Residual norm 4.090509587600e-03 3155 KSP Residual norm 3.897198613880e-03 3156 KSP Residual norm 3.639927871554e-03 3157 KSP Residual norm 3.528236759149e-03 3158 KSP Residual norm 3.438661404826e-03 3159 KSP Residual norm 3.636352107113e-03 3160 KSP Residual norm 3.997719762216e-03 3161 KSP Residual norm 4.112496172284e-03 3162 KSP Residual norm 4.057648658384e-03 3163 KSP Residual norm 4.270657152407e-03 3164 KSP Residual norm 4.405984332441e-03 3165 KSP Residual norm 3.807035656515e-03 3166 KSP Residual norm 3.531658843093e-03 3167 KSP Residual norm 3.943140142473e-03 3168 KSP Residual norm 3.994005998218e-03 3169 KSP Residual norm 3.925806568403e-03 3170 KSP Residual norm 4.283447489515e-03 3171 KSP Residual norm 4.995750857361e-03 3172 KSP Residual norm 5.209611202407e-03 3173 KSP Residual norm 5.080346313836e-03 3174 KSP Residual norm 4.799734984449e-03 3175 KSP Residual norm 4.163699720599e-03 3176 KSP Residual norm 4.081417388337e-03 3177 KSP Residual norm 4.614329415968e-03 3178 KSP Residual norm 5.245797563056e-03 3179 KSP Residual norm 5.297680341569e-03 3180 KSP Residual norm 5.401341806411e-03 3181 KSP Residual norm 6.006059778329e-03 3182 KSP Residual norm 7.005651171496e-03 3183 KSP Residual norm 8.532646190371e-03 3184 KSP Residual norm 9.728916729099e-03 3185 KSP Residual norm 9.381147206066e-03 3186 KSP Residual norm 8.961874693029e-03 3187 KSP Residual norm 9.086673303911e-03 3188 KSP Residual norm 8.251175181581e-03 3189 KSP Residual norm 6.911038858331e-03 3190 KSP Residual norm 6.568919068833e-03 3191 KSP Residual norm 6.805271037313e-03 3192 KSP Residual norm 7.204067554180e-03 3193 KSP Residual norm 8.129184294157e-03 3194 KSP Residual norm 9.005369610526e-03 3195 KSP Residual norm 9.442528385542e-03 3196 KSP Residual norm 9.774101977640e-03 3197 KSP Residual norm 1.033055682201e-02 3198 KSP Residual norm 1.043221180782e-02 3199 KSP Residual norm 1.078135156047e-02 3200 KSP Residual norm 1.200591990423e-02 3201 KSP Residual norm 1.305363207925e-02 3202 KSP Residual norm 1.378489495984e-02 3203 KSP Residual norm 1.356199718732e-02 3204 KSP Residual norm 1.269109848995e-02 3205 KSP Residual norm 1.269489526675e-02 3206 KSP Residual norm 1.451094867081e-02 3207 KSP Residual norm 1.567832583739e-02 3208 KSP Residual norm 1.532058906301e-02 3209 KSP Residual norm 1.576409641088e-02 3210 KSP Residual norm 1.643058038446e-02 3211 KSP Residual norm 1.557642354238e-02 3212 KSP Residual norm 1.552121527862e-02 3213 KSP Residual norm 1.628124738715e-02 3214 KSP Residual norm 1.537868036131e-02 3215 KSP Residual norm 1.411653453381e-02 3216 KSP Residual norm 1.438663611149e-02 3217 KSP Residual norm 1.586387854502e-02 3218 KSP Residual norm 1.532058834136e-02 3219 KSP Residual norm 1.381368567635e-02 3220 KSP Residual norm 1.269730312762e-02 3221 KSP Residual norm 1.185916433179e-02 3222 KSP Residual norm 1.137845876087e-02 3223 KSP Residual norm 1.122832688659e-02 3224 KSP Residual norm 1.142049772445e-02 3225 KSP Residual norm 1.150003013665e-02 3226 KSP Residual norm 1.329890835724e-02 3227 KSP Residual norm 1.690954609903e-02 3228 KSP Residual norm 1.907361819568e-02 3229 KSP Residual norm 1.836512146272e-02 3230 KSP Residual norm 1.644260808452e-02 3231 KSP Residual norm 1.429160276489e-02 3232 KSP Residual norm 1.238839069472e-02 3233 KSP Residual norm 1.206765651211e-02 3234 KSP Residual norm 1.162683853596e-02 3235 KSP Residual norm 1.080653244405e-02 3236 KSP Residual norm 9.770249579341e-03 3237 KSP Residual norm 9.143515025012e-03 3238 KSP Residual norm 8.858832879797e-03 3239 KSP Residual norm 9.023756798777e-03 3240 KSP Residual norm 1.001068990928e-02 3241 KSP Residual norm 1.069471425598e-02 3242 KSP Residual norm 1.020298319345e-02 3243 KSP Residual norm 8.865505274785e-03 3244 KSP Residual norm 7.383176934210e-03 3245 KSP Residual norm 7.414517430798e-03 3246 KSP Residual norm 9.001557144410e-03 3247 KSP Residual norm 1.022769544403e-02 3248 KSP Residual norm 9.785680285920e-03 3249 KSP Residual norm 9.142389815218e-03 3250 KSP Residual norm 8.536210708611e-03 3251 KSP Residual norm 6.867894613730e-03 3252 KSP Residual norm 5.895947524297e-03 3253 KSP Residual norm 5.818159631934e-03 3254 KSP Residual norm 5.969135701545e-03 3255 KSP Residual norm 6.268075658928e-03 3256 KSP Residual norm 7.399276665667e-03 3257 KSP Residual norm 7.661427943296e-03 3258 KSP Residual norm 6.453249799503e-03 3259 KSP Residual norm 5.921344561978e-03 3260 KSP Residual norm 5.982986714829e-03 3261 KSP Residual norm 5.199720903837e-03 3262 KSP Residual norm 4.422412439531e-03 3263 KSP Residual norm 4.694366066974e-03 3264 KSP Residual norm 5.853588950046e-03 3265 KSP Residual norm 6.551662871828e-03 3266 KSP Residual norm 6.490056762985e-03 3267 KSP Residual norm 5.951388959900e-03 3268 KSP Residual norm 4.516412582620e-03 3269 KSP Residual norm 2.983165029024e-03 3270 KSP Residual norm 2.039595003348e-03 3271 KSP Residual norm 1.794976143482e-03 3272 KSP Residual norm 1.939361591023e-03 3273 KSP Residual norm 2.531434460060e-03 3274 KSP Residual norm 3.598802316831e-03 3275 KSP Residual norm 4.674419793336e-03 3276 KSP Residual norm 4.842539121028e-03 3277 KSP Residual norm 3.996983218608e-03 3278 KSP Residual norm 3.140164118259e-03 3279 KSP Residual norm 2.515405621939e-03 3280 KSP Residual norm 2.408966869639e-03 3281 KSP Residual norm 2.720855275036e-03 3282 KSP Residual norm 3.355954732026e-03 3283 KSP Residual norm 4.167591031500e-03 3284 KSP Residual norm 4.680635771816e-03 3285 KSP Residual norm 3.889527774632e-03 3286 KSP Residual norm 2.755805924728e-03 3287 KSP Residual norm 2.315840756050e-03 3288 KSP Residual norm 2.308845251531e-03 3289 KSP Residual norm 2.636645978699e-03 3290 KSP Residual norm 3.402433647969e-03 3291 KSP Residual norm 4.251620632374e-03 3292 KSP Residual norm 3.793730879779e-03 3293 KSP Residual norm 2.832866760978e-03 3294 KSP Residual norm 2.180505219503e-03 3295 KSP Residual norm 1.697614738254e-03 3296 KSP Residual norm 1.665784154063e-03 3297 KSP Residual norm 2.161024671321e-03 3298 KSP Residual norm 3.050830942266e-03 3299 KSP Residual norm 3.509991938504e-03 3300 KSP Residual norm 3.542214302572e-03 3301 KSP Residual norm 3.106173541680e-03 3302 KSP Residual norm 2.376905396518e-03 3303 KSP Residual norm 2.173447876136e-03 3304 KSP Residual norm 2.308632256805e-03 3305 KSP Residual norm 2.176363396384e-03 3306 KSP Residual norm 1.813052173020e-03 3307 KSP Residual norm 1.624029390453e-03 3308 KSP Residual norm 1.559582321585e-03 3309 KSP Residual norm 1.519515691497e-03 3310 KSP Residual norm 1.893851598042e-03 3311 KSP Residual norm 2.836681978166e-03 3312 KSP Residual norm 3.660679422597e-03 3313 KSP Residual norm 3.559310041600e-03 3314 KSP Residual norm 3.105166606565e-03 3315 KSP Residual norm 2.343716525519e-03 3316 KSP Residual norm 1.722636638757e-03 3317 KSP Residual norm 1.443878171809e-03 3318 KSP Residual norm 1.516598879532e-03 3319 KSP Residual norm 1.954914482820e-03 3320 KSP Residual norm 2.782514594175e-03 3321 KSP Residual norm 3.700013789797e-03 3322 KSP Residual norm 3.772175780929e-03 3323 KSP Residual norm 2.935475918005e-03 3324 KSP Residual norm 2.083878885861e-03 3325 KSP Residual norm 1.703560777235e-03 3326 KSP Residual norm 1.706262679731e-03 3327 KSP Residual norm 2.224070904846e-03 3328 KSP Residual norm 3.404123082333e-03 3329 KSP Residual norm 5.017399865633e-03 3330 KSP Residual norm 5.328545072862e-03 3331 KSP Residual norm 4.110554961613e-03 3332 KSP Residual norm 3.055932463793e-03 3333 KSP Residual norm 2.434093665887e-03 3334 KSP Residual norm 2.245025757791e-03 3335 KSP Residual norm 2.597381919391e-03 3336 KSP Residual norm 3.608560344999e-03 3337 KSP Residual norm 5.024544092493e-03 3338 KSP Residual norm 6.757773177279e-03 3339 KSP Residual norm 7.236990599686e-03 3340 KSP Residual norm 5.334018579363e-03 3341 KSP Residual norm 3.670426817844e-03 3342 KSP Residual norm 3.458658010460e-03 3343 KSP Residual norm 4.177897662309e-03 3344 KSP Residual norm 5.361834008504e-03 3345 KSP Residual norm 7.448364528135e-03 3346 KSP Residual norm 1.161156259565e-02 3347 KSP Residual norm 1.466722425199e-02 3348 KSP Residual norm 1.223571276010e-02 3349 KSP Residual norm 8.515548591274e-03 3350 KSP Residual norm 5.560222541152e-03 3351 KSP Residual norm 3.737752004715e-03 3352 KSP Residual norm 3.374691477137e-03 3353 KSP Residual norm 4.248075403035e-03 3354 KSP Residual norm 6.454549110546e-03 3355 KSP Residual norm 9.253753441211e-03 3356 KSP Residual norm 1.172261739627e-02 3357 KSP Residual norm 1.293159358061e-02 3358 KSP Residual norm 1.327533129647e-02 3359 KSP Residual norm 1.421858697334e-02 3360 KSP Residual norm 1.588189853110e-02 3361 KSP Residual norm 1.468433443893e-02 3362 KSP Residual norm 1.061499850396e-02 3363 KSP Residual norm 7.238306118037e-03 3364 KSP Residual norm 5.295484481415e-03 3365 KSP Residual norm 5.030433624303e-03 3366 KSP Residual norm 6.448894481015e-03 3367 KSP Residual norm 1.005121245882e-02 3368 KSP Residual norm 1.460014037975e-02 3369 KSP Residual norm 1.624683604566e-02 3370 KSP Residual norm 1.433959148724e-02 3371 KSP Residual norm 1.160450343864e-02 3372 KSP Residual norm 9.605938004674e-03 3373 KSP Residual norm 8.899127743228e-03 3374 KSP Residual norm 9.370299718388e-03 3375 KSP Residual norm 9.380021626241e-03 3376 KSP Residual norm 8.342611820587e-03 3377 KSP Residual norm 6.985668548425e-03 3378 KSP Residual norm 6.327982854763e-03 3379 KSP Residual norm 6.591958145806e-03 3380 KSP Residual norm 7.177543030292e-03 3381 KSP Residual norm 7.853954858680e-03 3382 KSP Residual norm 8.120645681643e-03 3383 KSP Residual norm 9.053611504499e-03 3384 KSP Residual norm 1.136035037033e-02 3385 KSP Residual norm 1.304283230750e-02 3386 KSP Residual norm 1.163407764338e-02 3387 KSP Residual norm 8.409491414743e-03 3388 KSP Residual norm 6.076571569911e-03 3389 KSP Residual norm 5.063308200875e-03 3390 KSP Residual norm 5.248174315404e-03 3391 KSP Residual norm 6.560021912350e-03 3392 KSP Residual norm 7.415905842330e-03 3393 KSP Residual norm 7.258096522426e-03 3394 KSP Residual norm 7.519030670535e-03 3395 KSP Residual norm 7.947065118337e-03 3396 KSP Residual norm 8.439483136767e-03 3397 KSP Residual norm 9.153467368035e-03 3398 KSP Residual norm 8.306626157337e-03 3399 KSP Residual norm 5.999621700177e-03 3400 KSP Residual norm 4.249803258353e-03 3401 KSP Residual norm 3.266740729237e-03 3402 KSP Residual norm 2.962905410642e-03 3403 KSP Residual norm 3.294654846176e-03 3404 KSP Residual norm 3.473894366545e-03 3405 KSP Residual norm 3.310995992788e-03 3406 KSP Residual norm 3.726455952379e-03 3407 KSP Residual norm 4.885673388954e-03 3408 KSP Residual norm 5.546168850270e-03 3409 KSP Residual norm 5.740402294123e-03 3410 KSP Residual norm 6.079770275235e-03 3411 KSP Residual norm 6.014014871281e-03 3412 KSP Residual norm 6.020096376792e-03 3413 KSP Residual norm 5.968133126080e-03 3414 KSP Residual norm 5.059945350920e-03 3415 KSP Residual norm 3.587378586357e-03 3416 KSP Residual norm 2.822033047601e-03 3417 KSP Residual norm 2.513662279448e-03 3418 KSP Residual norm 2.379184307503e-03 3419 KSP Residual norm 2.329629264453e-03 3420 KSP Residual norm 2.204936400537e-03 3421 KSP Residual norm 2.085485526127e-03 3422 KSP Residual norm 2.239133298867e-03 3423 KSP Residual norm 2.781163346688e-03 3424 KSP Residual norm 3.467984178772e-03 3425 KSP Residual norm 3.766608644314e-03 3426 KSP Residual norm 3.542590969970e-03 3427 KSP Residual norm 3.021545797448e-03 3428 KSP Residual norm 2.575151560536e-03 3429 KSP Residual norm 2.360653493124e-03 3430 KSP Residual norm 2.316791073053e-03 3431 KSP Residual norm 2.217026763308e-03 3432 KSP Residual norm 1.992379378075e-03 3433 KSP Residual norm 1.753332197379e-03 3434 KSP Residual norm 1.757412270373e-03 3435 KSP Residual norm 2.202389081792e-03 3436 KSP Residual norm 2.686035122139e-03 3437 KSP Residual norm 2.499700048378e-03 3438 KSP Residual norm 2.244939603433e-03 3439 KSP Residual norm 2.330553771967e-03 3440 KSP Residual norm 2.624392189817e-03 3441 KSP Residual norm 2.623874272572e-03 3442 KSP Residual norm 2.070401979000e-03 3443 KSP Residual norm 1.444355694733e-03 3444 KSP Residual norm 1.094870722720e-03 3445 KSP Residual norm 9.607246377444e-04 3446 KSP Residual norm 8.651114234745e-04 3447 KSP Residual norm 7.499264278678e-04 3448 KSP Residual norm 6.984004105417e-04 3449 KSP Residual norm 7.806927786775e-04 3450 KSP Residual norm 1.030933255808e-03 3451 KSP Residual norm 1.398237545698e-03 3452 KSP Residual norm 1.562644637794e-03 3453 KSP Residual norm 1.511551810266e-03 3454 KSP Residual norm 1.473179602700e-03 3455 KSP Residual norm 1.517294639902e-03 3456 KSP Residual norm 1.480657264659e-03 3457 KSP Residual norm 1.201818362205e-03 3458 KSP Residual norm 8.991204921500e-04 3459 KSP Residual norm 8.256666138973e-04 3460 KSP Residual norm 8.916031729385e-04 3461 KSP Residual norm 9.093828933303e-04 3462 KSP Residual norm 7.921076595761e-04 3463 KSP Residual norm 6.812389654892e-04 3464 KSP Residual norm 6.586554280163e-04 3465 KSP Residual norm 7.291705292598e-04 3466 KSP Residual norm 8.036112210929e-04 3467 KSP Residual norm 8.366608045404e-04 3468 KSP Residual norm 7.963892254702e-04 3469 KSP Residual norm 7.877505216885e-04 3470 KSP Residual norm 8.353415436455e-04 3471 KSP Residual norm 9.277720257402e-04 3472 KSP Residual norm 1.101942335291e-03 3473 KSP Residual norm 1.067069499995e-03 3474 KSP Residual norm 8.006057154162e-04 3475 KSP Residual norm 6.566476734925e-04 3476 KSP Residual norm 6.350126093126e-04 3477 KSP Residual norm 6.742107335825e-04 3478 KSP Residual norm 6.354443420061e-04 3479 KSP Residual norm 5.179732380869e-04 3480 KSP Residual norm 4.121368114192e-04 3481 KSP Residual norm 3.787878580931e-04 3482 KSP Residual norm 4.440028813957e-04 3483 KSP Residual norm 5.114775212055e-04 3484 KSP Residual norm 4.743065925748e-04 3485 KSP Residual norm 4.294702343993e-04 3486 KSP Residual norm 4.657060853571e-04 3487 KSP Residual norm 5.852635504900e-04 3488 KSP Residual norm 7.011058864148e-04 3489 KSP Residual norm 7.195693799062e-04 3490 KSP Residual norm 6.535302888204e-04 3491 KSP Residual norm 5.841926553952e-04 3492 KSP Residual norm 5.497100454829e-04 3493 KSP Residual norm 5.255295851755e-04 3494 KSP Residual norm 4.356822844171e-04 3495 KSP Residual norm 3.685597805375e-04 3496 KSP Residual norm 3.330568822722e-04 3497 KSP Residual norm 3.421860146338e-04 3498 KSP Residual norm 3.690993881108e-04 3499 KSP Residual norm 3.532750971556e-04 3500 KSP Residual norm 3.049433515967e-04 3501 KSP Residual norm 2.931011824341e-04 3502 KSP Residual norm 3.440663736710e-04 3503 KSP Residual norm 4.632817903386e-04 3504 KSP Residual norm 5.561854642997e-04 3505 KSP Residual norm 5.630559169398e-04 3506 KSP Residual norm 5.380430164398e-04 3507 KSP Residual norm 6.104820630153e-04 3508 KSP Residual norm 8.274878968856e-04 3509 KSP Residual norm 9.875315649218e-04 3510 KSP Residual norm 9.092201828164e-04 3511 KSP Residual norm 7.757570077409e-04 3512 KSP Residual norm 7.559988057923e-04 3513 KSP Residual norm 8.803879684377e-04 3514 KSP Residual norm 9.508967221048e-04 3515 KSP Residual norm 7.389022793062e-04 3516 KSP Residual norm 6.193006140846e-04 3517 KSP Residual norm 6.570425920917e-04 3518 KSP Residual norm 7.832796504841e-04 3519 KSP Residual norm 8.532010926067e-04 3520 KSP Residual norm 8.208984225632e-04 3521 KSP Residual norm 7.932871011582e-04 3522 KSP Residual norm 8.798936491137e-04 3523 KSP Residual norm 1.070383262272e-03 3524 KSP Residual norm 1.195189003322e-03 3525 KSP Residual norm 1.205191158875e-03 3526 KSP Residual norm 1.266488170870e-03 3527 KSP Residual norm 1.554374548755e-03 3528 KSP Residual norm 2.023641804266e-03 3529 KSP Residual norm 2.045493343792e-03 3530 KSP Residual norm 1.478040580461e-03 3531 KSP Residual norm 1.075182917332e-03 3532 KSP Residual norm 9.771798637772e-04 3533 KSP Residual norm 1.067532871406e-03 3534 KSP Residual norm 1.030407985674e-03 3535 KSP Residual norm 8.620271812682e-04 3536 KSP Residual norm 8.317543992205e-04 3537 KSP Residual norm 1.015062917753e-03 3538 KSP Residual norm 1.334676394032e-03 3539 KSP Residual norm 1.496384546331e-03 3540 KSP Residual norm 1.441118149914e-03 3541 KSP Residual norm 1.525326786434e-03 3542 KSP Residual norm 1.931801313192e-03 3543 KSP Residual norm 2.280673015354e-03 3544 KSP Residual norm 2.018253041515e-03 3545 KSP Residual norm 1.678637829758e-03 3546 KSP Residual norm 1.639551928886e-03 3547 KSP Residual norm 1.806076301212e-03 3548 KSP Residual norm 1.997638456010e-03 3549 KSP Residual norm 1.877914332868e-03 3550 KSP Residual norm 1.484076951522e-03 3551 KSP Residual norm 1.330692024164e-03 3552 KSP Residual norm 1.437317909713e-03 3553 KSP Residual norm 1.613975703049e-03 3554 KSP Residual norm 1.673509057881e-03 3555 KSP Residual norm 1.613179476473e-03 3556 KSP Residual norm 1.724258525116e-03 3557 KSP Residual norm 2.249061196567e-03 3558 KSP Residual norm 3.032330505208e-03 3559 KSP Residual norm 3.311623721401e-03 3560 KSP Residual norm 3.128816736141e-03 3561 KSP Residual norm 3.171573670495e-03 3562 KSP Residual norm 3.733094582663e-03 3563 KSP Residual norm 4.097017261153e-03 3564 KSP Residual norm 3.513162399956e-03 3565 KSP Residual norm 2.948428313749e-03 3566 KSP Residual norm 2.901393034600e-03 3567 KSP Residual norm 2.984534470525e-03 3568 KSP Residual norm 2.779805984533e-03 3569 KSP Residual norm 2.432290259025e-03 3570 KSP Residual norm 2.389515820843e-03 3571 KSP Residual norm 2.593743319429e-03 3572 KSP Residual norm 2.785783894119e-03 3573 KSP Residual norm 2.501197636017e-03 3574 KSP Residual norm 2.082103648580e-03 3575 KSP Residual norm 2.041599248973e-03 3576 KSP Residual norm 2.399253747622e-03 3577 KSP Residual norm 2.732195497203e-03 3578 KSP Residual norm 2.709450757970e-03 3579 KSP Residual norm 2.626285332542e-03 3580 KSP Residual norm 3.036264845884e-03 3581 KSP Residual norm 4.021274289136e-03 3582 KSP Residual norm 4.881630819856e-03 3583 KSP Residual norm 4.877683385218e-03 3584 KSP Residual norm 4.338828882271e-03 3585 KSP Residual norm 4.245469786156e-03 3586 KSP Residual norm 4.453349478537e-03 3587 KSP Residual norm 4.187751638837e-03 3588 KSP Residual norm 3.784529402750e-03 3589 KSP Residual norm 3.561587575294e-03 3590 KSP Residual norm 3.509756733965e-03 3591 KSP Residual norm 3.549135223631e-03 3592 KSP Residual norm 3.401174808818e-03 3593 KSP Residual norm 3.240863107469e-03 3594 KSP Residual norm 3.520559389878e-03 3595 KSP Residual norm 4.082694692065e-03 3596 KSP Residual norm 4.136352599546e-03 3597 KSP Residual norm 4.166900620936e-03 3598 KSP Residual norm 5.044139245858e-03 3599 KSP Residual norm 6.561622484536e-03 3600 KSP Residual norm 7.307164257339e-03 3601 KSP Residual norm 7.443863158124e-03 3602 KSP Residual norm 7.828314430839e-03 3603 KSP Residual norm 8.166278844055e-03 3604 KSP Residual norm 7.658887239474e-03 3605 KSP Residual norm 6.207092964009e-03 3606 KSP Residual norm 4.678215399264e-03 3607 KSP Residual norm 4.331164823929e-03 3608 KSP Residual norm 4.833612569223e-03 3609 KSP Residual norm 4.850875438914e-03 3610 KSP Residual norm 4.047203963261e-03 3611 KSP Residual norm 3.592469971986e-03 3612 KSP Residual norm 3.532370630108e-03 3613 KSP Residual norm 3.553192131972e-03 3614 KSP Residual norm 3.665485709967e-03 3615 KSP Residual norm 3.676007855053e-03 3616 KSP Residual norm 3.981008685067e-03 3617 KSP Residual norm 5.034569569705e-03 3618 KSP Residual norm 6.117479478495e-03 3619 KSP Residual norm 6.033465754972e-03 3620 KSP Residual norm 6.227823513094e-03 3621 KSP Residual norm 7.423841165264e-03 3622 KSP Residual norm 8.306450199043e-03 3623 KSP Residual norm 8.151912053584e-03 3624 KSP Residual norm 7.754934563164e-03 3625 KSP Residual norm 6.914354816909e-03 3626 KSP Residual norm 6.253304568856e-03 3627 KSP Residual norm 5.509961410378e-03 3628 KSP Residual norm 4.394914369350e-03 3629 KSP Residual norm 3.843952478986e-03 3630 KSP Residual norm 4.149416599223e-03 3631 KSP Residual norm 4.231999601513e-03 3632 KSP Residual norm 3.540120774239e-03 3633 KSP Residual norm 3.313927565109e-03 3634 KSP Residual norm 3.855125406389e-03 3635 KSP Residual norm 4.200891466551e-03 3636 KSP Residual norm 4.150891094791e-03 3637 KSP Residual norm 3.814103303157e-03 3638 KSP Residual norm 3.586413132211e-03 3639 KSP Residual norm 3.863108676532e-03 3640 KSP Residual norm 4.476251263711e-03 3641 KSP Residual norm 4.712874915970e-03 3642 KSP Residual norm 5.332327228744e-03 3643 KSP Residual norm 6.941879291547e-03 3644 KSP Residual norm 8.118919341655e-03 3645 KSP Residual norm 7.298348207224e-03 3646 KSP Residual norm 5.954375112478e-03 3647 KSP Residual norm 4.984839833387e-03 3648 KSP Residual norm 4.384941933256e-03 3649 KSP Residual norm 4.133886164299e-03 3650 KSP Residual norm 3.545905667035e-03 3651 KSP Residual norm 2.994975000936e-03 3652 KSP Residual norm 2.767849745248e-03 3653 KSP Residual norm 2.769630219044e-03 3654 KSP Residual norm 2.439119191702e-03 3655 KSP Residual norm 2.185834742866e-03 3656 KSP Residual norm 2.202221778376e-03 3657 KSP Residual norm 2.271546394093e-03 3658 KSP Residual norm 2.366900033367e-03 3659 KSP Residual norm 2.659340650284e-03 3660 KSP Residual norm 2.704229966839e-03 3661 KSP Residual norm 2.692655430256e-03 3662 KSP Residual norm 2.892313527326e-03 3663 KSP Residual norm 3.288715276748e-03 3664 KSP Residual norm 3.769148978695e-03 3665 KSP Residual norm 4.412253772286e-03 3666 KSP Residual norm 4.761415185989e-03 3667 KSP Residual norm 4.615132664724e-03 3668 KSP Residual norm 4.554537747807e-03 3669 KSP Residual norm 4.331885400047e-03 3670 KSP Residual norm 3.832073404816e-03 3671 KSP Residual norm 3.661534364403e-03 3672 KSP Residual norm 3.689620790663e-03 3673 KSP Residual norm 3.429423924588e-03 3674 KSP Residual norm 3.060374069393e-03 3675 KSP Residual norm 2.864262662252e-03 3676 KSP Residual norm 2.672317271465e-03 3677 KSP Residual norm 2.587979518671e-03 3678 KSP Residual norm 2.568162988685e-03 3679 KSP Residual norm 2.186044796154e-03 3680 KSP Residual norm 1.727412388961e-03 3681 KSP Residual norm 1.682847424785e-03 3682 KSP Residual norm 1.821662828664e-03 3683 KSP Residual norm 1.855863327891e-03 3684 KSP Residual norm 1.818325924442e-03 3685 KSP Residual norm 1.901158812440e-03 3686 KSP Residual norm 2.089844447169e-03 3687 KSP Residual norm 2.602974999488e-03 3688 KSP Residual norm 3.018596557282e-03 3689 KSP Residual norm 2.884115491376e-03 3690 KSP Residual norm 2.961167133682e-03 3691 KSP Residual norm 3.602270471292e-03 3692 KSP Residual norm 4.112099294529e-03 3693 KSP Residual norm 3.931680371933e-03 3694 KSP Residual norm 3.659312970152e-03 3695 KSP Residual norm 3.471262559772e-03 3696 KSP Residual norm 3.131867604394e-03 3697 KSP Residual norm 2.647193407058e-03 3698 KSP Residual norm 1.962630853265e-03 3699 KSP Residual norm 1.438871758597e-03 3700 KSP Residual norm 1.319042910112e-03 3701 KSP Residual norm 1.392918112041e-03 3702 KSP Residual norm 1.316320114879e-03 3703 KSP Residual norm 1.255495628385e-03 3704 KSP Residual norm 1.221276120255e-03 3705 KSP Residual norm 1.199294005221e-03 3706 KSP Residual norm 1.265677446081e-03 3707 KSP Residual norm 1.376997309064e-03 3708 KSP Residual norm 1.365480084884e-03 3709 KSP Residual norm 1.278357561762e-03 3710 KSP Residual norm 1.307647305705e-03 3711 KSP Residual norm 1.271105239642e-03 3712 KSP Residual norm 1.121904585817e-03 3713 KSP Residual norm 1.105572981791e-03 3714 KSP Residual norm 1.150658368807e-03 3715 KSP Residual norm 1.133291576619e-03 3716 KSP Residual norm 1.133521042662e-03 3717 KSP Residual norm 1.172699492001e-03 3718 KSP Residual norm 1.227452311584e-03 3719 KSP Residual norm 1.375807043239e-03 3720 KSP Residual norm 1.519214167410e-03 3721 KSP Residual norm 1.496648336825e-03 3722 KSP Residual norm 1.479128754948e-03 3723 KSP Residual norm 1.386605556850e-03 3724 KSP Residual norm 1.123984411991e-03 3725 KSP Residual norm 1.019650926164e-03 3726 KSP Residual norm 1.062686708325e-03 3727 KSP Residual norm 9.688928695139e-04 3728 KSP Residual norm 7.979894915360e-04 3729 KSP Residual norm 7.999058710281e-04 3730 KSP Residual norm 9.139786750030e-04 3731 KSP Residual norm 1.040568590637e-03 3732 KSP Residual norm 1.128567347413e-03 3733 KSP Residual norm 1.204887771337e-03 3734 KSP Residual norm 1.300372691905e-03 3735 KSP Residual norm 1.508734119908e-03 3736 KSP Residual norm 1.729315134442e-03 3737 KSP Residual norm 1.846671078686e-03 3738 KSP Residual norm 2.160761751184e-03 3739 KSP Residual norm 2.430394745094e-03 3740 KSP Residual norm 2.271433580421e-03 3741 KSP Residual norm 1.895654453543e-03 3742 KSP Residual norm 1.694876639236e-03 3743 KSP Residual norm 1.664785611067e-03 3744 KSP Residual norm 1.845714329804e-03 3745 KSP Residual norm 1.928638202536e-03 3746 KSP Residual norm 1.714802331203e-03 3747 KSP Residual norm 1.625047760745e-03 3748 KSP Residual norm 1.713473645431e-03 3749 KSP Residual norm 1.829653885607e-03 3750 KSP Residual norm 1.672760741911e-03 3751 KSP Residual norm 1.474465283155e-03 3752 KSP Residual norm 1.425192074654e-03 3753 KSP Residual norm 1.467973336645e-03 3754 KSP Residual norm 1.473926936692e-03 3755 KSP Residual norm 1.350396371662e-03 3756 KSP Residual norm 1.304074505142e-03 3757 KSP Residual norm 1.494142474303e-03 3758 KSP Residual norm 1.673990463401e-03 3759 KSP Residual norm 1.722978244809e-03 3760 KSP Residual norm 1.883893105765e-03 3761 KSP Residual norm 2.252772410373e-03 3762 KSP Residual norm 2.457645832058e-03 3763 KSP Residual norm 2.591371644087e-03 3764 KSP Residual norm 3.014127607147e-03 3765 KSP Residual norm 3.658591732078e-03 3766 KSP Residual norm 3.966524686175e-03 3767 KSP Residual norm 3.862014072163e-03 3768 KSP Residual norm 3.777509564235e-03 3769 KSP Residual norm 4.056011536640e-03 3770 KSP Residual norm 4.548859906303e-03 3771 KSP Residual norm 4.544581773738e-03 3772 KSP Residual norm 4.091599538555e-03 3773 KSP Residual norm 4.306313663683e-03 3774 KSP Residual norm 4.351657747698e-03 3775 KSP Residual norm 3.824055216587e-03 3776 KSP Residual norm 3.505714847880e-03 3777 KSP Residual norm 3.537716266683e-03 3778 KSP Residual norm 3.317663861356e-03 3779 KSP Residual norm 2.850104340265e-03 3780 KSP Residual norm 2.553317624037e-03 3781 KSP Residual norm 2.325589585195e-03 3782 KSP Residual norm 2.140112890449e-03 3783 KSP Residual norm 2.184446675867e-03 3784 KSP Residual norm 2.275638442386e-03 3785 KSP Residual norm 2.437849826740e-03 3786 KSP Residual norm 2.868317331370e-03 3787 KSP Residual norm 3.312619326922e-03 3788 KSP Residual norm 3.410376571653e-03 3789 KSP Residual norm 3.786321830045e-03 3790 KSP Residual norm 4.508358257941e-03 3791 KSP Residual norm 4.892462517884e-03 3792 KSP Residual norm 5.553242085047e-03 3793 KSP Residual norm 6.601775799811e-03 3794 KSP Residual norm 7.053367506202e-03 3795 KSP Residual norm 7.333486350305e-03 3796 KSP Residual norm 7.649033463018e-03 3797 KSP Residual norm 7.876043153176e-03 3798 KSP Residual norm 8.189345098556e-03 3799 KSP Residual norm 8.555044954388e-03 3800 KSP Residual norm 8.014667913915e-03 3801 KSP Residual norm 7.248413672911e-03 3802 KSP Residual norm 7.287397118484e-03 3803 KSP Residual norm 7.637040534018e-03 3804 KSP Residual norm 6.897714931498e-03 3805 KSP Residual norm 5.944469575451e-03 3806 KSP Residual norm 5.528459318034e-03 3807 KSP Residual norm 5.120640390509e-03 3808 KSP Residual norm 5.004626284979e-03 3809 KSP Residual norm 4.807067376378e-03 3810 KSP Residual norm 4.145125430127e-03 3811 KSP Residual norm 4.048204614586e-03 3812 KSP Residual norm 4.404451642166e-03 3813 KSP Residual norm 4.421239620152e-03 3814 KSP Residual norm 4.235949899498e-03 3815 KSP Residual norm 4.318141981494e-03 3816 KSP Residual norm 4.354335812443e-03 3817 KSP Residual norm 4.546322496775e-03 3818 KSP Residual norm 5.031750667498e-03 3819 KSP Residual norm 5.678117818594e-03 3820 KSP Residual norm 6.142015034997e-03 3821 KSP Residual norm 6.888812809295e-03 3822 KSP Residual norm 7.390367234017e-03 3823 KSP Residual norm 7.242315061307e-03 3824 KSP Residual norm 7.198365334206e-03 3825 KSP Residual norm 7.905688770395e-03 3826 KSP Residual norm 8.467086227498e-03 3827 KSP Residual norm 8.994802372687e-03 3828 KSP Residual norm 9.159103811113e-03 3829 KSP Residual norm 9.321363538906e-03 3830 KSP Residual norm 9.340719550397e-03 3831 KSP Residual norm 8.794202141717e-03 3832 KSP Residual norm 7.320236022828e-03 3833 KSP Residual norm 6.202776925164e-03 3834 KSP Residual norm 6.002133205176e-03 3835 KSP Residual norm 5.774001141672e-03 3836 KSP Residual norm 5.522115330165e-03 3837 KSP Residual norm 5.750920916907e-03 3838 KSP Residual norm 5.784424597201e-03 3839 KSP Residual norm 5.504244278726e-03 3840 KSP Residual norm 5.786508402931e-03 3841 KSP Residual norm 6.258709456382e-03 3842 KSP Residual norm 6.184804909771e-03 3843 KSP Residual norm 6.356004268985e-03 3844 KSP Residual norm 6.602123566606e-03 3845 KSP Residual norm 6.629844664195e-03 3846 KSP Residual norm 6.591228941087e-03 3847 KSP Residual norm 6.622254373335e-03 3848 KSP Residual norm 7.022275135718e-03 3849 KSP Residual norm 7.837290866996e-03 3850 KSP Residual norm 8.123428859581e-03 3851 KSP Residual norm 7.749126144592e-03 3852 KSP Residual norm 8.163895683028e-03 3853 KSP Residual norm 9.515605767568e-03 3854 KSP Residual norm 1.001839354636e-02 3855 KSP Residual norm 9.104095295020e-03 3856 KSP Residual norm 8.774792035458e-03 3857 KSP Residual norm 9.217738089275e-03 3858 KSP Residual norm 1.018684571783e-02 3859 KSP Residual norm 9.748869663756e-03 3860 KSP Residual norm 7.794010844671e-03 3861 KSP Residual norm 6.623638961256e-03 3862 KSP Residual norm 6.355345934030e-03 3863 KSP Residual norm 5.763380439604e-03 3864 KSP Residual norm 4.581589857017e-03 3865 KSP Residual norm 4.204658458253e-03 3866 KSP Residual norm 4.358757553644e-03 3867 KSP Residual norm 4.372774461814e-03 3868 KSP Residual norm 4.075135303573e-03 3869 KSP Residual norm 3.988355680697e-03 3870 KSP Residual norm 4.074966854814e-03 3871 KSP Residual norm 4.428383285782e-03 3872 KSP Residual norm 4.992694123191e-03 3873 KSP Residual norm 5.345483223032e-03 3874 KSP Residual norm 5.907896699672e-03 3875 KSP Residual norm 6.791550695225e-03 3876 KSP Residual norm 7.453043324368e-03 3877 KSP Residual norm 7.316237666160e-03 3878 KSP Residual norm 7.638203490970e-03 3879 KSP Residual norm 8.189926544721e-03 3880 KSP Residual norm 8.172873192297e-03 3881 KSP Residual norm 8.202362329638e-03 3882 KSP Residual norm 7.875925066871e-03 3883 KSP Residual norm 7.302825340613e-03 3884 KSP Residual norm 6.769260297802e-03 3885 KSP Residual norm 5.824290141855e-03 3886 KSP Residual norm 4.628515099291e-03 3887 KSP Residual norm 4.344626413698e-03 3888 KSP Residual norm 4.438663569125e-03 3889 KSP Residual norm 3.802257686884e-03 3890 KSP Residual norm 3.095255544821e-03 3891 KSP Residual norm 2.848210691569e-03 3892 KSP Residual norm 2.657250095577e-03 3893 KSP Residual norm 2.276320456704e-03 3894 KSP Residual norm 2.037450393309e-03 3895 KSP Residual norm 1.915467200636e-03 3896 KSP Residual norm 1.719587131875e-03 3897 KSP Residual norm 1.524609344566e-03 3898 KSP Residual norm 1.290825461981e-03 3899 KSP Residual norm 1.168883320575e-03 3900 KSP Residual norm 1.283331231701e-03 3901 KSP Residual norm 1.436126987057e-03 3902 KSP Residual norm 1.361955006227e-03 3903 KSP Residual norm 1.338175783573e-03 3904 KSP Residual norm 1.570638634894e-03 3905 KSP Residual norm 1.713508983336e-03 3906 KSP Residual norm 1.677892560676e-03 3907 KSP Residual norm 1.813736764789e-03 3908 KSP Residual norm 2.242120276841e-03 3909 KSP Residual norm 2.698144730117e-03 3910 KSP Residual norm 2.943518756740e-03 3911 KSP Residual norm 2.811756037698e-03 3912 KSP Residual norm 2.699136252558e-03 3913 KSP Residual norm 2.913649753289e-03 3914 KSP Residual norm 2.953715188845e-03 3915 KSP Residual norm 2.652624157403e-03 3916 KSP Residual norm 2.673256732648e-03 3917 KSP Residual norm 2.724974376132e-03 3918 KSP Residual norm 2.333190412935e-03 3919 KSP Residual norm 1.956168059339e-03 3920 KSP Residual norm 1.871657576280e-03 3921 KSP Residual norm 1.899763659179e-03 3922 KSP Residual norm 1.997132487780e-03 3923 KSP Residual norm 2.058263607911e-03 3924 KSP Residual norm 1.903797645321e-03 3925 KSP Residual norm 1.710810284625e-03 3926 KSP Residual norm 1.557951987260e-03 3927 KSP Residual norm 1.380954604580e-03 3928 KSP Residual norm 1.273116266406e-03 3929 KSP Residual norm 1.281841353544e-03 3930 KSP Residual norm 1.277594278924e-03 3931 KSP Residual norm 1.277687865662e-03 3932 KSP Residual norm 1.353717396913e-03 3933 KSP Residual norm 1.402807119549e-03 3934 KSP Residual norm 1.422277591573e-03 3935 KSP Residual norm 1.625924173011e-03 3936 KSP Residual norm 1.975493126500e-03 3937 KSP Residual norm 2.196794890192e-03 3938 KSP Residual norm 2.341207215937e-03 3939 KSP Residual norm 2.430487228105e-03 3940 KSP Residual norm 2.495149065372e-03 3941 KSP Residual norm 2.778120305213e-03 3942 KSP Residual norm 3.124408930176e-03 3943 KSP Residual norm 3.190882529742e-03 3944 KSP Residual norm 3.373692722048e-03 3945 KSP Residual norm 3.709821475184e-03 3946 KSP Residual norm 3.491173160141e-03 3947 KSP Residual norm 3.063171306949e-03 3948 KSP Residual norm 2.800498250288e-03 3949 KSP Residual norm 2.768468359212e-03 3950 KSP Residual norm 2.789753714034e-03 3951 KSP Residual norm 2.738210884861e-03 3952 KSP Residual norm 2.465213574489e-03 3953 KSP Residual norm 2.265157643342e-03 3954 KSP Residual norm 2.358747748701e-03 3955 KSP Residual norm 2.411573013505e-03 3956 KSP Residual norm 2.089312592337e-03 3957 KSP Residual norm 1.933501287631e-03 3958 KSP Residual norm 2.108368215422e-03 3959 KSP Residual norm 2.219814652133e-03 3960 KSP Residual norm 2.110738251193e-03 3961 KSP Residual norm 2.234926751875e-03 3962 KSP Residual norm 2.593577306467e-03 3963 KSP Residual norm 2.983545200723e-03 3964 KSP Residual norm 3.078357910477e-03 3965 KSP Residual norm 2.966701504541e-03 3966 KSP Residual norm 3.149638391658e-03 3967 KSP Residual norm 3.445184167452e-03 3968 KSP Residual norm 3.453035160513e-03 3969 KSP Residual norm 3.502134926831e-03 3970 KSP Residual norm 4.085379176779e-03 3971 KSP Residual norm 4.590664868093e-03 3972 KSP Residual norm 4.469800844813e-03 3973 KSP Residual norm 4.237203899081e-03 3974 KSP Residual norm 4.128879619177e-03 3975 KSP Residual norm 4.125836613557e-03 3976 KSP Residual norm 4.214714507964e-03 3977 KSP Residual norm 4.105466834480e-03 3978 KSP Residual norm 3.934598316084e-03 3979 KSP Residual norm 3.912392761648e-03 3980 KSP Residual norm 3.798478103772e-03 3981 KSP Residual norm 3.562448082973e-03 3982 KSP Residual norm 3.545680883614e-03 3983 KSP Residual norm 3.645794420826e-03 3984 KSP Residual norm 3.592630361608e-03 3985 KSP Residual norm 3.553548998729e-03 3986 KSP Residual norm 3.700141927027e-03 3987 KSP Residual norm 3.567627859690e-03 3988 KSP Residual norm 3.390844827108e-03 3989 KSP Residual norm 3.281308568522e-03 3990 KSP Residual norm 2.991696242542e-03 3991 KSP Residual norm 2.792869991820e-03 3992 KSP Residual norm 3.010884760411e-03 3993 KSP Residual norm 3.227496452664e-03 3994 KSP Residual norm 3.412383234711e-03 3995 KSP Residual norm 3.795193937763e-03 3996 KSP Residual norm 4.149179729550e-03 3997 KSP Residual norm 4.539427431375e-03 3998 KSP Residual norm 5.416436361368e-03 3999 KSP Residual norm 6.300588368756e-03 4000 KSP Residual norm 6.167787442207e-03 4001 KSP Residual norm 6.292404625557e-03 4002 KSP Residual norm 7.008852289372e-03 4003 KSP Residual norm 7.565498299997e-03 4004 KSP Residual norm 8.120854469308e-03 4005 KSP Residual norm 8.713611599056e-03 4006 KSP Residual norm 9.677399641609e-03 4007 KSP Residual norm 1.054689757176e-02 4008 KSP Residual norm 1.087582928849e-02 4009 KSP Residual norm 1.007451883091e-02 4010 KSP Residual norm 1.005418117457e-02 4011 KSP Residual norm 1.124097769836e-02 4012 KSP Residual norm 1.148649707407e-02 4013 KSP Residual norm 9.941201966828e-03 4014 KSP Residual norm 8.901971081700e-03 4015 KSP Residual norm 8.696142532189e-03 4016 KSP Residual norm 8.404181169403e-03 4017 KSP Residual norm 7.741560127934e-03 4018 KSP Residual norm 6.870162833530e-03 4019 KSP Residual norm 6.409642477895e-03 4020 KSP Residual norm 6.069328086156e-03 4021 KSP Residual norm 5.776827413134e-03 4022 KSP Residual norm 5.670138057287e-03 4023 KSP Residual norm 6.239748496973e-03 4024 KSP Residual norm 6.364078567745e-03 4025 KSP Residual norm 5.480279527492e-03 4026 KSP Residual norm 5.420611891364e-03 4027 KSP Residual norm 6.693178523687e-03 4028 KSP Residual norm 7.573505928781e-03 4029 KSP Residual norm 7.355062725531e-03 4030 KSP Residual norm 7.865815366378e-03 4031 KSP Residual norm 8.913608852850e-03 4032 KSP Residual norm 9.353830422719e-03 4033 KSP Residual norm 9.445000231771e-03 4034 KSP Residual norm 9.764659241521e-03 4035 KSP Residual norm 1.100159122518e-02 4036 KSP Residual norm 1.285288256792e-02 4037 KSP Residual norm 1.335029056582e-02 4038 KSP Residual norm 1.294587675911e-02 4039 KSP Residual norm 1.422094898257e-02 4040 KSP Residual norm 1.628215393699e-02 4041 KSP Residual norm 1.696945076695e-02 4042 KSP Residual norm 1.694930718735e-02 4043 KSP Residual norm 1.843978945611e-02 4044 KSP Residual norm 2.052299648808e-02 4045 KSP Residual norm 2.211748841535e-02 4046 KSP Residual norm 2.321705907275e-02 4047 KSP Residual norm 2.341689257375e-02 4048 KSP Residual norm 2.416178335527e-02 4049 KSP Residual norm 2.599295943495e-02 4050 KSP Residual norm 2.514273558437e-02 4051 KSP Residual norm 2.373749571765e-02 4052 KSP Residual norm 2.496239301966e-02 4053 KSP Residual norm 2.384572486980e-02 4054 KSP Residual norm 2.189786972011e-02 4055 KSP Residual norm 2.172818225751e-02 4056 KSP Residual norm 2.205485921263e-02 4057 KSP Residual norm 2.086284956444e-02 4058 KSP Residual norm 1.978949412410e-02 4059 KSP Residual norm 1.903694302453e-02 4060 KSP Residual norm 1.756156298591e-02 4061 KSP Residual norm 1.650577386275e-02 4062 KSP Residual norm 1.612937501544e-02 4063 KSP Residual norm 1.558395950881e-02 4064 KSP Residual norm 1.577750699603e-02 4065 KSP Residual norm 1.655934045560e-02 4066 KSP Residual norm 1.473611853106e-02 4067 KSP Residual norm 1.377019185090e-02 4068 KSP Residual norm 1.564529179969e-02 4069 KSP Residual norm 1.774505414733e-02 4070 KSP Residual norm 1.868732589035e-02 4071 KSP Residual norm 1.991885818140e-02 4072 KSP Residual norm 2.012862143662e-02 4073 KSP Residual norm 1.988132788479e-02 4074 KSP Residual norm 2.115103418971e-02 4075 KSP Residual norm 2.358979444836e-02 4076 KSP Residual norm 2.611719633359e-02 4077 KSP Residual norm 3.035047492284e-02 4078 KSP Residual norm 3.352971782032e-02 4079 KSP Residual norm 3.578598417072e-02 4080 KSP Residual norm 4.263841989420e-02 4081 KSP Residual norm 5.165886197902e-02 4082 KSP Residual norm 5.486201128347e-02 4083 KSP Residual norm 5.049910298019e-02 4084 KSP Residual norm 4.605553120815e-02 4085 KSP Residual norm 4.490494642830e-02 4086 KSP Residual norm 4.829641408881e-02 4087 KSP Residual norm 5.501933354121e-02 4088 KSP Residual norm 5.762181496335e-02 4089 KSP Residual norm 6.184507763720e-02 4090 KSP Residual norm 6.799001535709e-02 4091 KSP Residual norm 6.699676586533e-02 4092 KSP Residual norm 6.137143947571e-02 4093 KSP Residual norm 6.145208768691e-02 4094 KSP Residual norm 6.447849143416e-02 4095 KSP Residual norm 6.677785223841e-02 4096 KSP Residual norm 6.994893194717e-02 4097 KSP Residual norm 7.259122470004e-02 4098 KSP Residual norm 6.628765269480e-02 4099 KSP Residual norm 5.769623044857e-02 4100 KSP Residual norm 4.954700349837e-02 4101 KSP Residual norm 4.471952485641e-02 4102 KSP Residual norm 4.250378170916e-02 4103 KSP Residual norm 4.024537134283e-02 4104 KSP Residual norm 3.712590771737e-02 4105 KSP Residual norm 3.736567911800e-02 4106 KSP Residual norm 3.892325200756e-02 4107 KSP Residual norm 3.618833648615e-02 4108 KSP Residual norm 3.274608263635e-02 4109 KSP Residual norm 3.302857805750e-02 4110 KSP Residual norm 3.368772472420e-02 4111 KSP Residual norm 3.084212384729e-02 4112 KSP Residual norm 2.668617404434e-02 4113 KSP Residual norm 2.604814110785e-02 4114 KSP Residual norm 2.984441792959e-02 4115 KSP Residual norm 3.794117643623e-02 4116 KSP Residual norm 4.452640277485e-02 4117 KSP Residual norm 4.683760157063e-02 4118 KSP Residual norm 4.627721780100e-02 4119 KSP Residual norm 4.750187095074e-02 4120 KSP Residual norm 5.246150669565e-02 4121 KSP Residual norm 6.162377530446e-02 4122 KSP Residual norm 6.940348416890e-02 4123 KSP Residual norm 6.691640581601e-02 4124 KSP Residual norm 6.090565988371e-02 4125 KSP Residual norm 5.826128057062e-02 4126 KSP Residual norm 5.637134138895e-02 4127 KSP Residual norm 5.479174409528e-02 4128 KSP Residual norm 5.077100516913e-02 4129 KSP Residual norm 4.710560934884e-02 4130 KSP Residual norm 4.689111810416e-02 4131 KSP Residual norm 4.582858436072e-02 4132 KSP Residual norm 4.018768421611e-02 4133 KSP Residual norm 3.983785178650e-02 4134 KSP Residual norm 3.979030025495e-02 4135 KSP Residual norm 3.514106000032e-02 4136 KSP Residual norm 3.181214922977e-02 4137 KSP Residual norm 3.383059959436e-02 4138 KSP Residual norm 3.435143887120e-02 4139 KSP Residual norm 3.201574560649e-02 4140 KSP Residual norm 3.025203926137e-02 4141 KSP Residual norm 2.955297935694e-02 4142 KSP Residual norm 3.005511786720e-02 4143 KSP Residual norm 3.134659999092e-02 4144 KSP Residual norm 3.152611526486e-02 4145 KSP Residual norm 3.378896641577e-02 4146 KSP Residual norm 3.767631864871e-02 4147 KSP Residual norm 3.872142644142e-02 4148 KSP Residual norm 3.788041737293e-02 4149 KSP Residual norm 4.003507550931e-02 4150 KSP Residual norm 4.352619292239e-02 4151 KSP Residual norm 4.722291075145e-02 4152 KSP Residual norm 5.312430242758e-02 4153 KSP Residual norm 6.384037240444e-02 4154 KSP Residual norm 6.976262480136e-02 4155 KSP Residual norm 6.727404192307e-02 4156 KSP Residual norm 6.278491467126e-02 4157 KSP Residual norm 5.903721072657e-02 4158 KSP Residual norm 5.770156694826e-02 4159 KSP Residual norm 5.719171927231e-02 4160 KSP Residual norm 5.405639928914e-02 4161 KSP Residual norm 5.104723515948e-02 4162 KSP Residual norm 5.246997457170e-02 4163 KSP Residual norm 5.085264261434e-02 4164 KSP Residual norm 4.639547771841e-02 4165 KSP Residual norm 4.424276902582e-02 4166 KSP Residual norm 3.894890310919e-02 4167 KSP Residual norm 3.188992989941e-02 4168 KSP Residual norm 2.891759071865e-02 4169 KSP Residual norm 2.911735950990e-02 4170 KSP Residual norm 2.737932630824e-02 4171 KSP Residual norm 2.513876186543e-02 4172 KSP Residual norm 2.334986682740e-02 4173 KSP Residual norm 2.168115377523e-02 4174 KSP Residual norm 2.196663323481e-02 4175 KSP Residual norm 2.301565649075e-02 4176 KSP Residual norm 2.151719060006e-02 4177 KSP Residual norm 2.006323971901e-02 4178 KSP Residual norm 2.055958248099e-02 4179 KSP Residual norm 2.085541329820e-02 4180 KSP Residual norm 1.971004760518e-02 4181 KSP Residual norm 1.928401803922e-02 4182 KSP Residual norm 1.961270292296e-02 4183 KSP Residual norm 2.032814605531e-02 4184 KSP Residual norm 2.084056987718e-02 4185 KSP Residual norm 1.921762997018e-02 4186 KSP Residual norm 1.847233155789e-02 4187 KSP Residual norm 2.049672581148e-02 4188 KSP Residual norm 2.389381826180e-02 4189 KSP Residual norm 2.589106108971e-02 4190 KSP Residual norm 2.829041676912e-02 4191 KSP Residual norm 3.107395419930e-02 4192 KSP Residual norm 3.313222908529e-02 4193 KSP Residual norm 3.574395005143e-02 4194 KSP Residual norm 3.763611194066e-02 4195 KSP Residual norm 3.783113059850e-02 4196 KSP Residual norm 3.810303732131e-02 4197 KSP Residual norm 3.910627755148e-02 4198 KSP Residual norm 4.087449554290e-02 4199 KSP Residual norm 4.133407095921e-02 4200 KSP Residual norm 3.912211956128e-02 4201 KSP Residual norm 3.584684092597e-02 4202 KSP Residual norm 3.410416417466e-02 4203 KSP Residual norm 3.089408696511e-02 4204 KSP Residual norm 2.721438390337e-02 4205 KSP Residual norm 2.542328046610e-02 4206 KSP Residual norm 2.534751439496e-02 4207 KSP Residual norm 2.201163504458e-02 4208 KSP Residual norm 1.897173081031e-02 4209 KSP Residual norm 1.778612566789e-02 4210 KSP Residual norm 1.658467259507e-02 4211 KSP Residual norm 1.463353687131e-02 4212 KSP Residual norm 1.372266039403e-02 4213 KSP Residual norm 1.386600833626e-02 4214 KSP Residual norm 1.413474215759e-02 4215 KSP Residual norm 1.364982865091e-02 4216 KSP Residual norm 1.220735256835e-02 4217 KSP Residual norm 1.056331911208e-02 4218 KSP Residual norm 1.088532116905e-02 4219 KSP Residual norm 1.122182851744e-02 4220 KSP Residual norm 1.024576915002e-02 4221 KSP Residual norm 9.685094919381e-03 4222 KSP Residual norm 1.020841898353e-02 4223 KSP Residual norm 1.026862157750e-02 4224 KSP Residual norm 9.442588629808e-03 4225 KSP Residual norm 8.810335436026e-03 4226 KSP Residual norm 8.722293627830e-03 4227 KSP Residual norm 9.293531504949e-03 4228 KSP Residual norm 9.822152110870e-03 4229 KSP Residual norm 9.889328955851e-03 4230 KSP Residual norm 1.070451223944e-02 4231 KSP Residual norm 1.238185370013e-02 4232 KSP Residual norm 1.378262595931e-02 4233 KSP Residual norm 1.535885577032e-02 4234 KSP Residual norm 1.790240334794e-02 4235 KSP Residual norm 1.878800842548e-02 4236 KSP Residual norm 1.710013687941e-02 4237 KSP Residual norm 1.594068750413e-02 4238 KSP Residual norm 1.655386570933e-02 4239 KSP Residual norm 1.749197146271e-02 4240 KSP Residual norm 1.952041724866e-02 4241 KSP Residual norm 2.144960774538e-02 4242 KSP Residual norm 2.157263928537e-02 4243 KSP Residual norm 2.121099531783e-02 4244 KSP Residual norm 2.009938524786e-02 4245 KSP Residual norm 1.813194249856e-02 4246 KSP Residual norm 1.801936846260e-02 4247 KSP Residual norm 1.971323604789e-02 4248 KSP Residual norm 1.983009954623e-02 4249 KSP Residual norm 1.892251611939e-02 4250 KSP Residual norm 1.889829507085e-02 4251 KSP Residual norm 1.867747261733e-02 4252 KSP Residual norm 1.792211688059e-02 4253 KSP Residual norm 1.674841349386e-02 4254 KSP Residual norm 1.585124779596e-02 4255 KSP Residual norm 1.599284123319e-02 4256 KSP Residual norm 1.576600388785e-02 4257 KSP Residual norm 1.491558449498e-02 4258 KSP Residual norm 1.443848922019e-02 4259 KSP Residual norm 1.393131558042e-02 4260 KSP Residual norm 1.243906245472e-02 4261 KSP Residual norm 1.001290372198e-02 4262 KSP Residual norm 8.511398912990e-03 4263 KSP Residual norm 8.066213770473e-03 4264 KSP Residual norm 8.227340391243e-03 4265 KSP Residual norm 8.500394900963e-03 4266 KSP Residual norm 9.014187567342e-03 4267 KSP Residual norm 9.462171200366e-03 4268 KSP Residual norm 9.889216057286e-03 4269 KSP Residual norm 8.953715920124e-03 4270 KSP Residual norm 7.793467920104e-03 4271 KSP Residual norm 7.605625436814e-03 4272 KSP Residual norm 8.237896802035e-03 4273 KSP Residual norm 8.962565770242e-03 4274 KSP Residual norm 9.700766071067e-03 4275 KSP Residual norm 1.057777171600e-02 4276 KSP Residual norm 1.163363620762e-02 4277 KSP Residual norm 1.208131610918e-02 4278 KSP Residual norm 1.201757014741e-02 4279 KSP Residual norm 1.250389070910e-02 4280 KSP Residual norm 1.272933054374e-02 4281 KSP Residual norm 1.216689327075e-02 4282 KSP Residual norm 1.193828250704e-02 4283 KSP Residual norm 1.246689184820e-02 4284 KSP Residual norm 1.312474726255e-02 4285 KSP Residual norm 1.263541419629e-02 4286 KSP Residual norm 1.172921671573e-02 4287 KSP Residual norm 1.248007088743e-02 4288 KSP Residual norm 1.407592826539e-02 4289 KSP Residual norm 1.413448717270e-02 4290 KSP Residual norm 1.357751255638e-02 4291 KSP Residual norm 1.452968722840e-02 4292 KSP Residual norm 1.584177590728e-02 4293 KSP Residual norm 1.465715531433e-02 4294 KSP Residual norm 1.333612525778e-02 4295 KSP Residual norm 1.384375993868e-02 4296 KSP Residual norm 1.386915622517e-02 4297 KSP Residual norm 1.261838066562e-02 4298 KSP Residual norm 1.175311334406e-02 4299 KSP Residual norm 1.215236500646e-02 4300 KSP Residual norm 1.255994524421e-02 4301 KSP Residual norm 1.124460072384e-02 4302 KSP Residual norm 9.343985657655e-03 4303 KSP Residual norm 8.335202935980e-03 4304 KSP Residual norm 8.285002658333e-03 4305 KSP Residual norm 8.067162735187e-03 4306 KSP Residual norm 7.348408965037e-03 4307 KSP Residual norm 7.251382815039e-03 4308 KSP Residual norm 7.724153703956e-03 4309 KSP Residual norm 8.165541009594e-03 4310 KSP Residual norm 8.161988205972e-03 4311 KSP Residual norm 8.837952909069e-03 4312 KSP Residual norm 9.636726196004e-03 4313 KSP Residual norm 1.063023191083e-02 4314 KSP Residual norm 1.091507928602e-02 4315 KSP Residual norm 1.148995199751e-02 4316 KSP Residual norm 1.256403293086e-02 4317 KSP Residual norm 1.295631262711e-02 4318 KSP Residual norm 1.236584792169e-02 4319 KSP Residual norm 1.206236777988e-02 4320 KSP Residual norm 1.263655891020e-02 4321 KSP Residual norm 1.304027979574e-02 4322 KSP Residual norm 1.334171959374e-02 4323 KSP Residual norm 1.373594325953e-02 4324 KSP Residual norm 1.329683296616e-02 4325 KSP Residual norm 1.372400313264e-02 4326 KSP Residual norm 1.516615141962e-02 4327 KSP Residual norm 1.673829022112e-02 4328 KSP Residual norm 1.889700559647e-02 4329 KSP Residual norm 2.206670894884e-02 4330 KSP Residual norm 2.430567542455e-02 4331 KSP Residual norm 2.599881652863e-02 4332 KSP Residual norm 2.939128723687e-02 4333 KSP Residual norm 3.303847575911e-02 4334 KSP Residual norm 3.261819884770e-02 4335 KSP Residual norm 3.227755508146e-02 4336 KSP Residual norm 3.516514611817e-02 4337 KSP Residual norm 3.714248843602e-02 4338 KSP Residual norm 3.687957608601e-02 4339 KSP Residual norm 3.547144948854e-02 4340 KSP Residual norm 3.538730000684e-02 4341 KSP Residual norm 3.747209848009e-02 4342 KSP Residual norm 4.033719367847e-02 4343 KSP Residual norm 4.188069048659e-02 4344 KSP Residual norm 4.027148882804e-02 4345 KSP Residual norm 3.822077958758e-02 4346 KSP Residual norm 3.653243560561e-02 4347 KSP Residual norm 3.511659944011e-02 4348 KSP Residual norm 3.206975127852e-02 4349 KSP Residual norm 3.086545559654e-02 4350 KSP Residual norm 3.127029182767e-02 4351 KSP Residual norm 3.044384193361e-02 4352 KSP Residual norm 2.846068960966e-02 4353 KSP Residual norm 2.777564701555e-02 4354 KSP Residual norm 2.569979107924e-02 4355 KSP Residual norm 2.331487808072e-02 4356 KSP Residual norm 2.253010482072e-02 4357 KSP Residual norm 2.216228329378e-02 4358 KSP Residual norm 2.043967396656e-02 4359 KSP Residual norm 2.023111342518e-02 4360 KSP Residual norm 2.187629590988e-02 4361 KSP Residual norm 2.184819232415e-02 4362 KSP Residual norm 1.822781641288e-02 4363 KSP Residual norm 1.558867590643e-02 4364 KSP Residual norm 1.575311061178e-02 4365 KSP Residual norm 1.809224181664e-02 4366 KSP Residual norm 2.130859643515e-02 4367 KSP Residual norm 2.376103624940e-02 4368 KSP Residual norm 2.502147297115e-02 4369 KSP Residual norm 2.623400155944e-02 4370 KSP Residual norm 2.739120320655e-02 4371 KSP Residual norm 2.942773841794e-02 4372 KSP Residual norm 3.200238216774e-02 4373 KSP Residual norm 3.338058966096e-02 4374 KSP Residual norm 3.313429368699e-02 4375 KSP Residual norm 3.461235426972e-02 4376 KSP Residual norm 3.811517365282e-02 4377 KSP Residual norm 4.046572050369e-02 4378 KSP Residual norm 4.134439617517e-02 4379 KSP Residual norm 4.562675841668e-02 4380 KSP Residual norm 5.212114202472e-02 4381 KSP Residual norm 5.456690035163e-02 4382 KSP Residual norm 5.225156155629e-02 4383 KSP Residual norm 5.154045850229e-02 4384 KSP Residual norm 5.583443611844e-02 4385 KSP Residual norm 6.290982861831e-02 4386 KSP Residual norm 6.450934490892e-02 4387 KSP Residual norm 6.225312631384e-02 4388 KSP Residual norm 6.204275106502e-02 4389 KSP Residual norm 6.709831924677e-02 4390 KSP Residual norm 7.303918792821e-02 4391 KSP Residual norm 7.607221642580e-02 4392 KSP Residual norm 7.785470924369e-02 4393 KSP Residual norm 7.826606828851e-02 4394 KSP Residual norm 7.967619320243e-02 4395 KSP Residual norm 7.984488955252e-02 4396 KSP Residual norm 7.594015846163e-02 4397 KSP Residual norm 7.448891051246e-02 4398 KSP Residual norm 7.395679493806e-02 4399 KSP Residual norm 7.079387810796e-02 4400 KSP Residual norm 6.814797450049e-02 4401 KSP Residual norm 6.878511005462e-02 4402 KSP Residual norm 6.821471316216e-02 4403 KSP Residual norm 6.129683459287e-02 4404 KSP Residual norm 5.513161157759e-02 4405 KSP Residual norm 5.400916771858e-02 4406 KSP Residual norm 5.713751137240e-02 4407 KSP Residual norm 5.508343444529e-02 4408 KSP Residual norm 4.958965746627e-02 4409 KSP Residual norm 4.482339821928e-02 4410 KSP Residual norm 4.083577232812e-02 4411 KSP Residual norm 3.957885546103e-02 4412 KSP Residual norm 3.967240796630e-02 4413 KSP Residual norm 3.728369645287e-02 4414 KSP Residual norm 3.226500874638e-02 4415 KSP Residual norm 2.658508602181e-02 4416 KSP Residual norm 2.243037461516e-02 4417 KSP Residual norm 2.118619526210e-02 4418 KSP Residual norm 2.038447161234e-02 4419 KSP Residual norm 1.902572524747e-02 4420 KSP Residual norm 1.756000590113e-02 4421 KSP Residual norm 1.694984169246e-02 4422 KSP Residual norm 1.623223727461e-02 4423 KSP Residual norm 1.484945348301e-02 4424 KSP Residual norm 1.425405980546e-02 4425 KSP Residual norm 1.470653094556e-02 4426 KSP Residual norm 1.458935889012e-02 4427 KSP Residual norm 1.293215686369e-02 4428 KSP Residual norm 1.158375529529e-02 4429 KSP Residual norm 1.145287243611e-02 4430 KSP Residual norm 1.096277925257e-02 4431 KSP Residual norm 1.022076337818e-02 4432 KSP Residual norm 1.032267774439e-02 4433 KSP Residual norm 1.236991410043e-02 4434 KSP Residual norm 1.503856744922e-02 4435 KSP Residual norm 1.615662952662e-02 4436 KSP Residual norm 1.770620500533e-02 4437 KSP Residual norm 1.956199501807e-02 4438 KSP Residual norm 2.015816510449e-02 4439 KSP Residual norm 1.951267658699e-02 4440 KSP Residual norm 1.843100893997e-02 4441 KSP Residual norm 1.851266566866e-02 4442 KSP Residual norm 1.892542891790e-02 4443 KSP Residual norm 1.796682590042e-02 4444 KSP Residual norm 1.722464550182e-02 4445 KSP Residual norm 1.868191386322e-02 4446 KSP Residual norm 2.066282950335e-02 4447 KSP Residual norm 2.020060391601e-02 4448 KSP Residual norm 2.000627501822e-02 4449 KSP Residual norm 2.033319998842e-02 4450 KSP Residual norm 1.959216035400e-02 4451 KSP Residual norm 1.808017765673e-02 4452 KSP Residual norm 1.698079625997e-02 4453 KSP Residual norm 1.626785390426e-02 4454 KSP Residual norm 1.616999484548e-02 4455 KSP Residual norm 1.606097315872e-02 4456 KSP Residual norm 1.642414344635e-02 4457 KSP Residual norm 1.682823461466e-02 4458 KSP Residual norm 1.633484799356e-02 4459 KSP Residual norm 1.588795131649e-02 4460 KSP Residual norm 1.584363336047e-02 4461 KSP Residual norm 1.538679118579e-02 4462 KSP Residual norm 1.429521352663e-02 4463 KSP Residual norm 1.289393515798e-02 4464 KSP Residual norm 1.245768605385e-02 4465 KSP Residual norm 1.272402080298e-02 4466 KSP Residual norm 1.275535831287e-02 4467 KSP Residual norm 1.219396475175e-02 4468 KSP Residual norm 1.163351295872e-02 4469 KSP Residual norm 1.175752166397e-02 4470 KSP Residual norm 1.093596782102e-02 4471 KSP Residual norm 9.663729370247e-03 4472 KSP Residual norm 9.081115435288e-03 4473 KSP Residual norm 9.339063465569e-03 4474 KSP Residual norm 9.591448392416e-03 4475 KSP Residual norm 9.599471359057e-03 4476 KSP Residual norm 9.703447291851e-03 4477 KSP Residual norm 1.001450604403e-02 4478 KSP Residual norm 1.077962278287e-02 4479 KSP Residual norm 1.150009077025e-02 4480 KSP Residual norm 1.229345486357e-02 4481 KSP Residual norm 1.287235228033e-02 4482 KSP Residual norm 1.265535392658e-02 4483 KSP Residual norm 1.156308035077e-02 4484 KSP Residual norm 1.114011596853e-02 4485 KSP Residual norm 1.147153421137e-02 4486 KSP Residual norm 1.112594017387e-02 4487 KSP Residual norm 1.021418512486e-02 4488 KSP Residual norm 9.792645690396e-03 4489 KSP Residual norm 9.245683125612e-03 4490 KSP Residual norm 9.257146877628e-03 4491 KSP Residual norm 9.392165152820e-03 4492 KSP Residual norm 9.524362593064e-03 4493 KSP Residual norm 1.023149277578e-02 4494 KSP Residual norm 1.212519050172e-02 4495 KSP Residual norm 1.373353270897e-02 4496 KSP Residual norm 1.355322660452e-02 4497 KSP Residual norm 1.319365688975e-02 4498 KSP Residual norm 1.320598856272e-02 4499 KSP Residual norm 1.303314579576e-02 4500 KSP Residual norm 1.280999738416e-02 4501 KSP Residual norm 1.286543335859e-02 4502 KSP Residual norm 1.245117101716e-02 4503 KSP Residual norm 1.166004328005e-02 4504 KSP Residual norm 1.149385895545e-02 4505 KSP Residual norm 1.198758776558e-02 4506 KSP Residual norm 1.207378491676e-02 4507 KSP Residual norm 1.084156987528e-02 4508 KSP Residual norm 9.254195116218e-03 4509 KSP Residual norm 8.870781250756e-03 4510 KSP Residual norm 9.020231346209e-03 4511 KSP Residual norm 8.569913824718e-03 4512 KSP Residual norm 7.785538142756e-03 4513 KSP Residual norm 7.305871557667e-03 4514 KSP Residual norm 7.284265054015e-03 4515 KSP Residual norm 7.221062818228e-03 4516 KSP Residual norm 7.140567405317e-03 4517 KSP Residual norm 7.415245136982e-03 4518 KSP Residual norm 7.786163532958e-03 4519 KSP Residual norm 7.493589885061e-03 4520 KSP Residual norm 6.923520903935e-03 4521 KSP Residual norm 6.488676059735e-03 4522 KSP Residual norm 6.314385517513e-03 4523 KSP Residual norm 5.771334330971e-03 4524 KSP Residual norm 5.137059606800e-03 4525 KSP Residual norm 5.190613076976e-03 4526 KSP Residual norm 5.485144484875e-03 4527 KSP Residual norm 4.987731218788e-03 4528 KSP Residual norm 4.275484929414e-03 4529 KSP Residual norm 4.273648435680e-03 4530 KSP Residual norm 4.681552936585e-03 4531 KSP Residual norm 4.961890292442e-03 4532 KSP Residual norm 5.024460531338e-03 4533 KSP Residual norm 5.094514678711e-03 4534 KSP Residual norm 5.511628985233e-03 4535 KSP Residual norm 5.772947393146e-03 4536 KSP Residual norm 5.494456452308e-03 4537 KSP Residual norm 5.430000883516e-03 4538 KSP Residual norm 6.029664715855e-03 4539 KSP Residual norm 7.202474564781e-03 4540 KSP Residual norm 8.255026164732e-03 4541 KSP Residual norm 8.663638679693e-03 4542 KSP Residual norm 9.156804572598e-03 4543 KSP Residual norm 9.528409971005e-03 4544 KSP Residual norm 9.004011761565e-03 4545 KSP Residual norm 8.265189619488e-03 4546 KSP Residual norm 8.511100581038e-03 4547 KSP Residual norm 8.848333678829e-03 4548 KSP Residual norm 8.469172509830e-03 4549 KSP Residual norm 8.183452675852e-03 4550 KSP Residual norm 8.418835744943e-03 4551 KSP Residual norm 8.349988468643e-03 4552 KSP Residual norm 8.110743967592e-03 4553 KSP Residual norm 7.868795202645e-03 4554 KSP Residual norm 7.533548722006e-03 4555 KSP Residual norm 7.503047814224e-03 4556 KSP Residual norm 7.198527592347e-03 4557 KSP Residual norm 6.602610859014e-03 4558 KSP Residual norm 5.928200665097e-03 4559 KSP Residual norm 5.753114308744e-03 4560 KSP Residual norm 5.989274351955e-03 4561 KSP Residual norm 5.846791526975e-03 4562 KSP Residual norm 5.511767992415e-03 4563 KSP Residual norm 5.382926278810e-03 4564 KSP Residual norm 5.448343355683e-03 4565 KSP Residual norm 5.105030191056e-03 4566 KSP Residual norm 4.723043641197e-03 4567 KSP Residual norm 4.469811098350e-03 4568 KSP Residual norm 4.334992698130e-03 4569 KSP Residual norm 3.928590823078e-03 4570 KSP Residual norm 3.646305926684e-03 4571 KSP Residual norm 3.586355737516e-03 4572 KSP Residual norm 3.472674737293e-03 4573 KSP Residual norm 3.220610295982e-03 4574 KSP Residual norm 3.010126817340e-03 4575 KSP Residual norm 2.929265323296e-03 4576 KSP Residual norm 2.862266090784e-03 4577 KSP Residual norm 2.760268484709e-03 4578 KSP Residual norm 2.698525320928e-03 4579 KSP Residual norm 2.821719281981e-03 4580 KSP Residual norm 3.247349282734e-03 4581 KSP Residual norm 3.852340658180e-03 4582 KSP Residual norm 4.180526509750e-03 4583 KSP Residual norm 4.384976716934e-03 4584 KSP Residual norm 4.354906564723e-03 4585 KSP Residual norm 4.314425355341e-03 4586 KSP Residual norm 4.232315007032e-03 4587 KSP Residual norm 4.262720403959e-03 4588 KSP Residual norm 4.539999824623e-03 4589 KSP Residual norm 4.867119743444e-03 4590 KSP Residual norm 5.381570729403e-03 4591 KSP Residual norm 6.450614716025e-03 4592 KSP Residual norm 7.690289509690e-03 4593 KSP Residual norm 8.000161345832e-03 4594 KSP Residual norm 7.819448166282e-03 4595 KSP Residual norm 8.008696594533e-03 4596 KSP Residual norm 7.867188305389e-03 4597 KSP Residual norm 7.625887126866e-03 4598 KSP Residual norm 8.289235801512e-03 4599 KSP Residual norm 9.568200090172e-03 4600 KSP Residual norm 9.906722861646e-03 4601 KSP Residual norm 9.959100563549e-03 4602 KSP Residual norm 1.002358188073e-02 4603 KSP Residual norm 9.892212356164e-03 4604 KSP Residual norm 1.007772310809e-02 4605 KSP Residual norm 1.094742253074e-02 4606 KSP Residual norm 1.204157128724e-02 4607 KSP Residual norm 1.281646958731e-02 4608 KSP Residual norm 1.215726535352e-02 4609 KSP Residual norm 1.093467492496e-02 4610 KSP Residual norm 1.069801877084e-02 4611 KSP Residual norm 1.077395405875e-02 4612 KSP Residual norm 1.075960710753e-02 4613 KSP Residual norm 1.013651319526e-02 4614 KSP Residual norm 9.710897936448e-03 4615 KSP Residual norm 9.182699053929e-03 4616 KSP Residual norm 8.293399133489e-03 4617 KSP Residual norm 7.723147755421e-03 4618 KSP Residual norm 7.991858294129e-03 4619 KSP Residual norm 8.938344640381e-03 4620 KSP Residual norm 8.654439004309e-03 4621 KSP Residual norm 7.181141301408e-03 4622 KSP Residual norm 6.357230595613e-03 4623 KSP Residual norm 6.422117995639e-03 4624 KSP Residual norm 6.483063368787e-03 4625 KSP Residual norm 6.865442041828e-03 4626 KSP Residual norm 7.410354335749e-03 4627 KSP Residual norm 7.197010267209e-03 4628 KSP Residual norm 6.824861693545e-03 4629 KSP Residual norm 6.897666276806e-03 4630 KSP Residual norm 7.506296015931e-03 4631 KSP Residual norm 7.905088162358e-03 4632 KSP Residual norm 7.862757023938e-03 4633 KSP Residual norm 8.544522738893e-03 4634 KSP Residual norm 9.360255424249e-03 4635 KSP Residual norm 9.179099451290e-03 4636 KSP Residual norm 8.286752076018e-03 4637 KSP Residual norm 7.746846310036e-03 4638 KSP Residual norm 7.534871355381e-03 4639 KSP Residual norm 7.282719682088e-03 4640 KSP Residual norm 7.553492203465e-03 4641 KSP Residual norm 9.295155557681e-03 4642 KSP Residual norm 1.171992700109e-02 4643 KSP Residual norm 1.286637312900e-02 4644 KSP Residual norm 1.271570269138e-02 4645 KSP Residual norm 1.298333595908e-02 4646 KSP Residual norm 1.347439695833e-02 4647 KSP Residual norm 1.305011359740e-02 4648 KSP Residual norm 1.306731634648e-02 4649 KSP Residual norm 1.289962866463e-02 4650 KSP Residual norm 1.268087597179e-02 4651 KSP Residual norm 1.181452450529e-02 4652 KSP Residual norm 1.162013706209e-02 4653 KSP Residual norm 1.227906872619e-02 4654 KSP Residual norm 1.325070303771e-02 4655 KSP Residual norm 1.337762070145e-02 4656 KSP Residual norm 1.315158500085e-02 4657 KSP Residual norm 1.280614725143e-02 4658 KSP Residual norm 1.279197585693e-02 4659 KSP Residual norm 1.297450067139e-02 4660 KSP Residual norm 1.288594217896e-02 4661 KSP Residual norm 1.324791314990e-02 4662 KSP Residual norm 1.332737571601e-02 4663 KSP Residual norm 1.377272971364e-02 4664 KSP Residual norm 1.470093687998e-02 4665 KSP Residual norm 1.521424957280e-02 4666 KSP Residual norm 1.426627944548e-02 4667 KSP Residual norm 1.334366572410e-02 4668 KSP Residual norm 1.269344829471e-02 4669 KSP Residual norm 1.157214060039e-02 4670 KSP Residual norm 1.089210980287e-02 4671 KSP Residual norm 1.138383261846e-02 4672 KSP Residual norm 1.239333059775e-02 4673 KSP Residual norm 1.238727732204e-02 4674 KSP Residual norm 1.115915590468e-02 4675 KSP Residual norm 9.818285958862e-03 4676 KSP Residual norm 8.565492420102e-03 4677 KSP Residual norm 8.386541780752e-03 4678 KSP Residual norm 9.516688861134e-03 4679 KSP Residual norm 1.055328475215e-02 4680 KSP Residual norm 1.103554801045e-02 4681 KSP Residual norm 1.214862504166e-02 4682 KSP Residual norm 1.334068555972e-02 4683 KSP Residual norm 1.261599459542e-02 4684 KSP Residual norm 1.226182142533e-02 4685 KSP Residual norm 1.213480550847e-02 4686 KSP Residual norm 1.128132550377e-02 4687 KSP Residual norm 1.061403127221e-02 4688 KSP Residual norm 1.063349230992e-02 4689 KSP Residual norm 1.100848987199e-02 4690 KSP Residual norm 1.182402965199e-02 4691 KSP Residual norm 1.241531199741e-02 4692 KSP Residual norm 1.178730273607e-02 4693 KSP Residual norm 1.118470952312e-02 4694 KSP Residual norm 1.088443059126e-02 4695 KSP Residual norm 1.124772832917e-02 4696 KSP Residual norm 1.227516148048e-02 4697 KSP Residual norm 1.361596978640e-02 4698 KSP Residual norm 1.481280206513e-02 4699 KSP Residual norm 1.503265598500e-02 4700 KSP Residual norm 1.405032574327e-02 4701 KSP Residual norm 1.349902341937e-02 4702 KSP Residual norm 1.431737388001e-02 4703 KSP Residual norm 1.510908150518e-02 4704 KSP Residual norm 1.493536282337e-02 4705 KSP Residual norm 1.551422531292e-02 4706 KSP Residual norm 1.734336603421e-02 4707 KSP Residual norm 2.002501626185e-02 4708 KSP Residual norm 2.437462491444e-02 4709 KSP Residual norm 2.787437687602e-02 4710 KSP Residual norm 2.969851828531e-02 4711 KSP Residual norm 3.218934254707e-02 4712 KSP Residual norm 3.334929843073e-02 4713 KSP Residual norm 3.362831222543e-02 4714 KSP Residual norm 3.716477564943e-02 4715 KSP Residual norm 3.912634467630e-02 4716 KSP Residual norm 3.385720793202e-02 4717 KSP Residual norm 3.148142322739e-02 4718 KSP Residual norm 3.614145952465e-02 4719 KSP Residual norm 4.372417586063e-02 4720 KSP Residual norm 4.681159484674e-02 4721 KSP Residual norm 4.927379738417e-02 4722 KSP Residual norm 5.332342467052e-02 4723 KSP Residual norm 5.711397446185e-02 4724 KSP Residual norm 5.630133139287e-02 4725 KSP Residual norm 5.150282811884e-02 4726 KSP Residual norm 4.848854259112e-02 4727 KSP Residual norm 4.839766712392e-02 4728 KSP Residual norm 4.434141718208e-02 4729 KSP Residual norm 3.899315766144e-02 4730 KSP Residual norm 3.714692830255e-02 4731 KSP Residual norm 3.808062959376e-02 4732 KSP Residual norm 4.016749477394e-02 4733 KSP Residual norm 4.266357705396e-02 4734 KSP Residual norm 4.339169673220e-02 4735 KSP Residual norm 4.319612095025e-02 4736 KSP Residual norm 4.215492209268e-02 4737 KSP Residual norm 4.114426642373e-02 4738 KSP Residual norm 3.930242419604e-02 4739 KSP Residual norm 3.635129552145e-02 4740 KSP Residual norm 3.390812645207e-02 4741 KSP Residual norm 3.148750122687e-02 4742 KSP Residual norm 2.999444339151e-02 4743 KSP Residual norm 2.840582120708e-02 4744 KSP Residual norm 2.762142759199e-02 4745 KSP Residual norm 2.756049086040e-02 4746 KSP Residual norm 2.528557249255e-02 4747 KSP Residual norm 1.985618261662e-02 4748 KSP Residual norm 1.704874133330e-02 4749 KSP Residual norm 1.702916365609e-02 4750 KSP Residual norm 1.760842219336e-02 4751 KSP Residual norm 1.789682033856e-02 4752 KSP Residual norm 1.713223185387e-02 4753 KSP Residual norm 1.569896355416e-02 4754 KSP Residual norm 1.436561731637e-02 4755 KSP Residual norm 1.359599960161e-02 4756 KSP Residual norm 1.300462886490e-02 4757 KSP Residual norm 1.245207510571e-02 4758 KSP Residual norm 1.202992508680e-02 4759 KSP Residual norm 1.253906473485e-02 4760 KSP Residual norm 1.370882739476e-02 4761 KSP Residual norm 1.427135535699e-02 4762 KSP Residual norm 1.239537244118e-02 4763 KSP Residual norm 1.231282471630e-02 4764 KSP Residual norm 1.422684655278e-02 4765 KSP Residual norm 1.553672450769e-02 4766 KSP Residual norm 1.642684069468e-02 4767 KSP Residual norm 1.800600010207e-02 4768 KSP Residual norm 1.919540710086e-02 4769 KSP Residual norm 1.740662476578e-02 4770 KSP Residual norm 1.477995780205e-02 4771 KSP Residual norm 1.399626756825e-02 4772 KSP Residual norm 1.511396159601e-02 4773 KSP Residual norm 1.688380847740e-02 4774 KSP Residual norm 1.916919031668e-02 4775 KSP Residual norm 2.244179738950e-02 4776 KSP Residual norm 2.656126852966e-02 4777 KSP Residual norm 2.897145629632e-02 4778 KSP Residual norm 2.940996576390e-02 4779 KSP Residual norm 2.825507853185e-02 4780 KSP Residual norm 2.845705802397e-02 4781 KSP Residual norm 2.886157826641e-02 4782 KSP Residual norm 2.701527474462e-02 4783 KSP Residual norm 2.642677855064e-02 4784 KSP Residual norm 2.901462934352e-02 4785 KSP Residual norm 3.402873026474e-02 4786 KSP Residual norm 3.853720495238e-02 4787 KSP Residual norm 3.850369558944e-02 4788 KSP Residual norm 3.196678228647e-02 4789 KSP Residual norm 2.634458486650e-02 4790 KSP Residual norm 2.410137337534e-02 4791 KSP Residual norm 2.516763698972e-02 4792 KSP Residual norm 2.772430350738e-02 4793 KSP Residual norm 2.689201040742e-02 4794 KSP Residual norm 2.436219820416e-02 4795 KSP Residual norm 2.261589709462e-02 4796 KSP Residual norm 2.277671648243e-02 4797 KSP Residual norm 2.562926450131e-02 4798 KSP Residual norm 3.042263996726e-02 4799 KSP Residual norm 3.251990926068e-02 4800 KSP Residual norm 3.085956605997e-02 4801 KSP Residual norm 2.646868399630e-02 4802 KSP Residual norm 2.386192171524e-02 4803 KSP Residual norm 2.261458721256e-02 4804 KSP Residual norm 2.062660091819e-02 4805 KSP Residual norm 1.869516568364e-02 4806 KSP Residual norm 1.760108480192e-02 4807 KSP Residual norm 1.754443504284e-02 4808 KSP Residual norm 1.796042878408e-02 4809 KSP Residual norm 1.788378531861e-02 4810 KSP Residual norm 1.530127019498e-02 4811 KSP Residual norm 1.215105364291e-02 4812 KSP Residual norm 1.084502145397e-02 4813 KSP Residual norm 1.040249650107e-02 4814 KSP Residual norm 9.958002318289e-03 4815 KSP Residual norm 9.097127761621e-03 4816 KSP Residual norm 8.910933318721e-03 4817 KSP Residual norm 9.315951713342e-03 4818 KSP Residual norm 9.950766199381e-03 4819 KSP Residual norm 9.998912624615e-03 4820 KSP Residual norm 9.455630025168e-03 4821 KSP Residual norm 8.270036764891e-03 4822 KSP Residual norm 7.859719892972e-03 4823 KSP Residual norm 8.155333463607e-03 4824 KSP Residual norm 8.272047439841e-03 4825 KSP Residual norm 8.085623764538e-03 4826 KSP Residual norm 7.547949352706e-03 4827 KSP Residual norm 6.644991843527e-03 4828 KSP Residual norm 5.462484033838e-03 4829 KSP Residual norm 4.988850411074e-03 4830 KSP Residual norm 5.102907622420e-03 4831 KSP Residual norm 5.449278214594e-03 4832 KSP Residual norm 5.720242187523e-03 4833 KSP Residual norm 6.082833267210e-03 4834 KSP Residual norm 5.929903839415e-03 4835 KSP Residual norm 4.995252054354e-03 4836 KSP Residual norm 4.491057907517e-03 4837 KSP Residual norm 4.837108278648e-03 4838 KSP Residual norm 6.055193862859e-03 4839 KSP Residual norm 6.920685278467e-03 4840 KSP Residual norm 7.231459824192e-03 4841 KSP Residual norm 8.107222146470e-03 4842 KSP Residual norm 9.308239239686e-03 4843 KSP Residual norm 9.871463989251e-03 4844 KSP Residual norm 1.002484830643e-02 4845 KSP Residual norm 1.026884090543e-02 4846 KSP Residual norm 1.037122151751e-02 4847 KSP Residual norm 9.770518532811e-03 4848 KSP Residual norm 9.286292396229e-03 4849 KSP Residual norm 9.804405914052e-03 4850 KSP Residual norm 1.042391520388e-02 4851 KSP Residual norm 1.065719222685e-02 4852 KSP Residual norm 1.111461650311e-02 4853 KSP Residual norm 1.172781563300e-02 4854 KSP Residual norm 1.265211396877e-02 4855 KSP Residual norm 1.306519731946e-02 4856 KSP Residual norm 1.287582554304e-02 4857 KSP Residual norm 1.232046664975e-02 4858 KSP Residual norm 1.194964402615e-02 4859 KSP Residual norm 1.148504283675e-02 4860 KSP Residual norm 1.070334349280e-02 4861 KSP Residual norm 1.065549411294e-02 4862 KSP Residual norm 1.068936895043e-02 4863 KSP Residual norm 1.001884044768e-02 4864 KSP Residual norm 8.632443770420e-03 4865 KSP Residual norm 6.928568699760e-03 4866 KSP Residual norm 5.646926488273e-03 4867 KSP Residual norm 4.978013413895e-03 4868 KSP Residual norm 4.959400548923e-03 4869 KSP Residual norm 5.160469696160e-03 4870 KSP Residual norm 4.971745575577e-03 4871 KSP Residual norm 4.928852565150e-03 4872 KSP Residual norm 5.549808642864e-03 4873 KSP Residual norm 6.227012788396e-03 4874 KSP Residual norm 5.897499897790e-03 4875 KSP Residual norm 4.857923099915e-03 4876 KSP Residual norm 4.121677201552e-03 4877 KSP Residual norm 4.073826787684e-03 4878 KSP Residual norm 4.450419215251e-03 4879 KSP Residual norm 5.263443308304e-03 4880 KSP Residual norm 6.394327906829e-03 4881 KSP Residual norm 6.381629515620e-03 4882 KSP Residual norm 5.412036775608e-03 4883 KSP Residual norm 4.694628047564e-03 4884 KSP Residual norm 4.657643136149e-03 4885 KSP Residual norm 5.266059715368e-03 4886 KSP Residual norm 6.271757358305e-03 4887 KSP Residual norm 6.796039189475e-03 4888 KSP Residual norm 6.379619582637e-03 4889 KSP Residual norm 6.246866960254e-03 4890 KSP Residual norm 6.591449778191e-03 4891 KSP Residual norm 6.257950698969e-03 4892 KSP Residual norm 6.083635246191e-03 4893 KSP Residual norm 6.257576495005e-03 4894 KSP Residual norm 5.832728742303e-03 4895 KSP Residual norm 4.989826804701e-03 4896 KSP Residual norm 4.185729388971e-03 4897 KSP Residual norm 3.749863191958e-03 4898 KSP Residual norm 4.055073930552e-03 4899 KSP Residual norm 5.203687735174e-03 4900 KSP Residual norm 7.289298072190e-03 4901 KSP Residual norm 9.765633568643e-03 4902 KSP Residual norm 1.080706924468e-02 4903 KSP Residual norm 1.002662072119e-02 4904 KSP Residual norm 8.988931475513e-03 4905 KSP Residual norm 8.453173556504e-03 4906 KSP Residual norm 9.320737266791e-03 4907 KSP Residual norm 1.047363050911e-02 4908 KSP Residual norm 1.076803557135e-02 4909 KSP Residual norm 1.039213410634e-02 4910 KSP Residual norm 9.155568292591e-03 4911 KSP Residual norm 7.626370109756e-03 4912 KSP Residual norm 6.557297619764e-03 4913 KSP Residual norm 5.688947686707e-03 4914 KSP Residual norm 5.307608385522e-03 4915 KSP Residual norm 5.402110500513e-03 4916 KSP Residual norm 5.758432233635e-03 4917 KSP Residual norm 6.434845501485e-03 4918 KSP Residual norm 7.111541646617e-03 4919 KSP Residual norm 7.488030948720e-03 4920 KSP Residual norm 6.705354653259e-03 4921 KSP Residual norm 4.790018037051e-03 4922 KSP Residual norm 3.362452309240e-03 4923 KSP Residual norm 2.946761601425e-03 4924 KSP Residual norm 3.390387075941e-03 4925 KSP Residual norm 4.241381450779e-03 4926 KSP Residual norm 5.270902723724e-03 4927 KSP Residual norm 6.059402611143e-03 4928 KSP Residual norm 5.337757436964e-03 4929 KSP Residual norm 3.833830493822e-03 4930 KSP Residual norm 2.747645358915e-03 4931 KSP Residual norm 2.150688673279e-03 4932 KSP Residual norm 1.914781624795e-03 4933 KSP Residual norm 2.077408449524e-03 4934 KSP Residual norm 2.632585692938e-03 4935 KSP Residual norm 3.300334592556e-03 4936 KSP Residual norm 3.179171046809e-03 4937 KSP Residual norm 2.228488962779e-03 4938 KSP Residual norm 1.514179510410e-03 4939 KSP Residual norm 1.178432803597e-03 4940 KSP Residual norm 1.092308238180e-03 4941 KSP Residual norm 1.194188530520e-03 4942 KSP Residual norm 1.459530746680e-03 4943 KSP Residual norm 1.974357583893e-03 4944 KSP Residual norm 2.456511377489e-03 4945 KSP Residual norm 2.363429800002e-03 4946 KSP Residual norm 1.747724840607e-03 4947 KSP Residual norm 1.238167988902e-03 4948 KSP Residual norm 9.347618818802e-04 4949 KSP Residual norm 7.846093234227e-04 4950 KSP Residual norm 7.969019913978e-04 4951 KSP Residual norm 1.018032594440e-03 4952 KSP Residual norm 1.435193586315e-03 4953 KSP Residual norm 1.763016426696e-03 4954 KSP Residual norm 1.879673692883e-03 4955 KSP Residual norm 1.648663413833e-03 4956 KSP Residual norm 1.115507322067e-03 4957 KSP Residual norm 6.986300219611e-04 4958 KSP Residual norm 5.518190849376e-04 4959 KSP Residual norm 5.930730453886e-04 4960 KSP Residual norm 7.720168068322e-04 4961 KSP Residual norm 9.852553178381e-04 4962 KSP Residual norm 1.136653159694e-03 4963 KSP Residual norm 1.180083614422e-03 4964 KSP Residual norm 1.160501693820e-03 4965 KSP Residual norm 1.130479996735e-03 4966 KSP Residual norm 1.193048737706e-03 4967 KSP Residual norm 1.337340833074e-03 4968 KSP Residual norm 1.407270243137e-03 4969 KSP Residual norm 1.332676891727e-03 4970 KSP Residual norm 1.301055778962e-03 4971 KSP Residual norm 1.361806020942e-03 4972 KSP Residual norm 1.541484114909e-03 4973 KSP Residual norm 1.818375177566e-03 4974 KSP Residual norm 2.031378473856e-03 4975 KSP Residual norm 1.853145278024e-03 4976 KSP Residual norm 1.524647736242e-03 4977 KSP Residual norm 1.342015063503e-03 4978 KSP Residual norm 1.405340684928e-03 4979 KSP Residual norm 1.737159120482e-03 4980 KSP Residual norm 2.422470991332e-03 4981 KSP Residual norm 3.063469428059e-03 4982 KSP Residual norm 3.263232649914e-03 4983 KSP Residual norm 3.286345157348e-03 4984 KSP Residual norm 3.329342521906e-03 4985 KSP Residual norm 3.326627977020e-03 4986 KSP Residual norm 3.371354442555e-03 4987 KSP Residual norm 3.421220349236e-03 4988 KSP Residual norm 3.287196782563e-03 4989 KSP Residual norm 2.715823964119e-03 4990 KSP Residual norm 2.214979943780e-03 4991 KSP Residual norm 2.114034858605e-03 4992 KSP Residual norm 2.269898558350e-03 4993 KSP Residual norm 2.499906013821e-03 4994 KSP Residual norm 2.912036386996e-03 4995 KSP Residual norm 3.387337502437e-03 4996 KSP Residual norm 3.235464518676e-03 4997 KSP Residual norm 2.880509673571e-03 4998 KSP Residual norm 2.816461095183e-03 4999 KSP Residual norm 3.022250534837e-03 5000 KSP Residual norm 3.291150101357e-03 5001 KSP Residual norm 3.389669223662e-03 5002 KSP Residual norm 2.817208090928e-03 5003 KSP Residual norm 1.978803521597e-03 5004 KSP Residual norm 1.422343727108e-03 5005 KSP Residual norm 1.144685926365e-03 5006 KSP Residual norm 1.234667529448e-03 5007 KSP Residual norm 1.819585014719e-03 5008 KSP Residual norm 2.750600411515e-03 5009 KSP Residual norm 3.165378437491e-03 5010 KSP Residual norm 2.677495677307e-03 5011 KSP Residual norm 2.163050280073e-03 5012 KSP Residual norm 1.874207928539e-03 5013 KSP Residual norm 1.535687724493e-03 5014 KSP Residual norm 1.313885632045e-03 5015 KSP Residual norm 1.408869915294e-03 5016 KSP Residual norm 1.588029743271e-03 5017 KSP Residual norm 1.588344572732e-03 5018 KSP Residual norm 1.669796293906e-03 5019 KSP Residual norm 1.944841238776e-03 5020 KSP Residual norm 2.045359968243e-03 5021 KSP Residual norm 1.656331145003e-03 5022 KSP Residual norm 1.279755905280e-03 5023 KSP Residual norm 1.079348218324e-03 5024 KSP Residual norm 1.096175544553e-03 5025 KSP Residual norm 1.288021654318e-03 5026 KSP Residual norm 1.594553342392e-03 5027 KSP Residual norm 1.888061940387e-03 5028 KSP Residual norm 2.171248386832e-03 5029 KSP Residual norm 2.454370553550e-03 5030 KSP Residual norm 2.816579734572e-03 5031 KSP Residual norm 3.084512066111e-03 5032 KSP Residual norm 3.027622341693e-03 5033 KSP Residual norm 2.978665746558e-03 5034 KSP Residual norm 2.690866475698e-03 5035 KSP Residual norm 2.110124306155e-03 5036 KSP Residual norm 1.674828335178e-03 5037 KSP Residual norm 1.566565908324e-03 5038 KSP Residual norm 1.747937220098e-03 5039 KSP Residual norm 2.248892010782e-03 5040 KSP Residual norm 2.857705442715e-03 5041 KSP Residual norm 3.085106545901e-03 5042 KSP Residual norm 3.051309956733e-03 5043 KSP Residual norm 3.033356990345e-03 5044 KSP Residual norm 3.192142597189e-03 5045 KSP Residual norm 3.375624152832e-03 5046 KSP Residual norm 3.336374499905e-03 5047 KSP Residual norm 3.050958470633e-03 5048 KSP Residual norm 2.756607565141e-03 5049 KSP Residual norm 2.515508106626e-03 5050 KSP Residual norm 2.489483992298e-03 5051 KSP Residual norm 2.849603288255e-03 5052 KSP Residual norm 3.546447799366e-03 5053 KSP Residual norm 4.151401419809e-03 5054 KSP Residual norm 4.446572591243e-03 5055 KSP Residual norm 4.856234194189e-03 5056 KSP Residual norm 5.256038219276e-03 5057 KSP Residual norm 5.845514047672e-03 5058 KSP Residual norm 6.592947114170e-03 5059 KSP Residual norm 6.076006162852e-03 5060 KSP Residual norm 5.404819106328e-03 5061 KSP Residual norm 5.269282759973e-03 5062 KSP Residual norm 5.001933187235e-03 5063 KSP Residual norm 4.622849954769e-03 5064 KSP Residual norm 4.823257629216e-03 5065 KSP Residual norm 5.573911968753e-03 5066 KSP Residual norm 6.516973487883e-03 5067 KSP Residual norm 7.820591843174e-03 5068 KSP Residual norm 8.378533585387e-03 5069 KSP Residual norm 7.809567523651e-03 5070 KSP Residual norm 6.823901788272e-03 5071 KSP Residual norm 6.260748730520e-03 5072 KSP Residual norm 5.898450566380e-03 5073 KSP Residual norm 5.687804058418e-03 5074 KSP Residual norm 5.262967099664e-03 5075 KSP Residual norm 4.844076719264e-03 5076 KSP Residual norm 4.651855957704e-03 5077 KSP Residual norm 4.479522372641e-03 5078 KSP Residual norm 4.820802430663e-03 5079 KSP Residual norm 5.157007390652e-03 5080 KSP Residual norm 5.347094848023e-03 5081 KSP Residual norm 5.480147694964e-03 5082 KSP Residual norm 6.197823277763e-03 5083 KSP Residual norm 6.953930398315e-03 5084 KSP Residual norm 7.494722045878e-03 5085 KSP Residual norm 7.122049835994e-03 5086 KSP Residual norm 5.250897891001e-03 5087 KSP Residual norm 3.733753304278e-03 5088 KSP Residual norm 3.000130609437e-03 5089 KSP Residual norm 2.730444865874e-03 5090 KSP Residual norm 2.806777592697e-03 5091 KSP Residual norm 3.534693244892e-03 5092 KSP Residual norm 4.742855747346e-03 5093 KSP Residual norm 5.618857680117e-03 5094 KSP Residual norm 5.727560122837e-03 5095 KSP Residual norm 5.665085254908e-03 5096 KSP Residual norm 5.572251597088e-03 5097 KSP Residual norm 5.392866905234e-03 5098 KSP Residual norm 4.797175112142e-03 5099 KSP Residual norm 4.115744724931e-03 5100 KSP Residual norm 3.808753573586e-03 5101 KSP Residual norm 3.911638581068e-03 5102 KSP Residual norm 3.978371081644e-03 5103 KSP Residual norm 4.160433694042e-03 5104 KSP Residual norm 4.717609959807e-03 5105 KSP Residual norm 5.529647619336e-03 5106 KSP Residual norm 6.057214307890e-03 5107 KSP Residual norm 6.661957144616e-03 5108 KSP Residual norm 7.161526349291e-03 5109 KSP Residual norm 7.570901013840e-03 5110 KSP Residual norm 7.707589057577e-03 5111 KSP Residual norm 7.683271579517e-03 5112 KSP Residual norm 7.846003147522e-03 5113 KSP Residual norm 8.308064350657e-03 5114 KSP Residual norm 8.337705614986e-03 5115 KSP Residual norm 7.612423043577e-03 5116 KSP Residual norm 7.332948570313e-03 5117 KSP Residual norm 7.268416620020e-03 5118 KSP Residual norm 7.415675864539e-03 5119 KSP Residual norm 8.621764085841e-03 5120 KSP Residual norm 1.058892453647e-02 5121 KSP Residual norm 1.178819456823e-02 5122 KSP Residual norm 1.225711412480e-02 5123 KSP Residual norm 1.273167378411e-02 5124 KSP Residual norm 1.301503302738e-02 5125 KSP Residual norm 1.221850412513e-02 5126 KSP Residual norm 1.124446749456e-02 5127 KSP Residual norm 1.023472666998e-02 5128 KSP Residual norm 9.532791955133e-03 5129 KSP Residual norm 9.174097741455e-03 5130 KSP Residual norm 9.694037708432e-03 5131 KSP Residual norm 1.055795212558e-02 5132 KSP Residual norm 9.425549028743e-03 5133 KSP Residual norm 8.619620720571e-03 5134 KSP Residual norm 1.002930961693e-02 5135 KSP Residual norm 1.278178777314e-02 5136 KSP Residual norm 1.347910571283e-02 5137 KSP Residual norm 1.170006737430e-02 5138 KSP Residual norm 9.876322953708e-03 5139 KSP Residual norm 9.995696068193e-03 5140 KSP Residual norm 1.197374794613e-02 5141 KSP Residual norm 1.180659063410e-02 5142 KSP Residual norm 8.960345861898e-03 5143 KSP Residual norm 6.670112656017e-03 5144 KSP Residual norm 5.325361947471e-03 5145 KSP Residual norm 4.736550206560e-03 5146 KSP Residual norm 5.072528972719e-03 5147 KSP Residual norm 5.757265321682e-03 5148 KSP Residual norm 5.833410810155e-03 5149 KSP Residual norm 5.502048773279e-03 5150 KSP Residual norm 5.020828627945e-03 5151 KSP Residual norm 4.879906807749e-03 5152 KSP Residual norm 5.473968702837e-03 5153 KSP Residual norm 6.806856885144e-03 5154 KSP Residual norm 8.018509552909e-03 5155 KSP Residual norm 8.304671294585e-03 5156 KSP Residual norm 8.427119056107e-03 5157 KSP Residual norm 9.083636959881e-03 5158 KSP Residual norm 1.031694186913e-02 5159 KSP Residual norm 1.026507915940e-02 5160 KSP Residual norm 9.187457077705e-03 5161 KSP Residual norm 8.958463131120e-03 5162 KSP Residual norm 9.005353946366e-03 5163 KSP Residual norm 7.998277276331e-03 5164 KSP Residual norm 7.076361672102e-03 5165 KSP Residual norm 6.012448325201e-03 5166 KSP Residual norm 4.614796846782e-03 5167 KSP Residual norm 4.092226890317e-03 5168 KSP Residual norm 4.552101862680e-03 5169 KSP Residual norm 5.165059680433e-03 5170 KSP Residual norm 5.132031450739e-03 5171 KSP Residual norm 5.228096972927e-03 5172 KSP Residual norm 6.330401772911e-03 5173 KSP Residual norm 9.020369001129e-03 5174 KSP Residual norm 1.191885618250e-02 5175 KSP Residual norm 1.155960131928e-02 5176 KSP Residual norm 9.926623465561e-03 5177 KSP Residual norm 9.926733624389e-03 5178 KSP Residual norm 1.195694135051e-02 5179 KSP Residual norm 1.338259629072e-02 5180 KSP Residual norm 1.389102283443e-02 5181 KSP Residual norm 1.267728231799e-02 5182 KSP Residual norm 1.165462408308e-02 5183 KSP Residual norm 1.221733246193e-02 5184 KSP Residual norm 1.256365927293e-02 5185 KSP Residual norm 1.147165485488e-02 5186 KSP Residual norm 9.948083793328e-03 5187 KSP Residual norm 8.952059063007e-03 5188 KSP Residual norm 9.548145245048e-03 5189 KSP Residual norm 1.198090008198e-02 5190 KSP Residual norm 1.421032375516e-02 5191 KSP Residual norm 1.455043651700e-02 5192 KSP Residual norm 1.597607475038e-02 5193 KSP Residual norm 1.731138927366e-02 5194 KSP Residual norm 1.869657793075e-02 5195 KSP Residual norm 2.056260700247e-02 5196 KSP Residual norm 1.989650020402e-02 5197 KSP Residual norm 1.644841553315e-02 5198 KSP Residual norm 1.402123831286e-02 5199 KSP Residual norm 1.393307731848e-02 5200 KSP Residual norm 1.499889495236e-02 5201 KSP Residual norm 1.512649830288e-02 5202 KSP Residual norm 1.452302777099e-02 5203 KSP Residual norm 1.400754830125e-02 5204 KSP Residual norm 1.454268965046e-02 5205 KSP Residual norm 1.605382740223e-02 5206 KSP Residual norm 1.747113205190e-02 5207 KSP Residual norm 1.826558199165e-02 5208 KSP Residual norm 1.766043281495e-02 5209 KSP Residual norm 1.624484941571e-02 5210 KSP Residual norm 1.573189280158e-02 5211 KSP Residual norm 1.590837291997e-02 5212 KSP Residual norm 1.489017780109e-02 5213 KSP Residual norm 1.299770002216e-02 5214 KSP Residual norm 1.147567719050e-02 5215 KSP Residual norm 1.042029489694e-02 5216 KSP Residual norm 1.003509276822e-02 5217 KSP Residual norm 9.768968368273e-03 5218 KSP Residual norm 8.752247826823e-03 5219 KSP Residual norm 7.601636512821e-03 5220 KSP Residual norm 8.168160771245e-03 5221 KSP Residual norm 1.080881386017e-02 5222 KSP Residual norm 1.341820622108e-02 5223 KSP Residual norm 1.304121223602e-02 5224 KSP Residual norm 1.110327025048e-02 5225 KSP Residual norm 9.743934920607e-03 5226 KSP Residual norm 9.277908333168e-03 5227 KSP Residual norm 8.558564414890e-03 5228 KSP Residual norm 7.211650134836e-03 5229 KSP Residual norm 5.988205291223e-03 5230 KSP Residual norm 5.703158478056e-03 5231 KSP Residual norm 5.551940465001e-03 5232 KSP Residual norm 5.386992639716e-03 5233 KSP Residual norm 5.019571930758e-03 5234 KSP Residual norm 4.476459677889e-03 5235 KSP Residual norm 3.781661865522e-03 5236 KSP Residual norm 3.701038045348e-03 5237 KSP Residual norm 4.429787412057e-03 5238 KSP Residual norm 5.345586233516e-03 5239 KSP Residual norm 5.605324989227e-03 5240 KSP Residual norm 5.575771676464e-03 5241 KSP Residual norm 5.831218616475e-03 5242 KSP Residual norm 6.658094702125e-03 5243 KSP Residual norm 7.821203358408e-03 5244 KSP Residual norm 7.423070869094e-03 5245 KSP Residual norm 5.436615846481e-03 5246 KSP Residual norm 3.950016393308e-03 5247 KSP Residual norm 3.315810664097e-03 5248 KSP Residual norm 3.325728496516e-03 5249 KSP Residual norm 3.126640604489e-03 5250 KSP Residual norm 2.558728088265e-03 5251 KSP Residual norm 2.262427999336e-03 5252 KSP Residual norm 2.406402413605e-03 5253 KSP Residual norm 2.997248114239e-03 5254 KSP Residual norm 3.787470875264e-03 5255 KSP Residual norm 4.270036535989e-03 5256 KSP Residual norm 4.631049225251e-03 5257 KSP Residual norm 5.218134405630e-03 5258 KSP Residual norm 6.607068903890e-03 5259 KSP Residual norm 7.960533351606e-03 5260 KSP Residual norm 7.166807492011e-03 5261 KSP Residual norm 5.997626686697e-03 5262 KSP Residual norm 6.178378670894e-03 5263 KSP Residual norm 7.438134165446e-03 5264 KSP Residual norm 6.984649077495e-03 5265 KSP Residual norm 5.136207872633e-03 5266 KSP Residual norm 4.644114249803e-03 5267 KSP Residual norm 5.507585266405e-03 5268 KSP Residual norm 6.560606415636e-03 5269 KSP Residual norm 6.299041827985e-03 5270 KSP Residual norm 5.947565289050e-03 5271 KSP Residual norm 6.052466081758e-03 5272 KSP Residual norm 6.633452172370e-03 5273 KSP Residual norm 7.343726482563e-03 5274 KSP Residual norm 7.706364749192e-03 5275 KSP Residual norm 7.498006751567e-03 5276 KSP Residual norm 7.582417555133e-03 5277 KSP Residual norm 8.575346699186e-03 5278 KSP Residual norm 8.913395547110e-03 5279 KSP Residual norm 7.819152327133e-03 5280 KSP Residual norm 6.835399530654e-03 5281 KSP Residual norm 6.616375226819e-03 5282 KSP Residual norm 6.942442179146e-03 5283 KSP Residual norm 6.701451226587e-03 5284 KSP Residual norm 6.135577499091e-03 5285 KSP Residual norm 5.841247466051e-03 5286 KSP Residual norm 5.811567014070e-03 5287 KSP Residual norm 6.153801526829e-03 5288 KSP Residual norm 6.370917959567e-03 5289 KSP Residual norm 5.974126718975e-03 5290 KSP Residual norm 5.157197549608e-03 5291 KSP Residual norm 4.506417439043e-03 5292 KSP Residual norm 4.226676984224e-03 5293 KSP Residual norm 4.476507758499e-03 5294 KSP Residual norm 5.282982936160e-03 5295 KSP Residual norm 5.736576221504e-03 5296 KSP Residual norm 4.864534830334e-03 5297 KSP Residual norm 4.218531524174e-03 5298 KSP Residual norm 4.315266172564e-03 5299 KSP Residual norm 4.359964728158e-03 5300 KSP Residual norm 4.042754202727e-03 5301 KSP Residual norm 3.587051339079e-03 5302 KSP Residual norm 3.354907241891e-03 5303 KSP Residual norm 3.598450594975e-03 5304 KSP Residual norm 3.825154915405e-03 5305 KSP Residual norm 3.696607710441e-03 5306 KSP Residual norm 3.609045564149e-03 5307 KSP Residual norm 3.785466801854e-03 5308 KSP Residual norm 4.186759102805e-03 5309 KSP Residual norm 4.485623763844e-03 5310 KSP Residual norm 4.194450565602e-03 5311 KSP Residual norm 3.792055641733e-03 5312 KSP Residual norm 3.922425745534e-03 5313 KSP Residual norm 4.447457413111e-03 5314 KSP Residual norm 4.232897818345e-03 5315 KSP Residual norm 3.658716643868e-03 5316 KSP Residual norm 3.287345091224e-03 5317 KSP Residual norm 2.969861678067e-03 5318 KSP Residual norm 2.548695342826e-03 5319 KSP Residual norm 2.186322831664e-03 5320 KSP Residual norm 1.865956342423e-03 5321 KSP Residual norm 1.639912204247e-03 5322 KSP Residual norm 1.612372144684e-03 5323 KSP Residual norm 1.548680511225e-03 5324 KSP Residual norm 1.463191038880e-03 5325 KSP Residual norm 1.485551702543e-03 5326 KSP Residual norm 1.578625931248e-03 5327 KSP Residual norm 1.678239612159e-03 5328 KSP Residual norm 1.817246820426e-03 5329 KSP Residual norm 2.093034605559e-03 5330 KSP Residual norm 2.379539180216e-03 5331 KSP Residual norm 2.281340139175e-03 5332 KSP Residual norm 2.113771530724e-03 5333 KSP Residual norm 1.953018997233e-03 5334 KSP Residual norm 1.787146918122e-03 5335 KSP Residual norm 1.673173151466e-03 5336 KSP Residual norm 1.673879455218e-03 5337 KSP Residual norm 1.610088340176e-03 5338 KSP Residual norm 1.375386601546e-03 5339 KSP Residual norm 1.204256628393e-03 5340 KSP Residual norm 1.172068834485e-03 5341 KSP Residual norm 1.167299749369e-03 5342 KSP Residual norm 1.084987721925e-03 5343 KSP Residual norm 1.048608500019e-03 5344 KSP Residual norm 1.182894398543e-03 5345 KSP Residual norm 1.453015709397e-03 5346 KSP Residual norm 1.767447466914e-03 5347 KSP Residual norm 2.047673873151e-03 5348 KSP Residual norm 2.165888451651e-03 5349 KSP Residual norm 2.345026100221e-03 5350 KSP Residual norm 2.699483159671e-03 5351 KSP Residual norm 3.029545334055e-03 5352 KSP Residual norm 3.098176374562e-03 5353 KSP Residual norm 2.863551593303e-03 5354 KSP Residual norm 2.665654364024e-03 5355 KSP Residual norm 2.591283133431e-03 5356 KSP Residual norm 2.494178966467e-03 5357 KSP Residual norm 2.197196779720e-03 5358 KSP Residual norm 1.911928992764e-03 5359 KSP Residual norm 1.870311203534e-03 5360 KSP Residual norm 1.849589844497e-03 5361 KSP Residual norm 1.742174612829e-03 5362 KSP Residual norm 1.629652677123e-03 5363 KSP Residual norm 1.711177703496e-03 5364 KSP Residual norm 1.912003863409e-03 5365 KSP Residual norm 1.960739899509e-03 5366 KSP Residual norm 1.901441585670e-03 5367 KSP Residual norm 1.925905953983e-03 5368 KSP Residual norm 1.986371216477e-03 5369 KSP Residual norm 1.957363818653e-03 5370 KSP Residual norm 2.053645174170e-03 5371 KSP Residual norm 2.330000687696e-03 5372 KSP Residual norm 2.490599777998e-03 5373 KSP Residual norm 2.764168582941e-03 5374 KSP Residual norm 3.333090739671e-03 5375 KSP Residual norm 3.299297499929e-03 5376 KSP Residual norm 2.943633959223e-03 5377 KSP Residual norm 3.020934084618e-03 5378 KSP Residual norm 3.559496348939e-03 5379 KSP Residual norm 3.750851249493e-03 5380 KSP Residual norm 3.057948346840e-03 5381 KSP Residual norm 2.483756421269e-03 5382 KSP Residual norm 2.313690894628e-03 5383 KSP Residual norm 2.284330872334e-03 5384 KSP Residual norm 2.067498440454e-03 5385 KSP Residual norm 1.900828101659e-03 5386 KSP Residual norm 1.896266700462e-03 5387 KSP Residual norm 2.053747040898e-03 5388 KSP Residual norm 2.344842302619e-03 5389 KSP Residual norm 2.448419012429e-03 5390 KSP Residual norm 2.216759496229e-03 5391 KSP Residual norm 2.143984695538e-03 5392 KSP Residual norm 2.267819864387e-03 5393 KSP Residual norm 2.100480202825e-03 5394 KSP Residual norm 1.758421892352e-03 5395 KSP Residual norm 1.624170996767e-03 5396 KSP Residual norm 1.623729138292e-03 5397 KSP Residual norm 1.715904587707e-03 5398 KSP Residual norm 1.628547705282e-03 5399 KSP Residual norm 1.369979652687e-03 5400 KSP Residual norm 1.197552314130e-03 5401 KSP Residual norm 1.235218715062e-03 5402 KSP Residual norm 1.251795156399e-03 5403 KSP Residual norm 1.055398984970e-03 5404 KSP Residual norm 8.693521818439e-04 5405 KSP Residual norm 7.765063210340e-04 5406 KSP Residual norm 8.061923680833e-04 5407 KSP Residual norm 8.149611894986e-04 5408 KSP Residual norm 7.216353075345e-04 5409 KSP Residual norm 6.957923218712e-04 5410 KSP Residual norm 7.951162909202e-04 5411 KSP Residual norm 8.469759210108e-04 5412 KSP Residual norm 8.136838889844e-04 5413 KSP Residual norm 8.467882306835e-04 5414 KSP Residual norm 9.013272552227e-04 5415 KSP Residual norm 9.403081361950e-04 5416 KSP Residual norm 9.849734302925e-04 5417 KSP Residual norm 9.031059445749e-04 5418 KSP Residual norm 7.554833038730e-04 5419 KSP Residual norm 6.641867816478e-04 5420 KSP Residual norm 5.882686720478e-04 5421 KSP Residual norm 5.146939792882e-04 5422 KSP Residual norm 4.958902421118e-04 5423 KSP Residual norm 4.933509916772e-04 5424 KSP Residual norm 4.882247397423e-04 5425 KSP Residual norm 4.747048969735e-04 5426 KSP Residual norm 4.784796763508e-04 5427 KSP Residual norm 4.999307350715e-04 5428 KSP Residual norm 5.243624785548e-04 5429 KSP Residual norm 5.483302830145e-04 5430 KSP Residual norm 6.131506349290e-04 5431 KSP Residual norm 6.352148222316e-04 5432 KSP Residual norm 5.757184833653e-04 5433 KSP Residual norm 5.968829149853e-04 5434 KSP Residual norm 6.817250660139e-04 5435 KSP Residual norm 6.874654095218e-04 5436 KSP Residual norm 6.585349986495e-04 5437 KSP Residual norm 6.565726686462e-04 5438 KSP Residual norm 6.764294138953e-04 5439 KSP Residual norm 7.278807566020e-04 5440 KSP Residual norm 8.085586887535e-04 5441 KSP Residual norm 7.630429361587e-04 5442 KSP Residual norm 6.206752290149e-04 5443 KSP Residual norm 5.819511475547e-04 5444 KSP Residual norm 6.465257952358e-04 5445 KSP Residual norm 6.454210708622e-04 5446 KSP Residual norm 5.561588402338e-04 5447 KSP Residual norm 4.881485648730e-04 5448 KSP Residual norm 5.141944068032e-04 5449 KSP Residual norm 5.888141774876e-04 5450 KSP Residual norm 5.735795430384e-04 5451 KSP Residual norm 5.252693912947e-04 5452 KSP Residual norm 5.472216968971e-04 5453 KSP Residual norm 6.429222011012e-04 5454 KSP Residual norm 7.053246569235e-04 5455 KSP Residual norm 7.429112339675e-04 5456 KSP Residual norm 7.693895427094e-04 5457 KSP Residual norm 8.491570316152e-04 5458 KSP Residual norm 9.660910274535e-04 5459 KSP Residual norm 9.911439508787e-04 5460 KSP Residual norm 9.367592708760e-04 5461 KSP Residual norm 9.912179534390e-04 5462 KSP Residual norm 1.168263072062e-03 5463 KSP Residual norm 1.256266202974e-03 5464 KSP Residual norm 1.171014256605e-03 5465 KSP Residual norm 1.132362106124e-03 5466 KSP Residual norm 1.166902839267e-03 5467 KSP Residual norm 1.182462127042e-03 5468 KSP Residual norm 1.070222512522e-03 5469 KSP Residual norm 9.206785642296e-04 5470 KSP Residual norm 8.413939349152e-04 5471 KSP Residual norm 8.537473414134e-04 5472 KSP Residual norm 9.202916667921e-04 5473 KSP Residual norm 8.716198936471e-04 5474 KSP Residual norm 8.713846114770e-04 5475 KSP Residual norm 9.529167027262e-04 5476 KSP Residual norm 9.668177988798e-04 5477 KSP Residual norm 8.555935437774e-04 5478 KSP Residual norm 7.819021412871e-04 5479 KSP Residual norm 8.279334404567e-04 5480 KSP Residual norm 8.632508766114e-04 5481 KSP Residual norm 7.624863590967e-04 5482 KSP Residual norm 7.082102483952e-04 5483 KSP Residual norm 6.948532033834e-04 5484 KSP Residual norm 6.398521462701e-04 5485 KSP Residual norm 5.842358819130e-04 5486 KSP Residual norm 5.534344985165e-04 5487 KSP Residual norm 5.157037674218e-04 5488 KSP Residual norm 5.159158713989e-04 5489 KSP Residual norm 5.823786140097e-04 5490 KSP Residual norm 5.981525457366e-04 5491 KSP Residual norm 5.570890062575e-04 5492 KSP Residual norm 5.450117473878e-04 5493 KSP Residual norm 5.295188269093e-04 5494 KSP Residual norm 4.938334956021e-04 5495 KSP Residual norm 4.886598351553e-04 5496 KSP Residual norm 5.163247130083e-04 5497 KSP Residual norm 4.899156276800e-04 5498 KSP Residual norm 4.760163279581e-04 5499 KSP Residual norm 5.104929275361e-04 5500 KSP Residual norm 5.518958695104e-04 5501 KSP Residual norm 5.204581735775e-04 5502 KSP Residual norm 4.702311113589e-04 5503 KSP Residual norm 4.631875978934e-04 5504 KSP Residual norm 5.069550315656e-04 5505 KSP Residual norm 5.428603301611e-04 5506 KSP Residual norm 5.157784070885e-04 5507 KSP Residual norm 4.556783214684e-04 5508 KSP Residual norm 4.245485377656e-04 5509 KSP Residual norm 4.268921172213e-04 5510 KSP Residual norm 4.283720223248e-04 5511 KSP Residual norm 3.867115304653e-04 5512 KSP Residual norm 3.352428401407e-04 5513 KSP Residual norm 3.174533887163e-04 5514 KSP Residual norm 2.873558066900e-04 5515 KSP Residual norm 2.543423152381e-04 5516 KSP Residual norm 2.205920585448e-04 5517 KSP Residual norm 1.904484367096e-04 5518 KSP Residual norm 1.762030537920e-04 5519 KSP Residual norm 1.713669432590e-04 5520 KSP Residual norm 1.484616200034e-04 5521 KSP Residual norm 1.291725557462e-04 5522 KSP Residual norm 1.279656720388e-04 5523 KSP Residual norm 1.339253399541e-04 5524 KSP Residual norm 1.130742007400e-04 5525 KSP Residual norm 9.226543504901e-05 5526 KSP Residual norm 9.337781091670e-05 5527 KSP Residual norm 1.033740271140e-04 5528 KSP Residual norm 1.077084271398e-04 5529 KSP Residual norm 1.078378465288e-04 5530 KSP Residual norm 1.142127592458e-04 5531 KSP Residual norm 1.296306433532e-04 5532 KSP Residual norm 1.397546422793e-04 5533 KSP Residual norm 1.360697136744e-04 5534 KSP Residual norm 1.330607527380e-04 5535 KSP Residual norm 1.282371457806e-04 5536 KSP Residual norm 1.300163013936e-04 5537 KSP Residual norm 1.403428137837e-04 5538 KSP Residual norm 1.366756756029e-04 5539 KSP Residual norm 1.142203581770e-04 5540 KSP Residual norm 1.058979001250e-04 5541 KSP Residual norm 1.061069452013e-04 5542 KSP Residual norm 1.025637608431e-04 5543 KSP Residual norm 9.764269999651e-05 5544 KSP Residual norm 9.671340362633e-05 5545 KSP Residual norm 1.025095953929e-04 5546 KSP Residual norm 1.078139515394e-04 5547 KSP Residual norm 1.150143576937e-04 5548 KSP Residual norm 1.202196098788e-04 5549 KSP Residual norm 1.276574943103e-04 5550 KSP Residual norm 1.341955162129e-04 5551 KSP Residual norm 1.496811108075e-04 5552 KSP Residual norm 1.696865048115e-04 5553 KSP Residual norm 1.805843626317e-04 5554 KSP Residual norm 1.773557321230e-04 5555 KSP Residual norm 1.740569548714e-04 5556 KSP Residual norm 1.893071345861e-04 5557 KSP Residual norm 2.000699092322e-04 5558 KSP Residual norm 1.976560690861e-04 5559 KSP Residual norm 1.968881124422e-04 5560 KSP Residual norm 2.015241556695e-04 5561 KSP Residual norm 2.136191320420e-04 5562 KSP Residual norm 2.041253067977e-04 5563 KSP Residual norm 1.817563216214e-04 5564 KSP Residual norm 1.820620737751e-04 5565 KSP Residual norm 2.040498486297e-04 5566 KSP Residual norm 2.191985137190e-04 5567 KSP Residual norm 2.110098065809e-04 5568 KSP Residual norm 1.936365651161e-04 5569 KSP Residual norm 1.950353541804e-04 5570 KSP Residual norm 2.361640077098e-04 5571 KSP Residual norm 2.909394547983e-04 5572 KSP Residual norm 3.174936992635e-04 5573 KSP Residual norm 3.128069202910e-04 5574 KSP Residual norm 2.963798382496e-04 5575 KSP Residual norm 2.977631698650e-04 5576 KSP Residual norm 3.074171672751e-04 5577 KSP Residual norm 3.124346464784e-04 5578 KSP Residual norm 3.147092932601e-04 5579 KSP Residual norm 3.289911755945e-04 5580 KSP Residual norm 3.401372379534e-04 5581 KSP Residual norm 3.459055044411e-04 5582 KSP Residual norm 3.494742390363e-04 5583 KSP Residual norm 3.241751533875e-04 5584 KSP Residual norm 2.984716177779e-04 5585 KSP Residual norm 2.841187218740e-04 5586 KSP Residual norm 2.500963203614e-04 5587 KSP Residual norm 1.985559588665e-04 5588 KSP Residual norm 1.813063461890e-04 5589 KSP Residual norm 1.884995846394e-04 5590 KSP Residual norm 1.862124949968e-04 5591 KSP Residual norm 1.576137634920e-04 5592 KSP Residual norm 1.314333115333e-04 5593 KSP Residual norm 1.233170540821e-04 5594 KSP Residual norm 1.256715269944e-04 5595 KSP Residual norm 1.274156428359e-04 5596 KSP Residual norm 1.304975747973e-04 5597 KSP Residual norm 1.403750917018e-04 5598 KSP Residual norm 1.518813122260e-04 5599 KSP Residual norm 1.594566833030e-04 5600 KSP Residual norm 1.607368647173e-04 5601 KSP Residual norm 1.512590127040e-04 5602 KSP Residual norm 1.503473571558e-04 5603 KSP Residual norm 1.593245697035e-04 5604 KSP Residual norm 1.564282073915e-04 5605 KSP Residual norm 1.528019132185e-04 5606 KSP Residual norm 1.554991897247e-04 5607 KSP Residual norm 1.505493314950e-04 5608 KSP Residual norm 1.494931849597e-04 5609 KSP Residual norm 1.543089634330e-04 5610 KSP Residual norm 1.525547365678e-04 5611 KSP Residual norm 1.443624500493e-04 5612 KSP Residual norm 1.413876022153e-04 5613 KSP Residual norm 1.340625797578e-04 5614 KSP Residual norm 1.190953314077e-04 5615 KSP Residual norm 1.092086393545e-04 5616 KSP Residual norm 1.013878003574e-04 5617 KSP Residual norm 8.985011747366e-05 5618 KSP Residual norm 7.803686905695e-05 5619 KSP Residual norm 6.968710138343e-05 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 6.981540039732e-05 1 KSP Residual norm 5.486740819847e-05 2 KSP Residual norm 4.859958377030e-05 3 KSP Residual norm 4.830776818665e-05 4 KSP Residual norm 4.680815415132e-05 5 KSP Residual norm 4.931038847020e-05 6 KSP Residual norm 5.311601221565e-05 7 KSP Residual norm 5.445362946290e-05 8 KSP Residual norm 5.501992703276e-05 9 KSP Residual norm 5.614607086184e-05 10 KSP Residual norm 5.818742397702e-05 11 KSP Residual norm 6.076860407413e-05 12 KSP Residual norm 6.209344716986e-05 13 KSP Residual norm 6.276921190143e-05 14 KSP Residual norm 6.575248070602e-05 15 KSP Residual norm 7.027771801762e-05 16 KSP Residual norm 6.876504852303e-05 17 KSP Residual norm 6.235650944687e-05 18 KSP Residual norm 5.823000331989e-05 19 KSP Residual norm 5.720617267618e-05 20 KSP Residual norm 5.333689341912e-05 21 KSP Residual norm 4.633418134281e-05 22 KSP Residual norm 4.179723268121e-05 23 KSP Residual norm 3.985994954437e-05 24 KSP Residual norm 3.768033487646e-05 25 KSP Residual norm 3.470051378485e-05 26 KSP Residual norm 3.325204563555e-05 27 KSP Residual norm 3.180228468573e-05 28 KSP Residual norm 3.169827245146e-05 29 KSP Residual norm 3.222653194243e-05 30 KSP Residual norm 3.103397420772e-05 31 KSP Residual norm 2.872270528153e-05 32 KSP Residual norm 2.979862025627e-05 33 KSP Residual norm 3.346975052952e-05 34 KSP Residual norm 3.501609175377e-05 35 KSP Residual norm 3.641338990485e-05 36 KSP Residual norm 4.039317497089e-05 37 KSP Residual norm 4.295748454991e-05 38 KSP Residual norm 4.373961253986e-05 39 KSP Residual norm 4.319140632739e-05 40 KSP Residual norm 4.380248212390e-05 41 KSP Residual norm 4.532938293215e-05 42 KSP Residual norm 4.559413519122e-05 43 KSP Residual norm 4.731476781739e-05 44 KSP Residual norm 4.775883847507e-05 45 KSP Residual norm 4.545024483077e-05 46 KSP Residual norm 4.373314157959e-05 47 KSP Residual norm 4.137897107505e-05 48 KSP Residual norm 3.546177537398e-05 49 KSP Residual norm 3.052751623339e-05 50 KSP Residual norm 2.864117038043e-05 51 KSP Residual norm 2.702872123912e-05 52 KSP Residual norm 2.628189014355e-05 53 KSP Residual norm 2.670640399291e-05 54 KSP Residual norm 2.499126125214e-05 55 KSP Residual norm 2.266614064588e-05 56 KSP Residual norm 2.252612290890e-05 57 KSP Residual norm 2.380232527474e-05 58 KSP Residual norm 2.479900394659e-05 59 KSP Residual norm 2.594797782441e-05 60 KSP Residual norm 2.794271226180e-05 61 KSP Residual norm 3.102034979668e-05 62 KSP Residual norm 3.509207964089e-05 63 KSP Residual norm 3.604481329240e-05 64 KSP Residual norm 3.592628672047e-05 65 KSP Residual norm 3.757751616012e-05 66 KSP Residual norm 3.921466117532e-05 67 KSP Residual norm 4.089934520970e-05 68 KSP Residual norm 4.300472982162e-05 69 KSP Residual norm 4.373578651627e-05 70 KSP Residual norm 4.367946591543e-05 71 KSP Residual norm 4.412167599913e-05 72 KSP Residual norm 4.534000742787e-05 73 KSP Residual norm 4.462618450013e-05 74 KSP Residual norm 4.441574051785e-05 75 KSP Residual norm 4.395960448337e-05 76 KSP Residual norm 4.409090830245e-05 77 KSP Residual norm 4.420761732308e-05 78 KSP Residual norm 4.371366661465e-05 79 KSP Residual norm 4.322759109329e-05 80 KSP Residual norm 4.352260878504e-05 81 KSP Residual norm 4.368352336874e-05 82 KSP Residual norm 4.258447314602e-05 83 KSP Residual norm 4.056184545740e-05 84 KSP Residual norm 4.056368411442e-05 85 KSP Residual norm 4.275537722706e-05 86 KSP Residual norm 4.878647073801e-05 87 KSP Residual norm 5.714141406613e-05 88 KSP Residual norm 6.315168874028e-05 89 KSP Residual norm 6.417905869732e-05 90 KSP Residual norm 6.430159427937e-05 91 KSP Residual norm 7.101589182977e-05 92 KSP Residual norm 7.515258571149e-05 93 KSP Residual norm 7.381088175387e-05 94 KSP Residual norm 8.006995216430e-05 95 KSP Residual norm 9.105174606062e-05 96 KSP Residual norm 8.795394773827e-05 97 KSP Residual norm 8.160834868630e-05 98 KSP Residual norm 8.716168381588e-05 99 KSP Residual norm 9.314636077655e-05 100 KSP Residual norm 9.034722539252e-05 101 KSP Residual norm 8.560227248158e-05 102 KSP Residual norm 8.736253425890e-05 103 KSP Residual norm 9.078448680835e-05 104 KSP Residual norm 9.446778199984e-05 105 KSP Residual norm 9.803841442608e-05 106 KSP Residual norm 1.019319395582e-04 107 KSP Residual norm 1.025935703686e-04 108 KSP Residual norm 1.064706043334e-04 109 KSP Residual norm 1.141888451782e-04 110 KSP Residual norm 1.171344768412e-04 111 KSP Residual norm 1.191838088932e-04 112 KSP Residual norm 1.299193798544e-04 113 KSP Residual norm 1.402202562321e-04 114 KSP Residual norm 1.292692066741e-04 115 KSP Residual norm 1.178776502619e-04 116 KSP Residual norm 1.237283041002e-04 117 KSP Residual norm 1.376675931523e-04 118 KSP Residual norm 1.421727334801e-04 119 KSP Residual norm 1.307850143471e-04 120 KSP Residual norm 1.170874317479e-04 121 KSP Residual norm 1.131956759002e-04 122 KSP Residual norm 1.088834413724e-04 123 KSP Residual norm 9.868907877203e-05 124 KSP Residual norm 8.904238360157e-05 125 KSP Residual norm 8.658080221285e-05 126 KSP Residual norm 8.423323391111e-05 127 KSP Residual norm 7.785928265843e-05 128 KSP Residual norm 7.144743125951e-05 129 KSP Residual norm 6.853230521148e-05 130 KSP Residual norm 6.878755852414e-05 131 KSP Residual norm 6.549341206408e-05 132 KSP Residual norm 6.536029796748e-05 133 KSP Residual norm 6.941465520394e-05 134 KSP Residual norm 7.441192350479e-05 135 KSP Residual norm 8.054270903251e-05 136 KSP Residual norm 9.257619604795e-05 137 KSP Residual norm 9.972863209624e-05 138 KSP Residual norm 1.025576511532e-04 139 KSP Residual norm 1.073858880987e-04 140 KSP Residual norm 1.069917174942e-04 141 KSP Residual norm 9.905061128405e-05 142 KSP Residual norm 9.712138881971e-05 143 KSP Residual norm 9.896887251036e-05 144 KSP Residual norm 9.180869356224e-05 145 KSP Residual norm 8.412875895159e-05 146 KSP Residual norm 8.080421028975e-05 147 KSP Residual norm 7.547965549861e-05 148 KSP Residual norm 6.898876902162e-05 149 KSP Residual norm 6.365742257829e-05 150 KSP Residual norm 6.364652228967e-05 151 KSP Residual norm 6.535403973594e-05 152 KSP Residual norm 6.200913940850e-05 153 KSP Residual norm 5.564244648251e-05 154 KSP Residual norm 5.518789581924e-05 155 KSP Residual norm 5.750074436045e-05 156 KSP Residual norm 5.661867421387e-05 157 KSP Residual norm 5.339139172252e-05 158 KSP Residual norm 5.030394807929e-05 159 KSP Residual norm 4.937510171395e-05 160 KSP Residual norm 5.226516349696e-05 161 KSP Residual norm 5.404719112849e-05 162 KSP Residual norm 5.304980719571e-05 163 KSP Residual norm 5.574957126148e-05 164 KSP Residual norm 6.312792655771e-05 165 KSP Residual norm 6.654596382397e-05 166 KSP Residual norm 6.297605018290e-05 167 KSP Residual norm 5.723317277165e-05 168 KSP Residual norm 5.516045984997e-05 169 KSP Residual norm 5.392574351997e-05 170 KSP Residual norm 5.072531195418e-05 171 KSP Residual norm 5.074515168635e-05 172 KSP Residual norm 5.592105315714e-05 173 KSP Residual norm 5.789242817196e-05 174 KSP Residual norm 5.564290007552e-05 175 KSP Residual norm 5.438634017329e-05 176 KSP Residual norm 5.616757442794e-05 177 KSP Residual norm 5.755111896661e-05 178 KSP Residual norm 5.923299575820e-05 179 KSP Residual norm 6.014988172065e-05 180 KSP Residual norm 6.272223927318e-05 181 KSP Residual norm 6.651678381283e-05 182 KSP Residual norm 6.472927027704e-05 183 KSP Residual norm 6.134890024015e-05 184 KSP Residual norm 6.470110659792e-05 185 KSP Residual norm 6.976501966588e-05 186 KSP Residual norm 6.969283534078e-05 187 KSP Residual norm 7.425567388575e-05 188 KSP Residual norm 7.864596253136e-05 189 KSP Residual norm 7.567975844604e-05 190 KSP Residual norm 7.557980116617e-05 191 KSP Residual norm 7.877600682610e-05 192 KSP Residual norm 7.576281015868e-05 193 KSP Residual norm 7.417612264368e-05 194 KSP Residual norm 7.386006862731e-05 195 KSP Residual norm 7.440492828970e-05 196 KSP Residual norm 7.777177324894e-05 197 KSP Residual norm 8.103157109878e-05 198 KSP Residual norm 8.795069003035e-05 199 KSP Residual norm 1.001705442765e-04 200 KSP Residual norm 1.070640980872e-04 201 KSP Residual norm 1.091939217250e-04 202 KSP Residual norm 1.191048783588e-04 203 KSP Residual norm 1.362256171204e-04 204 KSP Residual norm 1.467818182996e-04 205 KSP Residual norm 1.521056078547e-04 206 KSP Residual norm 1.487056681147e-04 207 KSP Residual norm 1.464713212261e-04 208 KSP Residual norm 1.446364144174e-04 209 KSP Residual norm 1.453399521497e-04 210 KSP Residual norm 1.457471847081e-04 211 KSP Residual norm 1.467129700839e-04 212 KSP Residual norm 1.415441306955e-04 213 KSP Residual norm 1.396784650154e-04 214 KSP Residual norm 1.429506359483e-04 215 KSP Residual norm 1.428022622147e-04 216 KSP Residual norm 1.365525434288e-04 217 KSP Residual norm 1.337416897171e-04 218 KSP Residual norm 1.295690094990e-04 219 KSP Residual norm 1.205129995938e-04 220 KSP Residual norm 1.128107184535e-04 221 KSP Residual norm 1.128162828788e-04 222 KSP Residual norm 1.156396769951e-04 223 KSP Residual norm 1.128400982738e-04 224 KSP Residual norm 1.137479574675e-04 225 KSP Residual norm 1.089566876535e-04 226 KSP Residual norm 1.050973423642e-04 227 KSP Residual norm 1.086783600564e-04 228 KSP Residual norm 1.129662218027e-04 229 KSP Residual norm 1.130203710386e-04 230 KSP Residual norm 1.090406054488e-04 231 KSP Residual norm 1.103750635270e-04 232 KSP Residual norm 1.136551399950e-04 233 KSP Residual norm 1.189195823106e-04 234 KSP Residual norm 1.215317691788e-04 235 KSP Residual norm 1.193538311037e-04 236 KSP Residual norm 1.201928148860e-04 237 KSP Residual norm 1.240644246175e-04 238 KSP Residual norm 1.282986527693e-04 239 KSP Residual norm 1.303025128916e-04 240 KSP Residual norm 1.227406426576e-04 241 KSP Residual norm 1.185856347069e-04 242 KSP Residual norm 1.228699755900e-04 243 KSP Residual norm 1.200612617097e-04 244 KSP Residual norm 1.179486758917e-04 245 KSP Residual norm 1.245980860231e-04 246 KSP Residual norm 1.325733764489e-04 247 KSP Residual norm 1.250404886213e-04 248 KSP Residual norm 1.194344593488e-04 249 KSP Residual norm 1.170213487575e-04 250 KSP Residual norm 1.134312347772e-04 251 KSP Residual norm 1.124860971529e-04 252 KSP Residual norm 1.137678303459e-04 253 KSP Residual norm 1.089741563950e-04 254 KSP Residual norm 1.111617773424e-04 255 KSP Residual norm 1.189820969039e-04 256 KSP Residual norm 1.171459703106e-04 257 KSP Residual norm 1.134877679278e-04 258 KSP Residual norm 1.174578677376e-04 259 KSP Residual norm 1.253270758101e-04 260 KSP Residual norm 1.295104635228e-04 261 KSP Residual norm 1.279150133006e-04 262 KSP Residual norm 1.240762848631e-04 263 KSP Residual norm 1.208483296083e-04 264 KSP Residual norm 1.166138220603e-04 265 KSP Residual norm 1.217832261034e-04 266 KSP Residual norm 1.336237809167e-04 267 KSP Residual norm 1.390163179027e-04 268 KSP Residual norm 1.421694296531e-04 269 KSP Residual norm 1.518133902495e-04 270 KSP Residual norm 1.529033806052e-04 271 KSP Residual norm 1.487241957244e-04 272 KSP Residual norm 1.526422933183e-04 273 KSP Residual norm 1.516377833397e-04 274 KSP Residual norm 1.460440099024e-04 275 KSP Residual norm 1.521252242812e-04 276 KSP Residual norm 1.562423693229e-04 277 KSP Residual norm 1.431880642221e-04 278 KSP Residual norm 1.385607213070e-04 279 KSP Residual norm 1.498265566529e-04 280 KSP Residual norm 1.596850217203e-04 281 KSP Residual norm 1.565381776562e-04 282 KSP Residual norm 1.519228874756e-04 283 KSP Residual norm 1.521934344572e-04 284 KSP Residual norm 1.522772565531e-04 285 KSP Residual norm 1.460086868820e-04 286 KSP Residual norm 1.495345138364e-04 287 KSP Residual norm 1.561662095947e-04 288 KSP Residual norm 1.639801631004e-04 289 KSP Residual norm 1.836971875830e-04 290 KSP Residual norm 1.941349179982e-04 291 KSP Residual norm 1.954577705834e-04 292 KSP Residual norm 2.007723397681e-04 293 KSP Residual norm 2.192303732935e-04 294 KSP Residual norm 2.408649813461e-04 295 KSP Residual norm 2.509203477993e-04 296 KSP Residual norm 2.728004395497e-04 297 KSP Residual norm 3.123383574209e-04 298 KSP Residual norm 3.231014687907e-04 299 KSP Residual norm 3.038796238126e-04 300 KSP Residual norm 2.805372286204e-04 301 KSP Residual norm 2.699920560761e-04 302 KSP Residual norm 2.611119685481e-04 303 KSP Residual norm 2.504051786545e-04 304 KSP Residual norm 2.438836142113e-04 305 KSP Residual norm 2.512304350183e-04 306 KSP Residual norm 2.720949286874e-04 307 KSP Residual norm 2.870895322639e-04 308 KSP Residual norm 2.938975084962e-04 309 KSP Residual norm 3.017332467858e-04 310 KSP Residual norm 3.082482922637e-04 311 KSP Residual norm 3.130650847360e-04 312 KSP Residual norm 3.257954879731e-04 313 KSP Residual norm 3.307800922468e-04 314 KSP Residual norm 3.252185256537e-04 315 KSP Residual norm 3.447381487181e-04 316 KSP Residual norm 3.691035511895e-04 317 KSP Residual norm 3.592722434409e-04 318 KSP Residual norm 3.461533727324e-04 319 KSP Residual norm 3.496598609985e-04 320 KSP Residual norm 3.422559497886e-04 321 KSP Residual norm 3.513331786751e-04 322 KSP Residual norm 3.678215591646e-04 323 KSP Residual norm 3.837901176974e-04 324 KSP Residual norm 4.049954785378e-04 325 KSP Residual norm 4.267723759839e-04 326 KSP Residual norm 4.357579054319e-04 327 KSP Residual norm 4.206431529214e-04 328 KSP Residual norm 3.960210422309e-04 329 KSP Residual norm 3.896464535641e-04 330 KSP Residual norm 3.956519812624e-04 331 KSP Residual norm 4.048227578024e-04 332 KSP Residual norm 4.049626060228e-04 333 KSP Residual norm 3.931605132306e-04 334 KSP Residual norm 3.755642935064e-04 335 KSP Residual norm 3.661374572092e-04 336 KSP Residual norm 3.765297357572e-04 337 KSP Residual norm 3.877284874775e-04 338 KSP Residual norm 4.066083243211e-04 339 KSP Residual norm 4.307021980872e-04 340 KSP Residual norm 4.194967482851e-04 341 KSP Residual norm 3.703442337360e-04 342 KSP Residual norm 3.479060139866e-04 343 KSP Residual norm 3.259674347278e-04 344 KSP Residual norm 3.175548728963e-04 345 KSP Residual norm 3.213586928771e-04 346 KSP Residual norm 3.250807460760e-04 347 KSP Residual norm 3.251819131296e-04 348 KSP Residual norm 3.313944592753e-04 349 KSP Residual norm 3.441929583600e-04 350 KSP Residual norm 3.472720691450e-04 351 KSP Residual norm 3.498702948256e-04 352 KSP Residual norm 3.537478107137e-04 353 KSP Residual norm 3.568281368181e-04 354 KSP Residual norm 3.695141812939e-04 355 KSP Residual norm 3.792590697228e-04 356 KSP Residual norm 3.752083064642e-04 357 KSP Residual norm 3.679782681858e-04 358 KSP Residual norm 3.805879751134e-04 359 KSP Residual norm 3.903897745976e-04 360 KSP Residual norm 3.867058955608e-04 361 KSP Residual norm 3.743382017322e-04 362 KSP Residual norm 3.610323721376e-04 363 KSP Residual norm 3.305268164179e-04 364 KSP Residual norm 3.098322171523e-04 365 KSP Residual norm 3.171597509563e-04 366 KSP Residual norm 3.296285788558e-04 367 KSP Residual norm 3.259517986141e-04 368 KSP Residual norm 3.208566354772e-04 369 KSP Residual norm 3.092635598126e-04 370 KSP Residual norm 2.963806886599e-04 371 KSP Residual norm 2.930370626108e-04 372 KSP Residual norm 2.801431006457e-04 373 KSP Residual norm 2.599193126052e-04 374 KSP Residual norm 2.485546338738e-04 375 KSP Residual norm 2.564488862802e-04 376 KSP Residual norm 2.561275758804e-04 377 KSP Residual norm 2.475474939625e-04 378 KSP Residual norm 2.400266434543e-04 379 KSP Residual norm 2.476348723327e-04 380 KSP Residual norm 2.694160138385e-04 381 KSP Residual norm 2.857610375647e-04 382 KSP Residual norm 2.922524550624e-04 383 KSP Residual norm 2.848353280463e-04 384 KSP Residual norm 2.673547985660e-04 385 KSP Residual norm 2.701172032734e-04 386 KSP Residual norm 2.695675447549e-04 387 KSP Residual norm 2.625265920549e-04 388 KSP Residual norm 2.581236031158e-04 389 KSP Residual norm 2.558491000489e-04 390 KSP Residual norm 2.556638787165e-04 391 KSP Residual norm 2.709510403060e-04 392 KSP Residual norm 2.878236899729e-04 393 KSP Residual norm 2.782178148903e-04 394 KSP Residual norm 2.528341448842e-04 395 KSP Residual norm 2.397778461477e-04 396 KSP Residual norm 2.354049002805e-04 397 KSP Residual norm 2.306441416892e-04 398 KSP Residual norm 2.288412364906e-04 399 KSP Residual norm 2.317318059214e-04 400 KSP Residual norm 2.400107978774e-04 401 KSP Residual norm 2.259629392734e-04 402 KSP Residual norm 2.077477682248e-04 403 KSP Residual norm 2.005458393285e-04 404 KSP Residual norm 1.977491471838e-04 405 KSP Residual norm 1.905765654879e-04 406 KSP Residual norm 1.884021912140e-04 407 KSP Residual norm 1.988611879625e-04 408 KSP Residual norm 2.076327027260e-04 409 KSP Residual norm 2.140217100206e-04 410 KSP Residual norm 2.127525474722e-04 411 KSP Residual norm 2.203505907684e-04 412 KSP Residual norm 2.170720887913e-04 413 KSP Residual norm 2.069253730006e-04 414 KSP Residual norm 2.147078693605e-04 415 KSP Residual norm 2.327129291436e-04 416 KSP Residual norm 2.519852947981e-04 417 KSP Residual norm 2.636144153323e-04 418 KSP Residual norm 2.746072723283e-04 419 KSP Residual norm 2.818369827043e-04 420 KSP Residual norm 2.909302251349e-04 421 KSP Residual norm 3.073147904541e-04 422 KSP Residual norm 3.095441312158e-04 423 KSP Residual norm 2.977255900737e-04 424 KSP Residual norm 2.799193331487e-04 425 KSP Residual norm 2.749290483331e-04 426 KSP Residual norm 2.874606134329e-04 427 KSP Residual norm 2.902477130763e-04 428 KSP Residual norm 2.757604890217e-04 429 KSP Residual norm 2.792537966592e-04 430 KSP Residual norm 3.103314820006e-04 431 KSP Residual norm 3.385389022806e-04 432 KSP Residual norm 3.626659268954e-04 433 KSP Residual norm 3.818882153868e-04 434 KSP Residual norm 4.048385232856e-04 435 KSP Residual norm 4.197062525627e-04 436 KSP Residual norm 4.228482750532e-04 437 KSP Residual norm 4.111899475869e-04 438 KSP Residual norm 4.086774546365e-04 439 KSP Residual norm 4.132131036902e-04 440 KSP Residual norm 4.304928509065e-04 441 KSP Residual norm 4.567316942924e-04 442 KSP Residual norm 5.098478062822e-04 443 KSP Residual norm 5.417075445727e-04 444 KSP Residual norm 5.371499781942e-04 445 KSP Residual norm 5.338995247722e-04 446 KSP Residual norm 5.255072001908e-04 447 KSP Residual norm 5.110536827644e-04 448 KSP Residual norm 5.020912890601e-04 449 KSP Residual norm 5.116561237546e-04 450 KSP Residual norm 5.226811206084e-04 451 KSP Residual norm 5.048375434607e-04 452 KSP Residual norm 4.711926174669e-04 453 KSP Residual norm 4.639060730506e-04 454 KSP Residual norm 4.937366762039e-04 455 KSP Residual norm 5.325213172694e-04 456 KSP Residual norm 5.369971722332e-04 457 KSP Residual norm 5.369176373066e-04 458 KSP Residual norm 5.442584174831e-04 459 KSP Residual norm 5.537587086352e-04 460 KSP Residual norm 5.647489638850e-04 461 KSP Residual norm 5.964772267925e-04 462 KSP Residual norm 6.223153191980e-04 463 KSP Residual norm 6.584346636596e-04 464 KSP Residual norm 7.206601354544e-04 465 KSP Residual norm 7.661778987373e-04 466 KSP Residual norm 7.616320358790e-04 467 KSP Residual norm 7.409337567099e-04 468 KSP Residual norm 7.398443583852e-04 469 KSP Residual norm 7.616017560879e-04 470 KSP Residual norm 7.615689005901e-04 471 KSP Residual norm 7.465350910227e-04 472 KSP Residual norm 7.157786316230e-04 473 KSP Residual norm 6.813674934020e-04 474 KSP Residual norm 7.038903810195e-04 475 KSP Residual norm 7.411277825872e-04 476 KSP Residual norm 7.751525138659e-04 477 KSP Residual norm 7.612387324218e-04 478 KSP Residual norm 7.331684292880e-04 479 KSP Residual norm 7.443202028166e-04 480 KSP Residual norm 7.357599705108e-04 481 KSP Residual norm 7.070872272444e-04 482 KSP Residual norm 6.900126599196e-04 483 KSP Residual norm 6.342877796595e-04 484 KSP Residual norm 5.847592870453e-04 485 KSP Residual norm 5.834519419936e-04 486 KSP Residual norm 5.964804845182e-04 487 KSP Residual norm 5.662375700080e-04 488 KSP Residual norm 5.383211884869e-04 489 KSP Residual norm 5.548528167284e-04 490 KSP Residual norm 5.767492190547e-04 491 KSP Residual norm 5.837693802046e-04 492 KSP Residual norm 5.993058075815e-04 493 KSP Residual norm 6.093717460716e-04 494 KSP Residual norm 5.943460626242e-04 495 KSP Residual norm 5.649679913700e-04 496 KSP Residual norm 5.363438797937e-04 497 KSP Residual norm 5.177053380776e-04 498 KSP Residual norm 4.972535862169e-04 499 KSP Residual norm 4.985135280223e-04 500 KSP Residual norm 5.316560947638e-04 501 KSP Residual norm 5.308626408057e-04 502 KSP Residual norm 5.168202514464e-04 503 KSP Residual norm 5.011392896844e-04 504 KSP Residual norm 4.922747083879e-04 505 KSP Residual norm 4.881911680738e-04 506 KSP Residual norm 4.910535459277e-04 507 KSP Residual norm 4.738401466484e-04 508 KSP Residual norm 4.509070504787e-04 509 KSP Residual norm 4.417435773872e-04 510 KSP Residual norm 4.490586510983e-04 511 KSP Residual norm 4.373285709883e-04 512 KSP Residual norm 4.210067546256e-04 513 KSP Residual norm 4.149237852168e-04 514 KSP Residual norm 4.126616380464e-04 515 KSP Residual norm 4.400985650400e-04 516 KSP Residual norm 4.668722840293e-04 517 KSP Residual norm 4.487090538670e-04 518 KSP Residual norm 4.108252906707e-04 519 KSP Residual norm 3.893803650408e-04 520 KSP Residual norm 3.690033563343e-04 521 KSP Residual norm 3.372102448074e-04 522 KSP Residual norm 3.124314668449e-04 523 KSP Residual norm 3.063166939072e-04 524 KSP Residual norm 3.009317612193e-04 525 KSP Residual norm 2.890779231687e-04 526 KSP Residual norm 2.755646645922e-04 527 KSP Residual norm 2.771047017005e-04 528 KSP Residual norm 2.772731520962e-04 529 KSP Residual norm 2.569474876420e-04 530 KSP Residual norm 2.371363859595e-04 531 KSP Residual norm 2.320942381456e-04 532 KSP Residual norm 2.319732315341e-04 533 KSP Residual norm 2.280635097930e-04 534 KSP Residual norm 2.213056603746e-04 535 KSP Residual norm 2.213553030469e-04 536 KSP Residual norm 2.088090657561e-04 537 KSP Residual norm 1.874879273762e-04 538 KSP Residual norm 1.809895898642e-04 539 KSP Residual norm 1.865550273684e-04 540 KSP Residual norm 1.895343280653e-04 541 KSP Residual norm 1.827628870556e-04 542 KSP Residual norm 1.813947650850e-04 543 KSP Residual norm 1.928277510546e-04 544 KSP Residual norm 2.189460446107e-04 545 KSP Residual norm 2.389423192570e-04 546 KSP Residual norm 2.593499176374e-04 547 KSP Residual norm 2.784163893697e-04 548 KSP Residual norm 2.825502209655e-04 549 KSP Residual norm 2.817441266797e-04 550 KSP Residual norm 2.764960244314e-04 551 KSP Residual norm 2.734569762243e-04 552 KSP Residual norm 2.710954464580e-04 553 KSP Residual norm 2.755118209365e-04 554 KSP Residual norm 2.705711734366e-04 555 KSP Residual norm 2.657022405368e-04 556 KSP Residual norm 2.655378434301e-04 557 KSP Residual norm 2.647116778205e-04 558 KSP Residual norm 2.673506255806e-04 559 KSP Residual norm 2.649854663544e-04 560 KSP Residual norm 2.728816892564e-04 561 KSP Residual norm 2.776455434169e-04 562 KSP Residual norm 2.574948366822e-04 563 KSP Residual norm 2.316459149341e-04 564 KSP Residual norm 2.128662992125e-04 565 KSP Residual norm 2.021895944361e-04 566 KSP Residual norm 1.993268678146e-04 567 KSP Residual norm 2.080187296173e-04 568 KSP Residual norm 2.160308678926e-04 569 KSP Residual norm 2.131554348425e-04 570 KSP Residual norm 2.000975062130e-04 571 KSP Residual norm 1.910708626051e-04 572 KSP Residual norm 1.899355968815e-04 573 KSP Residual norm 2.024059819100e-04 574 KSP Residual norm 2.090185387834e-04 575 KSP Residual norm 2.104955694865e-04 576 KSP Residual norm 2.068680324336e-04 577 KSP Residual norm 2.063659888697e-04 578 KSP Residual norm 2.109483827571e-04 579 KSP Residual norm 2.247507464692e-04 580 KSP Residual norm 2.536365279751e-04 581 KSP Residual norm 2.995612119600e-04 582 KSP Residual norm 3.286959491975e-04 583 KSP Residual norm 3.458910058780e-04 584 KSP Residual norm 3.600427435581e-04 585 KSP Residual norm 3.700882017988e-04 586 KSP Residual norm 3.665787171631e-04 587 KSP Residual norm 3.770035539765e-04 588 KSP Residual norm 3.857204574662e-04 589 KSP Residual norm 3.735291635419e-04 590 KSP Residual norm 3.606019788556e-04 591 KSP Residual norm 3.510361884609e-04 592 KSP Residual norm 3.484777324544e-04 593 KSP Residual norm 3.709360132250e-04 594 KSP Residual norm 3.894533771300e-04 595 KSP Residual norm 4.105238338751e-04 596 KSP Residual norm 4.060895026346e-04 597 KSP Residual norm 3.836560995854e-04 598 KSP Residual norm 3.772003634251e-04 599 KSP Residual norm 3.966847293137e-04 600 KSP Residual norm 4.229400992440e-04 601 KSP Residual norm 4.096565923865e-04 602 KSP Residual norm 3.859826494372e-04 603 KSP Residual norm 3.770500142573e-04 604 KSP Residual norm 3.880439493804e-04 605 KSP Residual norm 4.191534630627e-04 606 KSP Residual norm 4.414758926370e-04 607 KSP Residual norm 4.321646514625e-04 608 KSP Residual norm 4.138305649943e-04 609 KSP Residual norm 3.958945970100e-04 610 KSP Residual norm 3.827174321188e-04 611 KSP Residual norm 3.733233914260e-04 612 KSP Residual norm 3.561676971222e-04 613 KSP Residual norm 3.521969896433e-04 614 KSP Residual norm 3.652565699572e-04 615 KSP Residual norm 3.621046057501e-04 616 KSP Residual norm 3.459888780479e-04 617 KSP Residual norm 3.331270592330e-04 618 KSP Residual norm 3.289302440553e-04 619 KSP Residual norm 3.350218092037e-04 620 KSP Residual norm 3.482870144727e-04 621 KSP Residual norm 3.475563403751e-04 622 KSP Residual norm 3.342436536453e-04 623 KSP Residual norm 3.330867058258e-04 624 KSP Residual norm 3.380760842823e-04 625 KSP Residual norm 3.292356888261e-04 626 KSP Residual norm 3.050811981393e-04 627 KSP Residual norm 2.852909413427e-04 628 KSP Residual norm 2.748908805111e-04 629 KSP Residual norm 2.713072346263e-04 630 KSP Residual norm 2.686547215078e-04 631 KSP Residual norm 2.604360223220e-04 632 KSP Residual norm 2.553432595079e-04 633 KSP Residual norm 2.632918151328e-04 634 KSP Residual norm 2.767552525703e-04 635 KSP Residual norm 2.732590709563e-04 636 KSP Residual norm 2.552668340230e-04 637 KSP Residual norm 2.426470704150e-04 638 KSP Residual norm 2.504925811177e-04 639 KSP Residual norm 2.664749837324e-04 640 KSP Residual norm 2.663289494501e-04 641 KSP Residual norm 2.587736071318e-04 642 KSP Residual norm 2.610215060048e-04 643 KSP Residual norm 2.665150886995e-04 644 KSP Residual norm 2.666587947651e-04 645 KSP Residual norm 2.606735807949e-04 646 KSP Residual norm 2.494471368047e-04 647 KSP Residual norm 2.463369537758e-04 648 KSP Residual norm 2.465399359065e-04 649 KSP Residual norm 2.366218981115e-04 650 KSP Residual norm 2.211573924004e-04 651 KSP Residual norm 2.018730340623e-04 652 KSP Residual norm 1.888471481475e-04 653 KSP Residual norm 1.818203529277e-04 654 KSP Residual norm 1.815359699326e-04 655 KSP Residual norm 1.889774417459e-04 656 KSP Residual norm 1.901525887233e-04 657 KSP Residual norm 1.938847383440e-04 658 KSP Residual norm 1.915407299215e-04 659 KSP Residual norm 1.850964001476e-04 660 KSP Residual norm 1.841397297402e-04 661 KSP Residual norm 1.866959987298e-04 662 KSP Residual norm 1.770891695831e-04 663 KSP Residual norm 1.662573137785e-04 664 KSP Residual norm 1.591115887560e-04 665 KSP Residual norm 1.512013373742e-04 666 KSP Residual norm 1.518553517077e-04 667 KSP Residual norm 1.513046488993e-04 668 KSP Residual norm 1.440479352991e-04 669 KSP Residual norm 1.403945594173e-04 670 KSP Residual norm 1.365169959673e-04 671 KSP Residual norm 1.393479125988e-04 672 KSP Residual norm 1.511214294290e-04 673 KSP Residual norm 1.587884613257e-04 674 KSP Residual norm 1.575479668122e-04 675 KSP Residual norm 1.566909340512e-04 676 KSP Residual norm 1.522536705239e-04 677 KSP Residual norm 1.390501033191e-04 678 KSP Residual norm 1.340917935740e-04 679 KSP Residual norm 1.360345295459e-04 680 KSP Residual norm 1.379130227899e-04 681 KSP Residual norm 1.386845452263e-04 682 KSP Residual norm 1.463497011884e-04 683 KSP Residual norm 1.489178479418e-04 684 KSP Residual norm 1.418440621591e-04 685 KSP Residual norm 1.356745528882e-04 686 KSP Residual norm 1.341984731879e-04 687 KSP Residual norm 1.343719618582e-04 688 KSP Residual norm 1.273418777658e-04 689 KSP Residual norm 1.118180679154e-04 690 KSP Residual norm 1.045955198090e-04 691 KSP Residual norm 1.039288383267e-04 692 KSP Residual norm 1.011554506544e-04 693 KSP Residual norm 9.368118667392e-05 694 KSP Residual norm 8.778815047921e-05 695 KSP Residual norm 8.242840443364e-05 696 KSP Residual norm 8.284701377902e-05 697 KSP Residual norm 8.771902822762e-05 698 KSP Residual norm 9.158099488134e-05 699 KSP Residual norm 9.045580911773e-05 700 KSP Residual norm 8.720039417268e-05 701 KSP Residual norm 8.061595510881e-05 702 KSP Residual norm 7.564306265351e-05 703 KSP Residual norm 7.029080695304e-05 704 KSP Residual norm 6.505430184931e-05 705 KSP Residual norm 6.359848182438e-05 706 KSP Residual norm 6.945075160604e-05 707 KSP Residual norm 7.964562428650e-05 708 KSP Residual norm 8.464374068866e-05 709 KSP Residual norm 8.605329694482e-05 710 KSP Residual norm 8.784893347748e-05 711 KSP Residual norm 9.077680593438e-05 712 KSP Residual norm 9.211637659931e-05 713 KSP Residual norm 9.287293305922e-05 714 KSP Residual norm 8.955034606774e-05 715 KSP Residual norm 9.350476179194e-05 716 KSP Residual norm 1.020621804336e-04 717 KSP Residual norm 1.099871187190e-04 718 KSP Residual norm 1.097060015378e-04 719 KSP Residual norm 1.005983055756e-04 720 KSP Residual norm 9.136936752700e-05 721 KSP Residual norm 8.904756140863e-05 722 KSP Residual norm 9.324078480645e-05 723 KSP Residual norm 9.478374721466e-05 724 KSP Residual norm 9.303956716846e-05 725 KSP Residual norm 9.379200182521e-05 726 KSP Residual norm 9.789367347975e-05 727 KSP Residual norm 1.014423597225e-04 728 KSP Residual norm 1.001219584124e-04 729 KSP Residual norm 9.483663061797e-05 730 KSP Residual norm 9.075616607417e-05 731 KSP Residual norm 9.179214431364e-05 732 KSP Residual norm 9.621078333832e-05 733 KSP Residual norm 9.903635553456e-05 734 KSP Residual norm 9.960057375744e-05 735 KSP Residual norm 9.832985171293e-05 736 KSP Residual norm 9.961540001231e-05 737 KSP Residual norm 1.074098614953e-04 738 KSP Residual norm 1.129631023486e-04 739 KSP Residual norm 1.211427118724e-04 740 KSP Residual norm 1.182594319717e-04 741 KSP Residual norm 1.073481015164e-04 742 KSP Residual norm 1.070075767118e-04 743 KSP Residual norm 1.128573325618e-04 744 KSP Residual norm 1.129815739752e-04 745 KSP Residual norm 1.084953547993e-04 746 KSP Residual norm 1.077186897046e-04 747 KSP Residual norm 1.152306412331e-04 748 KSP Residual norm 1.264154432455e-04 749 KSP Residual norm 1.262888960034e-04 750 KSP Residual norm 1.219869790948e-04 751 KSP Residual norm 1.188599982608e-04 752 KSP Residual norm 1.128391206910e-04 753 KSP Residual norm 1.056224946333e-04 754 KSP Residual norm 1.042303676318e-04 755 KSP Residual norm 1.112996020318e-04 756 KSP Residual norm 1.230972573107e-04 757 KSP Residual norm 1.287410376401e-04 758 KSP Residual norm 1.283057084344e-04 759 KSP Residual norm 1.302262556370e-04 760 KSP Residual norm 1.338648157566e-04 761 KSP Residual norm 1.390411979530e-04 762 KSP Residual norm 1.345273231030e-04 763 KSP Residual norm 1.230468725277e-04 764 KSP Residual norm 1.127479391567e-04 765 KSP Residual norm 1.093379906097e-04 766 KSP Residual norm 1.104194925894e-04 767 KSP Residual norm 1.070059525267e-04 768 KSP Residual norm 9.816270285947e-05 769 KSP Residual norm 8.972957065753e-05 770 KSP Residual norm 8.433612921540e-05 771 KSP Residual norm 7.783694295777e-05 772 KSP Residual norm 7.238906231984e-05 773 KSP Residual norm 7.114738254377e-05 774 KSP Residual norm 7.135970448637e-05 775 KSP Residual norm 6.865321420652e-05 776 KSP Residual norm 6.590548987731e-05 777 KSP Residual norm 6.492853979065e-05 778 KSP Residual norm 6.564956719799e-05 779 KSP Residual norm 6.488622156589e-05 780 KSP Residual norm 6.230254109345e-05 781 KSP Residual norm 6.075697104076e-05 782 KSP Residual norm 6.163396656740e-05 783 KSP Residual norm 6.604961640641e-05 784 KSP Residual norm 6.865848550932e-05 785 KSP Residual norm 6.867497355651e-05 786 KSP Residual norm 6.692216674343e-05 787 KSP Residual norm 6.509052474198e-05 788 KSP Residual norm 6.780430251246e-05 789 KSP Residual norm 7.226643294862e-05 790 KSP Residual norm 7.467219628724e-05 791 KSP Residual norm 7.607989415374e-05 792 KSP Residual norm 7.400047152191e-05 793 KSP Residual norm 6.945455914576e-05 794 KSP Residual norm 6.743261338199e-05 795 KSP Residual norm 6.591024804061e-05 796 KSP Residual norm 6.290084823521e-05 797 KSP Residual norm 5.842037130509e-05 798 KSP Residual norm 5.596201688000e-05 799 KSP Residual norm 5.602274355058e-05 800 KSP Residual norm 5.892522654710e-05 801 KSP Residual norm 6.177925173331e-05 802 KSP Residual norm 6.545565104832e-05 803 KSP Residual norm 7.008269249873e-05 804 KSP Residual norm 6.981018823994e-05 805 KSP Residual norm 6.381899885451e-05 806 KSP Residual norm 6.016056007871e-05 807 KSP Residual norm 6.107484493908e-05 808 KSP Residual norm 6.474605984739e-05 809 KSP Residual norm 6.613826694597e-05 810 KSP Residual norm 6.349749633887e-05 811 KSP Residual norm 5.804077916330e-05 812 KSP Residual norm 5.587455640175e-05 813 KSP Residual norm 5.666206285036e-05 814 KSP Residual norm 5.483579564532e-05 815 KSP Residual norm 4.989095365380e-05 816 KSP Residual norm 4.714945287635e-05 817 KSP Residual norm 5.012346093655e-05 818 KSP Residual norm 5.680103112343e-05 819 KSP Residual norm 5.878234244686e-05 820 KSP Residual norm 5.530042722349e-05 821 KSP Residual norm 5.067697236267e-05 822 KSP Residual norm 4.709562325876e-05 823 KSP Residual norm 4.307295018147e-05 824 KSP Residual norm 4.059350372359e-05 825 KSP Residual norm 4.090732338720e-05 826 KSP Residual norm 4.182747107168e-05 827 KSP Residual norm 4.128606856004e-05 828 KSP Residual norm 4.167323965934e-05 829 KSP Residual norm 4.083423822730e-05 830 KSP Residual norm 4.009354493496e-05 831 KSP Residual norm 4.133885600597e-05 832 KSP Residual norm 4.197541767529e-05 833 KSP Residual norm 4.240937926697e-05 834 KSP Residual norm 4.873737639661e-05 835 KSP Residual norm 5.696733311686e-05 836 KSP Residual norm 6.255019564041e-05 837 KSP Residual norm 6.258981459509e-05 838 KSP Residual norm 5.799777851960e-05 839 KSP Residual norm 5.684643664611e-05 840 KSP Residual norm 5.932739216666e-05 841 KSP Residual norm 5.953921917871e-05 842 KSP Residual norm 5.672027683065e-05 843 KSP Residual norm 5.636113048167e-05 844 KSP Residual norm 6.045847179428e-05 845 KSP Residual norm 6.485229209418e-05 846 KSP Residual norm 6.369430030058e-05 847 KSP Residual norm 6.300389895430e-05 848 KSP Residual norm 6.544712222617e-05 849 KSP Residual norm 7.000971984079e-05 850 KSP Residual norm 7.133287306655e-05 851 KSP Residual norm 7.087058130692e-05 852 KSP Residual norm 7.337204849559e-05 853 KSP Residual norm 7.711310420568e-05 854 KSP Residual norm 7.785400856908e-05 855 KSP Residual norm 7.881065966625e-05 856 KSP Residual norm 7.896284636192e-05 857 KSP Residual norm 8.010917447026e-05 858 KSP Residual norm 8.460404292117e-05 859 KSP Residual norm 8.901098830854e-05 860 KSP Residual norm 8.494241143635e-05 861 KSP Residual norm 7.914195701307e-05 862 KSP Residual norm 7.588697085016e-05 863 KSP Residual norm 7.264635006538e-05 864 KSP Residual norm 6.564966440332e-05 865 KSP Residual norm 5.851261779231e-05 866 KSP Residual norm 5.294605912401e-05 867 KSP Residual norm 4.968567211044e-05 868 KSP Residual norm 4.960398891237e-05 869 KSP Residual norm 4.922579626325e-05 870 KSP Residual norm 5.013636474296e-05 871 KSP Residual norm 5.279049511588e-05 872 KSP Residual norm 5.572227844882e-05 873 KSP Residual norm 5.674823503410e-05 874 KSP Residual norm 6.011556345170e-05 875 KSP Residual norm 6.513883896674e-05 876 KSP Residual norm 7.072326382073e-05 877 KSP Residual norm 7.659461421539e-05 878 KSP Residual norm 7.971907062354e-05 879 KSP Residual norm 7.385907552013e-05 880 KSP Residual norm 6.590828765923e-05 881 KSP Residual norm 5.962504329792e-05 882 KSP Residual norm 5.670352987848e-05 883 KSP Residual norm 5.571788468776e-05 884 KSP Residual norm 5.515800176446e-05 885 KSP Residual norm 5.002533023182e-05 886 KSP Residual norm 4.419356033551e-05 887 KSP Residual norm 3.993527786211e-05 888 KSP Residual norm 3.816981615607e-05 889 KSP Residual norm 3.919361614400e-05 890 KSP Residual norm 4.129046356636e-05 891 KSP Residual norm 4.286727239251e-05 892 KSP Residual norm 4.516223594829e-05 893 KSP Residual norm 4.638919513952e-05 894 KSP Residual norm 4.521823965958e-05 895 KSP Residual norm 4.265602312420e-05 896 KSP Residual norm 4.076417985548e-05 897 KSP Residual norm 3.924967806215e-05 898 KSP Residual norm 3.735191718126e-05 899 KSP Residual norm 3.691413685076e-05 900 KSP Residual norm 3.627989323275e-05 901 KSP Residual norm 3.400508490191e-05 902 KSP Residual norm 3.155137917680e-05 903 KSP Residual norm 3.192270938184e-05 904 KSP Residual norm 3.493742114852e-05 905 KSP Residual norm 3.877090447324e-05 906 KSP Residual norm 3.940302474632e-05 907 KSP Residual norm 3.702545272101e-05 908 KSP Residual norm 3.450793462990e-05 909 KSP Residual norm 3.399691751787e-05 910 KSP Residual norm 3.358578323570e-05 911 KSP Residual norm 3.502480119750e-05 912 KSP Residual norm 3.649252011529e-05 913 KSP Residual norm 3.588661365164e-05 914 KSP Residual norm 3.621832847822e-05 915 KSP Residual norm 3.786731251138e-05 916 KSP Residual norm 3.864898340458e-05 917 KSP Residual norm 3.769276717839e-05 918 KSP Residual norm 3.732307007879e-05 919 KSP Residual norm 3.756836896410e-05 920 KSP Residual norm 3.811016365236e-05 921 KSP Residual norm 3.960184242316e-05 922 KSP Residual norm 3.754853856014e-05 923 KSP Residual norm 3.502531618859e-05 924 KSP Residual norm 3.546497303336e-05 925 KSP Residual norm 3.828498211723e-05 926 KSP Residual norm 3.841815082714e-05 927 KSP Residual norm 3.539118184106e-05 928 KSP Residual norm 3.161946113532e-05 929 KSP Residual norm 3.138660505138e-05 930 KSP Residual norm 3.357606197689e-05 931 KSP Residual norm 3.717370565927e-05 932 KSP Residual norm 3.968632605302e-05 933 KSP Residual norm 4.044165822633e-05 934 KSP Residual norm 4.131508740771e-05 935 KSP Residual norm 4.136491939955e-05 936 KSP Residual norm 4.126148013072e-05 937 KSP Residual norm 4.041128025313e-05 938 KSP Residual norm 3.961551627917e-05 939 KSP Residual norm 3.856713286085e-05 940 KSP Residual norm 3.769782531769e-05 941 KSP Residual norm 3.616050833389e-05 942 KSP Residual norm 3.471734940026e-05 943 KSP Residual norm 3.468245629033e-05 944 KSP Residual norm 3.453209177167e-05 945 KSP Residual norm 3.534618388985e-05 946 KSP Residual norm 3.487643302181e-05 947 KSP Residual norm 3.495424884110e-05 948 KSP Residual norm 3.559720584220e-05 949 KSP Residual norm 3.541169440522e-05 950 KSP Residual norm 3.405633898017e-05 951 KSP Residual norm 3.415280558028e-05 952 KSP Residual norm 3.415306079075e-05 953 KSP Residual norm 3.307928211253e-05 954 KSP Residual norm 3.394163735346e-05 955 KSP Residual norm 3.534878641275e-05 956 KSP Residual norm 3.714284133726e-05 957 KSP Residual norm 3.771075874784e-05 958 KSP Residual norm 3.569087402792e-05 959 KSP Residual norm 3.404466048865e-05 960 KSP Residual norm 3.514047102071e-05 961 KSP Residual norm 3.692863480980e-05 962 KSP Residual norm 3.739129967007e-05 963 KSP Residual norm 3.609985665880e-05 964 KSP Residual norm 3.510310414868e-05 965 KSP Residual norm 3.469631543251e-05 966 KSP Residual norm 3.380512839358e-05 967 KSP Residual norm 3.333491012338e-05 968 KSP Residual norm 3.183321577933e-05 969 KSP Residual norm 3.072721814543e-05 970 KSP Residual norm 2.929011342762e-05 971 KSP Residual norm 2.792719450245e-05 972 KSP Residual norm 2.819164627017e-05 973 KSP Residual norm 2.911130705426e-05 974 KSP Residual norm 2.961573082092e-05 975 KSP Residual norm 2.787466833860e-05 976 KSP Residual norm 2.514815034601e-05 977 KSP Residual norm 2.517415015423e-05 978 KSP Residual norm 2.676061088536e-05 979 KSP Residual norm 2.725100983154e-05 980 KSP Residual norm 2.761770527425e-05 981 KSP Residual norm 2.921250108395e-05 982 KSP Residual norm 3.098971688663e-05 983 KSP Residual norm 3.139010283323e-05 984 KSP Residual norm 3.226622522077e-05 985 KSP Residual norm 3.501733989075e-05 986 KSP Residual norm 4.087763626811e-05 987 KSP Residual norm 4.360030797778e-05 988 KSP Residual norm 4.141821873716e-05 989 KSP Residual norm 3.967171439491e-05 990 KSP Residual norm 4.110731618745e-05 991 KSP Residual norm 4.479467375231e-05 992 KSP Residual norm 4.446293766934e-05 993 KSP Residual norm 4.339398266916e-05 994 KSP Residual norm 4.253386550079e-05 995 KSP Residual norm 3.998397695582e-05 996 KSP Residual norm 3.558845767369e-05 997 KSP Residual norm 3.168519622176e-05 998 KSP Residual norm 2.899636669744e-05 999 KSP Residual norm 2.800404914530e-05 1000 KSP Residual norm 2.632930089158e-05 1001 KSP Residual norm 2.432386037413e-05 1002 KSP Residual norm 2.410110010576e-05 1003 KSP Residual norm 2.678003895453e-05 1004 KSP Residual norm 2.857706541637e-05 1005 KSP Residual norm 2.896473694628e-05 1006 KSP Residual norm 2.911200657526e-05 1007 KSP Residual norm 3.064307874813e-05 1008 KSP Residual norm 3.240707387696e-05 1009 KSP Residual norm 3.282702607295e-05 1010 KSP Residual norm 3.080647632135e-05 1011 KSP Residual norm 3.002799060369e-05 1012 KSP Residual norm 3.191194591343e-05 1013 KSP Residual norm 3.307073954253e-05 1014 KSP Residual norm 3.139573911125e-05 1015 KSP Residual norm 2.994902790213e-05 1016 KSP Residual norm 3.121351597704e-05 1017 KSP Residual norm 3.516775342604e-05 1018 KSP Residual norm 3.695495151602e-05 1019 KSP Residual norm 3.603942107279e-05 1020 KSP Residual norm 3.393762259557e-05 1021 KSP Residual norm 3.238363596203e-05 1022 KSP Residual norm 3.131187867504e-05 1023 KSP Residual norm 2.953730496856e-05 1024 KSP Residual norm 2.733207091346e-05 1025 KSP Residual norm 2.535581658465e-05 1026 KSP Residual norm 2.447094026325e-05 1027 KSP Residual norm 2.352697197001e-05 1028 KSP Residual norm 2.155594209180e-05 1029 KSP Residual norm 1.965614719135e-05 1030 KSP Residual norm 1.887552175669e-05 1031 KSP Residual norm 2.029984807722e-05 1032 KSP Residual norm 2.217189394586e-05 1033 KSP Residual norm 2.188841397438e-05 1034 KSP Residual norm 2.048334711720e-05 1035 KSP Residual norm 1.999122277764e-05 1036 KSP Residual norm 2.042632289208e-05 1037 KSP Residual norm 2.189170277045e-05 1038 KSP Residual norm 2.315714798040e-05 1039 KSP Residual norm 2.324923891951e-05 1040 KSP Residual norm 2.195190483420e-05 1041 KSP Residual norm 2.152830707271e-05 1042 KSP Residual norm 2.151685721855e-05 1043 KSP Residual norm 2.039556159966e-05 1044 KSP Residual norm 1.876175855317e-05 1045 KSP Residual norm 1.752390057233e-05 1046 KSP Residual norm 1.732328425934e-05 1047 KSP Residual norm 1.827987447603e-05 1048 KSP Residual norm 1.928677580358e-05 1049 KSP Residual norm 1.892069919774e-05 1050 KSP Residual norm 1.852045496008e-05 1051 KSP Residual norm 1.909455420889e-05 1052 KSP Residual norm 1.963197102343e-05 1053 KSP Residual norm 1.878750381399e-05 1054 KSP Residual norm 1.801423987327e-05 1055 KSP Residual norm 1.794593205187e-05 1056 KSP Residual norm 1.784578915985e-05 1057 KSP Residual norm 1.824167830530e-05 1058 KSP Residual norm 1.762008935708e-05 1059 KSP Residual norm 1.725736868901e-05 1060 KSP Residual norm 1.733529401393e-05 1061 KSP Residual norm 1.675111428068e-05 1062 KSP Residual norm 1.560285858569e-05 1063 KSP Residual norm 1.503923284598e-05 1064 KSP Residual norm 1.552705067186e-05 1065 KSP Residual norm 1.666277765664e-05 1066 KSP Residual norm 1.704086323254e-05 1067 KSP Residual norm 1.647450535555e-05 1068 KSP Residual norm 1.575953685271e-05 1069 KSP Residual norm 1.458482805043e-05 1070 KSP Residual norm 1.352251766894e-05 1071 KSP Residual norm 1.312217340347e-05 1072 KSP Residual norm 1.352632846867e-05 1073 KSP Residual norm 1.413483063328e-05 1074 KSP Residual norm 1.434560041448e-05 1075 KSP Residual norm 1.475811454994e-05 1076 KSP Residual norm 1.505897058670e-05 1077 KSP Residual norm 1.546970593577e-05 1078 KSP Residual norm 1.620499935072e-05 1079 KSP Residual norm 1.736508480898e-05 1080 KSP Residual norm 1.797065636785e-05 1081 KSP Residual norm 1.794718371417e-05 1082 KSP Residual norm 1.652737141517e-05 1083 KSP Residual norm 1.529827929191e-05 1084 KSP Residual norm 1.449423167720e-05 1085 KSP Residual norm 1.376733087223e-05 1086 KSP Residual norm 1.268819028001e-05 1087 KSP Residual norm 1.186357337288e-05 1088 KSP Residual norm 1.214148800488e-05 1089 KSP Residual norm 1.329666986000e-05 1090 KSP Residual norm 1.419553187845e-05 1091 KSP Residual norm 1.458319413329e-05 1092 KSP Residual norm 1.462430506569e-05 1093 KSP Residual norm 1.425893945780e-05 1094 KSP Residual norm 1.380815982992e-05 1095 KSP Residual norm 1.371594388089e-05 1096 KSP Residual norm 1.378047341795e-05 1097 KSP Residual norm 1.392099279805e-05 1098 KSP Residual norm 1.432358491796e-05 1099 KSP Residual norm 1.512359341605e-05 1100 KSP Residual norm 1.522604555772e-05 1101 KSP Residual norm 1.501582067235e-05 1102 KSP Residual norm 1.454323377181e-05 1103 KSP Residual norm 1.351192087360e-05 1104 KSP Residual norm 1.307031902499e-05 1105 KSP Residual norm 1.249593230800e-05 1106 KSP Residual norm 1.158914294459e-05 1107 KSP Residual norm 1.085711596923e-05 1108 KSP Residual norm 1.129323130221e-05 1109 KSP Residual norm 1.203241273996e-05 1110 KSP Residual norm 1.189488466156e-05 1111 KSP Residual norm 1.179902996396e-05 1112 KSP Residual norm 1.155449286810e-05 1113 KSP Residual norm 1.119127942279e-05 1114 KSP Residual norm 1.095314824700e-05 1115 KSP Residual norm 1.113697658218e-05 1116 KSP Residual norm 1.183016228919e-05 1117 KSP Residual norm 1.235434182082e-05 1118 KSP Residual norm 1.234300362882e-05 1119 KSP Residual norm 1.207958155940e-05 1120 KSP Residual norm 1.123744155387e-05 1121 KSP Residual norm 1.041139367663e-05 1122 KSP Residual norm 1.028332002076e-05 1123 KSP Residual norm 1.036861254659e-05 1124 KSP Residual norm 1.029357216562e-05 1125 KSP Residual norm 1.056249546759e-05 1126 KSP Residual norm 1.117827232537e-05 1127 KSP Residual norm 1.191528768311e-05 1128 KSP Residual norm 1.236699646248e-05 1129 KSP Residual norm 1.265404831743e-05 1130 KSP Residual norm 1.259697291451e-05 1131 KSP Residual norm 1.252740788379e-05 1132 KSP Residual norm 1.285626617730e-05 1133 KSP Residual norm 1.300558343903e-05 1134 KSP Residual norm 1.381016789594e-05 1135 KSP Residual norm 1.431343942265e-05 1136 KSP Residual norm 1.451950274356e-05 1137 KSP Residual norm 1.532300867493e-05 1138 KSP Residual norm 1.661444959755e-05 1139 KSP Residual norm 1.719736506697e-05 1140 KSP Residual norm 1.681549851108e-05 1141 KSP Residual norm 1.653265064443e-05 1142 KSP Residual norm 1.671917219582e-05 1143 KSP Residual norm 1.782332039009e-05 1144 KSP Residual norm 1.994601751655e-05 1145 KSP Residual norm 2.077328982285e-05 1146 KSP Residual norm 1.941233838159e-05 1147 KSP Residual norm 1.721474036960e-05 1148 KSP Residual norm 1.640331542375e-05 1149 KSP Residual norm 1.673463384373e-05 1150 KSP Residual norm 1.706161042198e-05 1151 KSP Residual norm 1.750035673268e-05 1152 KSP Residual norm 1.758683201480e-05 1153 KSP Residual norm 1.807147151558e-05 1154 KSP Residual norm 1.920330248147e-05 1155 KSP Residual norm 1.990715240609e-05 1156 KSP Residual norm 2.142921951081e-05 1157 KSP Residual norm 2.363394653277e-05 1158 KSP Residual norm 2.409679646693e-05 1159 KSP Residual norm 2.252848922981e-05 1160 KSP Residual norm 2.043574500195e-05 1161 KSP Residual norm 1.980661156307e-05 1162 KSP Residual norm 2.013229780946e-05 1163 KSP Residual norm 2.114143261374e-05 1164 KSP Residual norm 2.237477533649e-05 1165 KSP Residual norm 2.193809826587e-05 1166 KSP Residual norm 2.115476431617e-05 1167 KSP Residual norm 2.057247242821e-05 1168 KSP Residual norm 2.107587792497e-05 1169 KSP Residual norm 2.127763683881e-05 1170 KSP Residual norm 2.152671341555e-05 1171 KSP Residual norm 2.133790325110e-05 1172 KSP Residual norm 2.195190825592e-05 1173 KSP Residual norm 2.181786865731e-05 1174 KSP Residual norm 2.166847154574e-05 1175 KSP Residual norm 2.198849117136e-05 1176 KSP Residual norm 2.317094973172e-05 1177 KSP Residual norm 2.466189411577e-05 1178 KSP Residual norm 2.533260579389e-05 1179 KSP Residual norm 2.496896232496e-05 1180 KSP Residual norm 2.456039980620e-05 1181 KSP Residual norm 2.388173873804e-05 1182 KSP Residual norm 2.220204713040e-05 1183 KSP Residual norm 2.155479071116e-05 1184 KSP Residual norm 2.090194259046e-05 1185 KSP Residual norm 1.975397097235e-05 1186 KSP Residual norm 1.849788088781e-05 1187 KSP Residual norm 1.759917041498e-05 1188 KSP Residual norm 1.735589344613e-05 1189 KSP Residual norm 1.838587390698e-05 1190 KSP Residual norm 2.048981805969e-05 1191 KSP Residual norm 2.223793868762e-05 1192 KSP Residual norm 2.364503403067e-05 1193 KSP Residual norm 2.631922877562e-05 1194 KSP Residual norm 2.873945965252e-05 1195 KSP Residual norm 2.872939709810e-05 1196 KSP Residual norm 2.788731961436e-05 1197 KSP Residual norm 2.868757234110e-05 1198 KSP Residual norm 3.144223612869e-05 1199 KSP Residual norm 3.432982852138e-05 1200 KSP Residual norm 3.415188672221e-05 1201 KSP Residual norm 3.201228456020e-05 1202 KSP Residual norm 3.101834145042e-05 1203 KSP Residual norm 3.230300590884e-05 1204 KSP Residual norm 3.433512799804e-05 1205 KSP Residual norm 3.528747492801e-05 1206 KSP Residual norm 3.516125863969e-05 1207 KSP Residual norm 3.561417037081e-05 1208 KSP Residual norm 3.602552478765e-05 1209 KSP Residual norm 3.616055397628e-05 1210 KSP Residual norm 3.724544779553e-05 1211 KSP Residual norm 3.751020115570e-05 1212 KSP Residual norm 3.775147405145e-05 1213 KSP Residual norm 3.846972462633e-05 1214 KSP Residual norm 3.921777794086e-05 1215 KSP Residual norm 4.045578824687e-05 1216 KSP Residual norm 4.263883479369e-05 1217 KSP Residual norm 4.442341519977e-05 1218 KSP Residual norm 4.228423620976e-05 1219 KSP Residual norm 3.997608202466e-05 1220 KSP Residual norm 3.974303821045e-05 1221 KSP Residual norm 4.166416907785e-05 1222 KSP Residual norm 4.337337531367e-05 1223 KSP Residual norm 4.304801730325e-05 1224 KSP Residual norm 3.972456679578e-05 1225 KSP Residual norm 3.674149351926e-05 1226 KSP Residual norm 3.542568231494e-05 1227 KSP Residual norm 3.559223370832e-05 1228 KSP Residual norm 3.564620991575e-05 1229 KSP Residual norm 3.624345317041e-05 1230 KSP Residual norm 3.688180849276e-05 1231 KSP Residual norm 3.747624167001e-05 1232 KSP Residual norm 3.667702842277e-05 1233 KSP Residual norm 3.521168601182e-05 1234 KSP Residual norm 3.685882616824e-05 1235 KSP Residual norm 4.045385557802e-05 1236 KSP Residual norm 4.159322026787e-05 1237 KSP Residual norm 3.990731576483e-05 1238 KSP Residual norm 4.128498874312e-05 1239 KSP Residual norm 4.380784910485e-05 1240 KSP Residual norm 4.316924851041e-05 1241 KSP Residual norm 3.912573008241e-05 1242 KSP Residual norm 3.851391161173e-05 1243 KSP Residual norm 4.129838956987e-05 1244 KSP Residual norm 4.614278501035e-05 1245 KSP Residual norm 5.008320879065e-05 1246 KSP Residual norm 4.958150888960e-05 1247 KSP Residual norm 4.627384151002e-05 1248 KSP Residual norm 4.311350937799e-05 1249 KSP Residual norm 4.062899886144e-05 1250 KSP Residual norm 3.775718522903e-05 1251 KSP Residual norm 3.698787628953e-05 1252 KSP Residual norm 3.783599369550e-05 1253 KSP Residual norm 3.676383028426e-05 1254 KSP Residual norm 3.317555377620e-05 1255 KSP Residual norm 3.107385108569e-05 1256 KSP Residual norm 3.129533136678e-05 1257 KSP Residual norm 3.289965152911e-05 1258 KSP Residual norm 3.635528758498e-05 1259 KSP Residual norm 3.994005559172e-05 1260 KSP Residual norm 4.007643275249e-05 1261 KSP Residual norm 3.740406633791e-05 1262 KSP Residual norm 3.454688516715e-05 1263 KSP Residual norm 3.373203661175e-05 1264 KSP Residual norm 3.502872106137e-05 1265 KSP Residual norm 3.689388016819e-05 1266 KSP Residual norm 3.801244967261e-05 1267 KSP Residual norm 3.751978305285e-05 1268 KSP Residual norm 3.646629208410e-05 1269 KSP Residual norm 3.660349614908e-05 1270 KSP Residual norm 3.801554331752e-05 1271 KSP Residual norm 3.920798039746e-05 1272 KSP Residual norm 3.928361080415e-05 1273 KSP Residual norm 3.726311168390e-05 1274 KSP Residual norm 3.695969197323e-05 1275 KSP Residual norm 3.834103547396e-05 1276 KSP Residual norm 3.940585188190e-05 1277 KSP Residual norm 3.981893995619e-05 1278 KSP Residual norm 4.000054990565e-05 1279 KSP Residual norm 3.886802488370e-05 1280 KSP Residual norm 3.948519609008e-05 1281 KSP Residual norm 4.258115972935e-05 1282 KSP Residual norm 4.300800633516e-05 1283 KSP Residual norm 4.348818985512e-05 1284 KSP Residual norm 4.271386858815e-05 1285 KSP Residual norm 4.174299748199e-05 1286 KSP Residual norm 4.071149576460e-05 1287 KSP Residual norm 4.197932276342e-05 1288 KSP Residual norm 4.333901424858e-05 1289 KSP Residual norm 4.247450953939e-05 1290 KSP Residual norm 4.269959166605e-05 1291 KSP Residual norm 4.266576981810e-05 1292 KSP Residual norm 4.154941891414e-05 1293 KSP Residual norm 4.003605631722e-05 1294 KSP Residual norm 3.893957454975e-05 1295 KSP Residual norm 3.689181013469e-05 1296 KSP Residual norm 3.664121308226e-05 1297 KSP Residual norm 4.045006194263e-05 1298 KSP Residual norm 4.535591760325e-05 1299 KSP Residual norm 4.593219719312e-05 1300 KSP Residual norm 4.460326962986e-05 1301 KSP Residual norm 4.527225284020e-05 1302 KSP Residual norm 5.164373179724e-05 1303 KSP Residual norm 5.987305457757e-05 1304 KSP Residual norm 6.205531458447e-05 1305 KSP Residual norm 5.954569824587e-05 1306 KSP Residual norm 5.760371519062e-05 1307 KSP Residual norm 5.936824587461e-05 1308 KSP Residual norm 6.002654389829e-05 1309 KSP Residual norm 6.036642176803e-05 1310 KSP Residual norm 6.500378640290e-05 1311 KSP Residual norm 7.019329103066e-05 1312 KSP Residual norm 7.036316552840e-05 1313 KSP Residual norm 6.619114297019e-05 1314 KSP Residual norm 6.516574324979e-05 1315 KSP Residual norm 6.876747422704e-05 1316 KSP Residual norm 7.564762180329e-05 1317 KSP Residual norm 8.037624907237e-05 1318 KSP Residual norm 8.193683472970e-05 1319 KSP Residual norm 8.683249399690e-05 1320 KSP Residual norm 9.371357114987e-05 1321 KSP Residual norm 9.343194241992e-05 1322 KSP Residual norm 8.770997760191e-05 1323 KSP Residual norm 8.062126863638e-05 1324 KSP Residual norm 8.104231326508e-05 1325 KSP Residual norm 8.709901267359e-05 1326 KSP Residual norm 8.936865320833e-05 1327 KSP Residual norm 8.671477464883e-05 1328 KSP Residual norm 8.014939303772e-05 1329 KSP Residual norm 7.596287547648e-05 1330 KSP Residual norm 6.995401113159e-05 1331 KSP Residual norm 6.668331522629e-05 1332 KSP Residual norm 6.802303917520e-05 1333 KSP Residual norm 7.177391216996e-05 1334 KSP Residual norm 7.162905370631e-05 1335 KSP Residual norm 6.833001334919e-05 1336 KSP Residual norm 6.678695263480e-05 1337 KSP Residual norm 6.904844304018e-05 1338 KSP Residual norm 7.163346962269e-05 1339 KSP Residual norm 7.111063634423e-05 1340 KSP Residual norm 7.269599442416e-05 1341 KSP Residual norm 7.943805397785e-05 1342 KSP Residual norm 8.400114478840e-05 1343 KSP Residual norm 8.241201438188e-05 1344 KSP Residual norm 7.566513059460e-05 1345 KSP Residual norm 6.889118087063e-05 1346 KSP Residual norm 6.220672330154e-05 1347 KSP Residual norm 5.931225235730e-05 1348 KSP Residual norm 5.757243576606e-05 1349 KSP Residual norm 5.572888549533e-05 1350 KSP Residual norm 5.498457521379e-05 1351 KSP Residual norm 5.518771074732e-05 1352 KSP Residual norm 5.733427133201e-05 1353 KSP Residual norm 6.062102910322e-05 1354 KSP Residual norm 6.820407412865e-05 1355 KSP Residual norm 7.476185844029e-05 1356 KSP Residual norm 8.237097531704e-05 1357 KSP Residual norm 9.054978929549e-05 1358 KSP Residual norm 9.647495990443e-05 1359 KSP Residual norm 1.015092736828e-04 1360 KSP Residual norm 1.028225049059e-04 1361 KSP Residual norm 9.798028846354e-05 1362 KSP Residual norm 9.039556709578e-05 1363 KSP Residual norm 8.389664302484e-05 1364 KSP Residual norm 7.823822831689e-05 1365 KSP Residual norm 7.419333670853e-05 1366 KSP Residual norm 7.300370426151e-05 1367 KSP Residual norm 7.234038070823e-05 1368 KSP Residual norm 7.212059653405e-05 1369 KSP Residual norm 7.125841363158e-05 1370 KSP Residual norm 7.343721778706e-05 1371 KSP Residual norm 7.882204533231e-05 1372 KSP Residual norm 7.712987258872e-05 1373 KSP Residual norm 7.660542421638e-05 1374 KSP Residual norm 8.415792908016e-05 1375 KSP Residual norm 9.632389504333e-05 1376 KSP Residual norm 1.042950740652e-04 1377 KSP Residual norm 1.007262163343e-04 1378 KSP Residual norm 9.735193935163e-05 1379 KSP Residual norm 9.998264987893e-05 1380 KSP Residual norm 1.024789508215e-04 1381 KSP Residual norm 1.081016407727e-04 1382 KSP Residual norm 1.115667007694e-04 1383 KSP Residual norm 1.135639221269e-04 1384 KSP Residual norm 1.148991265655e-04 1385 KSP Residual norm 1.153410018279e-04 1386 KSP Residual norm 1.139862307524e-04 1387 KSP Residual norm 1.045443410799e-04 1388 KSP Residual norm 9.185367300655e-05 1389 KSP Residual norm 8.329384258037e-05 1390 KSP Residual norm 8.001140632603e-05 1391 KSP Residual norm 7.493728611921e-05 1392 KSP Residual norm 7.292835534873e-05 1393 KSP Residual norm 7.509366618646e-05 1394 KSP Residual norm 8.006868782785e-05 1395 KSP Residual norm 8.764644940914e-05 1396 KSP Residual norm 9.989504142827e-05 1397 KSP Residual norm 1.070078296509e-04 1398 KSP Residual norm 1.039924025447e-04 1399 KSP Residual norm 1.027548454829e-04 1400 KSP Residual norm 1.058811862576e-04 1401 KSP Residual norm 1.089397698242e-04 1402 KSP Residual norm 1.106481268775e-04 1403 KSP Residual norm 1.089864894098e-04 1404 KSP Residual norm 1.093765171370e-04 1405 KSP Residual norm 1.086321145925e-04 1406 KSP Residual norm 1.146602583241e-04 1407 KSP Residual norm 1.225824540045e-04 1408 KSP Residual norm 1.231980975968e-04 1409 KSP Residual norm 1.284520816972e-04 1410 KSP Residual norm 1.385983200873e-04 1411 KSP Residual norm 1.405809859573e-04 1412 KSP Residual norm 1.299828991748e-04 1413 KSP Residual norm 1.110307584574e-04 1414 KSP Residual norm 1.021153786946e-04 1415 KSP Residual norm 1.029278954802e-04 1416 KSP Residual norm 1.074367374081e-04 1417 KSP Residual norm 1.124171588610e-04 1418 KSP Residual norm 1.201247464327e-04 1419 KSP Residual norm 1.317145155448e-04 1420 KSP Residual norm 1.489862486301e-04 1421 KSP Residual norm 1.595026769464e-04 1422 KSP Residual norm 1.514012822946e-04 1423 KSP Residual norm 1.310329445374e-04 1424 KSP Residual norm 1.195800115250e-04 1425 KSP Residual norm 1.178685991958e-04 1426 KSP Residual norm 1.216694816007e-04 1427 KSP Residual norm 1.313393293211e-04 1428 KSP Residual norm 1.392408683093e-04 1429 KSP Residual norm 1.440968800500e-04 1430 KSP Residual norm 1.427586015202e-04 1431 KSP Residual norm 1.438339844635e-04 1432 KSP Residual norm 1.449906679223e-04 1433 KSP Residual norm 1.277619836810e-04 1434 KSP Residual norm 1.057032943485e-04 1435 KSP Residual norm 9.267085559955e-05 1436 KSP Residual norm 8.738522330196e-05 1437 KSP Residual norm 8.824975655932e-05 1438 KSP Residual norm 9.472655493848e-05 1439 KSP Residual norm 1.059474486519e-04 1440 KSP Residual norm 1.100064532365e-04 1441 KSP Residual norm 1.114010848483e-04 1442 KSP Residual norm 1.143666660417e-04 1443 KSP Residual norm 1.163007695899e-04 1444 KSP Residual norm 1.115201324767e-04 1445 KSP Residual norm 1.009536173534e-04 1446 KSP Residual norm 9.504400155876e-05 1447 KSP Residual norm 9.835501584485e-05 1448 KSP Residual norm 1.070136122688e-04 1449 KSP Residual norm 1.082794841576e-04 1450 KSP Residual norm 1.008186278689e-04 1451 KSP Residual norm 9.378596340103e-05 1452 KSP Residual norm 9.674802899286e-05 1453 KSP Residual norm 1.123931447308e-04 1454 KSP Residual norm 1.315393961540e-04 1455 KSP Residual norm 1.402711342360e-04 1456 KSP Residual norm 1.458315545479e-04 1457 KSP Residual norm 1.552995703385e-04 1458 KSP Residual norm 1.657152336409e-04 1459 KSP Residual norm 1.593341671613e-04 1460 KSP Residual norm 1.563471525558e-04 1461 KSP Residual norm 1.588853876837e-04 1462 KSP Residual norm 1.524736891738e-04 1463 KSP Residual norm 1.353066547178e-04 1464 KSP Residual norm 1.204972649653e-04 1465 KSP Residual norm 1.260967620843e-04 1466 KSP Residual norm 1.506863238403e-04 1467 KSP Residual norm 1.666997387382e-04 1468 KSP Residual norm 1.553635940061e-04 1469 KSP Residual norm 1.338149034534e-04 1470 KSP Residual norm 1.220913441558e-04 1471 KSP Residual norm 1.360335358119e-04 1472 KSP Residual norm 1.672880813558e-04 1473 KSP Residual norm 1.949455011127e-04 1474 KSP Residual norm 2.062393512148e-04 1475 KSP Residual norm 1.959926439237e-04 1476 KSP Residual norm 1.871835412853e-04 1477 KSP Residual norm 1.981082422628e-04 1478 KSP Residual norm 2.167921875555e-04 1479 KSP Residual norm 2.304144613608e-04 1480 KSP Residual norm 2.347299849131e-04 1481 KSP Residual norm 2.106318475696e-04 1482 KSP Residual norm 1.821753317710e-04 1483 KSP Residual norm 1.681929636346e-04 1484 KSP Residual norm 1.747095087648e-04 1485 KSP Residual norm 1.889446093134e-04 1486 KSP Residual norm 1.952797529263e-04 1487 KSP Residual norm 1.918553147387e-04 1488 KSP Residual norm 2.059418996607e-04 1489 KSP Residual norm 2.404820817266e-04 1490 KSP Residual norm 2.777832517328e-04 1491 KSP Residual norm 3.050169049100e-04 1492 KSP Residual norm 2.878531889854e-04 1493 KSP Residual norm 2.517050364202e-04 1494 KSP Residual norm 2.275447937233e-04 1495 KSP Residual norm 2.156996084019e-04 1496 KSP Residual norm 2.076764828401e-04 1497 KSP Residual norm 1.976869922882e-04 1498 KSP Residual norm 1.820992488223e-04 1499 KSP Residual norm 1.614057800434e-04 1500 KSP Residual norm 1.480127461511e-04 1501 KSP Residual norm 1.469893306410e-04 1502 KSP Residual norm 1.522980439938e-04 1503 KSP Residual norm 1.636522251079e-04 1504 KSP Residual norm 1.776184032761e-04 1505 KSP Residual norm 2.083523362157e-04 1506 KSP Residual norm 2.496504701915e-04 1507 KSP Residual norm 2.812215206849e-04 1508 KSP Residual norm 2.784790988798e-04 1509 KSP Residual norm 2.402844999152e-04 1510 KSP Residual norm 1.820074625126e-04 1511 KSP Residual norm 1.490031511287e-04 1512 KSP Residual norm 1.431029846309e-04 1513 KSP Residual norm 1.500457008235e-04 1514 KSP Residual norm 1.550159470974e-04 1515 KSP Residual norm 1.583062812327e-04 1516 KSP Residual norm 1.616413890558e-04 1517 KSP Residual norm 1.563324738356e-04 1518 KSP Residual norm 1.451139630464e-04 1519 KSP Residual norm 1.392818441419e-04 1520 KSP Residual norm 1.496392387252e-04 1521 KSP Residual norm 1.593663216286e-04 1522 KSP Residual norm 1.568803775052e-04 1523 KSP Residual norm 1.523350116505e-04 1524 KSP Residual norm 1.557883093996e-04 1525 KSP Residual norm 1.510357333691e-04 1526 KSP Residual norm 1.354069034118e-04 1527 KSP Residual norm 1.222972372484e-04 1528 KSP Residual norm 1.241395747719e-04 1529 KSP Residual norm 1.343532970782e-04 1530 KSP Residual norm 1.487204881163e-04 1531 KSP Residual norm 1.519821471776e-04 1532 KSP Residual norm 1.444061384137e-04 1533 KSP Residual norm 1.430969233363e-04 1534 KSP Residual norm 1.526761921572e-04 1535 KSP Residual norm 1.669274030769e-04 1536 KSP Residual norm 1.753079202012e-04 1537 KSP Residual norm 1.704909620126e-04 1538 KSP Residual norm 1.614814505517e-04 1539 KSP Residual norm 1.588237564899e-04 1540 KSP Residual norm 1.598958270834e-04 1541 KSP Residual norm 1.632644581602e-04 1542 KSP Residual norm 1.581711728837e-04 1543 KSP Residual norm 1.487987729990e-04 1544 KSP Residual norm 1.304004131744e-04 1545 KSP Residual norm 1.132937840594e-04 1546 KSP Residual norm 1.032210581236e-04 1547 KSP Residual norm 1.044585401425e-04 1548 KSP Residual norm 1.167320961009e-04 1549 KSP Residual norm 1.293767937912e-04 1550 KSP Residual norm 1.357861360011e-04 1551 KSP Residual norm 1.392792584694e-04 1552 KSP Residual norm 1.432964482430e-04 1553 KSP Residual norm 1.552514086572e-04 1554 KSP Residual norm 1.694185875703e-04 1555 KSP Residual norm 1.636651967820e-04 1556 KSP Residual norm 1.359970825612e-04 1557 KSP Residual norm 1.123689241499e-04 1558 KSP Residual norm 1.006181345949e-04 1559 KSP Residual norm 1.069220379838e-04 1560 KSP Residual norm 1.272952471881e-04 1561 KSP Residual norm 1.736081543488e-04 1562 KSP Residual norm 2.217745622057e-04 1563 KSP Residual norm 2.278621758242e-04 1564 KSP Residual norm 1.990014994625e-04 1565 KSP Residual norm 1.689266518213e-04 1566 KSP Residual norm 1.681749525813e-04 1567 KSP Residual norm 1.902581122152e-04 1568 KSP Residual norm 2.040611199586e-04 1569 KSP Residual norm 2.073179254296e-04 1570 KSP Residual norm 2.217986778912e-04 1571 KSP Residual norm 2.416908640486e-04 1572 KSP Residual norm 2.648648797321e-04 1573 KSP Residual norm 2.584896671241e-04 1574 KSP Residual norm 2.185625821977e-04 1575 KSP Residual norm 1.786009216062e-04 1576 KSP Residual norm 1.537224593758e-04 1577 KSP Residual norm 1.365470057744e-04 1578 KSP Residual norm 1.254963311650e-04 1579 KSP Residual norm 1.190705830343e-04 1580 KSP Residual norm 1.232265708413e-04 1581 KSP Residual norm 1.405358292976e-04 1582 KSP Residual norm 1.530863860904e-04 1583 KSP Residual norm 1.589668759977e-04 1584 KSP Residual norm 1.641396148775e-04 1585 KSP Residual norm 1.833865995439e-04 1586 KSP Residual norm 2.117121101827e-04 1587 KSP Residual norm 2.140415485374e-04 1588 KSP Residual norm 1.794579738421e-04 1589 KSP Residual norm 1.406288970165e-04 1590 KSP Residual norm 1.127013677563e-04 1591 KSP Residual norm 1.044696342361e-04 1592 KSP Residual norm 1.108871008629e-04 1593 KSP Residual norm 1.231787173259e-04 1594 KSP Residual norm 1.222547816366e-04 1595 KSP Residual norm 9.991868931381e-05 1596 KSP Residual norm 7.639041456275e-05 1597 KSP Residual norm 6.937262226086e-05 1598 KSP Residual norm 8.023027884463e-05 1599 KSP Residual norm 1.078155202894e-04 1600 KSP Residual norm 1.427796813838e-04 1601 KSP Residual norm 1.658982453230e-04 1602 KSP Residual norm 1.575573636873e-04 1603 KSP Residual norm 1.266828454222e-04 1604 KSP Residual norm 9.709971641665e-05 1605 KSP Residual norm 8.256699101284e-05 1606 KSP Residual norm 7.980579307520e-05 1607 KSP Residual norm 8.855231505994e-05 1608 KSP Residual norm 1.015907908585e-04 1609 KSP Residual norm 1.179296948967e-04 1610 KSP Residual norm 1.379364910157e-04 1611 KSP Residual norm 1.420515360933e-04 1612 KSP Residual norm 1.255570229919e-04 1613 KSP Residual norm 1.117613901852e-04 1614 KSP Residual norm 1.039501697027e-04 1615 KSP Residual norm 9.364902609745e-05 1616 KSP Residual norm 8.607081978396e-05 1617 KSP Residual norm 8.909996006316e-05 1618 KSP Residual norm 1.059468430216e-04 1619 KSP Residual norm 1.350505931912e-04 1620 KSP Residual norm 1.635870538549e-04 1621 KSP Residual norm 1.763923518578e-04 1622 KSP Residual norm 1.660306548737e-04 1623 KSP Residual norm 1.510005028229e-04 1624 KSP Residual norm 1.422726016876e-04 1625 KSP Residual norm 1.457664446175e-04 1626 KSP Residual norm 1.638357915378e-04 1627 KSP Residual norm 1.845446520661e-04 1628 KSP Residual norm 1.791976802638e-04 1629 KSP Residual norm 1.623879267795e-04 1630 KSP Residual norm 1.502923875717e-04 1631 KSP Residual norm 1.418223077944e-04 1632 KSP Residual norm 1.273254725453e-04 1633 KSP Residual norm 1.184729548926e-04 1634 KSP Residual norm 1.092059167818e-04 1635 KSP Residual norm 9.048865879719e-05 1636 KSP Residual norm 7.292817620989e-05 1637 KSP Residual norm 6.099989036409e-05 1638 KSP Residual norm 6.111855687351e-05 1639 KSP Residual norm 7.591741259007e-05 1640 KSP Residual norm 1.081490129047e-04 1641 KSP Residual norm 1.357003835668e-04 1642 KSP Residual norm 1.329226104717e-04 1643 KSP Residual norm 1.014392111524e-04 1644 KSP Residual norm 6.962606003180e-05 1645 KSP Residual norm 5.186712928949e-05 1646 KSP Residual norm 4.255776787865e-05 1647 KSP Residual norm 4.344753395959e-05 1648 KSP Residual norm 5.412434154219e-05 1649 KSP Residual norm 7.552633147029e-05 1650 KSP Residual norm 1.005940613207e-04 1651 KSP Residual norm 1.144593357437e-04 1652 KSP Residual norm 1.080392133403e-04 1653 KSP Residual norm 8.518440784148e-05 1654 KSP Residual norm 5.881334259044e-05 1655 KSP Residual norm 4.817967366779e-05 1656 KSP Residual norm 5.299823534953e-05 1657 KSP Residual norm 7.136286729876e-05 1658 KSP Residual norm 1.042100877155e-04 1659 KSP Residual norm 1.313373841183e-04 1660 KSP Residual norm 1.201537153493e-04 1661 KSP Residual norm 9.018660112639e-05 1662 KSP Residual norm 6.532562492553e-05 1663 KSP Residual norm 5.649260300789e-05 1664 KSP Residual norm 6.538773675498e-05 1665 KSP Residual norm 9.596419339071e-05 1666 KSP Residual norm 1.513165968446e-04 1667 KSP Residual norm 1.951951397578e-04 1668 KSP Residual norm 1.694458127844e-04 1669 KSP Residual norm 1.133807508646e-04 1670 KSP Residual norm 7.023736465041e-05 1671 KSP Residual norm 4.806665406104e-05 1672 KSP Residual norm 4.578182051396e-05 1673 KSP Residual norm 6.055870487089e-05 1674 KSP Residual norm 9.834221404856e-05 1675 KSP Residual norm 1.550282956194e-04 1676 KSP Residual norm 1.813010438116e-04 1677 KSP Residual norm 1.335388506703e-04 1678 KSP Residual norm 8.104089675768e-05 1679 KSP Residual norm 6.041968601952e-05 1680 KSP Residual norm 6.218671383638e-05 1681 KSP Residual norm 8.430476290168e-05 1682 KSP Residual norm 1.339875990970e-04 1683 KSP Residual norm 2.050686414191e-04 1684 KSP Residual norm 2.486792303236e-04 1685 KSP Residual norm 2.158259204076e-04 1686 KSP Residual norm 1.468145103665e-04 1687 KSP Residual norm 9.535695331496e-05 1688 KSP Residual norm 6.914309899951e-05 1689 KSP Residual norm 5.874336791455e-05 1690 KSP Residual norm 6.147096968782e-05 1691 KSP Residual norm 8.132244892159e-05 1692 KSP Residual norm 1.211144554351e-04 1693 KSP Residual norm 1.598899522199e-04 1694 KSP Residual norm 1.501443249046e-04 1695 KSP Residual norm 9.928425049333e-05 1696 KSP Residual norm 5.789109514265e-05 1697 KSP Residual norm 3.735862717785e-05 1698 KSP Residual norm 3.032327245044e-05 1699 KSP Residual norm 3.112193254455e-05 1700 KSP Residual norm 4.178863830514e-05 1701 KSP Residual norm 6.309117034803e-05 1702 KSP Residual norm 9.068646119005e-05 1703 KSP Residual norm 1.000307139127e-04 1704 KSP Residual norm 7.392841452253e-05 1705 KSP Residual norm 4.501865928935e-05 1706 KSP Residual norm 3.235679583790e-05 1707 KSP Residual norm 2.978819542827e-05 1708 KSP Residual norm 3.573129340262e-05 1709 KSP Residual norm 4.951885946483e-05 1710 KSP Residual norm 6.445601060555e-05 1711 KSP Residual norm 6.957085192254e-05 1712 KSP Residual norm 6.756587302994e-05 1713 KSP Residual norm 7.056625677718e-05 1714 KSP Residual norm 8.152279203535e-05 1715 KSP Residual norm 8.710776724179e-05 1716 KSP Residual norm 7.475694487381e-05 1717 KSP Residual norm 5.502459078031e-05 1718 KSP Residual norm 4.076192456932e-05 1719 KSP Residual norm 3.675018119838e-05 1720 KSP Residual norm 4.153785514217e-05 1721 KSP Residual norm 5.335676678338e-05 1722 KSP Residual norm 6.271637557330e-05 1723 KSP Residual norm 6.465377859297e-05 1724 KSP Residual norm 7.024060712657e-05 1725 KSP Residual norm 8.547622571908e-05 1726 KSP Residual norm 1.013047982263e-04 1727 KSP Residual norm 8.958349464017e-05 1728 KSP Residual norm 6.368058211490e-05 1729 KSP Residual norm 4.879809280622e-05 1730 KSP Residual norm 4.810344287796e-05 1731 KSP Residual norm 5.200450512227e-05 1732 KSP Residual norm 4.570440552338e-05 1733 KSP Residual norm 3.434197560012e-05 1734 KSP Residual norm 2.645040785699e-05 1735 KSP Residual norm 2.654846690444e-05 1736 KSP Residual norm 3.305476665633e-05 1737 KSP Residual norm 4.028937077267e-05 1738 KSP Residual norm 4.229884310405e-05 1739 KSP Residual norm 4.057767977870e-05 1740 KSP Residual norm 4.368169145254e-05 1741 KSP Residual norm 5.806124660954e-05 1742 KSP Residual norm 7.728286596288e-05 1743 KSP Residual norm 7.466040085248e-05 1744 KSP Residual norm 5.912087719640e-05 1745 KSP Residual norm 4.979550758481e-05 1746 KSP Residual norm 4.833362620949e-05 1747 KSP Residual norm 4.897350052364e-05 1748 KSP Residual norm 4.342477208681e-05 1749 KSP Residual norm 3.587110145625e-05 1750 KSP Residual norm 3.397927997564e-05 1751 KSP Residual norm 3.933673634705e-05 1752 KSP Residual norm 4.703086101383e-05 1753 KSP Residual norm 4.639498957625e-05 1754 KSP Residual norm 4.572118456320e-05 1755 KSP Residual norm 5.489192113200e-05 1756 KSP Residual norm 7.384741667165e-05 1757 KSP Residual norm 8.733764008860e-05 1758 KSP Residual norm 8.285208753563e-05 1759 KSP Residual norm 7.455755746713e-05 1760 KSP Residual norm 7.516681918448e-05 1761 KSP Residual norm 8.345968991202e-05 1762 KSP Residual norm 7.756991733274e-05 1763 KSP Residual norm 5.657991132344e-05 1764 KSP Residual norm 4.132539021751e-05 1765 KSP Residual norm 3.858953252024e-05 1766 KSP Residual norm 4.431505280786e-05 1767 KSP Residual norm 4.545302359569e-05 1768 KSP Residual norm 3.628828495200e-05 1769 KSP Residual norm 2.845420725659e-05 1770 KSP Residual norm 2.888470948136e-05 1771 KSP Residual norm 3.660504080600e-05 1772 KSP Residual norm 4.421476425188e-05 1773 KSP Residual norm 4.420981539998e-05 1774 KSP Residual norm 4.276369895633e-05 1775 KSP Residual norm 4.826420755097e-05 1776 KSP Residual norm 5.839814153108e-05 1777 KSP Residual norm 5.663278126738e-05 1778 KSP Residual norm 4.261978869464e-05 1779 KSP Residual norm 3.752374746703e-05 1780 KSP Residual norm 3.995229950895e-05 1781 KSP Residual norm 4.159296217179e-05 1782 KSP Residual norm 3.428943603040e-05 1783 KSP Residual norm 2.536232949874e-05 1784 KSP Residual norm 2.199648361586e-05 1785 KSP Residual norm 2.267630924666e-05 1786 KSP Residual norm 2.421920235551e-05 1787 KSP Residual norm 2.339083498189e-05 1788 KSP Residual norm 2.272224102939e-05 1789 KSP Residual norm 2.660438795093e-05 1790 KSP Residual norm 3.595247854564e-05 1791 KSP Residual norm 4.758542959452e-05 1792 KSP Residual norm 5.231398759863e-05 1793 KSP Residual norm 5.271887525154e-05 1794 KSP Residual norm 5.868998967500e-05 1795 KSP Residual norm 7.422291394157e-05 1796 KSP Residual norm 9.114275099629e-05 1797 KSP Residual norm 8.692163969323e-05 1798 KSP Residual norm 7.071703549949e-05 1799 KSP Residual norm 6.541829758150e-05 1800 KSP Residual norm 6.978241190142e-05 1801 KSP Residual norm 6.906909186209e-05 1802 KSP Residual norm 5.605318832255e-05 1803 KSP Residual norm 4.480870705028e-05 1804 KSP Residual norm 4.347940087147e-05 1805 KSP Residual norm 4.736810910380e-05 1806 KSP Residual norm 4.317126895758e-05 1807 KSP Residual norm 3.526708138919e-05 1808 KSP Residual norm 3.393518839314e-05 1809 KSP Residual norm 4.120538523084e-05 1810 KSP Residual norm 5.169442746598e-05 1811 KSP Residual norm 5.879284948617e-05 1812 KSP Residual norm 6.287013666618e-05 1813 KSP Residual norm 7.713185133217e-05 1814 KSP Residual norm 1.112265969136e-04 1815 KSP Residual norm 1.550364230427e-04 1816 KSP Residual norm 1.775803614424e-04 1817 KSP Residual norm 1.689456419749e-04 1818 KSP Residual norm 1.611915196464e-04 1819 KSP Residual norm 1.667037248994e-04 1820 KSP Residual norm 1.651580195153e-04 1821 KSP Residual norm 1.365388912510e-04 1822 KSP Residual norm 1.156999832836e-04 1823 KSP Residual norm 1.124165045960e-04 1824 KSP Residual norm 1.059263494255e-04 1825 KSP Residual norm 8.274956283356e-05 1826 KSP Residual norm 6.532002243282e-05 1827 KSP Residual norm 6.311501169009e-05 1828 KSP Residual norm 6.545118562832e-05 1829 KSP Residual norm 5.703999539170e-05 1830 KSP Residual norm 4.731194771401e-05 1831 KSP Residual norm 4.686571231988e-05 1832 KSP Residual norm 5.236015613828e-05 1833 KSP Residual norm 5.533145806754e-05 1834 KSP Residual norm 5.268374892611e-05 1835 KSP Residual norm 5.017147725298e-05 1836 KSP Residual norm 5.587819234823e-05 1837 KSP Residual norm 7.184426848658e-05 1838 KSP Residual norm 8.586911616072e-05 1839 KSP Residual norm 8.662716532534e-05 1840 KSP Residual norm 8.501490651984e-05 1841 KSP Residual norm 9.771812233991e-05 1842 KSP Residual norm 1.130250867607e-04 1843 KSP Residual norm 1.085616268621e-04 1844 KSP Residual norm 9.566848962302e-05 1845 KSP Residual norm 9.311966192110e-05 1846 KSP Residual norm 9.415670116004e-05 1847 KSP Residual norm 8.804659977003e-05 1848 KSP Residual norm 6.770294909616e-05 1849 KSP Residual norm 5.221084181353e-05 1850 KSP Residual norm 4.670579353667e-05 1851 KSP Residual norm 4.819512615688e-05 1852 KSP Residual norm 4.671396435292e-05 1853 KSP Residual norm 4.177480955398e-05 1854 KSP Residual norm 4.307178743888e-05 1855 KSP Residual norm 5.238237414427e-05 1856 KSP Residual norm 6.060702757200e-05 1857 KSP Residual norm 6.080620170073e-05 1858 KSP Residual norm 6.513203584982e-05 1859 KSP Residual norm 7.888627391129e-05 1860 KSP Residual norm 8.950704156141e-05 1861 KSP Residual norm 8.657831426233e-05 1862 KSP Residual norm 7.908055852432e-05 1863 KSP Residual norm 8.635175305052e-05 1864 KSP Residual norm 1.085206528166e-04 1865 KSP Residual norm 1.162913065919e-04 1866 KSP Residual norm 1.036266156916e-04 1867 KSP Residual norm 9.665031486237e-05 1868 KSP Residual norm 9.907376720976e-05 1869 KSP Residual norm 9.835602457695e-05 1870 KSP Residual norm 8.500364052670e-05 1871 KSP Residual norm 7.199253004762e-05 1872 KSP Residual norm 6.870825248643e-05 1873 KSP Residual norm 6.981851502818e-05 1874 KSP Residual norm 6.265131822843e-05 1875 KSP Residual norm 5.030669315022e-05 1876 KSP Residual norm 4.274925354472e-05 1877 KSP Residual norm 3.858647977075e-05 1878 KSP Residual norm 3.368296519471e-05 1879 KSP Residual norm 2.742275165445e-05 1880 KSP Residual norm 2.480060200444e-05 1881 KSP Residual norm 2.621683999505e-05 1882 KSP Residual norm 2.906286740443e-05 1883 KSP Residual norm 3.045752112834e-05 1884 KSP Residual norm 2.835651055679e-05 1885 KSP Residual norm 2.897567110330e-05 1886 KSP Residual norm 3.568567374638e-05 1887 KSP Residual norm 4.561462244462e-05 1888 KSP Residual norm 5.122453661449e-05 1889 KSP Residual norm 5.267770286835e-05 1890 KSP Residual norm 6.169881299403e-05 1891 KSP Residual norm 8.057187791296e-05 1892 KSP Residual norm 9.695035535993e-05 1893 KSP Residual norm 9.659469095191e-05 1894 KSP Residual norm 9.409280604400e-05 1895 KSP Residual norm 9.530707040243e-05 1896 KSP Residual norm 9.261842866598e-05 1897 KSP Residual norm 8.347545296760e-05 1898 KSP Residual norm 8.109444004373e-05 1899 KSP Residual norm 8.520713265851e-05 1900 KSP Residual norm 8.622939836032e-05 1901 KSP Residual norm 7.780378310796e-05 1902 KSP Residual norm 6.765595192081e-05 1903 KSP Residual norm 6.376711525302e-05 1904 KSP Residual norm 6.172628285816e-05 1905 KSP Residual norm 5.517623474333e-05 1906 KSP Residual norm 4.933415197155e-05 1907 KSP Residual norm 4.757817972698e-05 1908 KSP Residual norm 4.567076444637e-05 1909 KSP Residual norm 3.968241143714e-05 1910 KSP Residual norm 3.519611121404e-05 1911 KSP Residual norm 3.656706326849e-05 1912 KSP Residual norm 4.030027837565e-05 1913 KSP Residual norm 3.932992145155e-05 1914 KSP Residual norm 3.816054655325e-05 1915 KSP Residual norm 4.298641288979e-05 1916 KSP Residual norm 5.200898421600e-05 1917 KSP Residual norm 6.127075880237e-05 1918 KSP Residual norm 6.502246812138e-05 1919 KSP Residual norm 7.107140710760e-05 1920 KSP Residual norm 8.105909459055e-05 1921 KSP Residual norm 8.772438805153e-05 1922 KSP Residual norm 8.515018063269e-05 1923 KSP Residual norm 8.374388275368e-05 1924 KSP Residual norm 8.999425409334e-05 1925 KSP Residual norm 8.848562701116e-05 1926 KSP Residual norm 7.672014127488e-05 1927 KSP Residual norm 6.882125930869e-05 1928 KSP Residual norm 6.510731367965e-05 1929 KSP Residual norm 5.853207715381e-05 1930 KSP Residual norm 4.833575303195e-05 1931 KSP Residual norm 4.079152330499e-05 1932 KSP Residual norm 3.804418085596e-05 1933 KSP Residual norm 3.671583238456e-05 1934 KSP Residual norm 3.249094460221e-05 1935 KSP Residual norm 2.818968930180e-05 1936 KSP Residual norm 2.455600339593e-05 1937 KSP Residual norm 2.268549583469e-05 1938 KSP Residual norm 2.139072822561e-05 1939 KSP Residual norm 2.119210349094e-05 1940 KSP Residual norm 2.327074715160e-05 1941 KSP Residual norm 2.744985516810e-05 1942 KSP Residual norm 3.342901872368e-05 1943 KSP Residual norm 3.833776466491e-05 1944 KSP Residual norm 4.304442245814e-05 1945 KSP Residual norm 5.041967301334e-05 1946 KSP Residual norm 6.234009014930e-05 1947 KSP Residual norm 7.080241494044e-05 1948 KSP Residual norm 7.578848186134e-05 1949 KSP Residual norm 8.136803806799e-05 1950 KSP Residual norm 9.295436501126e-05 1951 KSP Residual norm 9.545426905819e-05 1952 KSP Residual norm 9.204960379493e-05 1953 KSP Residual norm 9.703106775261e-05 1954 KSP Residual norm 1.164813164361e-04 1955 KSP Residual norm 1.254300731039e-04 1956 KSP Residual norm 1.110164960759e-04 1957 KSP Residual norm 1.021566459320e-04 1958 KSP Residual norm 1.089447203592e-04 1959 KSP Residual norm 1.139642834252e-04 1960 KSP Residual norm 1.017424042626e-04 1961 KSP Residual norm 8.504830286679e-05 1962 KSP Residual norm 7.719094350895e-05 1963 KSP Residual norm 7.000339913686e-05 1964 KSP Residual norm 5.745591888795e-05 1965 KSP Residual norm 5.054898401809e-05 1966 KSP Residual norm 4.926736925454e-05 1967 KSP Residual norm 4.775600314902e-05 1968 KSP Residual norm 4.187795205101e-05 1969 KSP Residual norm 3.637303144558e-05 1970 KSP Residual norm 3.557163246377e-05 1971 KSP Residual norm 3.601075010292e-05 1972 KSP Residual norm 3.289158733166e-05 1973 KSP Residual norm 3.013364732973e-05 1974 KSP Residual norm 3.057845718720e-05 1975 KSP Residual norm 3.159062772902e-05 1976 KSP Residual norm 2.954281674491e-05 1977 KSP Residual norm 2.970520393219e-05 1978 KSP Residual norm 3.446007545808e-05 1979 KSP Residual norm 4.154158078645e-05 1980 KSP Residual norm 4.823108719157e-05 1981 KSP Residual norm 5.514371122818e-05 1982 KSP Residual norm 6.620848858541e-05 1983 KSP Residual norm 7.361787614819e-05 1984 KSP Residual norm 7.404957519912e-05 1985 KSP Residual norm 7.587769129775e-05 1986 KSP Residual norm 8.718819638297e-05 1987 KSP Residual norm 1.045491937153e-04 1988 KSP Residual norm 1.160223589894e-04 1989 KSP Residual norm 1.153089114790e-04 1990 KSP Residual norm 1.188733983327e-04 1991 KSP Residual norm 1.345853673388e-04 1992 KSP Residual norm 1.425507026857e-04 1993 KSP Residual norm 1.339668054978e-04 1994 KSP Residual norm 1.270642878899e-04 1995 KSP Residual norm 1.253926551094e-04 1996 KSP Residual norm 1.170403933972e-04 1997 KSP Residual norm 1.015752348934e-04 1998 KSP Residual norm 8.655967488148e-05 1999 KSP Residual norm 7.728845790511e-05 2000 KSP Residual norm 7.345547586614e-05 2001 KSP Residual norm 7.084963534688e-05 2002 KSP Residual norm 6.591356720823e-05 2003 KSP Residual norm 6.330718105201e-05 2004 KSP Residual norm 6.113178853395e-05 2005 KSP Residual norm 5.778816266795e-05 2006 KSP Residual norm 4.935234420663e-05 2007 KSP Residual norm 4.468794671712e-05 2008 KSP Residual norm 4.837779230756e-05 2009 KSP Residual norm 5.527663814967e-05 2010 KSP Residual norm 5.641785227988e-05 2011 KSP Residual norm 5.675474362412e-05 2012 KSP Residual norm 6.419625000284e-05 2013 KSP Residual norm 7.582155513883e-05 2014 KSP Residual norm 8.537570986489e-05 2015 KSP Residual norm 9.482398182569e-05 2016 KSP Residual norm 1.063625895562e-04 2017 KSP Residual norm 1.154623872140e-04 2018 KSP Residual norm 1.200249333478e-04 2019 KSP Residual norm 1.294331926198e-04 2020 KSP Residual norm 1.419912941039e-04 2021 KSP Residual norm 1.540929176854e-04 2022 KSP Residual norm 1.624175872529e-04 2023 KSP Residual norm 1.690292092113e-04 2024 KSP Residual norm 1.716680626458e-04 2025 KSP Residual norm 1.730790655306e-04 2026 KSP Residual norm 1.546063992232e-04 2027 KSP Residual norm 1.300177384175e-04 2028 KSP Residual norm 1.201083071643e-04 2029 KSP Residual norm 1.108198331435e-04 2030 KSP Residual norm 9.367379458212e-05 2031 KSP Residual norm 7.728334801680e-05 2032 KSP Residual norm 7.364801558827e-05 2033 KSP Residual norm 7.191968224978e-05 2034 KSP Residual norm 6.430768689345e-05 2035 KSP Residual norm 5.584741309398e-05 2036 KSP Residual norm 5.342822046168e-05 2037 KSP Residual norm 5.351870488026e-05 2038 KSP Residual norm 4.897636571757e-05 2039 KSP Residual norm 4.348010597180e-05 2040 KSP Residual norm 4.066036071986e-05 2041 KSP Residual norm 3.978628058137e-05 2042 KSP Residual norm 3.702065976097e-05 2043 KSP Residual norm 3.384567728620e-05 2044 KSP Residual norm 3.563232110428e-05 2045 KSP Residual norm 4.013756038586e-05 2046 KSP Residual norm 4.361737137094e-05 2047 KSP Residual norm 4.617186659379e-05 2048 KSP Residual norm 5.132887443457e-05 2049 KSP Residual norm 6.034816287815e-05 2050 KSP Residual norm 7.117012923850e-05 2051 KSP Residual norm 8.045395778169e-05 2052 KSP Residual norm 9.094204680949e-05 2053 KSP Residual norm 1.033210559251e-04 2054 KSP Residual norm 1.113550273736e-04 2055 KSP Residual norm 1.125618312664e-04 2056 KSP Residual norm 1.217997308288e-04 2057 KSP Residual norm 1.406297800758e-04 2058 KSP Residual norm 1.564996743460e-04 2059 KSP Residual norm 1.641353733608e-04 2060 KSP Residual norm 1.786071321103e-04 2061 KSP Residual norm 2.020418732653e-04 2062 KSP Residual norm 2.317258208946e-04 2063 KSP Residual norm 2.325249005256e-04 2064 KSP Residual norm 2.135091490829e-04 2065 KSP Residual norm 2.099512160108e-04 2066 KSP Residual norm 2.299120955404e-04 2067 KSP Residual norm 2.382322596208e-04 2068 KSP Residual norm 2.249036626476e-04 2069 KSP Residual norm 2.089898703397e-04 2070 KSP Residual norm 2.040948821142e-04 2071 KSP Residual norm 1.991038337363e-04 2072 KSP Residual norm 1.768302171210e-04 2073 KSP Residual norm 1.654231832066e-04 2074 KSP Residual norm 1.673060736667e-04 2075 KSP Residual norm 1.664459628621e-04 2076 KSP Residual norm 1.462984236462e-04 2077 KSP Residual norm 1.274750009124e-04 2078 KSP Residual norm 1.245294869258e-04 2079 KSP Residual norm 1.225948215299e-04 2080 KSP Residual norm 1.042575806097e-04 2081 KSP Residual norm 8.659358725777e-05 2082 KSP Residual norm 8.058411176778e-05 2083 KSP Residual norm 7.985872804837e-05 2084 KSP Residual norm 7.909978661000e-05 2085 KSP Residual norm 7.588943927986e-05 2086 KSP Residual norm 7.801546833461e-05 2087 KSP Residual norm 8.939543427435e-05 2088 KSP Residual norm 9.332935879162e-05 2089 KSP Residual norm 8.566100239866e-05 2090 KSP Residual norm 8.632606593098e-05 2091 KSP Residual norm 9.529320421949e-05 2092 KSP Residual norm 1.012068681805e-04 2093 KSP Residual norm 1.040372561815e-04 2094 KSP Residual norm 1.103677053549e-04 2095 KSP Residual norm 1.262328403807e-04 2096 KSP Residual norm 1.401253987967e-04 2097 KSP Residual norm 1.411854651844e-04 2098 KSP Residual norm 1.438812167555e-04 2099 KSP Residual norm 1.669415193335e-04 2100 KSP Residual norm 2.097690240300e-04 2101 KSP Residual norm 2.525820169397e-04 2102 KSP Residual norm 2.869166359697e-04 2103 KSP Residual norm 3.162255665974e-04 2104 KSP Residual norm 3.310550534410e-04 2105 KSP Residual norm 3.330019192838e-04 2106 KSP Residual norm 3.438565665013e-04 2107 KSP Residual norm 3.813568723049e-04 2108 KSP Residual norm 4.188917688844e-04 2109 KSP Residual norm 4.105113172912e-04 2110 KSP Residual norm 3.552540571811e-04 2111 KSP Residual norm 3.240364341293e-04 2112 KSP Residual norm 3.168536101792e-04 2113 KSP Residual norm 2.893104558398e-04 2114 KSP Residual norm 2.478065925695e-04 2115 KSP Residual norm 2.364213100052e-04 2116 KSP Residual norm 2.301758927433e-04 2117 KSP Residual norm 2.063981171527e-04 2118 KSP Residual norm 1.818653871418e-04 2119 KSP Residual norm 1.654283425321e-04 2120 KSP Residual norm 1.490059908107e-04 2121 KSP Residual norm 1.361651655705e-04 2122 KSP Residual norm 1.258183940148e-04 2123 KSP Residual norm 1.193805626413e-04 2124 KSP Residual norm 1.099165241967e-04 2125 KSP Residual norm 9.312135566871e-05 2126 KSP Residual norm 7.968481737556e-05 2127 KSP Residual norm 7.587584577166e-05 2128 KSP Residual norm 7.977341310049e-05 2129 KSP Residual norm 8.105396819965e-05 2130 KSP Residual norm 8.044541465144e-05 2131 KSP Residual norm 8.147778748008e-05 2132 KSP Residual norm 8.864454151897e-05 2133 KSP Residual norm 1.018877902173e-04 2134 KSP Residual norm 1.098819323142e-04 2135 KSP Residual norm 1.154550972912e-04 2136 KSP Residual norm 1.291584647482e-04 2137 KSP Residual norm 1.407074968079e-04 2138 KSP Residual norm 1.461764659046e-04 2139 KSP Residual norm 1.566322973912e-04 2140 KSP Residual norm 1.680249192688e-04 2141 KSP Residual norm 1.759867228633e-04 2142 KSP Residual norm 1.927274228507e-04 2143 KSP Residual norm 2.242597801894e-04 2144 KSP Residual norm 2.425292026805e-04 2145 KSP Residual norm 2.365468989593e-04 2146 KSP Residual norm 2.287712682275e-04 2147 KSP Residual norm 2.333762373188e-04 2148 KSP Residual norm 2.464175929646e-04 2149 KSP Residual norm 2.667637391810e-04 2150 KSP Residual norm 2.662563677010e-04 2151 KSP Residual norm 2.710692693014e-04 2152 KSP Residual norm 3.047950688204e-04 2153 KSP Residual norm 3.370860129090e-04 2154 KSP Residual norm 3.233847501232e-04 2155 KSP Residual norm 2.947574675598e-04 2156 KSP Residual norm 2.793183641549e-04 2157 KSP Residual norm 2.764495790484e-04 2158 KSP Residual norm 2.566056823036e-04 2159 KSP Residual norm 2.298310237411e-04 2160 KSP Residual norm 2.206766592004e-04 2161 KSP Residual norm 2.222904102504e-04 2162 KSP Residual norm 2.121782449003e-04 2163 KSP Residual norm 1.859023789893e-04 2164 KSP Residual norm 1.714570022773e-04 2165 KSP Residual norm 1.693721300179e-04 2166 KSP Residual norm 1.567970662707e-04 2167 KSP Residual norm 1.322111644143e-04 2168 KSP Residual norm 1.147823097816e-04 2169 KSP Residual norm 1.151729209642e-04 2170 KSP Residual norm 1.143783742795e-04 2171 KSP Residual norm 1.084018743143e-04 2172 KSP Residual norm 1.006674348966e-04 2173 KSP Residual norm 9.555166362807e-05 2174 KSP Residual norm 9.170606426874e-05 2175 KSP Residual norm 8.741053296915e-05 2176 KSP Residual norm 8.242224037062e-05 2177 KSP Residual norm 7.758706499399e-05 2178 KSP Residual norm 7.107154907316e-05 2179 KSP Residual norm 6.237603053049e-05 2180 KSP Residual norm 5.873850461650e-05 2181 KSP Residual norm 6.001752003718e-05 2182 KSP Residual norm 6.237670997216e-05 2183 KSP Residual norm 6.139281168758e-05 2184 KSP Residual norm 5.914545935942e-05 2185 KSP Residual norm 5.884663578568e-05 2186 KSP Residual norm 6.047997088460e-05 2187 KSP Residual norm 6.329371095671e-05 2188 KSP Residual norm 7.021001094637e-05 2189 KSP Residual norm 7.647923644830e-05 2190 KSP Residual norm 7.282653682265e-05 2191 KSP Residual norm 6.770191546287e-05 2192 KSP Residual norm 7.181143252398e-05 2193 KSP Residual norm 8.653763153747e-05 2194 KSP Residual norm 9.934720317205e-05 2195 KSP Residual norm 1.073081103150e-04 2196 KSP Residual norm 1.061802111238e-04 2197 KSP Residual norm 1.002078910340e-04 2198 KSP Residual norm 9.635891918043e-05 2199 KSP Residual norm 9.408972393928e-05 2200 KSP Residual norm 1.009790459012e-04 2201 KSP Residual norm 1.271285889784e-04 2202 KSP Residual norm 1.537354505821e-04 2203 KSP Residual norm 1.621761350960e-04 2204 KSP Residual norm 1.700066369504e-04 2205 KSP Residual norm 1.931137847692e-04 2206 KSP Residual norm 2.134282182864e-04 2207 KSP Residual norm 2.060622005584e-04 2208 KSP Residual norm 1.942445843235e-04 2209 KSP Residual norm 1.973603245038e-04 2210 KSP Residual norm 2.062111341542e-04 2211 KSP Residual norm 2.075325224207e-04 2212 KSP Residual norm 2.160064821904e-04 2213 KSP Residual norm 2.216002587121e-04 2214 KSP Residual norm 2.090511461745e-04 2215 KSP Residual norm 1.905551068415e-04 2216 KSP Residual norm 1.849120467096e-04 2217 KSP Residual norm 1.901904717515e-04 2218 KSP Residual norm 1.887477485445e-04 2219 KSP Residual norm 1.798164701903e-04 2220 KSP Residual norm 1.734671590589e-04 2221 KSP Residual norm 1.761216756340e-04 2222 KSP Residual norm 1.671351567175e-04 2223 KSP Residual norm 1.440579394843e-04 2224 KSP Residual norm 1.250176261648e-04 2225 KSP Residual norm 1.253121671612e-04 2226 KSP Residual norm 1.374449391930e-04 2227 KSP Residual norm 1.349583805796e-04 2228 KSP Residual norm 1.209413040716e-04 2229 KSP Residual norm 1.054241693105e-04 2230 KSP Residual norm 9.274364365914e-05 2231 KSP Residual norm 8.446648193726e-05 2232 KSP Residual norm 7.988946923577e-05 2233 KSP Residual norm 7.894604000712e-05 2234 KSP Residual norm 8.059494714037e-05 2235 KSP Residual norm 8.045216042742e-05 2236 KSP Residual norm 8.011383433813e-05 2237 KSP Residual norm 7.820440461111e-05 2238 KSP Residual norm 8.053146450935e-05 2239 KSP Residual norm 8.541453266594e-05 2240 KSP Residual norm 8.317519796360e-05 2241 KSP Residual norm 8.571770294475e-05 2242 KSP Residual norm 9.344884306611e-05 2243 KSP Residual norm 9.295741529865e-05 2244 KSP Residual norm 8.684465078566e-05 2245 KSP Residual norm 8.297223536127e-05 2246 KSP Residual norm 8.322466991227e-05 2247 KSP Residual norm 8.831968445209e-05 2248 KSP Residual norm 9.226380356608e-05 2249 KSP Residual norm 9.422441463232e-05 2250 KSP Residual norm 1.057150283414e-04 2251 KSP Residual norm 1.224830901988e-04 2252 KSP Residual norm 1.331522449159e-04 2253 KSP Residual norm 1.413208919405e-04 2254 KSP Residual norm 1.515161371470e-04 2255 KSP Residual norm 1.588190268030e-04 2256 KSP Residual norm 1.565600174728e-04 2257 KSP Residual norm 1.519077760437e-04 2258 KSP Residual norm 1.519165401249e-04 2259 KSP Residual norm 1.596282141288e-04 2260 KSP Residual norm 1.712875208077e-04 2261 KSP Residual norm 1.821597065204e-04 2262 KSP Residual norm 1.981337700490e-04 2263 KSP Residual norm 2.135415021981e-04 2264 KSP Residual norm 2.204781337423e-04 2265 KSP Residual norm 2.313572316323e-04 2266 KSP Residual norm 2.451684134992e-04 2267 KSP Residual norm 2.537210836276e-04 2268 KSP Residual norm 2.480140541450e-04 2269 KSP Residual norm 2.397479743890e-04 2270 KSP Residual norm 2.375829797543e-04 2271 KSP Residual norm 2.364583912241e-04 2272 KSP Residual norm 2.252184960572e-04 2273 KSP Residual norm 2.163679820330e-04 2274 KSP Residual norm 2.106219273481e-04 2275 KSP Residual norm 2.109717253794e-04 2276 KSP Residual norm 2.143508385264e-04 2277 KSP Residual norm 2.323128971654e-04 2278 KSP Residual norm 2.378017969407e-04 2279 KSP Residual norm 2.086129966265e-04 2280 KSP Residual norm 1.821378934959e-04 2281 KSP Residual norm 1.702440452917e-04 2282 KSP Residual norm 1.538326110841e-04 2283 KSP Residual norm 1.268147348654e-04 2284 KSP Residual norm 1.040025509461e-04 2285 KSP Residual norm 9.238945573306e-05 2286 KSP Residual norm 9.113541873480e-05 2287 KSP Residual norm 9.545729227373e-05 2288 KSP Residual norm 9.389686604025e-05 2289 KSP Residual norm 8.471540910156e-05 2290 KSP Residual norm 8.092324013286e-05 2291 KSP Residual norm 7.845699197729e-05 2292 KSP Residual norm 7.113673671055e-05 2293 KSP Residual norm 6.902926216335e-05 2294 KSP Residual norm 7.398435330542e-05 2295 KSP Residual norm 7.959998642109e-05 2296 KSP Residual norm 7.969150600137e-05 2297 KSP Residual norm 7.895569322434e-05 2298 KSP Residual norm 8.434355366307e-05 2299 KSP Residual norm 8.870657968660e-05 2300 KSP Residual norm 8.439351278067e-05 2301 KSP Residual norm 7.974192115568e-05 2302 KSP Residual norm 8.435944491590e-05 2303 KSP Residual norm 9.601518862652e-05 2304 KSP Residual norm 1.092146780410e-04 2305 KSP Residual norm 1.236250564388e-04 2306 KSP Residual norm 1.448453938802e-04 2307 KSP Residual norm 1.577915570239e-04 2308 KSP Residual norm 1.557729955066e-04 2309 KSP Residual norm 1.520952317554e-04 2310 KSP Residual norm 1.510661333301e-04 2311 KSP Residual norm 1.474377883713e-04 2312 KSP Residual norm 1.414668939765e-04 2313 KSP Residual norm 1.459824453511e-04 2314 KSP Residual norm 1.532624490929e-04 2315 KSP Residual norm 1.604475659537e-04 2316 KSP Residual norm 1.615454442294e-04 2317 KSP Residual norm 1.549455162358e-04 2318 KSP Residual norm 1.614312437080e-04 2319 KSP Residual norm 1.669887941466e-04 2320 KSP Residual norm 1.614862369333e-04 2321 KSP Residual norm 1.596120456007e-04 2322 KSP Residual norm 1.682527085484e-04 2323 KSP Residual norm 1.779026743295e-04 2324 KSP Residual norm 1.765883553665e-04 2325 KSP Residual norm 1.739232156150e-04 2326 KSP Residual norm 1.799974712687e-04 2327 KSP Residual norm 1.806434166137e-04 2328 KSP Residual norm 1.743831789899e-04 2329 KSP Residual norm 1.848482569439e-04 2330 KSP Residual norm 2.102617285496e-04 2331 KSP Residual norm 2.320422473769e-04 2332 KSP Residual norm 2.382570465323e-04 2333 KSP Residual norm 2.228548849639e-04 2334 KSP Residual norm 2.217500928061e-04 2335 KSP Residual norm 2.382250555188e-04 2336 KSP Residual norm 2.420957196517e-04 2337 KSP Residual norm 2.126949869709e-04 2338 KSP Residual norm 1.692738378553e-04 2339 KSP Residual norm 1.391881623428e-04 2340 KSP Residual norm 1.246871055055e-04 2341 KSP Residual norm 1.088153104351e-04 2342 KSP Residual norm 9.691636126096e-05 2343 KSP Residual norm 9.269431005059e-05 2344 KSP Residual norm 8.761136929050e-05 2345 KSP Residual norm 8.161380227983e-05 2346 KSP Residual norm 7.437369920325e-05 2347 KSP Residual norm 6.596530758990e-05 2348 KSP Residual norm 5.849581279183e-05 2349 KSP Residual norm 5.206591498391e-05 2350 KSP Residual norm 5.096197687523e-05 2351 KSP Residual norm 5.235227840762e-05 2352 KSP Residual norm 5.002094309548e-05 2353 KSP Residual norm 4.554898084243e-05 2354 KSP Residual norm 4.331779825569e-05 2355 KSP Residual norm 4.502056915081e-05 2356 KSP Residual norm 4.513351137936e-05 2357 KSP Residual norm 4.325203162985e-05 2358 KSP Residual norm 4.369756318174e-05 2359 KSP Residual norm 4.791188982242e-05 2360 KSP Residual norm 5.236506295731e-05 2361 KSP Residual norm 5.315333650470e-05 2362 KSP Residual norm 5.309020978406e-05 2363 KSP Residual norm 5.287668714353e-05 2364 KSP Residual norm 4.879893697589e-05 2365 KSP Residual norm 4.648666077336e-05 2366 KSP Residual norm 4.728255480296e-05 2367 KSP Residual norm 5.199013622912e-05 2368 KSP Residual norm 5.445981125523e-05 2369 KSP Residual norm 5.290139914741e-05 2370 KSP Residual norm 5.618379273285e-05 2371 KSP Residual norm 6.564707091845e-05 2372 KSP Residual norm 7.294412428352e-05 2373 KSP Residual norm 7.234780715487e-05 2374 KSP Residual norm 6.887877308614e-05 2375 KSP Residual norm 6.997268480881e-05 2376 KSP Residual norm 7.561965722597e-05 2377 KSP Residual norm 7.975041669906e-05 2378 KSP Residual norm 7.760923402662e-05 2379 KSP Residual norm 7.540099741135e-05 2380 KSP Residual norm 7.116307333408e-05 2381 KSP Residual norm 6.295793347607e-05 2382 KSP Residual norm 5.785234517360e-05 2383 KSP Residual norm 6.164090770342e-05 2384 KSP Residual norm 7.003850600494e-05 2385 KSP Residual norm 7.016398282154e-05 2386 KSP Residual norm 7.331113758847e-05 2387 KSP Residual norm 8.416451523502e-05 2388 KSP Residual norm 8.982226575026e-05 2389 KSP Residual norm 8.422708847330e-05 2390 KSP Residual norm 8.004700738578e-05 2391 KSP Residual norm 8.188070554825e-05 2392 KSP Residual norm 8.245606648174e-05 2393 KSP Residual norm 7.628462732049e-05 2394 KSP Residual norm 7.008224217210e-05 2395 KSP Residual norm 7.149243912464e-05 2396 KSP Residual norm 7.786040078026e-05 2397 KSP Residual norm 7.610460375465e-05 2398 KSP Residual norm 7.109183849232e-05 2399 KSP Residual norm 7.298671454367e-05 2400 KSP Residual norm 7.693265441345e-05 2401 KSP Residual norm 7.747031334366e-05 2402 KSP Residual norm 7.194228841576e-05 2403 KSP Residual norm 6.455844112686e-05 2404 KSP Residual norm 6.239162131136e-05 2405 KSP Residual norm 6.477629121046e-05 2406 KSP Residual norm 6.242596652556e-05 2407 KSP Residual norm 5.690458858876e-05 2408 KSP Residual norm 4.903651254711e-05 2409 KSP Residual norm 4.632380778812e-05 2410 KSP Residual norm 4.614481471399e-05 2411 KSP Residual norm 4.395834763054e-05 2412 KSP Residual norm 4.236379182631e-05 2413 KSP Residual norm 4.172300138136e-05 2414 KSP Residual norm 4.031772146354e-05 2415 KSP Residual norm 3.885806126901e-05 2416 KSP Residual norm 3.798958563083e-05 2417 KSP Residual norm 3.899870630006e-05 2418 KSP Residual norm 3.978878033293e-05 2419 KSP Residual norm 3.930373437248e-05 2420 KSP Residual norm 3.533035895158e-05 2421 KSP Residual norm 3.170581040927e-05 2422 KSP Residual norm 3.109979137307e-05 2423 KSP Residual norm 3.171183823150e-05 2424 KSP Residual norm 3.353712270729e-05 2425 KSP Residual norm 3.527835345622e-05 2426 KSP Residual norm 3.713164073648e-05 2427 KSP Residual norm 3.938643667285e-05 2428 KSP Residual norm 4.314562460621e-05 2429 KSP Residual norm 4.715863259697e-05 2430 KSP Residual norm 4.513847774689e-05 2431 KSP Residual norm 4.192139718222e-05 2432 KSP Residual norm 4.268601068874e-05 2433 KSP Residual norm 4.584066869470e-05 2434 KSP Residual norm 4.344823624398e-05 2435 KSP Residual norm 4.020582364642e-05 2436 KSP Residual norm 4.070899458197e-05 2437 KSP Residual norm 4.079657052329e-05 2438 KSP Residual norm 4.060020193671e-05 2439 KSP Residual norm 4.149951388077e-05 2440 KSP Residual norm 4.223947684109e-05 2441 KSP Residual norm 4.068470056644e-05 2442 KSP Residual norm 4.044946489518e-05 2443 KSP Residual norm 4.441576752142e-05 2444 KSP Residual norm 4.813554345106e-05 2445 KSP Residual norm 4.613065465832e-05 2446 KSP Residual norm 4.633778498457e-05 2447 KSP Residual norm 4.922557369623e-05 2448 KSP Residual norm 5.279731326841e-05 2449 KSP Residual norm 5.333271097862e-05 2450 KSP Residual norm 5.039034451608e-05 2451 KSP Residual norm 5.031707537462e-05 2452 KSP Residual norm 5.493019733143e-05 2453 KSP Residual norm 6.009579304282e-05 2454 KSP Residual norm 5.998128318259e-05 2455 KSP Residual norm 5.698445291940e-05 2456 KSP Residual norm 5.652085472890e-05 2457 KSP Residual norm 5.867083347070e-05 2458 KSP Residual norm 5.903295749641e-05 2459 KSP Residual norm 5.749945166331e-05 2460 KSP Residual norm 5.409293902355e-05 2461 KSP Residual norm 5.145794400179e-05 2462 KSP Residual norm 5.183431434930e-05 2463 KSP Residual norm 5.552097948475e-05 2464 KSP Residual norm 5.569415063709e-05 2465 KSP Residual norm 5.278748773035e-05 2466 KSP Residual norm 5.125155697592e-05 2467 KSP Residual norm 5.165401423964e-05 2468 KSP Residual norm 5.013502981398e-05 2469 KSP Residual norm 4.773139835467e-05 2470 KSP Residual norm 4.791700511513e-05 2471 KSP Residual norm 5.186610235305e-05 2472 KSP Residual norm 5.318428245094e-05 2473 KSP Residual norm 5.037221519223e-05 2474 KSP Residual norm 4.846358440628e-05 2475 KSP Residual norm 4.612299590622e-05 2476 KSP Residual norm 4.130114294874e-05 2477 KSP Residual norm 3.920615706920e-05 2478 KSP Residual norm 4.044433821042e-05 2479 KSP Residual norm 4.168265290477e-05 2480 KSP Residual norm 3.918934617509e-05 2481 KSP Residual norm 3.560158891861e-05 2482 KSP Residual norm 3.231012958670e-05 2483 KSP Residual norm 3.012362596168e-05 2484 KSP Residual norm 2.853936998222e-05 2485 KSP Residual norm 2.714517259881e-05 2486 KSP Residual norm 2.636931698152e-05 2487 KSP Residual norm 2.655666223680e-05 2488 KSP Residual norm 2.531509638015e-05 2489 KSP Residual norm 2.391480497562e-05 2490 KSP Residual norm 2.414261600730e-05 2491 KSP Residual norm 2.492343846820e-05 2492 KSP Residual norm 2.612092473969e-05 2493 KSP Residual norm 2.743276476959e-05 2494 KSP Residual norm 3.034763170454e-05 2495 KSP Residual norm 3.355645010066e-05 2496 KSP Residual norm 3.504307976450e-05 2497 KSP Residual norm 3.466472503040e-05 2498 KSP Residual norm 3.524867513446e-05 2499 KSP Residual norm 3.559146154151e-05 2500 KSP Residual norm 3.612356096905e-05 2501 KSP Residual norm 3.821330396122e-05 2502 KSP Residual norm 4.150087584732e-05 2503 KSP Residual norm 4.453984926972e-05 2504 KSP Residual norm 4.701954469269e-05 2505 KSP Residual norm 4.939739046173e-05 2506 KSP Residual norm 5.243477101231e-05 2507 KSP Residual norm 5.405182387626e-05 2508 KSP Residual norm 5.779870818750e-05 2509 KSP Residual norm 5.949525340653e-05 2510 KSP Residual norm 5.745600414122e-05 2511 KSP Residual norm 5.661975069965e-05 2512 KSP Residual norm 5.887912788223e-05 2513 KSP Residual norm 6.404670001435e-05 2514 KSP Residual norm 7.069108838276e-05 2515 KSP Residual norm 7.447657498585e-05 2516 KSP Residual norm 7.171217814892e-05 2517 KSP Residual norm 7.343337460155e-05 2518 KSP Residual norm 8.257184522413e-05 2519 KSP Residual norm 9.555359162115e-05 2520 KSP Residual norm 1.015905177955e-04 2521 KSP Residual norm 1.044309990667e-04 2522 KSP Residual norm 1.023232731569e-04 2523 KSP Residual norm 1.052736829374e-04 2524 KSP Residual norm 1.133910128202e-04 2525 KSP Residual norm 1.117970704116e-04 2526 KSP Residual norm 9.195571977788e-05 2527 KSP Residual norm 7.803485646233e-05 2528 KSP Residual norm 7.551883913084e-05 2529 KSP Residual norm 8.014149241128e-05 2530 KSP Residual norm 8.772237359993e-05 2531 KSP Residual norm 9.654053431318e-05 2532 KSP Residual norm 9.871974602925e-05 2533 KSP Residual norm 9.629352029565e-05 2534 KSP Residual norm 9.586266810218e-05 2535 KSP Residual norm 9.486695460118e-05 2536 KSP Residual norm 9.538956435908e-05 2537 KSP Residual norm 9.146555591093e-05 2538 KSP Residual norm 8.572118786136e-05 2539 KSP Residual norm 8.472822366213e-05 2540 KSP Residual norm 8.391800695387e-05 2541 KSP Residual norm 7.589149454479e-05 2542 KSP Residual norm 6.679926987889e-05 2543 KSP Residual norm 6.521219621362e-05 2544 KSP Residual norm 6.455569847371e-05 2545 KSP Residual norm 6.018087781912e-05 2546 KSP Residual norm 5.915751886636e-05 2547 KSP Residual norm 6.282601426029e-05 2548 KSP Residual norm 6.714651859608e-05 2549 KSP Residual norm 6.921727934522e-05 2550 KSP Residual norm 7.344940719043e-05 2551 KSP Residual norm 7.643672962339e-05 2552 KSP Residual norm 7.349435162064e-05 2553 KSP Residual norm 6.469359367140e-05 2554 KSP Residual norm 5.688865386305e-05 2555 KSP Residual norm 5.165163613024e-05 2556 KSP Residual norm 4.679373522340e-05 2557 KSP Residual norm 4.156010991242e-05 2558 KSP Residual norm 3.835016200548e-05 2559 KSP Residual norm 3.959044633283e-05 2560 KSP Residual norm 4.251873621926e-05 2561 KSP Residual norm 4.586637817975e-05 2562 KSP Residual norm 4.686586166984e-05 2563 KSP Residual norm 4.616649880586e-05 2564 KSP Residual norm 4.709771342781e-05 2565 KSP Residual norm 5.107473263278e-05 2566 KSP Residual norm 5.762345389441e-05 2567 KSP Residual norm 6.386298021481e-05 2568 KSP Residual norm 6.374755740589e-05 2569 KSP Residual norm 6.087017263042e-05 2570 KSP Residual norm 6.710487005912e-05 2571 KSP Residual norm 7.473703054964e-05 2572 KSP Residual norm 7.136408624995e-05 2573 KSP Residual norm 7.004911577959e-05 2574 KSP Residual norm 7.362417346435e-05 2575 KSP Residual norm 7.471508940816e-05 2576 KSP Residual norm 6.969544080515e-05 2577 KSP Residual norm 7.015638923922e-05 2578 KSP Residual norm 7.486123249157e-05 2579 KSP Residual norm 8.052979467600e-05 2580 KSP Residual norm 8.309119653837e-05 2581 KSP Residual norm 8.562718701757e-05 2582 KSP Residual norm 8.835416352618e-05 2583 KSP Residual norm 9.548412350224e-05 2584 KSP Residual norm 1.030844846703e-04 2585 KSP Residual norm 1.025527704316e-04 2586 KSP Residual norm 1.023548263858e-04 2587 KSP Residual norm 1.084773089890e-04 2588 KSP Residual norm 1.027637053914e-04 2589 KSP Residual norm 9.155102881336e-05 2590 KSP Residual norm 9.443580571784e-05 2591 KSP Residual norm 1.015198220061e-04 2592 KSP Residual norm 1.082932584637e-04 2593 KSP Residual norm 1.251768351188e-04 2594 KSP Residual norm 1.584642980311e-04 2595 KSP Residual norm 1.821347375249e-04 2596 KSP Residual norm 1.794793903667e-04 2597 KSP Residual norm 1.680599530034e-04 2598 KSP Residual norm 1.669609499900e-04 2599 KSP Residual norm 1.697904825140e-04 2600 KSP Residual norm 1.706470687561e-04 2601 KSP Residual norm 1.822958202336e-04 2602 KSP Residual norm 2.032115388570e-04 2603 KSP Residual norm 2.204311401484e-04 2604 KSP Residual norm 2.259096788278e-04 2605 KSP Residual norm 2.263138857600e-04 2606 KSP Residual norm 2.352395511741e-04 2607 KSP Residual norm 2.619804829724e-04 2608 KSP Residual norm 2.791487939479e-04 2609 KSP Residual norm 2.554639620336e-04 2610 KSP Residual norm 2.258304448226e-04 2611 KSP Residual norm 1.954116159899e-04 2612 KSP Residual norm 1.624904736551e-04 2613 KSP Residual norm 1.483061533959e-04 2614 KSP Residual norm 1.421014821911e-04 2615 KSP Residual norm 1.357722291498e-04 2616 KSP Residual norm 1.282565643948e-04 2617 KSP Residual norm 1.350242123496e-04 2618 KSP Residual norm 1.362714715781e-04 2619 KSP Residual norm 1.317577505547e-04 2620 KSP Residual norm 1.382400294903e-04 2621 KSP Residual norm 1.541298608028e-04 2622 KSP Residual norm 1.508475560096e-04 2623 KSP Residual norm 1.388297372818e-04 2624 KSP Residual norm 1.309199703765e-04 2625 KSP Residual norm 1.220831127312e-04 2626 KSP Residual norm 1.142585868239e-04 2627 KSP Residual norm 1.074183298564e-04 2628 KSP Residual norm 9.994146341025e-05 2629 KSP Residual norm 9.420700711185e-05 2630 KSP Residual norm 8.626940929019e-05 2631 KSP Residual norm 7.814698881910e-05 2632 KSP Residual norm 8.075274332567e-05 2633 KSP Residual norm 8.388355963605e-05 2634 KSP Residual norm 7.874315160859e-05 2635 KSP Residual norm 7.322417136743e-05 2636 KSP Residual norm 7.229262327786e-05 2637 KSP Residual norm 7.289809252595e-05 2638 KSP Residual norm 7.136869796543e-05 2639 KSP Residual norm 7.017739740453e-05 2640 KSP Residual norm 6.591215861970e-05 2641 KSP Residual norm 6.205415308538e-05 2642 KSP Residual norm 6.596125940918e-05 2643 KSP Residual norm 7.204487972022e-05 2644 KSP Residual norm 7.179895727567e-05 2645 KSP Residual norm 7.400988283283e-05 2646 KSP Residual norm 8.364965686139e-05 2647 KSP Residual norm 9.133722382474e-05 2648 KSP Residual norm 8.960741174111e-05 2649 KSP Residual norm 8.273411012276e-05 2650 KSP Residual norm 7.671087744838e-05 2651 KSP Residual norm 7.414071346736e-05 2652 KSP Residual norm 7.413064790601e-05 2653 KSP Residual norm 6.892505947859e-05 2654 KSP Residual norm 6.641426218853e-05 2655 KSP Residual norm 6.609791763500e-05 2656 KSP Residual norm 6.760661117874e-05 2657 KSP Residual norm 7.015896003073e-05 2658 KSP Residual norm 6.975538470772e-05 2659 KSP Residual norm 6.987150676951e-05 2660 KSP Residual norm 7.524436729959e-05 2661 KSP Residual norm 8.570875033009e-05 2662 KSP Residual norm 9.515659542009e-05 2663 KSP Residual norm 9.871136274917e-05 2664 KSP Residual norm 9.443535650484e-05 2665 KSP Residual norm 9.893642188489e-05 2666 KSP Residual norm 1.211865031452e-04 2667 KSP Residual norm 1.423292115760e-04 2668 KSP Residual norm 1.307180682489e-04 2669 KSP Residual norm 1.161895156657e-04 2670 KSP Residual norm 1.177705990722e-04 2671 KSP Residual norm 1.135633707666e-04 2672 KSP Residual norm 1.044983233552e-04 2673 KSP Residual norm 1.015560115550e-04 2674 KSP Residual norm 9.772443245858e-05 2675 KSP Residual norm 8.177560282334e-05 2676 KSP Residual norm 7.316984668395e-05 2677 KSP Residual norm 7.141613893708e-05 2678 KSP Residual norm 7.236059548106e-05 2679 KSP Residual norm 7.761045836710e-05 2680 KSP Residual norm 8.893394737327e-05 2681 KSP Residual norm 1.016621101836e-04 2682 KSP Residual norm 1.126991859164e-04 2683 KSP Residual norm 1.221215701412e-04 2684 KSP Residual norm 1.215257290112e-04 2685 KSP Residual norm 1.237649424044e-04 2686 KSP Residual norm 1.375574551202e-04 2687 KSP Residual norm 1.515495972157e-04 2688 KSP Residual norm 1.559268570488e-04 2689 KSP Residual norm 1.552418022188e-04 2690 KSP Residual norm 1.540889222373e-04 2691 KSP Residual norm 1.509849659830e-04 2692 KSP Residual norm 1.495491589717e-04 2693 KSP Residual norm 1.543431955704e-04 2694 KSP Residual norm 1.584033851904e-04 2695 KSP Residual norm 1.537664657174e-04 2696 KSP Residual norm 1.518588385328e-04 2697 KSP Residual norm 1.600098928737e-04 2698 KSP Residual norm 1.631100205522e-04 2699 KSP Residual norm 1.499886239068e-04 2700 KSP Residual norm 1.609924418033e-04 2701 KSP Residual norm 1.869204672776e-04 2702 KSP Residual norm 1.916923501378e-04 2703 KSP Residual norm 1.856888397473e-04 2704 KSP Residual norm 1.886637275069e-04 2705 KSP Residual norm 1.963101071110e-04 2706 KSP Residual norm 1.960623273820e-04 2707 KSP Residual norm 1.920965948132e-04 2708 KSP Residual norm 1.870570039036e-04 2709 KSP Residual norm 1.750629562572e-04 2710 KSP Residual norm 1.600281281492e-04 2711 KSP Residual norm 1.484385966411e-04 2712 KSP Residual norm 1.425987499806e-04 2713 KSP Residual norm 1.389237184415e-04 2714 KSP Residual norm 1.306168486614e-04 2715 KSP Residual norm 1.259941692203e-04 2716 KSP Residual norm 1.281349416352e-04 2717 KSP Residual norm 1.275175406526e-04 2718 KSP Residual norm 1.187058218368e-04 2719 KSP Residual norm 1.136809289892e-04 2720 KSP Residual norm 1.155378661584e-04 2721 KSP Residual norm 1.140302391030e-04 2722 KSP Residual norm 1.056481014678e-04 2723 KSP Residual norm 9.734714173812e-05 2724 KSP Residual norm 9.406541295757e-05 2725 KSP Residual norm 9.246737342523e-05 2726 KSP Residual norm 8.514437093883e-05 2727 KSP Residual norm 7.957085140138e-05 2728 KSP Residual norm 7.750589879035e-05 2729 KSP Residual norm 7.109471481803e-05 2730 KSP Residual norm 6.552027410612e-05 2731 KSP Residual norm 7.027422877938e-05 2732 KSP Residual norm 7.375756555500e-05 2733 KSP Residual norm 6.240813324229e-05 2734 KSP Residual norm 5.542005079353e-05 2735 KSP Residual norm 5.449173170807e-05 2736 KSP Residual norm 5.329068329687e-05 2737 KSP Residual norm 5.090715521122e-05 2738 KSP Residual norm 5.361935100142e-05 2739 KSP Residual norm 5.911231888882e-05 2740 KSP Residual norm 5.801782678697e-05 2741 KSP Residual norm 5.725468793355e-05 2742 KSP Residual norm 6.194756257456e-05 2743 KSP Residual norm 6.336945835623e-05 2744 KSP Residual norm 5.848579752400e-05 2745 KSP Residual norm 5.506292643438e-05 2746 KSP Residual norm 5.617843361705e-05 2747 KSP Residual norm 5.834608899193e-05 2748 KSP Residual norm 5.598276468575e-05 2749 KSP Residual norm 5.648566133962e-05 2750 KSP Residual norm 6.266956569096e-05 2751 KSP Residual norm 6.835765907872e-05 2752 KSP Residual norm 7.226829622211e-05 2753 KSP Residual norm 7.476941020332e-05 2754 KSP Residual norm 7.822119579677e-05 2755 KSP Residual norm 7.748958195041e-05 2756 KSP Residual norm 7.340275546659e-05 2757 KSP Residual norm 6.879278550757e-05 2758 KSP Residual norm 6.935629459116e-05 2759 KSP Residual norm 6.974604375574e-05 2760 KSP Residual norm 6.387768253729e-05 2761 KSP Residual norm 5.916342789555e-05 2762 KSP Residual norm 6.174478697807e-05 2763 KSP Residual norm 6.191586477264e-05 2764 KSP Residual norm 5.810330629357e-05 2765 KSP Residual norm 5.807787159794e-05 2766 KSP Residual norm 6.381314570978e-05 2767 KSP Residual norm 6.919156009640e-05 2768 KSP Residual norm 7.829993431968e-05 2769 KSP Residual norm 9.774585973461e-05 2770 KSP Residual norm 1.200003989363e-04 2771 KSP Residual norm 1.297842880976e-04 2772 KSP Residual norm 1.412740999062e-04 2773 KSP Residual norm 1.630073757865e-04 2774 KSP Residual norm 1.706536930462e-04 2775 KSP Residual norm 1.589959261031e-04 2776 KSP Residual norm 1.534621606252e-04 2777 KSP Residual norm 1.577110995567e-04 2778 KSP Residual norm 1.522459437676e-04 2779 KSP Residual norm 1.475987398503e-04 2780 KSP Residual norm 1.569009142527e-04 2781 KSP Residual norm 1.690355428573e-04 2782 KSP Residual norm 1.646535146735e-04 2783 KSP Residual norm 1.549604287898e-04 2784 KSP Residual norm 1.703719426804e-04 2785 KSP Residual norm 1.805616248544e-04 2786 KSP Residual norm 1.717352976056e-04 2787 KSP Residual norm 1.798997042689e-04 2788 KSP Residual norm 1.935919068536e-04 2789 KSP Residual norm 1.872616230704e-04 2790 KSP Residual norm 1.919112296791e-04 2791 KSP Residual norm 2.286585243015e-04 2792 KSP Residual norm 2.765361114183e-04 2793 KSP Residual norm 2.758060604786e-04 2794 KSP Residual norm 2.698227195502e-04 2795 KSP Residual norm 2.688331357541e-04 2796 KSP Residual norm 2.524275970659e-04 2797 KSP Residual norm 2.345149556368e-04 2798 KSP Residual norm 2.278747333198e-04 2799 KSP Residual norm 2.225240719275e-04 2800 KSP Residual norm 2.193101455626e-04 2801 KSP Residual norm 2.178120306747e-04 2802 KSP Residual norm 2.271716852184e-04 2803 KSP Residual norm 2.371434920195e-04 2804 KSP Residual norm 2.402546190469e-04 2805 KSP Residual norm 2.414695269177e-04 2806 KSP Residual norm 2.534853316148e-04 2807 KSP Residual norm 2.786161944943e-04 2808 KSP Residual norm 2.724617926424e-04 2809 KSP Residual norm 2.470358407787e-04 2810 KSP Residual norm 2.320861937884e-04 2811 KSP Residual norm 2.370428873268e-04 2812 KSP Residual norm 2.308138624931e-04 2813 KSP Residual norm 2.083255898942e-04 2814 KSP Residual norm 1.976585687891e-04 2815 KSP Residual norm 2.090924374615e-04 2816 KSP Residual norm 2.036391314802e-04 2817 KSP Residual norm 1.820674727176e-04 2818 KSP Residual norm 1.753073879342e-04 2819 KSP Residual norm 1.847920091893e-04 2820 KSP Residual norm 1.720619150346e-04 2821 KSP Residual norm 1.502596374527e-04 2822 KSP Residual norm 1.405689812458e-04 2823 KSP Residual norm 1.436728871018e-04 2824 KSP Residual norm 1.316710636961e-04 2825 KSP Residual norm 1.186837289934e-04 2826 KSP Residual norm 1.123751433744e-04 2827 KSP Residual norm 1.039738176819e-04 2828 KSP Residual norm 9.573492853490e-05 2829 KSP Residual norm 9.766733552305e-05 2830 KSP Residual norm 1.018264639792e-04 2831 KSP Residual norm 8.789914897695e-05 2832 KSP Residual norm 7.094369186012e-05 2833 KSP Residual norm 6.401731397682e-05 2834 KSP Residual norm 6.393255328270e-05 2835 KSP Residual norm 5.907935211002e-05 2836 KSP Residual norm 5.554259858867e-05 2837 KSP Residual norm 5.670866366268e-05 2838 KSP Residual norm 5.716920078025e-05 2839 KSP Residual norm 4.889814106269e-05 2840 KSP Residual norm 4.461754898926e-05 2841 KSP Residual norm 4.845949008460e-05 2842 KSP Residual norm 5.407130936100e-05 2843 KSP Residual norm 5.394734821616e-05 2844 KSP Residual norm 5.443854893379e-05 2845 KSP Residual norm 5.880462546781e-05 2846 KSP Residual norm 5.604589729507e-05 2847 KSP Residual norm 5.058345932175e-05 2848 KSP Residual norm 4.983721213943e-05 2849 KSP Residual norm 5.063103450877e-05 2850 KSP Residual norm 4.627337103439e-05 2851 KSP Residual norm 4.150390459422e-05 2852 KSP Residual norm 4.245217251116e-05 2853 KSP Residual norm 4.046827461576e-05 2854 KSP Residual norm 3.653114229107e-05 2855 KSP Residual norm 3.784893196076e-05 2856 KSP Residual norm 4.128175046279e-05 2857 KSP Residual norm 3.820729684273e-05 2858 KSP Residual norm 3.458811957403e-05 2859 KSP Residual norm 3.719818425660e-05 2860 KSP Residual norm 3.989375193757e-05 2861 KSP Residual norm 3.648437390503e-05 2862 KSP Residual norm 3.514787474582e-05 2863 KSP Residual norm 3.672006198590e-05 2864 KSP Residual norm 3.487635588729e-05 2865 KSP Residual norm 3.439534231306e-05 2866 KSP Residual norm 3.791355373752e-05 2867 KSP Residual norm 3.726644907131e-05 2868 KSP Residual norm 3.340412398662e-05 2869 KSP Residual norm 3.433078500336e-05 2870 KSP Residual norm 3.485951379642e-05 2871 KSP Residual norm 3.028371556779e-05 2872 KSP Residual norm 2.867789093390e-05 2873 KSP Residual norm 3.082443839615e-05 2874 KSP Residual norm 2.994326024623e-05 2875 KSP Residual norm 3.002935247278e-05 2876 KSP Residual norm 3.316997463862e-05 2877 KSP Residual norm 3.276136460439e-05 2878 KSP Residual norm 2.967328817999e-05 2879 KSP Residual norm 2.934388389114e-05 2880 KSP Residual norm 3.063513702837e-05 2881 KSP Residual norm 3.083694518454e-05 2882 KSP Residual norm 3.364540615194e-05 2883 KSP Residual norm 3.851997271040e-05 2884 KSP Residual norm 3.725365920289e-05 2885 KSP Residual norm 3.665755212251e-05 2886 KSP Residual norm 4.183135204928e-05 2887 KSP Residual norm 4.752424156926e-05 2888 KSP Residual norm 4.622243185009e-05 2889 KSP Residual norm 4.604792000863e-05 2890 KSP Residual norm 5.217109506257e-05 2891 KSP Residual norm 5.309730188064e-05 2892 KSP Residual norm 4.804961159999e-05 2893 KSP Residual norm 4.798541843952e-05 2894 KSP Residual norm 5.336060077889e-05 2895 KSP Residual norm 4.864696884627e-05 2896 KSP Residual norm 4.558004424790e-05 2897 KSP Residual norm 5.512078751551e-05 2898 KSP Residual norm 6.460516218688e-05 2899 KSP Residual norm 5.832482466266e-05 2900 KSP Residual norm 5.492796587533e-05 2901 KSP Residual norm 5.567924095153e-05 2902 KSP Residual norm 5.391480143795e-05 2903 KSP Residual norm 5.443350144407e-05 2904 KSP Residual norm 6.232902255946e-05 2905 KSP Residual norm 6.914333705745e-05 2906 KSP Residual norm 6.881440525459e-05 2907 KSP Residual norm 7.453845917795e-05 2908 KSP Residual norm 8.708017875334e-05 2909 KSP Residual norm 8.344870605380e-05 2910 KSP Residual norm 7.453624025433e-05 2911 KSP Residual norm 7.973511242386e-05 2912 KSP Residual norm 8.700347654139e-05 2913 KSP Residual norm 7.812713737144e-05 2914 KSP Residual norm 7.121321222264e-05 2915 KSP Residual norm 7.796145466587e-05 2916 KSP Residual norm 8.283051864607e-05 2917 KSP Residual norm 8.406876087447e-05 2918 KSP Residual norm 8.823189903795e-05 2919 KSP Residual norm 9.438112448711e-05 2920 KSP Residual norm 9.940585184990e-05 2921 KSP Residual norm 1.111926719402e-04 2922 KSP Residual norm 1.363866387670e-04 2923 KSP Residual norm 1.548059122572e-04 2924 KSP Residual norm 1.480819889549e-04 2925 KSP Residual norm 1.480541835774e-04 2926 KSP Residual norm 1.663039433726e-04 2927 KSP Residual norm 1.725729245096e-04 2928 KSP Residual norm 1.798584679799e-04 2929 KSP Residual norm 2.029181786275e-04 2930 KSP Residual norm 2.175519290702e-04 2931 KSP Residual norm 2.138147497084e-04 2932 KSP Residual norm 2.259883559807e-04 2933 KSP Residual norm 2.547658575092e-04 2934 KSP Residual norm 2.541819599950e-04 2935 KSP Residual norm 2.589882768857e-04 2936 KSP Residual norm 2.749082700495e-04 2937 KSP Residual norm 2.453073162869e-04 2938 KSP Residual norm 2.091106409589e-04 2939 KSP Residual norm 2.056865778831e-04 2940 KSP Residual norm 2.230577610617e-04 2941 KSP Residual norm 2.284176507170e-04 2942 KSP Residual norm 2.389221266366e-04 2943 KSP Residual norm 2.572226436993e-04 2944 KSP Residual norm 2.541934988709e-04 2945 KSP Residual norm 2.436760315555e-04 2946 KSP Residual norm 2.650269390277e-04 2947 KSP Residual norm 3.069316363096e-04 2948 KSP Residual norm 3.194951619918e-04 2949 KSP Residual norm 3.266746126817e-04 2950 KSP Residual norm 3.529403990272e-04 2951 KSP Residual norm 3.302983837112e-04 2952 KSP Residual norm 2.846674313965e-04 2953 KSP Residual norm 2.925855845015e-04 2954 KSP Residual norm 3.169611635151e-04 2955 KSP Residual norm 2.751445295381e-04 2956 KSP Residual norm 2.422294737745e-04 2957 KSP Residual norm 2.488107560431e-04 2958 KSP Residual norm 2.500047578650e-04 2959 KSP Residual norm 2.163159993911e-04 2960 KSP Residual norm 2.114109653086e-04 2961 KSP Residual norm 2.336029640767e-04 2962 KSP Residual norm 2.345752940936e-04 2963 KSP Residual norm 2.130572910899e-04 2964 KSP Residual norm 2.194030454563e-04 2965 KSP Residual norm 2.242237215919e-04 2966 KSP Residual norm 2.178423714781e-04 2967 KSP Residual norm 2.388662628640e-04 2968 KSP Residual norm 2.944931084366e-04 2969 KSP Residual norm 2.947638674887e-04 2970 KSP Residual norm 2.619179336545e-04 2971 KSP Residual norm 2.607757840989e-04 2972 KSP Residual norm 2.544625539137e-04 2973 KSP Residual norm 2.111650403626e-04 2974 KSP Residual norm 1.955033666107e-04 2975 KSP Residual norm 2.224039186616e-04 2976 KSP Residual norm 2.453220239181e-04 2977 KSP Residual norm 2.150270783634e-04 2978 KSP Residual norm 1.967434715007e-04 2979 KSP Residual norm 2.172935661083e-04 2980 KSP Residual norm 2.282110526316e-04 2981 KSP Residual norm 2.141826474067e-04 2982 KSP Residual norm 2.273050325666e-04 2983 KSP Residual norm 2.511227568465e-04 2984 KSP Residual norm 2.230076873065e-04 2985 KSP Residual norm 1.864403880761e-04 2986 KSP Residual norm 1.809232368430e-04 2987 KSP Residual norm 1.809260031362e-04 2988 KSP Residual norm 1.630379180076e-04 2989 KSP Residual norm 1.550960753377e-04 2990 KSP Residual norm 1.583229819832e-04 2991 KSP Residual norm 1.362335808794e-04 2992 KSP Residual norm 1.165174899278e-04 2993 KSP Residual norm 1.130495787032e-04 2994 KSP Residual norm 1.063821105932e-04 2995 KSP Residual norm 9.769694701050e-05 2996 KSP Residual norm 1.024099520605e-04 2997 KSP Residual norm 1.058260267335e-04 2998 KSP Residual norm 8.793836028414e-05 2999 KSP Residual norm 7.674189584807e-05 3000 KSP Residual norm 7.832545060640e-05 3001 KSP Residual norm 8.123776701662e-05 3002 KSP Residual norm 7.519430016332e-05 3003 KSP Residual norm 7.257211117254e-05 3004 KSP Residual norm 7.895642510913e-05 3005 KSP Residual norm 8.232248232043e-05 3006 KSP Residual norm 8.382921520721e-05 3007 KSP Residual norm 8.917721889460e-05 3008 KSP Residual norm 8.777029175351e-05 3009 KSP Residual norm 7.508194797990e-05 3010 KSP Residual norm 6.969711924552e-05 3011 KSP Residual norm 7.027711878432e-05 3012 KSP Residual norm 6.189916665181e-05 3013 KSP Residual norm 5.558480955307e-05 3014 KSP Residual norm 5.872701403394e-05 3015 KSP Residual norm 5.764550801110e-05 3016 KSP Residual norm 5.335476401032e-05 3017 KSP Residual norm 5.555393605015e-05 3018 KSP Residual norm 5.855374239394e-05 3019 KSP Residual norm 5.118822365574e-05 3020 KSP Residual norm 4.579096272686e-05 3021 KSP Residual norm 4.806536921691e-05 3022 KSP Residual norm 4.784650502488e-05 3023 KSP Residual norm 4.538791804560e-05 3024 KSP Residual norm 4.651987261180e-05 3025 KSP Residual norm 4.586192224891e-05 3026 KSP Residual norm 3.776080898557e-05 3027 KSP Residual norm 3.290846065061e-05 3028 KSP Residual norm 3.358071914431e-05 3029 KSP Residual norm 3.485233425133e-05 3030 KSP Residual norm 3.658801529886e-05 3031 KSP Residual norm 3.823311977423e-05 3032 KSP Residual norm 3.841317839957e-05 3033 KSP Residual norm 3.700934014513e-05 3034 KSP Residual norm 3.553613313720e-05 3035 KSP Residual norm 3.526560664799e-05 3036 KSP Residual norm 3.460412178838e-05 3037 KSP Residual norm 2.980746909890e-05 3038 KSP Residual norm 2.523321130510e-05 3039 KSP Residual norm 2.374503422048e-05 3040 KSP Residual norm 2.490537885031e-05 3041 KSP Residual norm 2.524430625298e-05 3042 KSP Residual norm 2.389176915935e-05 3043 KSP Residual norm 2.140813586129e-05 3044 KSP Residual norm 1.908501807033e-05 3045 KSP Residual norm 1.765838887149e-05 3046 KSP Residual norm 1.826501738451e-05 3047 KSP Residual norm 2.090275003493e-05 3048 KSP Residual norm 2.158035304582e-05 3049 KSP Residual norm 1.948876881701e-05 3050 KSP Residual norm 1.910639814256e-05 3051 KSP Residual norm 2.068326874322e-05 3052 KSP Residual norm 2.007916498953e-05 3053 KSP Residual norm 1.815216331316e-05 3054 KSP Residual norm 1.823220740622e-05 3055 KSP Residual norm 1.878058485588e-05 3056 KSP Residual norm 1.684918521901e-05 3057 KSP Residual norm 1.630680279923e-05 3058 KSP Residual norm 1.792380354438e-05 3059 KSP Residual norm 1.823191875548e-05 3060 KSP Residual norm 1.627326294890e-05 3061 KSP Residual norm 1.501208439504e-05 3062 KSP Residual norm 1.339173994152e-05 3063 KSP Residual norm 1.080930988218e-05 3064 KSP Residual norm 9.813573807538e-06 3065 KSP Residual norm 9.883940737722e-06 3066 KSP Residual norm 9.211893052572e-06 3067 KSP Residual norm 8.590908346866e-06 3068 KSP Residual norm 9.081093351400e-06 3069 KSP Residual norm 9.791337341847e-06 3070 KSP Residual norm 9.539758973048e-06 3071 KSP Residual norm 9.861434959441e-06 3072 KSP Residual norm 1.084083441213e-05 3073 KSP Residual norm 1.112667554158e-05 3074 KSP Residual norm 1.104646643839e-05 3075 KSP Residual norm 1.081057403508e-05 3076 KSP Residual norm 9.796739700224e-06 3077 KSP Residual norm 8.585495211359e-06 3078 KSP Residual norm 8.002938914736e-06 3079 KSP Residual norm 7.697294313625e-06 3080 KSP Residual norm 7.243160524573e-06 3081 KSP Residual norm 7.183865939244e-06 3082 KSP Residual norm 8.266783893935e-06 3083 KSP Residual norm 8.975129795981e-06 3084 KSP Residual norm 8.991923271717e-06 3085 KSP Residual norm 1.047708169981e-05 3086 KSP Residual norm 1.275011934179e-05 3087 KSP Residual norm 1.277720469884e-05 3088 KSP Residual norm 1.288553363823e-05 3089 KSP Residual norm 1.418076540844e-05 3090 KSP Residual norm 1.448583501389e-05 3091 KSP Residual norm 1.351758169845e-05 3092 KSP Residual norm 1.391327805145e-05 3093 KSP Residual norm 1.529933903951e-05 3094 KSP Residual norm 1.518593837087e-05 3095 KSP Residual norm 1.492448860072e-05 3096 KSP Residual norm 1.524907396152e-05 3097 KSP Residual norm 1.499006363869e-05 3098 KSP Residual norm 1.452586619339e-05 3099 KSP Residual norm 1.525659444367e-05 3100 KSP Residual norm 1.675232381260e-05 3101 KSP Residual norm 1.669833884423e-05 3102 KSP Residual norm 1.578215365756e-05 3103 KSP Residual norm 1.579335071994e-05 3104 KSP Residual norm 1.623841047098e-05 3105 KSP Residual norm 1.716417691683e-05 3106 KSP Residual norm 1.827476532089e-05 3107 KSP Residual norm 2.040807457722e-05 3108 KSP Residual norm 2.369192074687e-05 3109 KSP Residual norm 2.437257458675e-05 3110 KSP Residual norm 2.387742127635e-05 3111 KSP Residual norm 2.579632231242e-05 3112 KSP Residual norm 2.903855810130e-05 3113 KSP Residual norm 2.745761500017e-05 3114 KSP Residual norm 2.325259707086e-05 3115 KSP Residual norm 2.261747764403e-05 3116 KSP Residual norm 2.583994486062e-05 3117 KSP Residual norm 2.867448580901e-05 3118 KSP Residual norm 3.034871760602e-05 3119 KSP Residual norm 3.261381535396e-05 3120 KSP Residual norm 3.127784419734e-05 3121 KSP Residual norm 2.868294135036e-05 3122 KSP Residual norm 3.115919334022e-05 3123 KSP Residual norm 3.943449309256e-05 3124 KSP Residual norm 4.490933789997e-05 3125 KSP Residual norm 4.894636167729e-05 3126 KSP Residual norm 5.624776004675e-05 3127 KSP Residual norm 5.912974540235e-05 3128 KSP Residual norm 5.194683337825e-05 3129 KSP Residual norm 5.216651363126e-05 3130 KSP Residual norm 5.532558719964e-05 3131 KSP Residual norm 5.208773563512e-05 3132 KSP Residual norm 5.096883663527e-05 3133 KSP Residual norm 5.459308472875e-05 3134 KSP Residual norm 5.549600580888e-05 3135 KSP Residual norm 5.442146066883e-05 3136 KSP Residual norm 5.753276455022e-05 3137 KSP Residual norm 6.122030735468e-05 3138 KSP Residual norm 5.918485535130e-05 3139 KSP Residual norm 5.903654630292e-05 3140 KSP Residual norm 5.730854297744e-05 3141 KSP Residual norm 5.225286365771e-05 3142 KSP Residual norm 4.958983958548e-05 3143 KSP Residual norm 4.827273652584e-05 3144 KSP Residual norm 4.225511949034e-05 3145 KSP Residual norm 3.530478041503e-05 3146 KSP Residual norm 3.309154804151e-05 3147 KSP Residual norm 3.276815863822e-05 3148 KSP Residual norm 3.364697694738e-05 3149 KSP Residual norm 3.780043499432e-05 3150 KSP Residual norm 4.128172651046e-05 3151 KSP Residual norm 3.941605247750e-05 3152 KSP Residual norm 3.894883755593e-05 3153 KSP Residual norm 3.869275314011e-05 3154 KSP Residual norm 3.126337937533e-05 3155 KSP Residual norm 2.334903861474e-05 3156 KSP Residual norm 2.042505202755e-05 3157 KSP Residual norm 2.148044813468e-05 3158 KSP Residual norm 2.422204005642e-05 3159 KSP Residual norm 2.610177872487e-05 3160 KSP Residual norm 2.508883477346e-05 3161 KSP Residual norm 2.187083388936e-05 3162 KSP Residual norm 1.928064401712e-05 3163 KSP Residual norm 1.926132293728e-05 3164 KSP Residual norm 2.153059634358e-05 3165 KSP Residual norm 2.414313715744e-05 3166 KSP Residual norm 2.493579864012e-05 3167 KSP Residual norm 2.439520861236e-05 3168 KSP Residual norm 2.454965597216e-05 3169 KSP Residual norm 2.496085820912e-05 3170 KSP Residual norm 2.439405789278e-05 3171 KSP Residual norm 2.392612885477e-05 3172 KSP Residual norm 2.359211783333e-05 3173 KSP Residual norm 2.264111993776e-05 3174 KSP Residual norm 1.998509448380e-05 3175 KSP Residual norm 1.884280353401e-05 3176 KSP Residual norm 1.905853898637e-05 3177 KSP Residual norm 1.765828869807e-05 3178 KSP Residual norm 1.538484423549e-05 3179 KSP Residual norm 1.434760139186e-05 3180 KSP Residual norm 1.453820858146e-05 3181 KSP Residual norm 1.494354641437e-05 3182 KSP Residual norm 1.598534158227e-05 3183 KSP Residual norm 1.670999027868e-05 3184 KSP Residual norm 1.506112017854e-05 3185 KSP Residual norm 1.397175695129e-05 3186 KSP Residual norm 1.417044338567e-05 3187 KSP Residual norm 1.341752509942e-05 3188 KSP Residual norm 1.254078954016e-05 3189 KSP Residual norm 1.288702429768e-05 3190 KSP Residual norm 1.255092814294e-05 3191 KSP Residual norm 1.112101093033e-05 3192 KSP Residual norm 1.097985344038e-05 3193 KSP Residual norm 1.272532997403e-05 3194 KSP Residual norm 1.441222895620e-05 3195 KSP Residual norm 1.525656540614e-05 3196 KSP Residual norm 1.580556524554e-05 3197 KSP Residual norm 1.592224549017e-05 3198 KSP Residual norm 1.486261759542e-05 3199 KSP Residual norm 1.395479839347e-05 3200 KSP Residual norm 1.302747504831e-05 3201 KSP Residual norm 1.097204366467e-05 3202 KSP Residual norm 8.751434733207e-06 3203 KSP Residual norm 7.915740592375e-06 3204 KSP Residual norm 8.422449099951e-06 3205 KSP Residual norm 9.666861226815e-06 3206 KSP Residual norm 1.138276834129e-05 3207 KSP Residual norm 1.260937701832e-05 3208 KSP Residual norm 1.135573036472e-05 3209 KSP Residual norm 8.784882730022e-06 3210 KSP Residual norm 7.179270154009e-06 3211 KSP Residual norm 7.052948792399e-06 3212 KSP Residual norm 8.160581408284e-06 3213 KSP Residual norm 9.491411441814e-06 3214 KSP Residual norm 9.212056199399e-06 3215 KSP Residual norm 8.035896772880e-06 3216 KSP Residual norm 7.436754549615e-06 3217 KSP Residual norm 7.147880590288e-06 3218 KSP Residual norm 6.875806495411e-06 3219 KSP Residual norm 6.728258016853e-06 3220 KSP Residual norm 6.558028300309e-06 3221 KSP Residual norm 5.596744166408e-06 3222 KSP Residual norm 5.037074779112e-06 3223 KSP Residual norm 5.502589648588e-06 3224 KSP Residual norm 6.178138032325e-06 3225 KSP Residual norm 6.436072389192e-06 3226 KSP Residual norm 7.493447008679e-06 3227 KSP Residual norm 8.680384493769e-06 3228 KSP Residual norm 7.832977153327e-06 3229 KSP Residual norm 6.119780465391e-06 3230 KSP Residual norm 4.877503371345e-06 3231 KSP Residual norm 4.263596073727e-06 3232 KSP Residual norm 4.403648905265e-06 3233 KSP Residual norm 4.942259066004e-06 3234 KSP Residual norm 5.202680539426e-06 3235 KSP Residual norm 5.376429931578e-06 3236 KSP Residual norm 6.660289734611e-06 3237 KSP Residual norm 8.549417156545e-06 3238 KSP Residual norm 9.296715176002e-06 3239 KSP Residual norm 9.110451043419e-06 3240 KSP Residual norm 7.624156801182e-06 3241 KSP Residual norm 5.381948612026e-06 3242 KSP Residual norm 4.268577383838e-06 3243 KSP Residual norm 4.720818114124e-06 3244 KSP Residual norm 6.619795533018e-06 3245 KSP Residual norm 9.088521429318e-06 3246 KSP Residual norm 1.240980759084e-05 3247 KSP Residual norm 1.493786363687e-05 3248 KSP Residual norm 1.365507525432e-05 3249 KSP Residual norm 1.101837794999e-05 3250 KSP Residual norm 9.916952448789e-06 3251 KSP Residual norm 9.555267853466e-06 3252 KSP Residual norm 8.696571482657e-06 3253 KSP Residual norm 8.719881297786e-06 3254 KSP Residual norm 1.019029529513e-05 3255 KSP Residual norm 1.240712557133e-05 3256 KSP Residual norm 1.464078065964e-05 3257 KSP Residual norm 1.625956236838e-05 3258 KSP Residual norm 1.610353819909e-05 3259 KSP Residual norm 1.547738080799e-05 3260 KSP Residual norm 1.421341552492e-05 3261 KSP Residual norm 1.188115606805e-05 3262 KSP Residual norm 1.104307095892e-05 3263 KSP Residual norm 1.285298530045e-05 3264 KSP Residual norm 1.687043171361e-05 3265 KSP Residual norm 2.166183060300e-05 3266 KSP Residual norm 2.244302630276e-05 3267 KSP Residual norm 2.011604266414e-05 3268 KSP Residual norm 1.708896515367e-05 3269 KSP Residual norm 1.501541074298e-05 3270 KSP Residual norm 1.494924928148e-05 3271 KSP Residual norm 1.459871621442e-05 3272 KSP Residual norm 1.426206996600e-05 3273 KSP Residual norm 1.472903432766e-05 3274 KSP Residual norm 1.491289689752e-05 3275 KSP Residual norm 1.467911953497e-05 3276 KSP Residual norm 1.590957422828e-05 3277 KSP Residual norm 2.106420504973e-05 3278 KSP Residual norm 2.875038594849e-05 3279 KSP Residual norm 3.441361421966e-05 3280 KSP Residual norm 3.569826386577e-05 3281 KSP Residual norm 3.037877045442e-05 3282 KSP Residual norm 2.358718078041e-05 3283 KSP Residual norm 2.188059602841e-05 3284 KSP Residual norm 2.534623660961e-05 3285 KSP Residual norm 2.876319439782e-05 3286 KSP Residual norm 3.034671641283e-05 3287 KSP Residual norm 3.074901201380e-05 3288 KSP Residual norm 2.695282094531e-05 3289 KSP Residual norm 2.122606938440e-05 3290 KSP Residual norm 1.924344105616e-05 3291 KSP Residual norm 2.288533317665e-05 3292 KSP Residual norm 2.994098958874e-05 3293 KSP Residual norm 3.741701031079e-05 3294 KSP Residual norm 4.696887899960e-05 3295 KSP Residual norm 5.233033704070e-05 3296 KSP Residual norm 4.383546463489e-05 3297 KSP Residual norm 3.442419415246e-05 3298 KSP Residual norm 2.550768685969e-05 3299 KSP Residual norm 1.993359380915e-05 3300 KSP Residual norm 2.126529708047e-05 3301 KSP Residual norm 2.842237069288e-05 3302 KSP Residual norm 3.698428061978e-05 3303 KSP Residual norm 4.205286673081e-05 3304 KSP Residual norm 3.890628598452e-05 3305 KSP Residual norm 2.851784591067e-05 3306 KSP Residual norm 2.302999464729e-05 3307 KSP Residual norm 2.453385600675e-05 3308 KSP Residual norm 3.136492081822e-05 3309 KSP Residual norm 4.222079531108e-05 3310 KSP Residual norm 5.165795234999e-05 3311 KSP Residual norm 5.131479034441e-05 3312 KSP Residual norm 3.994790820639e-05 3313 KSP Residual norm 2.747011898276e-05 3314 KSP Residual norm 2.154710418857e-05 3315 KSP Residual norm 2.126710001026e-05 3316 KSP Residual norm 2.593124076821e-05 3317 KSP Residual norm 3.332746393310e-05 3318 KSP Residual norm 4.102663241242e-05 3319 KSP Residual norm 4.141289324964e-05 3320 KSP Residual norm 3.341756274927e-05 3321 KSP Residual norm 2.510928691592e-05 3322 KSP Residual norm 1.861623686504e-05 3323 KSP Residual norm 1.573029943743e-05 3324 KSP Residual norm 1.701207366240e-05 3325 KSP Residual norm 2.219130301786e-05 3326 KSP Residual norm 2.865259537642e-05 3327 KSP Residual norm 3.277459980101e-05 3328 KSP Residual norm 2.967207316529e-05 3329 KSP Residual norm 2.012967446442e-05 3330 KSP Residual norm 1.323779610640e-05 3331 KSP Residual norm 1.170058764125e-05 3332 KSP Residual norm 1.506874039563e-05 3333 KSP Residual norm 2.187773957932e-05 3334 KSP Residual norm 2.807532045097e-05 3335 KSP Residual norm 3.476178423142e-05 3336 KSP Residual norm 3.832318442393e-05 3337 KSP Residual norm 3.105535569413e-05 3338 KSP Residual norm 1.964748765249e-05 3339 KSP Residual norm 1.183344877743e-05 3340 KSP Residual norm 7.696038616628e-06 3341 KSP Residual norm 6.285333628082e-06 3342 KSP Residual norm 6.925560461326e-06 3343 KSP Residual norm 9.918522521907e-06 3344 KSP Residual norm 1.549464330447e-05 3345 KSP Residual norm 2.131135161626e-05 3346 KSP Residual norm 2.015989632038e-05 3347 KSP Residual norm 1.290775905438e-05 3348 KSP Residual norm 8.034820805598e-06 3349 KSP Residual norm 5.705912965930e-06 3350 KSP Residual norm 4.651391981281e-06 3351 KSP Residual norm 4.547029847071e-06 3352 KSP Residual norm 5.679968419010e-06 3353 KSP Residual norm 8.448900024363e-06 3354 KSP Residual norm 1.050633558129e-05 3355 KSP Residual norm 9.679200792397e-06 3356 KSP Residual norm 8.169027424337e-06 3357 KSP Residual norm 6.648521428254e-06 3358 KSP Residual norm 5.710510271592e-06 3359 KSP Residual norm 5.175795269693e-06 3360 KSP Residual norm 4.302373918401e-06 3361 KSP Residual norm 3.629528180964e-06 3362 KSP Residual norm 3.824357258986e-06 3363 KSP Residual norm 4.945732969127e-06 3364 KSP Residual norm 6.423387205401e-06 3365 KSP Residual norm 6.920275517319e-06 3366 KSP Residual norm 6.198857464093e-06 3367 KSP Residual norm 4.626085229782e-06 3368 KSP Residual norm 3.829072136821e-06 3369 KSP Residual norm 3.809912689597e-06 3370 KSP Residual norm 3.814821592034e-06 3371 KSP Residual norm 3.202989443671e-06 3372 KSP Residual norm 2.667076448618e-06 3373 KSP Residual norm 2.468835979289e-06 3374 KSP Residual norm 2.570826036892e-06 3375 KSP Residual norm 3.173043705035e-06 3376 KSP Residual norm 4.039890917904e-06 3377 KSP Residual norm 4.068178823920e-06 3378 KSP Residual norm 3.556608311169e-06 3379 KSP Residual norm 3.028959222301e-06 3380 KSP Residual norm 2.728996180852e-06 3381 KSP Residual norm 2.893470883108e-06 3382 KSP Residual norm 3.286525091617e-06 3383 KSP Residual norm 3.156415463214e-06 3384 KSP Residual norm 2.530071511509e-06 3385 KSP Residual norm 2.128653649856e-06 3386 KSP Residual norm 2.245452443505e-06 3387 KSP Residual norm 2.916259437187e-06 3388 KSP Residual norm 3.915548600624e-06 3389 KSP Residual norm 5.046566855111e-06 3390 KSP Residual norm 5.791378848346e-06 3391 KSP Residual norm 6.258952687663e-06 3392 KSP Residual norm 6.568897043088e-06 3393 KSP Residual norm 6.726933143907e-06 3394 KSP Residual norm 7.124626489960e-06 3395 KSP Residual norm 6.910813941941e-06 3396 KSP Residual norm 6.039878504467e-06 3397 KSP Residual norm 5.002458912617e-06 3398 KSP Residual norm 4.399671683077e-06 3399 KSP Residual norm 4.213431829465e-06 3400 KSP Residual norm 4.348157904986e-06 3401 KSP Residual norm 4.300414991952e-06 3402 KSP Residual norm 4.468360130142e-06 3403 KSP Residual norm 5.244109098856e-06 3404 KSP Residual norm 6.285041015688e-06 3405 KSP Residual norm 7.421522913903e-06 3406 KSP Residual norm 9.008240670641e-06 3407 KSP Residual norm 9.923541717810e-06 3408 KSP Residual norm 9.298882641464e-06 3409 KSP Residual norm 8.883188000123e-06 3410 KSP Residual norm 9.040650892792e-06 3411 KSP Residual norm 8.743943345825e-06 3412 KSP Residual norm 8.432430559586e-06 3413 KSP Residual norm 7.726172984659e-06 3414 KSP Residual norm 6.849683706682e-06 3415 KSP Residual norm 6.774303701723e-06 3416 KSP Residual norm 8.088012311917e-06 3417 KSP Residual norm 9.432954227549e-06 3418 KSP Residual norm 9.851059693386e-06 3419 KSP Residual norm 1.046597391891e-05 3420 KSP Residual norm 1.151646125394e-05 3421 KSP Residual norm 1.296510878071e-05 3422 KSP Residual norm 1.457289850243e-05 3423 KSP Residual norm 1.531635524505e-05 3424 KSP Residual norm 1.457343287216e-05 3425 KSP Residual norm 1.565590880706e-05 3426 KSP Residual norm 1.836349518612e-05 3427 KSP Residual norm 1.883899065806e-05 3428 KSP Residual norm 1.648177077744e-05 3429 KSP Residual norm 1.454872874720e-05 3430 KSP Residual norm 1.336455981563e-05 3431 KSP Residual norm 1.369876216032e-05 3432 KSP Residual norm 1.563155986076e-05 3433 KSP Residual norm 1.773513146678e-05 3434 KSP Residual norm 1.889918217112e-05 3435 KSP Residual norm 1.920089108455e-05 3436 KSP Residual norm 2.073556856208e-05 3437 KSP Residual norm 2.401413776553e-05 3438 KSP Residual norm 2.914853612921e-05 3439 KSP Residual norm 2.980671549895e-05 3440 KSP Residual norm 2.298654031174e-05 3441 KSP Residual norm 1.781925990964e-05 3442 KSP Residual norm 1.788686395393e-05 3443 KSP Residual norm 2.124456605243e-05 3444 KSP Residual norm 2.268130375391e-05 3445 KSP Residual norm 2.015333115727e-05 3446 KSP Residual norm 1.621876867064e-05 3447 KSP Residual norm 1.467543589451e-05 3448 KSP Residual norm 1.638064733787e-05 3449 KSP Residual norm 1.950538486800e-05 3450 KSP Residual norm 1.981411821023e-05 3451 KSP Residual norm 1.746599466306e-05 3452 KSP Residual norm 1.643555313935e-05 3453 KSP Residual norm 1.734889796195e-05 3454 KSP Residual norm 2.025451986627e-05 3455 KSP Residual norm 2.246522395773e-05 3456 KSP Residual norm 1.952559143006e-05 3457 KSP Residual norm 1.535533671459e-05 3458 KSP Residual norm 1.408359496695e-05 3459 KSP Residual norm 1.434326783109e-05 3460 KSP Residual norm 1.331776505808e-05 3461 KSP Residual norm 1.041632225792e-05 3462 KSP Residual norm 8.392585985888e-06 3463 KSP Residual norm 7.557702263398e-06 3464 KSP Residual norm 7.767591900393e-06 3465 KSP Residual norm 8.657962758448e-06 3466 KSP Residual norm 9.201941132156e-06 3467 KSP Residual norm 9.661589386537e-06 3468 KSP Residual norm 1.099193006597e-05 3469 KSP Residual norm 1.296742697896e-05 3470 KSP Residual norm 1.470399126726e-05 3471 KSP Residual norm 1.515942185649e-05 3472 KSP Residual norm 1.373776795099e-05 3473 KSP Residual norm 1.143048495215e-05 3474 KSP Residual norm 1.050086772088e-05 3475 KSP Residual norm 1.115458501625e-05 3476 KSP Residual norm 1.205072713730e-05 3477 KSP Residual norm 1.058380332615e-05 3478 KSP Residual norm 8.284614409351e-06 3479 KSP Residual norm 6.481384082508e-06 3480 KSP Residual norm 5.485943661615e-06 3481 KSP Residual norm 5.580044440494e-06 3482 KSP Residual norm 6.497775991127e-06 3483 KSP Residual norm 6.439384241457e-06 3484 KSP Residual norm 5.678117764058e-06 3485 KSP Residual norm 5.961734635828e-06 3486 KSP Residual norm 7.502834229859e-06 3487 KSP Residual norm 8.761081406637e-06 3488 KSP Residual norm 8.757232523749e-06 3489 KSP Residual norm 8.415591899829e-06 3490 KSP Residual norm 8.455132779463e-06 3491 KSP Residual norm 8.779996404614e-06 3492 KSP Residual norm 7.862788956484e-06 3493 KSP Residual norm 5.782450410104e-06 3494 KSP Residual norm 4.886546422831e-06 3495 KSP Residual norm 4.890724197692e-06 3496 KSP Residual norm 4.518768043227e-06 3497 KSP Residual norm 3.731502415144e-06 3498 KSP Residual norm 3.044744448705e-06 3499 KSP Residual norm 2.723397336297e-06 3500 KSP Residual norm 2.770578151290e-06 3501 KSP Residual norm 3.159166450596e-06 3502 KSP Residual norm 3.688241189002e-06 3503 KSP Residual norm 3.704320985747e-06 3504 KSP Residual norm 3.428489675987e-06 3505 KSP Residual norm 3.664874349453e-06 3506 KSP Residual norm 4.352063266174e-06 3507 KSP Residual norm 5.095564988528e-06 3508 KSP Residual norm 4.804007198632e-06 3509 KSP Residual norm 3.944664147018e-06 3510 KSP Residual norm 3.456089538284e-06 3511 KSP Residual norm 3.353091755052e-06 3512 KSP Residual norm 3.118295620289e-06 3513 KSP Residual norm 2.634647287076e-06 3514 KSP Residual norm 2.215318825704e-06 3515 KSP Residual norm 2.107536549470e-06 3516 KSP Residual norm 2.301761198864e-06 3517 KSP Residual norm 2.634306633637e-06 3518 KSP Residual norm 2.833486049921e-06 3519 KSP Residual norm 2.819906321408e-06 3520 KSP Residual norm 2.862448622959e-06 3521 KSP Residual norm 3.103444214171e-06 3522 KSP Residual norm 3.425086911027e-06 3523 KSP Residual norm 3.052250060194e-06 3524 KSP Residual norm 2.485378804797e-06 3525 KSP Residual norm 2.329194879576e-06 3526 KSP Residual norm 2.559072393455e-06 3527 KSP Residual norm 2.691869934022e-06 3528 KSP Residual norm 2.330669891274e-06 3529 KSP Residual norm 1.849617027315e-06 3530 KSP Residual norm 1.735085388705e-06 3531 KSP Residual norm 1.898541178296e-06 3532 KSP Residual norm 1.966065807046e-06 3533 KSP Residual norm 1.754529346136e-06 3534 KSP Residual norm 1.641007933640e-06 3535 KSP Residual norm 1.888080792190e-06 3536 KSP Residual norm 2.241713842927e-06 3537 KSP Residual norm 2.129589016449e-06 3538 KSP Residual norm 1.715606109555e-06 3539 KSP Residual norm 1.441516077675e-06 3540 KSP Residual norm 1.446329978727e-06 3541 KSP Residual norm 1.579618228880e-06 3542 KSP Residual norm 1.333454780343e-06 3543 KSP Residual norm 9.645580401823e-07 3544 KSP Residual norm 8.439710765374e-07 3545 KSP Residual norm 9.192985179820e-07 3546 KSP Residual norm 9.582221181273e-07 3547 KSP Residual norm 7.676738408684e-07 3548 KSP Residual norm 6.271246004834e-07 3549 KSP Residual norm 6.349349447433e-07 3550 KSP Residual norm 7.710621406193e-07 3551 KSP Residual norm 9.320923312000e-07 3552 KSP Residual norm 9.396954030959e-07 3553 KSP Residual norm 9.271868753915e-07 3554 KSP Residual norm 9.959760553364e-07 3555 KSP Residual norm 1.102413847981e-06 3556 KSP Residual norm 1.074699634744e-06 3557 KSP Residual norm 9.528623251440e-07 3558 KSP Residual norm 9.344857084589e-07 3559 KSP Residual norm 9.156164586489e-07 3560 KSP Residual norm 8.528582496735e-07 3561 KSP Residual norm 7.610264110662e-07 3562 KSP Residual norm 6.756343575947e-07 3563 KSP Residual norm 6.388532147174e-07 3564 KSP Residual norm 6.034252364984e-07 3565 KSP Residual norm 5.062399809483e-07 3566 KSP Residual norm 4.151200989298e-07 3567 KSP Residual norm 4.032793815091e-07 3568 KSP Residual norm 4.245988852464e-07 3569 KSP Residual norm 3.825914344821e-07 3570 KSP Residual norm 3.218384993194e-07 3571 KSP Residual norm 3.109639768267e-07 3572 KSP Residual norm 3.480219430295e-07 3573 KSP Residual norm 4.223271752347e-07 3574 KSP Residual norm 4.733245503122e-07 3575 KSP Residual norm 4.443357596441e-07 3576 KSP Residual norm 4.706528308369e-07 3577 KSP Residual norm 5.868780622195e-07 3578 KSP Residual norm 6.320335883680e-07 3579 KSP Residual norm 5.410744182738e-07 3580 KSP Residual norm 4.611244183073e-07 3581 KSP Residual norm 4.562796351967e-07 3582 KSP Residual norm 4.840517271442e-07 3583 KSP Residual norm 4.846836465960e-07 3584 KSP Residual norm 4.118916078670e-07 3585 KSP Residual norm 3.385529841682e-07 3586 KSP Residual norm 3.394438641770e-07 3587 KSP Residual norm 3.878073162125e-07 3588 KSP Residual norm 3.855427723957e-07 3589 KSP Residual norm 3.763532974457e-07 3590 KSP Residual norm 4.249176531873e-07 3591 KSP Residual norm 4.951792588903e-07 3592 KSP Residual norm 5.230653504428e-07 3593 KSP Residual norm 5.166776135398e-07 3594 KSP Residual norm 4.885661548536e-07 3595 KSP Residual norm 5.089913209592e-07 3596 KSP Residual norm 6.105817958582e-07 3597 KSP Residual norm 6.744076293441e-07 3598 KSP Residual norm 6.570618383792e-07 3599 KSP Residual norm 6.803180224173e-07 3600 KSP Residual norm 7.821022050445e-07 3601 KSP Residual norm 8.480486496661e-07 3602 KSP Residual norm 8.030118290118e-07 3603 KSP Residual norm 7.183263359764e-07 3604 KSP Residual norm 6.783240521457e-07 3605 KSP Residual norm 6.976403169247e-07 3606 KSP Residual norm 6.205059463396e-07 3607 KSP Residual norm 4.719339261862e-07 3608 KSP Residual norm 4.068526599426e-07 3609 KSP Residual norm 4.272043268057e-07 3610 KSP Residual norm 4.385974004489e-07 3611 KSP Residual norm 4.057838074545e-07 3612 KSP Residual norm 4.025013261647e-07 3613 KSP Residual norm 4.284246321281e-07 3614 KSP Residual norm 4.758598998461e-07 3615 KSP Residual norm 5.371053557119e-07 3616 KSP Residual norm 5.488740312323e-07 3617 KSP Residual norm 5.891733783876e-07 3618 KSP Residual norm 7.418999387560e-07 3619 KSP Residual norm 8.853355059951e-07 3620 KSP Residual norm 9.462025140680e-07 3621 KSP Residual norm 1.054787551529e-06 3622 KSP Residual norm 1.248844041606e-06 3623 KSP Residual norm 1.531262547759e-06 3624 KSP Residual norm 1.849772865716e-06 3625 KSP Residual norm 2.033684609760e-06 3626 KSP Residual norm 2.152610442800e-06 3627 KSP Residual norm 2.464354850171e-06 3628 KSP Residual norm 3.002453322179e-06 3629 KSP Residual norm 3.083836558514e-06 3630 KSP Residual norm 2.606788998609e-06 3631 KSP Residual norm 2.387570103142e-06 3632 KSP Residual norm 2.474155408846e-06 3633 KSP Residual norm 2.695110143048e-06 3634 KSP Residual norm 2.761896678164e-06 3635 KSP Residual norm 2.638706373999e-06 3636 KSP Residual norm 2.714194588856e-06 3637 KSP Residual norm 3.172702504316e-06 3638 KSP Residual norm 3.588080613719e-06 3639 KSP Residual norm 3.351696232698e-06 3640 KSP Residual norm 3.299169626506e-06 3641 KSP Residual norm 3.826184395197e-06 3642 KSP Residual norm 4.273069367044e-06 3643 KSP Residual norm 4.441033436856e-06 3644 KSP Residual norm 4.525507794219e-06 3645 KSP Residual norm 4.622571681548e-06 3646 KSP Residual norm 4.781589235052e-06 3647 KSP Residual norm 4.913356228468e-06 3648 KSP Residual norm 4.588498902656e-06 3649 KSP Residual norm 4.444980124599e-06 3650 KSP Residual norm 5.056860726362e-06 3651 KSP Residual norm 5.686530029130e-06 3652 KSP Residual norm 5.092280320456e-06 3653 KSP Residual norm 4.837444826180e-06 3654 KSP Residual norm 5.286485565352e-06 3655 KSP Residual norm 5.154481267991e-06 3656 KSP Residual norm 4.762643924650e-06 3657 KSP Residual norm 4.585610017958e-06 3658 KSP Residual norm 4.360509794952e-06 3659 KSP Residual norm 4.264795850216e-06 3660 KSP Residual norm 4.530978009758e-06 3661 KSP Residual norm 4.443603117495e-06 3662 KSP Residual norm 4.170560435769e-06 3663 KSP Residual norm 4.225170560361e-06 3664 KSP Residual norm 4.422941253743e-06 3665 KSP Residual norm 4.392045373930e-06 3666 KSP Residual norm 4.592818157570e-06 3667 KSP Residual norm 4.780611599568e-06 3668 KSP Residual norm 4.716905451106e-06 3669 KSP Residual norm 4.937769765162e-06 3670 KSP Residual norm 5.304756241094e-06 3671 KSP Residual norm 5.191910998404e-06 3672 KSP Residual norm 4.753438357897e-06 3673 KSP Residual norm 4.269504706704e-06 3674 KSP Residual norm 3.875836261946e-06 3675 KSP Residual norm 3.758530621443e-06 3676 KSP Residual norm 3.542483040917e-06 3677 KSP Residual norm 2.917160506852e-06 3678 KSP Residual norm 2.466230878100e-06 3679 KSP Residual norm 2.401609746696e-06 3680 KSP Residual norm 2.323964570817e-06 3681 KSP Residual norm 2.077019978141e-06 3682 KSP Residual norm 2.044328302573e-06 3683 KSP Residual norm 2.303704496015e-06 3684 KSP Residual norm 2.800020364772e-06 3685 KSP Residual norm 3.132232747929e-06 3686 KSP Residual norm 3.020493793944e-06 3687 KSP Residual norm 2.759433828957e-06 3688 KSP Residual norm 2.903929612669e-06 3689 KSP Residual norm 3.028016108264e-06 3690 KSP Residual norm 2.800120658122e-06 3691 KSP Residual norm 2.746148229032e-06 3692 KSP Residual norm 2.940429999851e-06 3693 KSP Residual norm 3.063173237658e-06 3694 KSP Residual norm 2.876877974440e-06 3695 KSP Residual norm 2.415717454831e-06 3696 KSP Residual norm 2.042197343663e-06 3697 KSP Residual norm 1.821052134087e-06 3698 KSP Residual norm 1.594767834728e-06 3699 KSP Residual norm 1.291190117978e-06 3700 KSP Residual norm 1.088918243555e-06 3701 KSP Residual norm 1.044565588725e-06 3702 KSP Residual norm 1.080917483961e-06 3703 KSP Residual norm 1.016866562235e-06 3704 KSP Residual norm 9.738967586824e-07 3705 KSP Residual norm 1.030118717800e-06 3706 KSP Residual norm 1.149573300664e-06 3707 KSP Residual norm 1.280360872830e-06 3708 KSP Residual norm 1.354074410667e-06 3709 KSP Residual norm 1.318511366349e-06 3710 KSP Residual norm 1.386781773062e-06 3711 KSP Residual norm 1.525080313129e-06 3712 KSP Residual norm 1.479698737472e-06 3713 KSP Residual norm 1.432648061976e-06 3714 KSP Residual norm 1.462875024770e-06 3715 KSP Residual norm 1.383965558921e-06 3716 KSP Residual norm 1.270742076136e-06 3717 KSP Residual norm 1.171423655535e-06 3718 KSP Residual norm 1.059754951912e-06 3719 KSP Residual norm 9.422430271395e-07 3720 KSP Residual norm 8.973334995884e-07 3721 KSP Residual norm 8.383454102375e-07 3722 KSP Residual norm 7.315684899432e-07 3723 KSP Residual norm 7.081525117260e-07 3724 KSP Residual norm 6.968663477808e-07 3725 KSP Residual norm 6.952172806436e-07 3726 KSP Residual norm 7.267344400176e-07 3727 KSP Residual norm 8.107590605001e-07 3728 KSP Residual norm 8.227224979989e-07 3729 KSP Residual norm 8.294665474125e-07 3730 KSP Residual norm 8.746843014578e-07 3731 KSP Residual norm 9.445583595244e-07 3732 KSP Residual norm 1.071064226952e-06 3733 KSP Residual norm 1.262859292283e-06 3734 KSP Residual norm 1.367596354232e-06 3735 KSP Residual norm 1.388293351752e-06 3736 KSP Residual norm 1.515373335978e-06 3737 KSP Residual norm 1.630040652584e-06 3738 KSP Residual norm 1.562416806984e-06 3739 KSP Residual norm 1.529240057852e-06 3740 KSP Residual norm 1.591940402724e-06 3741 KSP Residual norm 1.660031391113e-06 3742 KSP Residual norm 1.632973798744e-06 3743 KSP Residual norm 1.552814259800e-06 3744 KSP Residual norm 1.440819589805e-06 3745 KSP Residual norm 1.430247065142e-06 3746 KSP Residual norm 1.516318466711e-06 3747 KSP Residual norm 1.469908935461e-06 3748 KSP Residual norm 1.408366151022e-06 3749 KSP Residual norm 1.303558869278e-06 3750 KSP Residual norm 1.062321009734e-06 3751 KSP Residual norm 8.702210268361e-07 3752 KSP Residual norm 8.259925774839e-07 3753 KSP Residual norm 8.058986009137e-07 3754 KSP Residual norm 7.737775110823e-07 3755 KSP Residual norm 8.103513608520e-07 3756 KSP Residual norm 8.798638966731e-07 3757 KSP Residual norm 8.841383635747e-07 3758 KSP Residual norm 9.120599544767e-07 3759 KSP Residual norm 9.851031676935e-07 3760 KSP Residual norm 1.012672850688e-06 3761 KSP Residual norm 1.073177107955e-06 3762 KSP Residual norm 1.195683000042e-06 3763 KSP Residual norm 1.166148943804e-06 3764 KSP Residual norm 1.140841372740e-06 3765 KSP Residual norm 1.211455307420e-06 3766 KSP Residual norm 1.353990561219e-06 3767 KSP Residual norm 1.522463109306e-06 3768 KSP Residual norm 1.627904833774e-06 3769 KSP Residual norm 1.608470525836e-06 3770 KSP Residual norm 1.587587554794e-06 3771 KSP Residual norm 1.615992846795e-06 3772 KSP Residual norm 1.499939809079e-06 3773 KSP Residual norm 1.359532500993e-06 3774 KSP Residual norm 1.428909780163e-06 3775 KSP Residual norm 1.445303386031e-06 3776 KSP Residual norm 1.268380512460e-06 3777 KSP Residual norm 1.226031745217e-06 3778 KSP Residual norm 1.249574755914e-06 3779 KSP Residual norm 1.234927751935e-06 3780 KSP Residual norm 1.279231771040e-06 3781 KSP Residual norm 1.420421339161e-06 3782 KSP Residual norm 1.591164656229e-06 3783 KSP Residual norm 1.755560572054e-06 3784 KSP Residual norm 1.939712723472e-06 3785 KSP Residual norm 2.158903182487e-06 3786 KSP Residual norm 2.358977066217e-06 3787 KSP Residual norm 2.318318510283e-06 3788 KSP Residual norm 2.271296198857e-06 3789 KSP Residual norm 2.567608144724e-06 3790 KSP Residual norm 3.356690704413e-06 3791 KSP Residual norm 3.840776912695e-06 3792 KSP Residual norm 3.810442916231e-06 3793 KSP Residual norm 4.272043675216e-06 3794 KSP Residual norm 5.329524556348e-06 3795 KSP Residual norm 5.852636762068e-06 3796 KSP Residual norm 5.630433506794e-06 3797 KSP Residual norm 5.244693500175e-06 3798 KSP Residual norm 5.124463923441e-06 3799 KSP Residual norm 5.444720518857e-06 3800 KSP Residual norm 5.654689161350e-06 3801 KSP Residual norm 5.019189014019e-06 3802 KSP Residual norm 4.327008154834e-06 3803 KSP Residual norm 3.795545965977e-06 3804 KSP Residual norm 3.272145602977e-06 3805 KSP Residual norm 2.882268172501e-06 3806 KSP Residual norm 2.783864025375e-06 3807 KSP Residual norm 2.869413910309e-06 3808 KSP Residual norm 2.886645736391e-06 3809 KSP Residual norm 2.978978804025e-06 3810 KSP Residual norm 3.050697107902e-06 3811 KSP Residual norm 3.152653521429e-06 3812 KSP Residual norm 3.539045914679e-06 3813 KSP Residual norm 4.080952973986e-06 3814 KSP Residual norm 4.056366242479e-06 3815 KSP Residual norm 4.128619597827e-06 3816 KSP Residual norm 4.585346515874e-06 3817 KSP Residual norm 5.395200941763e-06 3818 KSP Residual norm 6.130422841083e-06 3819 KSP Residual norm 6.746848625216e-06 3820 KSP Residual norm 7.281251030174e-06 3821 KSP Residual norm 8.363904703077e-06 3822 KSP Residual norm 9.374561540588e-06 3823 KSP Residual norm 8.703124786078e-06 3824 KSP Residual norm 7.474884767157e-06 3825 KSP Residual norm 7.103133456034e-06 3826 KSP Residual norm 6.960668559380e-06 3827 KSP Residual norm 6.672238211021e-06 3828 KSP Residual norm 6.767395453705e-06 3829 KSP Residual norm 7.363019083227e-06 3830 KSP Residual norm 7.300953027647e-06 3831 KSP Residual norm 6.651025999016e-06 3832 KSP Residual norm 6.032274823082e-06 3833 KSP Residual norm 5.368710672885e-06 3834 KSP Residual norm 4.942438383253e-06 3835 KSP Residual norm 4.552462089858e-06 3836 KSP Residual norm 4.107684203262e-06 3837 KSP Residual norm 4.290079933508e-06 3838 KSP Residual norm 4.748851158122e-06 3839 KSP Residual norm 4.777990905957e-06 3840 KSP Residual norm 4.746523940651e-06 3841 KSP Residual norm 5.431783114485e-06 3842 KSP Residual norm 5.998721400670e-06 3843 KSP Residual norm 5.994891506696e-06 3844 KSP Residual norm 6.399145053679e-06 3845 KSP Residual norm 7.159641088552e-06 3846 KSP Residual norm 7.561793062474e-06 3847 KSP Residual norm 7.479628299684e-06 3848 KSP Residual norm 7.770825015306e-06 3849 KSP Residual norm 8.562821038079e-06 3850 KSP Residual norm 1.011488929149e-05 3851 KSP Residual norm 1.094132392167e-05 3852 KSP Residual norm 9.633134400814e-06 3853 KSP Residual norm 8.544313293936e-06 3854 KSP Residual norm 8.270855248278e-06 3855 KSP Residual norm 7.820370290377e-06 3856 KSP Residual norm 6.871965207121e-06 3857 KSP Residual norm 5.951786552651e-06 3858 KSP Residual norm 5.360502683583e-06 3859 KSP Residual norm 5.414101965805e-06 3860 KSP Residual norm 5.525984141113e-06 3861 KSP Residual norm 5.208047400202e-06 3862 KSP Residual norm 5.005375572901e-06 3863 KSP Residual norm 5.307256257210e-06 3864 KSP Residual norm 5.407369481090e-06 3865 KSP Residual norm 5.227287184623e-06 3866 KSP Residual norm 5.632868054008e-06 3867 KSP Residual norm 6.936464699798e-06 3868 KSP Residual norm 8.000146571746e-06 3869 KSP Residual norm 8.593621213014e-06 3870 KSP Residual norm 9.320002023306e-06 3871 KSP Residual norm 9.768270304624e-06 3872 KSP Residual norm 1.010525703557e-05 3873 KSP Residual norm 1.036052578915e-05 3874 KSP Residual norm 1.061429347273e-05 3875 KSP Residual norm 1.166408852793e-05 3876 KSP Residual norm 1.274939454818e-05 3877 KSP Residual norm 1.223446056877e-05 3878 KSP Residual norm 1.200067930319e-05 3879 KSP Residual norm 1.236379270639e-05 3880 KSP Residual norm 1.204555003139e-05 3881 KSP Residual norm 1.097562163311e-05 3882 KSP Residual norm 1.060829734514e-05 3883 KSP Residual norm 1.049679185549e-05 3884 KSP Residual norm 9.360125192350e-06 3885 KSP Residual norm 8.876918486251e-06 3886 KSP Residual norm 8.530351824767e-06 3887 KSP Residual norm 8.009290019789e-06 3888 KSP Residual norm 8.008801783341e-06 3889 KSP Residual norm 7.590950111824e-06 3890 KSP Residual norm 6.557974520820e-06 3891 KSP Residual norm 6.099623183748e-06 3892 KSP Residual norm 6.301827433178e-06 3893 KSP Residual norm 6.338419462456e-06 3894 KSP Residual norm 7.000523373679e-06 3895 KSP Residual norm 8.963951528421e-06 3896 KSP Residual norm 1.105496514305e-05 3897 KSP Residual norm 1.131477202679e-05 3898 KSP Residual norm 1.168041414311e-05 3899 KSP Residual norm 1.261022864377e-05 3900 KSP Residual norm 1.244473474677e-05 3901 KSP Residual norm 1.248614919990e-05 3902 KSP Residual norm 1.285753477646e-05 3903 KSP Residual norm 1.299399636268e-05 3904 KSP Residual norm 1.431623668462e-05 3905 KSP Residual norm 1.584639889953e-05 3906 KSP Residual norm 1.569555867591e-05 3907 KSP Residual norm 1.549631167506e-05 3908 KSP Residual norm 1.531167944840e-05 3909 KSP Residual norm 1.323549949407e-05 3910 KSP Residual norm 1.173037843572e-05 3911 KSP Residual norm 1.124040223446e-05 3912 KSP Residual norm 1.014137060834e-05 3913 KSP Residual norm 8.267592648466e-06 3914 KSP Residual norm 7.057529652661e-06 3915 KSP Residual norm 6.342773202148e-06 3916 KSP Residual norm 5.745664904008e-06 3917 KSP Residual norm 5.361638071649e-06 3918 KSP Residual norm 4.480233892138e-06 3919 KSP Residual norm 3.803935908037e-06 3920 KSP Residual norm 3.729284071624e-06 3921 KSP Residual norm 3.664406335496e-06 3922 KSP Residual norm 3.346082469532e-06 3923 KSP Residual norm 3.277710122132e-06 3924 KSP Residual norm 3.317248808822e-06 3925 KSP Residual norm 3.101985570883e-06 3926 KSP Residual norm 2.835559984315e-06 3927 KSP Residual norm 2.806202255822e-06 3928 KSP Residual norm 2.866816147590e-06 3929 KSP Residual norm 3.046421172426e-06 3930 KSP Residual norm 3.340653824567e-06 3931 KSP Residual norm 3.481752448748e-06 3932 KSP Residual norm 3.793235932999e-06 3933 KSP Residual norm 4.589915951669e-06 3934 KSP Residual norm 5.260319918914e-06 3935 KSP Residual norm 5.068741225741e-06 3936 KSP Residual norm 5.233519922258e-06 3937 KSP Residual norm 5.608754644663e-06 3938 KSP Residual norm 5.280688672832e-06 3939 KSP Residual norm 4.688827804566e-06 3940 KSP Residual norm 4.509159847625e-06 3941 KSP Residual norm 4.524163865744e-06 3942 KSP Residual norm 4.498420002468e-06 3943 KSP Residual norm 4.042225112591e-06 3944 KSP Residual norm 3.528011278487e-06 3945 KSP Residual norm 3.676270594623e-06 3946 KSP Residual norm 4.316137799525e-06 3947 KSP Residual norm 4.265000129805e-06 3948 KSP Residual norm 3.933686303661e-06 3949 KSP Residual norm 4.045733582554e-06 3950 KSP Residual norm 4.013671359101e-06 3951 KSP Residual norm 3.509156177225e-06 3952 KSP Residual norm 3.047449933389e-06 3953 KSP Residual norm 2.945008087037e-06 3954 KSP Residual norm 3.009269881032e-06 3955 KSP Residual norm 3.066857051109e-06 3956 KSP Residual norm 2.701925467700e-06 3957 KSP Residual norm 2.354299306823e-06 3958 KSP Residual norm 2.552126920559e-06 3959 KSP Residual norm 3.169367356218e-06 3960 KSP Residual norm 3.524024537499e-06 3961 KSP Residual norm 3.596042564301e-06 3962 KSP Residual norm 3.903056061934e-06 3963 KSP Residual norm 4.090661779472e-06 3964 KSP Residual norm 4.117562810927e-06 3965 KSP Residual norm 4.164906567608e-06 3966 KSP Residual norm 4.174242155959e-06 3967 KSP Residual norm 4.397279913906e-06 3968 KSP Residual norm 4.619749380131e-06 3969 KSP Residual norm 4.319395139712e-06 3970 KSP Residual norm 4.222769983938e-06 3971 KSP Residual norm 4.695183902939e-06 3972 KSP Residual norm 4.979577503148e-06 3973 KSP Residual norm 4.546814703982e-06 3974 KSP Residual norm 4.508707065855e-06 3975 KSP Residual norm 4.868374733462e-06 3976 KSP Residual norm 4.872509966033e-06 3977 KSP Residual norm 4.541927740416e-06 3978 KSP Residual norm 3.949368491098e-06 3979 KSP Residual norm 3.364234175214e-06 3980 KSP Residual norm 3.076584556032e-06 3981 KSP Residual norm 2.877187798763e-06 3982 KSP Residual norm 2.602491799925e-06 3983 KSP Residual norm 2.547247921317e-06 3984 KSP Residual norm 2.342735656875e-06 3985 KSP Residual norm 1.810269522062e-06 3986 KSP Residual norm 1.524509617965e-06 3987 KSP Residual norm 1.473619204396e-06 3988 KSP Residual norm 1.425998943179e-06 3989 KSP Residual norm 1.365835136384e-06 3990 KSP Residual norm 1.321311529931e-06 3991 KSP Residual norm 1.239468999999e-06 3992 KSP Residual norm 1.200630334762e-06 3993 KSP Residual norm 1.177486702682e-06 3994 KSP Residual norm 1.146270919585e-06 3995 KSP Residual norm 1.203034759077e-06 3996 KSP Residual norm 1.361136739204e-06 3997 KSP Residual norm 1.394054599714e-06 3998 KSP Residual norm 1.340223508940e-06 3999 KSP Residual norm 1.529349104796e-06 4000 KSP Residual norm 1.843794676089e-06 4001 KSP Residual norm 2.011938517545e-06 4002 KSP Residual norm 2.200087542422e-06 4003 KSP Residual norm 2.664630149047e-06 4004 KSP Residual norm 3.337513912444e-06 4005 KSP Residual norm 3.791481240069e-06 4006 KSP Residual norm 3.675329690418e-06 4007 KSP Residual norm 3.344306223204e-06 4008 KSP Residual norm 3.459187591552e-06 4009 KSP Residual norm 4.008910872003e-06 4010 KSP Residual norm 4.241212035824e-06 4011 KSP Residual norm 4.117465405779e-06 4012 KSP Residual norm 4.498301821306e-06 4013 KSP Residual norm 5.067254086874e-06 4014 KSP Residual norm 5.110000013286e-06 4015 KSP Residual norm 4.834952253881e-06 4016 KSP Residual norm 4.392434146129e-06 4017 KSP Residual norm 4.203026242153e-06 4018 KSP Residual norm 4.430658429098e-06 4019 KSP Residual norm 4.400852175412e-06 4020 KSP Residual norm 4.095238261669e-06 4021 KSP Residual norm 3.689313397723e-06 4022 KSP Residual norm 3.118165644290e-06 4023 KSP Residual norm 2.416568615801e-06 4024 KSP Residual norm 2.221954449922e-06 4025 KSP Residual norm 2.282377786359e-06 4026 KSP Residual norm 2.226676689853e-06 4027 KSP Residual norm 2.255980249626e-06 4028 KSP Residual norm 2.550292415727e-06 4029 KSP Residual norm 2.863728156467e-06 4030 KSP Residual norm 3.063258678214e-06 4031 KSP Residual norm 3.199212696736e-06 4032 KSP Residual norm 3.353509394471e-06 4033 KSP Residual norm 3.696604520907e-06 4034 KSP Residual norm 4.516201046009e-06 4035 KSP Residual norm 5.234253087695e-06 4036 KSP Residual norm 4.997151472966e-06 4037 KSP Residual norm 5.018256341283e-06 4038 KSP Residual norm 5.456310802496e-06 4039 KSP Residual norm 5.480412341116e-06 4040 KSP Residual norm 5.201160480214e-06 4041 KSP Residual norm 5.570747285710e-06 4042 KSP Residual norm 6.208550888128e-06 4043 KSP Residual norm 6.979833294015e-06 4044 KSP Residual norm 7.502132814159e-06 4045 KSP Residual norm 7.106557773647e-06 4046 KSP Residual norm 6.711742832633e-06 4047 KSP Residual norm 6.173565996458e-06 4048 KSP Residual norm 5.024413271093e-06 4049 KSP Residual norm 4.158467148615e-06 4050 KSP Residual norm 3.908694618079e-06 4051 KSP Residual norm 3.827551925999e-06 4052 KSP Residual norm 3.560695861823e-06 4053 KSP Residual norm 3.457697050780e-06 4054 KSP Residual norm 3.699141814143e-06 4055 KSP Residual norm 3.897364526619e-06 4056 KSP Residual norm 3.849794339861e-06 4057 KSP Residual norm 3.805542853945e-06 4058 KSP Residual norm 3.737819714822e-06 4059 KSP Residual norm 3.856330607394e-06 4060 KSP Residual norm 4.220779335093e-06 4061 KSP Residual norm 4.602696295204e-06 4062 KSP Residual norm 4.858600525129e-06 4063 KSP Residual norm 5.165473240074e-06 4064 KSP Residual norm 5.539455763645e-06 4065 KSP Residual norm 5.787367508374e-06 4066 KSP Residual norm 5.995999606028e-06 4067 KSP Residual norm 5.976195633230e-06 4068 KSP Residual norm 6.180948496748e-06 4069 KSP Residual norm 7.460770120668e-06 4070 KSP Residual norm 8.970797199034e-06 4071 KSP Residual norm 9.287737191090e-06 4072 KSP Residual norm 9.385475164838e-06 4073 KSP Residual norm 1.053733436957e-05 4074 KSP Residual norm 1.217077813649e-05 4075 KSP Residual norm 1.289568087582e-05 4076 KSP Residual norm 1.139917218444e-05 4077 KSP Residual norm 1.055754089672e-05 4078 KSP Residual norm 1.166916257122e-05 4079 KSP Residual norm 1.274780549066e-05 4080 KSP Residual norm 1.166800446754e-05 4081 KSP Residual norm 9.875329156147e-06 4082 KSP Residual norm 8.702883473273e-06 4083 KSP Residual norm 8.345822535095e-06 4084 KSP Residual norm 8.666013393134e-06 4085 KSP Residual norm 8.655426618629e-06 4086 KSP Residual norm 7.814370630228e-06 4087 KSP Residual norm 7.106978803345e-06 4088 KSP Residual norm 6.646471373602e-06 4089 KSP Residual norm 5.814288829641e-06 4090 KSP Residual norm 5.104821438660e-06 4091 KSP Residual norm 4.835549919843e-06 4092 KSP Residual norm 4.383854980778e-06 4093 KSP Residual norm 4.132393430280e-06 4094 KSP Residual norm 4.363886597331e-06 4095 KSP Residual norm 4.621913466172e-06 4096 KSP Residual norm 4.891155336691e-06 4097 KSP Residual norm 5.377049794437e-06 4098 KSP Residual norm 5.976734625296e-06 4099 KSP Residual norm 6.594682366503e-06 4100 KSP Residual norm 7.633527454703e-06 4101 KSP Residual norm 8.557440255803e-06 4102 KSP Residual norm 9.302727677549e-06 4103 KSP Residual norm 1.047775248236e-05 4104 KSP Residual norm 1.155736630998e-05 4105 KSP Residual norm 1.196199470740e-05 4106 KSP Residual norm 1.333258453809e-05 4107 KSP Residual norm 1.517367040753e-05 4108 KSP Residual norm 1.536897466647e-05 4109 KSP Residual norm 1.565314958931e-05 4110 KSP Residual norm 1.653095810263e-05 4111 KSP Residual norm 1.657718872429e-05 4112 KSP Residual norm 1.637679883608e-05 4113 KSP Residual norm 1.712729905392e-05 4114 KSP Residual norm 1.827012918296e-05 4115 KSP Residual norm 2.001693351164e-05 4116 KSP Residual norm 2.260199628011e-05 4117 KSP Residual norm 2.275595044511e-05 4118 KSP Residual norm 2.179091636973e-05 4119 KSP Residual norm 2.241251751890e-05 4120 KSP Residual norm 2.334921776753e-05 4121 KSP Residual norm 2.170480640827e-05 4122 KSP Residual norm 1.843602302234e-05 4123 KSP Residual norm 1.645148916452e-05 4124 KSP Residual norm 1.505553614366e-05 4125 KSP Residual norm 1.314748398611e-05 4126 KSP Residual norm 1.181953292705e-05 4127 KSP Residual norm 1.052845508550e-05 4128 KSP Residual norm 9.396535923326e-06 4129 KSP Residual norm 8.202783677955e-06 4130 KSP Residual norm 7.089183818425e-06 4131 KSP Residual norm 6.500785833614e-06 4132 KSP Residual norm 6.386832066749e-06 4133 KSP Residual norm 6.243715143113e-06 4134 KSP Residual norm 5.857783180655e-06 4135 KSP Residual norm 5.501965527903e-06 4136 KSP Residual norm 5.242630266802e-06 4137 KSP Residual norm 4.907543913057e-06 4138 KSP Residual norm 4.628566921256e-06 4139 KSP Residual norm 4.611593892441e-06 4140 KSP Residual norm 4.514538629071e-06 4141 KSP Residual norm 4.107052713577e-06 4142 KSP Residual norm 4.007340601790e-06 4143 KSP Residual norm 4.144366122049e-06 4144 KSP Residual norm 4.275942082769e-06 4145 KSP Residual norm 4.488666524149e-06 4146 KSP Residual norm 4.926035783454e-06 4147 KSP Residual norm 5.682161572272e-06 4148 KSP Residual norm 6.577037864479e-06 4149 KSP Residual norm 6.845866035184e-06 4150 KSP Residual norm 6.850846673560e-06 4151 KSP Residual norm 7.131482459077e-06 4152 KSP Residual norm 7.048261907960e-06 4153 KSP Residual norm 6.478639044787e-06 4154 KSP Residual norm 6.495468853249e-06 4155 KSP Residual norm 7.006336672524e-06 4156 KSP Residual norm 7.312898904866e-06 4157 KSP Residual norm 6.935940741479e-06 4158 KSP Residual norm 6.735395918752e-06 4159 KSP Residual norm 6.973365129706e-06 4160 KSP Residual norm 7.237348493477e-06 4161 KSP Residual norm 7.181443764604e-06 4162 KSP Residual norm 6.652732695176e-06 4163 KSP Residual norm 6.456683700004e-06 4164 KSP Residual norm 7.105718405442e-06 4165 KSP Residual norm 7.696015030319e-06 4166 KSP Residual norm 7.342599336740e-06 4167 KSP Residual norm 7.001114677846e-06 4168 KSP Residual norm 6.574312698020e-06 4169 KSP Residual norm 6.094440598016e-06 4170 KSP Residual norm 5.915977946512e-06 4171 KSP Residual norm 5.599102971935e-06 4172 KSP Residual norm 5.233525468123e-06 4173 KSP Residual norm 4.921330882513e-06 4174 KSP Residual norm 4.553649271811e-06 4175 KSP Residual norm 4.244318384535e-06 4176 KSP Residual norm 4.305646146491e-06 4177 KSP Residual norm 3.945855349671e-06 4178 KSP Residual norm 3.398451631528e-06 4179 KSP Residual norm 3.257608985690e-06 4180 KSP Residual norm 3.382528656754e-06 4181 KSP Residual norm 3.361547141288e-06 4182 KSP Residual norm 3.193208688971e-06 4183 KSP Residual norm 2.967653373360e-06 4184 KSP Residual norm 2.855727523872e-06 4185 KSP Residual norm 2.965920373464e-06 4186 KSP Residual norm 3.002917685308e-06 4187 KSP Residual norm 2.803978484555e-06 4188 KSP Residual norm 2.847121664869e-06 4189 KSP Residual norm 3.166792711413e-06 4190 KSP Residual norm 3.451108600626e-06 4191 KSP Residual norm 3.723813792282e-06 4192 KSP Residual norm 4.042502090498e-06 4193 KSP Residual norm 4.228689771935e-06 4194 KSP Residual norm 4.189246497762e-06 4195 KSP Residual norm 4.269988106955e-06 4196 KSP Residual norm 4.280805231644e-06 4197 KSP Residual norm 4.248465731037e-06 4198 KSP Residual norm 4.467084032637e-06 4199 KSP Residual norm 4.632167947787e-06 4200 KSP Residual norm 4.674867666761e-06 4201 KSP Residual norm 4.605998722810e-06 4202 KSP Residual norm 4.406649001944e-06 4203 KSP Residual norm 4.057724046604e-06 4204 KSP Residual norm 3.942681099657e-06 4205 KSP Residual norm 3.943303815205e-06 4206 KSP Residual norm 3.821869792395e-06 4207 KSP Residual norm 3.934111893158e-06 4208 KSP Residual norm 4.137367832785e-06 4209 KSP Residual norm 4.287765846688e-06 4210 KSP Residual norm 4.351902983190e-06 4211 KSP Residual norm 4.382005655466e-06 4212 KSP Residual norm 4.192428235014e-06 4213 KSP Residual norm 4.006894746467e-06 4214 KSP Residual norm 3.727023150471e-06 4215 KSP Residual norm 3.415709359617e-06 4216 KSP Residual norm 3.177829024506e-06 4217 KSP Residual norm 2.953104272232e-06 4218 KSP Residual norm 2.448728932112e-06 4219 KSP Residual norm 2.127380426850e-06 4220 KSP Residual norm 2.167212292649e-06 4221 KSP Residual norm 2.121896081593e-06 4222 KSP Residual norm 1.837010550971e-06 4223 KSP Residual norm 1.607743892699e-06 4224 KSP Residual norm 1.566440487234e-06 4225 KSP Residual norm 1.632292557909e-06 4226 KSP Residual norm 1.638334417496e-06 4227 KSP Residual norm 1.480794250285e-06 4228 KSP Residual norm 1.414953044023e-06 4229 KSP Residual norm 1.534513263322e-06 4230 KSP Residual norm 1.630558671848e-06 4231 KSP Residual norm 1.468124533284e-06 4232 KSP Residual norm 1.395477422569e-06 4233 KSP Residual norm 1.555642834887e-06 4234 KSP Residual norm 1.633758904011e-06 4235 KSP Residual norm 1.549930657296e-06 4236 KSP Residual norm 1.584122417914e-06 4237 KSP Residual norm 1.682593737976e-06 4238 KSP Residual norm 1.724214684835e-06 4239 KSP Residual norm 1.772062759346e-06 4240 KSP Residual norm 1.921401722302e-06 4241 KSP Residual norm 2.081269471552e-06 4242 KSP Residual norm 2.236763876521e-06 4243 KSP Residual norm 2.477040469254e-06 4244 KSP Residual norm 2.916802775968e-06 4245 KSP Residual norm 3.333855715627e-06 4246 KSP Residual norm 3.319341947077e-06 4247 KSP Residual norm 2.731942171871e-06 4248 KSP Residual norm 2.562847919669e-06 4249 KSP Residual norm 2.959630763653e-06 4250 KSP Residual norm 3.515310485493e-06 4251 KSP Residual norm 3.622689106853e-06 4252 KSP Residual norm 3.478491560088e-06 4253 KSP Residual norm 3.358835631648e-06 4254 KSP Residual norm 3.497170532034e-06 4255 KSP Residual norm 3.491428057276e-06 4256 KSP Residual norm 3.154074135817e-06 4257 KSP Residual norm 2.882628689550e-06 4258 KSP Residual norm 2.891798302277e-06 4259 KSP Residual norm 2.909831715405e-06 4260 KSP Residual norm 2.799883196286e-06 4261 KSP Residual norm 2.725386020572e-06 4262 KSP Residual norm 2.703398492820e-06 4263 KSP Residual norm 2.682173355203e-06 4264 KSP Residual norm 2.607176062333e-06 4265 KSP Residual norm 2.461475066157e-06 4266 KSP Residual norm 2.424246556766e-06 4267 KSP Residual norm 2.669621244605e-06 4268 KSP Residual norm 2.996777956464e-06 4269 KSP Residual norm 2.967553561392e-06 4270 KSP Residual norm 2.781821486019e-06 4271 KSP Residual norm 2.633411867466e-06 4272 KSP Residual norm 2.415358428900e-06 4273 KSP Residual norm 2.067827469207e-06 4274 KSP Residual norm 1.677127016659e-06 4275 KSP Residual norm 1.355639858057e-06 4276 KSP Residual norm 1.089178159621e-06 4277 KSP Residual norm 9.654021069062e-07 4278 KSP Residual norm 9.554188609288e-07 4279 KSP Residual norm 9.589395567582e-07 4280 KSP Residual norm 9.660220838651e-07 4281 KSP Residual norm 1.036846196427e-06 4282 KSP Residual norm 1.169906126471e-06 4283 KSP Residual norm 1.270346181951e-06 4284 KSP Residual norm 1.223851414313e-06 4285 KSP Residual norm 1.240875331397e-06 4286 KSP Residual norm 1.439635868316e-06 4287 KSP Residual norm 1.617343148914e-06 4288 KSP Residual norm 1.614055202169e-06 4289 KSP Residual norm 1.631487945264e-06 4290 KSP Residual norm 1.792174187594e-06 4291 KSP Residual norm 1.998294068056e-06 4292 KSP Residual norm 1.948884793372e-06 4293 KSP Residual norm 1.834942802432e-06 4294 KSP Residual norm 2.003001016979e-06 4295 KSP Residual norm 2.395998818173e-06 4296 KSP Residual norm 2.789322018390e-06 4297 KSP Residual norm 2.910482086507e-06 4298 KSP Residual norm 3.086246240411e-06 4299 KSP Residual norm 3.318832561079e-06 4300 KSP Residual norm 3.341385958344e-06 4301 KSP Residual norm 3.201204157430e-06 4302 KSP Residual norm 3.248139332964e-06 4303 KSP Residual norm 3.438922278525e-06 4304 KSP Residual norm 3.513516513813e-06 4305 KSP Residual norm 3.646107831200e-06 4306 KSP Residual norm 4.139114702900e-06 4307 KSP Residual norm 4.471531992517e-06 4308 KSP Residual norm 4.353716135277e-06 4309 KSP Residual norm 4.225490123476e-06 4310 KSP Residual norm 4.230082648201e-06 4311 KSP Residual norm 4.183758217792e-06 4312 KSP Residual norm 3.893599540989e-06 4313 KSP Residual norm 3.452511837958e-06 4314 KSP Residual norm 3.344634552027e-06 4315 KSP Residual norm 3.432591511071e-06 4316 KSP Residual norm 3.322104594250e-06 4317 KSP Residual norm 2.813047682435e-06 4318 KSP Residual norm 2.304809788223e-06 4319 KSP Residual norm 1.994906105869e-06 4320 KSP Residual norm 1.817887964899e-06 4321 KSP Residual norm 1.622995571409e-06 4322 KSP Residual norm 1.413132869975e-06 4323 KSP Residual norm 1.267644010283e-06 4324 KSP Residual norm 1.188085038779e-06 4325 KSP Residual norm 1.133243672966e-06 4326 KSP Residual norm 1.131879290789e-06 4327 KSP Residual norm 1.255232951950e-06 4328 KSP Residual norm 1.386376204335e-06 4329 KSP Residual norm 1.431313603434e-06 4330 KSP Residual norm 1.563338915860e-06 4331 KSP Residual norm 1.771380685949e-06 4332 KSP Residual norm 2.092735726560e-06 4333 KSP Residual norm 2.307503796122e-06 4334 KSP Residual norm 2.384177127872e-06 4335 KSP Residual norm 2.298069130630e-06 4336 KSP Residual norm 2.396112326167e-06 4337 KSP Residual norm 2.597298094083e-06 4338 KSP Residual norm 2.498746433333e-06 4339 KSP Residual norm 2.418497130708e-06 4340 KSP Residual norm 2.514109444758e-06 4341 KSP Residual norm 2.563487711016e-06 4342 KSP Residual norm 2.526385862442e-06 4343 KSP Residual norm 2.591379084774e-06 4344 KSP Residual norm 2.813137616320e-06 4345 KSP Residual norm 3.110540830248e-06 4346 KSP Residual norm 3.485807876087e-06 4347 KSP Residual norm 3.743827377611e-06 4348 KSP Residual norm 3.739646219966e-06 4349 KSP Residual norm 3.612212867962e-06 4350 KSP Residual norm 3.347285594573e-06 4351 KSP Residual norm 3.378069573082e-06 4352 KSP Residual norm 3.990921819713e-06 4353 KSP Residual norm 4.694724497166e-06 4354 KSP Residual norm 4.903688844549e-06 4355 KSP Residual norm 4.822804565045e-06 4356 KSP Residual norm 4.878511309625e-06 4357 KSP Residual norm 5.425295586565e-06 4358 KSP Residual norm 5.853320437057e-06 4359 KSP Residual norm 5.403285175286e-06 4360 KSP Residual norm 4.749194288277e-06 4361 KSP Residual norm 4.335898759241e-06 4362 KSP Residual norm 3.994356583634e-06 4363 KSP Residual norm 3.650049186537e-06 4364 KSP Residual norm 3.631398640575e-06 4365 KSP Residual norm 3.933708524009e-06 4366 KSP Residual norm 4.118526849987e-06 4367 KSP Residual norm 4.415451212628e-06 4368 KSP Residual norm 5.041851756737e-06 4369 KSP Residual norm 5.307034722625e-06 4370 KSP Residual norm 5.176011314779e-06 4371 KSP Residual norm 4.915249601529e-06 4372 KSP Residual norm 5.075262726313e-06 4373 KSP Residual norm 4.983121889618e-06 4374 KSP Residual norm 4.147099170085e-06 4375 KSP Residual norm 3.374100530943e-06 4376 KSP Residual norm 3.054932858654e-06 4377 KSP Residual norm 2.837680552635e-06 4378 KSP Residual norm 2.594987942371e-06 4379 KSP Residual norm 2.436729276799e-06 4380 KSP Residual norm 2.509332244263e-06 4381 KSP Residual norm 2.669321750652e-06 4382 KSP Residual norm 2.597587525864e-06 4383 KSP Residual norm 2.492579133841e-06 4384 KSP Residual norm 2.535687513017e-06 4385 KSP Residual norm 2.707975205932e-06 4386 KSP Residual norm 2.633742717142e-06 4387 KSP Residual norm 2.373322874908e-06 4388 KSP Residual norm 2.242950101315e-06 4389 KSP Residual norm 2.394657835214e-06 4390 KSP Residual norm 2.655465127535e-06 4391 KSP Residual norm 2.829479683996e-06 4392 KSP Residual norm 3.093722077234e-06 4393 KSP Residual norm 3.230622554869e-06 4394 KSP Residual norm 3.184359045103e-06 4395 KSP Residual norm 3.168739761989e-06 4396 KSP Residual norm 3.187274237746e-06 4397 KSP Residual norm 3.207736753087e-06 4398 KSP Residual norm 3.307641792369e-06 4399 KSP Residual norm 3.479991195996e-06 4400 KSP Residual norm 3.572773025726e-06 4401 KSP Residual norm 3.677226408261e-06 4402 KSP Residual norm 4.051770057382e-06 4403 KSP Residual norm 4.608219130823e-06 4404 KSP Residual norm 5.168494823681e-06 4405 KSP Residual norm 5.687524764200e-06 4406 KSP Residual norm 5.699117606999e-06 4407 KSP Residual norm 5.387006610369e-06 4408 KSP Residual norm 5.008564788924e-06 4409 KSP Residual norm 5.104144350996e-06 4410 KSP Residual norm 5.071999975338e-06 4411 KSP Residual norm 4.716659315477e-06 4412 KSP Residual norm 4.047400390550e-06 4413 KSP Residual norm 3.524399409029e-06 4414 KSP Residual norm 3.153073602726e-06 4415 KSP Residual norm 2.771141697348e-06 4416 KSP Residual norm 2.488156928148e-06 4417 KSP Residual norm 2.501274152012e-06 4418 KSP Residual norm 2.769063756245e-06 4419 KSP Residual norm 2.938975386831e-06 4420 KSP Residual norm 2.929243943111e-06 4421 KSP Residual norm 2.866633398039e-06 4422 KSP Residual norm 2.890806216267e-06 4423 KSP Residual norm 2.805212436385e-06 4424 KSP Residual norm 2.651751276045e-06 4425 KSP Residual norm 2.666488569370e-06 4426 KSP Residual norm 2.820671457985e-06 4427 KSP Residual norm 2.752627651786e-06 4428 KSP Residual norm 2.495361295055e-06 4429 KSP Residual norm 2.564059122930e-06 4430 KSP Residual norm 3.130947125063e-06 4431 KSP Residual norm 3.621144862326e-06 4432 KSP Residual norm 3.855315313124e-06 4433 KSP Residual norm 4.396619428352e-06 4434 KSP Residual norm 5.074272941338e-06 4435 KSP Residual norm 5.250246048783e-06 4436 KSP Residual norm 4.885706001022e-06 4437 KSP Residual norm 4.559150824797e-06 4438 KSP Residual norm 4.795783798655e-06 4439 KSP Residual norm 5.435554062130e-06 4440 KSP Residual norm 5.841644365006e-06 4441 KSP Residual norm 6.371303195893e-06 4442 KSP Residual norm 6.955297884843e-06 4443 KSP Residual norm 7.490238121387e-06 4444 KSP Residual norm 7.769888446280e-06 4445 KSP Residual norm 8.114891731851e-06 4446 KSP Residual norm 8.361760067799e-06 4447 KSP Residual norm 8.102183991695e-06 4448 KSP Residual norm 7.549749613660e-06 4449 KSP Residual norm 7.214096925373e-06 4450 KSP Residual norm 7.064276138927e-06 4451 KSP Residual norm 6.854460904079e-06 4452 KSP Residual norm 6.760366115516e-06 4453 KSP Residual norm 7.309128393727e-06 4454 KSP Residual norm 7.951264282768e-06 4455 KSP Residual norm 7.773578769679e-06 4456 KSP Residual norm 7.332955605206e-06 4457 KSP Residual norm 7.342827094526e-06 4458 KSP Residual norm 7.904867340596e-06 4459 KSP Residual norm 8.401570840425e-06 4460 KSP Residual norm 8.360004048710e-06 4461 KSP Residual norm 8.311580295044e-06 4462 KSP Residual norm 8.167812088042e-06 4463 KSP Residual norm 7.444628037929e-06 4464 KSP Residual norm 6.681253149968e-06 4465 KSP Residual norm 6.671156951681e-06 4466 KSP Residual norm 6.706863761832e-06 4467 KSP Residual norm 5.892922219718e-06 4468 KSP Residual norm 4.999490780900e-06 4469 KSP Residual norm 4.565760519464e-06 4470 KSP Residual norm 4.459653743128e-06 4471 KSP Residual norm 4.775012438301e-06 4472 KSP Residual norm 5.240229905226e-06 4473 KSP Residual norm 5.616664893734e-06 4474 KSP Residual norm 6.185344965312e-06 4475 KSP Residual norm 6.803739092162e-06 4476 KSP Residual norm 6.965123581004e-06 4477 KSP Residual norm 6.439302038531e-06 4478 KSP Residual norm 6.558283934702e-06 4479 KSP Residual norm 7.540512363700e-06 4480 KSP Residual norm 7.913329667216e-06 4481 KSP Residual norm 7.342274049029e-06 4482 KSP Residual norm 6.416642465014e-06 4483 KSP Residual norm 5.923314890366e-06 4484 KSP Residual norm 6.141487794981e-06 4485 KSP Residual norm 6.459220139118e-06 4486 KSP Residual norm 6.545254562034e-06 4487 KSP Residual norm 6.600584986305e-06 4488 KSP Residual norm 6.740721082371e-06 4489 KSP Residual norm 7.139007115596e-06 4490 KSP Residual norm 8.107526160043e-06 4491 KSP Residual norm 9.081581939658e-06 4492 KSP Residual norm 9.214070276925e-06 4493 KSP Residual norm 9.116018302961e-06 4494 KSP Residual norm 9.625206565481e-06 4495 KSP Residual norm 1.061411598445e-05 4496 KSP Residual norm 1.130551164552e-05 4497 KSP Residual norm 1.116320246877e-05 4498 KSP Residual norm 1.112396542288e-05 4499 KSP Residual norm 1.173686805243e-05 4500 KSP Residual norm 1.155355894180e-05 4501 KSP Residual norm 1.016868896524e-05 4502 KSP Residual norm 9.778276701099e-06 4503 KSP Residual norm 1.056114391616e-05 4504 KSP Residual norm 1.144800494972e-05 4505 KSP Residual norm 1.210804883209e-05 4506 KSP Residual norm 1.252087052071e-05 4507 KSP Residual norm 1.252341619199e-05 4508 KSP Residual norm 1.242902150396e-05 4509 KSP Residual norm 1.222792924752e-05 4510 KSP Residual norm 1.180686490712e-05 4511 KSP Residual norm 1.155630615779e-05 4512 KSP Residual norm 1.087258791359e-05 4513 KSP Residual norm 1.034971408973e-05 4514 KSP Residual norm 1.038218562463e-05 4515 KSP Residual norm 1.103422117608e-05 4516 KSP Residual norm 1.144834622760e-05 4517 KSP Residual norm 1.172755837694e-05 4518 KSP Residual norm 1.193605736013e-05 4519 KSP Residual norm 1.170755220226e-05 4520 KSP Residual norm 1.068672246546e-05 4521 KSP Residual norm 1.002029540849e-05 4522 KSP Residual norm 9.699503065243e-06 4523 KSP Residual norm 9.162479341425e-06 4524 KSP Residual norm 8.537419769763e-06 4525 KSP Residual norm 7.791564441253e-06 4526 KSP Residual norm 7.128015954438e-06 4527 KSP Residual norm 7.247218491598e-06 4528 KSP Residual norm 7.652775468190e-06 4529 KSP Residual norm 7.653175202942e-06 4530 KSP Residual norm 8.086025007614e-06 4531 KSP Residual norm 8.093586854434e-06 4532 KSP Residual norm 7.124222206751e-06 4533 KSP Residual norm 6.659825944938e-06 4534 KSP Residual norm 6.484545798744e-06 4535 KSP Residual norm 6.031195600517e-06 4536 KSP Residual norm 5.840510591402e-06 4537 KSP Residual norm 5.982410869574e-06 4538 KSP Residual norm 6.172897568264e-06 4539 KSP Residual norm 6.209256490269e-06 4540 KSP Residual norm 6.064944952514e-06 4541 KSP Residual norm 6.072867499162e-06 4542 KSP Residual norm 6.496617683122e-06 4543 KSP Residual norm 7.158264554275e-06 4544 KSP Residual norm 7.690710223765e-06 4545 KSP Residual norm 8.202153087805e-06 4546 KSP Residual norm 8.585621377010e-06 4547 KSP Residual norm 7.771091841228e-06 4548 KSP Residual norm 6.605450835758e-06 4549 KSP Residual norm 6.576624972396e-06 4550 KSP Residual norm 7.337735043230e-06 4551 KSP Residual norm 7.771840789165e-06 4552 KSP Residual norm 7.808874092485e-06 4553 KSP Residual norm 8.145565959961e-06 4554 KSP Residual norm 8.928032880868e-06 4555 KSP Residual norm 9.322811356249e-06 4556 KSP Residual norm 9.217783808124e-06 4557 KSP Residual norm 9.944089465485e-06 4558 KSP Residual norm 1.108833428478e-05 4559 KSP Residual norm 1.092559756902e-05 4560 KSP Residual norm 9.927248582039e-06 4561 KSP Residual norm 9.731167520721e-06 4562 KSP Residual norm 1.001294096719e-05 4563 KSP Residual norm 9.994248050890e-06 4564 KSP Residual norm 1.057288336630e-05 4565 KSP Residual norm 1.055784070555e-05 4566 KSP Residual norm 9.918863328643e-06 4567 KSP Residual norm 9.476588810202e-06 4568 KSP Residual norm 9.087952934058e-06 4569 KSP Residual norm 7.914313226388e-06 4570 KSP Residual norm 6.219620528958e-06 4571 KSP Residual norm 4.970715208319e-06 4572 KSP Residual norm 4.381074048155e-06 4573 KSP Residual norm 4.257037374704e-06 4574 KSP Residual norm 3.991327349578e-06 4575 KSP Residual norm 3.470399412198e-06 4576 KSP Residual norm 2.981084473327e-06 4577 KSP Residual norm 2.870528967907e-06 4578 KSP Residual norm 2.978460735544e-06 4579 KSP Residual norm 3.123069793912e-06 4580 KSP Residual norm 3.001584913147e-06 4581 KSP Residual norm 2.665736254452e-06 4582 KSP Residual norm 2.436231476897e-06 4583 KSP Residual norm 2.488293743517e-06 4584 KSP Residual norm 2.799996313113e-06 4585 KSP Residual norm 3.325183599698e-06 4586 KSP Residual norm 3.836168001089e-06 4587 KSP Residual norm 3.944412074953e-06 4588 KSP Residual norm 4.008941670035e-06 4589 KSP Residual norm 4.105094695722e-06 4590 KSP Residual norm 4.407243427201e-06 4591 KSP Residual norm 4.560853594063e-06 4592 KSP Residual norm 4.500019999570e-06 4593 KSP Residual norm 4.294705056009e-06 4594 KSP Residual norm 4.251641648236e-06 4595 KSP Residual norm 4.548309316860e-06 4596 KSP Residual norm 4.646855798998e-06 4597 KSP Residual norm 4.265574712030e-06 4598 KSP Residual norm 4.091813706284e-06 4599 KSP Residual norm 4.471034963874e-06 4600 KSP Residual norm 4.745870674793e-06 4601 KSP Residual norm 4.624767327014e-06 4602 KSP Residual norm 4.504061107083e-06 4603 KSP Residual norm 4.738400384699e-06 4604 KSP Residual norm 5.258082389614e-06 4605 KSP Residual norm 5.757650554459e-06 4606 KSP Residual norm 6.194370961563e-06 4607 KSP Residual norm 6.422140043945e-06 4608 KSP Residual norm 6.305982639737e-06 4609 KSP Residual norm 6.206389104685e-06 4610 KSP Residual norm 6.186224301597e-06 4611 KSP Residual norm 5.961003499606e-06 4612 KSP Residual norm 5.826489171045e-06 4613 KSP Residual norm 5.885750344079e-06 4614 KSP Residual norm 6.055924953065e-06 4615 KSP Residual norm 6.063263036870e-06 4616 KSP Residual norm 5.797351522394e-06 4617 KSP Residual norm 5.823908894968e-06 4618 KSP Residual norm 6.184456236982e-06 4619 KSP Residual norm 6.507072729587e-06 4620 KSP Residual norm 6.563135497577e-06 4621 KSP Residual norm 6.712926715427e-06 4622 KSP Residual norm 6.744411140431e-06 4623 KSP Residual norm 6.085257888865e-06 4624 KSP Residual norm 5.550313167580e-06 4625 KSP Residual norm 5.501241507229e-06 4626 KSP Residual norm 5.719243285453e-06 4627 KSP Residual norm 4.890292468600e-06 4628 KSP Residual norm 3.909271545199e-06 4629 KSP Residual norm 3.479191085264e-06 4630 KSP Residual norm 3.579382832217e-06 4631 KSP Residual norm 3.957063119777e-06 4632 KSP Residual norm 4.043510993896e-06 4633 KSP Residual norm 3.952783800155e-06 4634 KSP Residual norm 3.969581588256e-06 4635 KSP Residual norm 3.696980230108e-06 4636 KSP Residual norm 3.252765996725e-06 4637 KSP Residual norm 3.217224598550e-06 4638 KSP Residual norm 3.538805587260e-06 4639 KSP Residual norm 3.724618415070e-06 4640 KSP Residual norm 3.609490218028e-06 4641 KSP Residual norm 3.417183591628e-06 4642 KSP Residual norm 3.103798772522e-06 4643 KSP Residual norm 2.940007371501e-06 4644 KSP Residual norm 2.945616981145e-06 4645 KSP Residual norm 3.033282901102e-06 4646 KSP Residual norm 3.057095135696e-06 4647 KSP Residual norm 3.002330709392e-06 4648 KSP Residual norm 2.909910828571e-06 4649 KSP Residual norm 2.730309350190e-06 4650 KSP Residual norm 2.688065415040e-06 4651 KSP Residual norm 2.762999991557e-06 4652 KSP Residual norm 2.804632898940e-06 4653 KSP Residual norm 3.123676759486e-06 4654 KSP Residual norm 3.720672215481e-06 4655 KSP Residual norm 4.051012435852e-06 4656 KSP Residual norm 4.066929179458e-06 4657 KSP Residual norm 4.124965839876e-06 4658 KSP Residual norm 4.040746157444e-06 4659 KSP Residual norm 3.900110946336e-06 4660 KSP Residual norm 4.067435504438e-06 4661 KSP Residual norm 4.543080931608e-06 4662 KSP Residual norm 4.703700028288e-06 4663 KSP Residual norm 4.320506599615e-06 4664 KSP Residual norm 3.903941572165e-06 4665 KSP Residual norm 3.739129539760e-06 4666 KSP Residual norm 3.929078292892e-06 4667 KSP Residual norm 4.613710501268e-06 4668 KSP Residual norm 5.798389330501e-06 4669 KSP Residual norm 6.283032097447e-06 4670 KSP Residual norm 5.319465177879e-06 4671 KSP Residual norm 4.449826258355e-06 4672 KSP Residual norm 4.442277969338e-06 4673 KSP Residual norm 5.035375482085e-06 4674 KSP Residual norm 5.285849213512e-06 4675 KSP Residual norm 4.959977263148e-06 4676 KSP Residual norm 4.719895029588e-06 4677 KSP Residual norm 4.898427968445e-06 4678 KSP Residual norm 5.045105864954e-06 4679 KSP Residual norm 4.728605193092e-06 4680 KSP Residual norm 4.570097225955e-06 4681 KSP Residual norm 4.626729948628e-06 4682 KSP Residual norm 4.980426814831e-06 4683 KSP Residual norm 5.124973281689e-06 4684 KSP Residual norm 5.060773701544e-06 4685 KSP Residual norm 5.250812546812e-06 4686 KSP Residual norm 5.502027367380e-06 4687 KSP Residual norm 5.276685793024e-06 4688 KSP Residual norm 4.417570476167e-06 4689 KSP Residual norm 3.633348973905e-06 4690 KSP Residual norm 3.102810783248e-06 4691 KSP Residual norm 3.128606860652e-06 4692 KSP Residual norm 3.588772395799e-06 4693 KSP Residual norm 4.017092734693e-06 4694 KSP Residual norm 4.303003680295e-06 4695 KSP Residual norm 4.365565950108e-06 4696 KSP Residual norm 4.448694399364e-06 4697 KSP Residual norm 4.015027136523e-06 4698 KSP Residual norm 3.499674814751e-06 4699 KSP Residual norm 3.212832178720e-06 4700 KSP Residual norm 3.055100417737e-06 4701 KSP Residual norm 3.119231869446e-06 4702 KSP Residual norm 3.395893519119e-06 4703 KSP Residual norm 3.480165727773e-06 4704 KSP Residual norm 3.026151022904e-06 4705 KSP Residual norm 2.554903848687e-06 4706 KSP Residual norm 2.280303175951e-06 4707 KSP Residual norm 2.240414324431e-06 4708 KSP Residual norm 2.419206783838e-06 4709 KSP Residual norm 2.404179086963e-06 4710 KSP Residual norm 2.260143684334e-06 4711 KSP Residual norm 2.125300499261e-06 4712 KSP Residual norm 2.065449145861e-06 4713 KSP Residual norm 2.152351013140e-06 4714 KSP Residual norm 2.237572232798e-06 4715 KSP Residual norm 2.196208032702e-06 4716 KSP Residual norm 2.109189473815e-06 4717 KSP Residual norm 2.149380585786e-06 4718 KSP Residual norm 2.294435180638e-06 4719 KSP Residual norm 2.465510537764e-06 4720 KSP Residual norm 2.673931112141e-06 4721 KSP Residual norm 2.842140270408e-06 4722 KSP Residual norm 2.942547305071e-06 4723 KSP Residual norm 2.945007316110e-06 4724 KSP Residual norm 3.060425905198e-06 4725 KSP Residual norm 2.928588700870e-06 4726 KSP Residual norm 2.859712551335e-06 4727 KSP Residual norm 2.919937782686e-06 4728 KSP Residual norm 2.841410518816e-06 4729 KSP Residual norm 2.704259932883e-06 4730 KSP Residual norm 2.639921174211e-06 4731 KSP Residual norm 2.683962521434e-06 4732 KSP Residual norm 2.862650868540e-06 4733 KSP Residual norm 3.156904312309e-06 4734 KSP Residual norm 3.338653271594e-06 4735 KSP Residual norm 3.377200032088e-06 4736 KSP Residual norm 3.538262257498e-06 4737 KSP Residual norm 4.180187575970e-06 4738 KSP Residual norm 4.979104831434e-06 4739 KSP Residual norm 5.189254157701e-06 4740 KSP Residual norm 4.646720181478e-06 4741 KSP Residual norm 4.069516442494e-06 4742 KSP Residual norm 3.740268358331e-06 4743 KSP Residual norm 3.620246546513e-06 4744 KSP Residual norm 3.798491047346e-06 4745 KSP Residual norm 4.297078104392e-06 4746 KSP Residual norm 4.824745428101e-06 4747 KSP Residual norm 4.897749121755e-06 4748 KSP Residual norm 4.346170242500e-06 4749 KSP Residual norm 3.875151387980e-06 4750 KSP Residual norm 3.694255981652e-06 4751 KSP Residual norm 3.575833468475e-06 4752 KSP Residual norm 3.426777905916e-06 4753 KSP Residual norm 3.401514566784e-06 4754 KSP Residual norm 3.301125380960e-06 4755 KSP Residual norm 2.975226002593e-06 4756 KSP Residual norm 2.713920001261e-06 4757 KSP Residual norm 2.732163428554e-06 4758 KSP Residual norm 2.866192493327e-06 4759 KSP Residual norm 2.837133478929e-06 4760 KSP Residual norm 2.762229370665e-06 4761 KSP Residual norm 2.709325906925e-06 4762 KSP Residual norm 2.592704181166e-06 4763 KSP Residual norm 2.401611584238e-06 4764 KSP Residual norm 2.284458330486e-06 4765 KSP Residual norm 2.067522880357e-06 4766 KSP Residual norm 1.794173146654e-06 4767 KSP Residual norm 1.638987433803e-06 4768 KSP Residual norm 1.616644120241e-06 4769 KSP Residual norm 1.765276011609e-06 4770 KSP Residual norm 1.939186940518e-06 4771 KSP Residual norm 2.054326582129e-06 4772 KSP Residual norm 2.206600306076e-06 4773 KSP Residual norm 2.216511256535e-06 4774 KSP Residual norm 1.947687138485e-06 4775 KSP Residual norm 1.660864978845e-06 4776 KSP Residual norm 1.447567712612e-06 4777 KSP Residual norm 1.342806784992e-06 4778 KSP Residual norm 1.425754510661e-06 4779 KSP Residual norm 1.619495185807e-06 4780 KSP Residual norm 1.920995600812e-06 4781 KSP Residual norm 2.227640921019e-06 4782 KSP Residual norm 2.311592322089e-06 4783 KSP Residual norm 2.419022845539e-06 4784 KSP Residual norm 2.652356778770e-06 4785 KSP Residual norm 2.967144490413e-06 4786 KSP Residual norm 3.098277189045e-06 4787 KSP Residual norm 3.166379282489e-06 4788 KSP Residual norm 3.205165006833e-06 4789 KSP Residual norm 3.005299631558e-06 4790 KSP Residual norm 2.775364152338e-06 4791 KSP Residual norm 2.728405120530e-06 4792 KSP Residual norm 2.876916984430e-06 4793 KSP Residual norm 2.979855974209e-06 4794 KSP Residual norm 3.139757981776e-06 4795 KSP Residual norm 3.380326600043e-06 4796 KSP Residual norm 3.702672679125e-06 4797 KSP Residual norm 3.747539367100e-06 4798 KSP Residual norm 3.789335025805e-06 4799 KSP Residual norm 4.340347431035e-06 4800 KSP Residual norm 4.868264686609e-06 4801 KSP Residual norm 4.762308747922e-06 4802 KSP Residual norm 4.780595296062e-06 4803 KSP Residual norm 5.405135265920e-06 4804 KSP Residual norm 6.023856209637e-06 4805 KSP Residual norm 5.676294440857e-06 4806 KSP Residual norm 4.646489356669e-06 4807 KSP Residual norm 3.613496919622e-06 4808 KSP Residual norm 3.218712899497e-06 4809 KSP Residual norm 3.635138111744e-06 4810 KSP Residual norm 4.479517680781e-06 4811 KSP Residual norm 5.091947045290e-06 4812 KSP Residual norm 5.208999993187e-06 4813 KSP Residual norm 4.668400279765e-06 4814 KSP Residual norm 4.118793941203e-06 4815 KSP Residual norm 4.062076557165e-06 4816 KSP Residual norm 4.484271977201e-06 4817 KSP Residual norm 5.621739443869e-06 4818 KSP Residual norm 6.810404113878e-06 4819 KSP Residual norm 6.946803138675e-06 4820 KSP Residual norm 6.176076620899e-06 4821 KSP Residual norm 5.699099996922e-06 4822 KSP Residual norm 6.055108596592e-06 4823 KSP Residual norm 6.991401831183e-06 4824 KSP Residual norm 7.471201895190e-06 4825 KSP Residual norm 6.548336738413e-06 4826 KSP Residual norm 4.989613750761e-06 4827 KSP Residual norm 3.863417462846e-06 4828 KSP Residual norm 3.111152576912e-06 4829 KSP Residual norm 2.612419750530e-06 4830 KSP Residual norm 2.551531034681e-06 4831 KSP Residual norm 2.798638196212e-06 4832 KSP Residual norm 2.960548850026e-06 4833 KSP Residual norm 2.921788796787e-06 4834 KSP Residual norm 2.856582378235e-06 4835 KSP Residual norm 2.997410597606e-06 4836 KSP Residual norm 3.421493924306e-06 4837 KSP Residual norm 3.879724291071e-06 4838 KSP Residual norm 4.206758533246e-06 4839 KSP Residual norm 4.391614738641e-06 4840 KSP Residual norm 4.503699907786e-06 4841 KSP Residual norm 3.984033606775e-06 4842 KSP Residual norm 3.255453129734e-06 4843 KSP Residual norm 2.691789121694e-06 4844 KSP Residual norm 2.529869916316e-06 4845 KSP Residual norm 2.518515950284e-06 4846 KSP Residual norm 2.547826262875e-06 4847 KSP Residual norm 2.575005713821e-06 4848 KSP Residual norm 2.777223001965e-06 4849 KSP Residual norm 2.983151059572e-06 4850 KSP Residual norm 3.125125021740e-06 4851 KSP Residual norm 2.980563465931e-06 4852 KSP Residual norm 2.817243981615e-06 4853 KSP Residual norm 2.929850106055e-06 4854 KSP Residual norm 3.560071434750e-06 4855 KSP Residual norm 4.876399936007e-06 4856 KSP Residual norm 6.391466529659e-06 4857 KSP Residual norm 7.200309443960e-06 4858 KSP Residual norm 7.548833114582e-06 4859 KSP Residual norm 7.155971000020e-06 4860 KSP Residual norm 5.977675213753e-06 4861 KSP Residual norm 5.146164823420e-06 4862 KSP Residual norm 4.612369158120e-06 4863 KSP Residual norm 4.188482375645e-06 4864 KSP Residual norm 4.072748503590e-06 4865 KSP Residual norm 4.405284358532e-06 4866 KSP Residual norm 5.078307304852e-06 4867 KSP Residual norm 5.931705439416e-06 4868 KSP Residual norm 7.725369062149e-06 4869 KSP Residual norm 9.325294853198e-06 4870 KSP Residual norm 9.501574624619e-06 4871 KSP Residual norm 8.604731387774e-06 4872 KSP Residual norm 8.419361787674e-06 4873 KSP Residual norm 8.980012388244e-06 4874 KSP Residual norm 9.352931949619e-06 4875 KSP Residual norm 8.887465800553e-06 4876 KSP Residual norm 8.345800155569e-06 4877 KSP Residual norm 8.350750150883e-06 4878 KSP Residual norm 9.030199104255e-06 4879 KSP Residual norm 9.965189094465e-06 4880 KSP Residual norm 1.164371457342e-05 4881 KSP Residual norm 1.263735865043e-05 4882 KSP Residual norm 1.165467046475e-05 4883 KSP Residual norm 9.457543792158e-06 4884 KSP Residual norm 7.160322190847e-06 4885 KSP Residual norm 6.332136485700e-06 4886 KSP Residual norm 6.303715156550e-06 4887 KSP Residual norm 7.037161653788e-06 4888 KSP Residual norm 8.835463845288e-06 4889 KSP Residual norm 1.054456788397e-05 4890 KSP Residual norm 9.638764794574e-06 4891 KSP Residual norm 7.253864680898e-06 4892 KSP Residual norm 5.491015728626e-06 4893 KSP Residual norm 4.510572566338e-06 4894 KSP Residual norm 4.474795592332e-06 4895 KSP Residual norm 5.338490002365e-06 4896 KSP Residual norm 6.529419411095e-06 4897 KSP Residual norm 7.008236283437e-06 4898 KSP Residual norm 5.928560493854e-06 4899 KSP Residual norm 4.679065249776e-06 4900 KSP Residual norm 3.909221132438e-06 4901 KSP Residual norm 3.552489922867e-06 4902 KSP Residual norm 3.218240596195e-06 4903 KSP Residual norm 3.048051253281e-06 4904 KSP Residual norm 3.129283130238e-06 4905 KSP Residual norm 3.631050115105e-06 4906 KSP Residual norm 4.844259978891e-06 4907 KSP Residual norm 6.603975724988e-06 4908 KSP Residual norm 8.477529854999e-06 4909 KSP Residual norm 8.928415419210e-06 4910 KSP Residual norm 7.551853704631e-06 4911 KSP Residual norm 5.973522173760e-06 4912 KSP Residual norm 4.966446129656e-06 4913 KSP Residual norm 4.289595385358e-06 4914 KSP Residual norm 4.088756586678e-06 4915 KSP Residual norm 4.353518740192e-06 4916 KSP Residual norm 5.036840140768e-06 4917 KSP Residual norm 5.600069449388e-06 4918 KSP Residual norm 5.877277105197e-06 4919 KSP Residual norm 6.292516962723e-06 4920 KSP Residual norm 7.224161690643e-06 4921 KSP Residual norm 8.453928949680e-06 4922 KSP Residual norm 8.621060319463e-06 4923 KSP Residual norm 7.261867530016e-06 4924 KSP Residual norm 6.216235782566e-06 4925 KSP Residual norm 5.835893612035e-06 4926 KSP Residual norm 5.772699749449e-06 4927 KSP Residual norm 6.134134078385e-06 4928 KSP Residual norm 6.973449886435e-06 4929 KSP Residual norm 7.514874707776e-06 4930 KSP Residual norm 7.134120608450e-06 4931 KSP Residual norm 7.157914024235e-06 4932 KSP Residual norm 8.415946029380e-06 4933 KSP Residual norm 9.456756912058e-06 4934 KSP Residual norm 8.139480966330e-06 4935 KSP Residual norm 6.288215010761e-06 4936 KSP Residual norm 5.785781486311e-06 4937 KSP Residual norm 6.718167896787e-06 4938 KSP Residual norm 8.738489233119e-06 4939 KSP Residual norm 1.193406596239e-05 4940 KSP Residual norm 1.492171650905e-05 4941 KSP Residual norm 1.488897505201e-05 4942 KSP Residual norm 1.235072737215e-05 4943 KSP Residual norm 9.168751883405e-06 4944 KSP Residual norm 7.894172403725e-06 4945 KSP Residual norm 8.691145206079e-06 4946 KSP Residual norm 1.107398131843e-05 4947 KSP Residual norm 1.338337006198e-05 4948 KSP Residual norm 1.280777844831e-05 4949 KSP Residual norm 1.067625190165e-05 4950 KSP Residual norm 9.010288817394e-06 4951 KSP Residual norm 8.874108801539e-06 4952 KSP Residual norm 1.024918464827e-05 4953 KSP Residual norm 1.362780314128e-05 4954 KSP Residual norm 1.799165647994e-05 4955 KSP Residual norm 1.871957505211e-05 4956 KSP Residual norm 1.536016364199e-05 4957 KSP Residual norm 1.164134678280e-05 4958 KSP Residual norm 8.867152607965e-06 4959 KSP Residual norm 7.150849026146e-06 4960 KSP Residual norm 6.788763347845e-06 4961 KSP Residual norm 7.666047862180e-06 4962 KSP Residual norm 9.021011266396e-06 4963 KSP Residual norm 1.102022417338e-05 4964 KSP Residual norm 1.144538506363e-05 4965 KSP Residual norm 8.211447888093e-06 4966 KSP Residual norm 4.899355112742e-06 4967 KSP Residual norm 3.438790404352e-06 4968 KSP Residual norm 3.032970766247e-06 4969 KSP Residual norm 3.179742290810e-06 4970 KSP Residual norm 3.897029665913e-06 4971 KSP Residual norm 4.950588738404e-06 4972 KSP Residual norm 5.887070820936e-06 4973 KSP Residual norm 5.641542932183e-06 4974 KSP Residual norm 4.154254006254e-06 4975 KSP Residual norm 3.044149900135e-06 4976 KSP Residual norm 2.677809996279e-06 4977 KSP Residual norm 2.732551496355e-06 4978 KSP Residual norm 3.079826042950e-06 4979 KSP Residual norm 3.917796141755e-06 4980 KSP Residual norm 5.228185231160e-06 4981 KSP Residual norm 5.970615788319e-06 4982 KSP Residual norm 5.694607625326e-06 4983 KSP Residual norm 4.682075549566e-06 4984 KSP Residual norm 3.468053099142e-06 4985 KSP Residual norm 2.824688618057e-06 4986 KSP Residual norm 2.905797535203e-06 4987 KSP Residual norm 3.583169323461e-06 4988 KSP Residual norm 4.625067073184e-06 4989 KSP Residual norm 5.387843448413e-06 4990 KSP Residual norm 5.217286857151e-06 4991 KSP Residual norm 4.802694825993e-06 4992 KSP Residual norm 5.014564279131e-06 4993 KSP Residual norm 5.129509961194e-06 4994 KSP Residual norm 4.748480436946e-06 4995 KSP Residual norm 4.952461806835e-06 4996 KSP Residual norm 5.869607290622e-06 4997 KSP Residual norm 7.390858115626e-06 4998 KSP Residual norm 9.629378633441e-06 4999 KSP Residual norm 1.159422889741e-05 5000 KSP Residual norm 1.103918245739e-05 5001 KSP Residual norm 8.113460826154e-06 5002 KSP Residual norm 5.769684183448e-06 5003 KSP Residual norm 4.465459942665e-06 5004 KSP Residual norm 4.134869960992e-06 5005 KSP Residual norm 4.510158556462e-06 5006 KSP Residual norm 5.437779672051e-06 5007 KSP Residual norm 6.663368262236e-06 5008 KSP Residual norm 7.710894973869e-06 5009 KSP Residual norm 7.526246003320e-06 5010 KSP Residual norm 6.518820923108e-06 5011 KSP Residual norm 5.479348712502e-06 5012 KSP Residual norm 4.976495731943e-06 5013 KSP Residual norm 4.717156186048e-06 5014 KSP Residual norm 4.353470150187e-06 5015 KSP Residual norm 3.697353729805e-06 5016 KSP Residual norm 3.136102532853e-06 5017 KSP Residual norm 2.928213390023e-06 5018 KSP Residual norm 3.109316596306e-06 5019 KSP Residual norm 3.535209195011e-06 5020 KSP Residual norm 3.982876431802e-06 5021 KSP Residual norm 4.066037044859e-06 5022 KSP Residual norm 3.568848765267e-06 5023 KSP Residual norm 3.276706364446e-06 5024 KSP Residual norm 3.511082136006e-06 5025 KSP Residual norm 3.890449064001e-06 5026 KSP Residual norm 4.092649244055e-06 5027 KSP Residual norm 3.827682909221e-06 5028 KSP Residual norm 3.215588665982e-06 5029 KSP Residual norm 2.786934285280e-06 5030 KSP Residual norm 2.581325204354e-06 5031 KSP Residual norm 2.758549405434e-06 5032 KSP Residual norm 3.321201135465e-06 5033 KSP Residual norm 4.463325392394e-06 5034 KSP Residual norm 5.848537914513e-06 5035 KSP Residual norm 6.779365561803e-06 5036 KSP Residual norm 7.253908158929e-06 5037 KSP Residual norm 7.245348693572e-06 5038 KSP Residual norm 6.991476705677e-06 5039 KSP Residual norm 6.673392851305e-06 5040 KSP Residual norm 6.502903043145e-06 5041 KSP Residual norm 6.054206185407e-06 5042 KSP Residual norm 5.652495160388e-06 5043 KSP Residual norm 5.826491811416e-06 5044 KSP Residual norm 6.988210688674e-06 5045 KSP Residual norm 7.841658967917e-06 5046 KSP Residual norm 7.568405167749e-06 5047 KSP Residual norm 7.420859651622e-06 5048 KSP Residual norm 8.310590416574e-06 5049 KSP Residual norm 9.895248717692e-06 5050 KSP Residual norm 1.065252285188e-05 5051 KSP Residual norm 9.442467824085e-06 5052 KSP Residual norm 6.995892545410e-06 5053 KSP Residual norm 5.605114535207e-06 5054 KSP Residual norm 5.386586186855e-06 5055 KSP Residual norm 5.475584768235e-06 5056 KSP Residual norm 5.639982202183e-06 5057 KSP Residual norm 6.334020755917e-06 5058 KSP Residual norm 7.290974284418e-06 5059 KSP Residual norm 7.712843822409e-06 5060 KSP Residual norm 7.933016657078e-06 5061 KSP Residual norm 8.815505891790e-06 5062 KSP Residual norm 1.037240472395e-05 5063 KSP Residual norm 1.082371785000e-05 5064 KSP Residual norm 1.013194238412e-05 5065 KSP Residual norm 9.676777369365e-06 5066 KSP Residual norm 1.030048024033e-05 5067 KSP Residual norm 1.056314605811e-05 5068 KSP Residual norm 9.758221631567e-06 5069 KSP Residual norm 8.779766842617e-06 5070 KSP Residual norm 8.198975980773e-06 5071 KSP Residual norm 7.690571485566e-06 5072 KSP Residual norm 7.052183595688e-06 5073 KSP Residual norm 6.555750352606e-06 5074 KSP Residual norm 6.902660987877e-06 5075 KSP Residual norm 8.199346484916e-06 5076 KSP Residual norm 9.747146720780e-06 5077 KSP Residual norm 1.121550203070e-05 5078 KSP Residual norm 1.199731762364e-05 5079 KSP Residual norm 1.038530974534e-05 5080 KSP Residual norm 8.005379962562e-06 5081 KSP Residual norm 6.547428210286e-06 5082 KSP Residual norm 5.697079130409e-06 5083 KSP Residual norm 5.020668735041e-06 5084 KSP Residual norm 4.690895884031e-06 5085 KSP Residual norm 4.553681130640e-06 5086 KSP Residual norm 4.189605658545e-06 5087 KSP Residual norm 3.765868818170e-06 5088 KSP Residual norm 3.892572593241e-06 5089 KSP Residual norm 5.028909739819e-06 5090 KSP Residual norm 6.564763965087e-06 5091 KSP Residual norm 6.582620915340e-06 5092 KSP Residual norm 6.121205507721e-06 5093 KSP Residual norm 6.196025753108e-06 5094 KSP Residual norm 6.288598360232e-06 5095 KSP Residual norm 6.298320393044e-06 5096 KSP Residual norm 6.440485835464e-06 5097 KSP Residual norm 6.265151820647e-06 5098 KSP Residual norm 5.212115937788e-06 5099 KSP Residual norm 4.530571405790e-06 5100 KSP Residual norm 4.545492073676e-06 5101 KSP Residual norm 5.078065928224e-06 5102 KSP Residual norm 5.832244305748e-06 5103 KSP Residual norm 6.158572741714e-06 5104 KSP Residual norm 5.849955623442e-06 5105 KSP Residual norm 5.472251971599e-06 5106 KSP Residual norm 6.116985768585e-06 5107 KSP Residual norm 7.625141357575e-06 5108 KSP Residual norm 8.457041656584e-06 5109 KSP Residual norm 7.566198246034e-06 5110 KSP Residual norm 6.539373056293e-06 5111 KSP Residual norm 6.124409022165e-06 5112 KSP Residual norm 6.168449227270e-06 5113 KSP Residual norm 6.143188745986e-06 5114 KSP Residual norm 5.548608530280e-06 5115 KSP Residual norm 4.546556440953e-06 5116 KSP Residual norm 4.083782088424e-06 5117 KSP Residual norm 3.960083128934e-06 5118 KSP Residual norm 3.778898028742e-06 5119 KSP Residual norm 3.446171195594e-06 5120 KSP Residual norm 3.285006522825e-06 5121 KSP Residual norm 3.617230674729e-06 5122 KSP Residual norm 4.224563523916e-06 5123 KSP Residual norm 4.848480383080e-06 5124 KSP Residual norm 5.328471514985e-06 5125 KSP Residual norm 5.841635283873e-06 5126 KSP Residual norm 6.006966813415e-06 5127 KSP Residual norm 6.047903869068e-06 5128 KSP Residual norm 6.655420309169e-06 5129 KSP Residual norm 7.490375850430e-06 5130 KSP Residual norm 7.725182173792e-06 5131 KSP Residual norm 6.835372701123e-06 5132 KSP Residual norm 5.672927996960e-06 5133 KSP Residual norm 4.707761799845e-06 5134 KSP Residual norm 4.648403170918e-06 5135 KSP Residual norm 5.519856986586e-06 5136 KSP Residual norm 6.735858831513e-06 5137 KSP Residual norm 7.702234846345e-06 5138 KSP Residual norm 7.639691893133e-06 5139 KSP Residual norm 7.257711249566e-06 5140 KSP Residual norm 7.873127325778e-06 5141 KSP Residual norm 9.717431001995e-06 5142 KSP Residual norm 1.172031199915e-05 5143 KSP Residual norm 1.238680126900e-05 5144 KSP Residual norm 1.269367419071e-05 5145 KSP Residual norm 1.416544927436e-05 5146 KSP Residual norm 1.539832792391e-05 5147 KSP Residual norm 1.485161114097e-05 5148 KSP Residual norm 1.297136588694e-05 5149 KSP Residual norm 1.189573821532e-05 5150 KSP Residual norm 1.102158639368e-05 5151 KSP Residual norm 1.056036924833e-05 5152 KSP Residual norm 9.417130288301e-06 5153 KSP Residual norm 7.493147052307e-06 5154 KSP Residual norm 6.746297791974e-06 5155 KSP Residual norm 7.554606661726e-06 5156 KSP Residual norm 8.767830620654e-06 5157 KSP Residual norm 8.823961755105e-06 5158 KSP Residual norm 8.699489309667e-06 5159 KSP Residual norm 9.669359737108e-06 5160 KSP Residual norm 1.033042786227e-05 5161 KSP Residual norm 1.009753469800e-05 5162 KSP Residual norm 1.005430036467e-05 5163 KSP Residual norm 9.749340579294e-06 5164 KSP Residual norm 8.431182794856e-06 5165 KSP Residual norm 6.691274150551e-06 5166 KSP Residual norm 5.883360859804e-06 5167 KSP Residual norm 6.205930693377e-06 5168 KSP Residual norm 7.278574937625e-06 5169 KSP Residual norm 8.226170762326e-06 5170 KSP Residual norm 8.462716737282e-06 5171 KSP Residual norm 9.272017691695e-06 5172 KSP Residual norm 1.211168328934e-05 5173 KSP Residual norm 1.505130024907e-05 5174 KSP Residual norm 1.401866509493e-05 5175 KSP Residual norm 1.065177276386e-05 5176 KSP Residual norm 8.555037126333e-06 5177 KSP Residual norm 7.852004668878e-06 5178 KSP Residual norm 7.861151477086e-06 5179 KSP Residual norm 8.437539604167e-06 5180 KSP Residual norm 8.000790547168e-06 5181 KSP Residual norm 7.337693139389e-06 5182 KSP Residual norm 7.635424642211e-06 5183 KSP Residual norm 8.696148831364e-06 5184 KSP Residual norm 9.384165662391e-06 5185 KSP Residual norm 9.319144951824e-06 5186 KSP Residual norm 8.876564429561e-06 5187 KSP Residual norm 8.659292761739e-06 5188 KSP Residual norm 9.329339728702e-06 5189 KSP Residual norm 1.087811537110e-05 5190 KSP Residual norm 1.235427023826e-05 5191 KSP Residual norm 1.182872048890e-05 5192 KSP Residual norm 1.122709275608e-05 5193 KSP Residual norm 1.200607232686e-05 5194 KSP Residual norm 1.379343316334e-05 5195 KSP Residual norm 1.495591994394e-05 5196 KSP Residual norm 1.481950511353e-05 5197 KSP Residual norm 1.426849192876e-05 5198 KSP Residual norm 1.525642894131e-05 5199 KSP Residual norm 1.842031320552e-05 5200 KSP Residual norm 2.088147680253e-05 5201 KSP Residual norm 1.963994964993e-05 5202 KSP Residual norm 1.807501595523e-05 5203 KSP Residual norm 1.716196245069e-05 5204 KSP Residual norm 1.628989629117e-05 5205 KSP Residual norm 1.623322499560e-05 5206 KSP Residual norm 1.584812819824e-05 5207 KSP Residual norm 1.452348256431e-05 5208 KSP Residual norm 1.325089739661e-05 5209 KSP Residual norm 1.220548461777e-05 5210 KSP Residual norm 1.232059376322e-05 5211 KSP Residual norm 1.275234238616e-05 5212 KSP Residual norm 1.207394571244e-05 5213 KSP Residual norm 1.162946657974e-05 5214 KSP Residual norm 1.303835307122e-05 5215 KSP Residual norm 1.554584798859e-05 5216 KSP Residual norm 1.801561006151e-05 5217 KSP Residual norm 1.896659819865e-05 5218 KSP Residual norm 1.800946774205e-05 5219 KSP Residual norm 1.746193164334e-05 5220 KSP Residual norm 1.861044294687e-05 5221 KSP Residual norm 1.887464472709e-05 5222 KSP Residual norm 1.651312817488e-05 5223 KSP Residual norm 1.389112595654e-05 5224 KSP Residual norm 1.308809119504e-05 5225 KSP Residual norm 1.357545491391e-05 5226 KSP Residual norm 1.399150527652e-05 5227 KSP Residual norm 1.253911172307e-05 5228 KSP Residual norm 1.088746556343e-05 5229 KSP Residual norm 1.150061337583e-05 5230 KSP Residual norm 1.424066271143e-05 5231 KSP Residual norm 1.743322968541e-05 5232 KSP Residual norm 1.874201508677e-05 5233 KSP Residual norm 1.826386830567e-05 5234 KSP Residual norm 1.715889534125e-05 5235 KSP Residual norm 1.773258909346e-05 5236 KSP Residual norm 1.879840743439e-05 5237 KSP Residual norm 1.952767517280e-05 5238 KSP Residual norm 1.936323565914e-05 5239 KSP Residual norm 1.686365983775e-05 5240 KSP Residual norm 1.512768295192e-05 5241 KSP Residual norm 1.604296205474e-05 5242 KSP Residual norm 1.781036834601e-05 5243 KSP Residual norm 1.686071031469e-05 5244 KSP Residual norm 1.535698858318e-05 5245 KSP Residual norm 1.438129467170e-05 5246 KSP Residual norm 1.408751520884e-05 5247 KSP Residual norm 1.439955825575e-05 5248 KSP Residual norm 1.542793017451e-05 5249 KSP Residual norm 1.631316757151e-05 5250 KSP Residual norm 1.641818903694e-05 5251 KSP Residual norm 1.659501574888e-05 5252 KSP Residual norm 1.787840946378e-05 5253 KSP Residual norm 1.851721480257e-05 5254 KSP Residual norm 1.647814503054e-05 5255 KSP Residual norm 1.390252172833e-05 5256 KSP Residual norm 1.401922999706e-05 5257 KSP Residual norm 1.552161840468e-05 5258 KSP Residual norm 1.500311561444e-05 5259 KSP Residual norm 1.221440231445e-05 5260 KSP Residual norm 9.870760158073e-06 5261 KSP Residual norm 9.415039076535e-06 5262 KSP Residual norm 1.144499110962e-05 5263 KSP Residual norm 1.423640171445e-05 5264 KSP Residual norm 1.439628736300e-05 5265 KSP Residual norm 1.383387004506e-05 5266 KSP Residual norm 1.501675973053e-05 5267 KSP Residual norm 1.714450575061e-05 5268 KSP Residual norm 1.815020937663e-05 5269 KSP Residual norm 1.607268071322e-05 5270 KSP Residual norm 1.307822931932e-05 5271 KSP Residual norm 1.162347835007e-05 5272 KSP Residual norm 1.222183274596e-05 5273 KSP Residual norm 1.302798007952e-05 5274 KSP Residual norm 1.249208780501e-05 5275 KSP Residual norm 1.219410033126e-05 5276 KSP Residual norm 1.366015481101e-05 5277 KSP Residual norm 1.668156952035e-05 5278 KSP Residual norm 1.901897036897e-05 5279 KSP Residual norm 1.958258115880e-05 5280 KSP Residual norm 1.971344972876e-05 5281 KSP Residual norm 2.196584524485e-05 5282 KSP Residual norm 2.653950926230e-05 5283 KSP Residual norm 3.002892132456e-05 5284 KSP Residual norm 2.706744857327e-05 5285 KSP Residual norm 2.345036269688e-05 5286 KSP Residual norm 2.409680807816e-05 5287 KSP Residual norm 2.632096241107e-05 5288 KSP Residual norm 2.461217000940e-05 5289 KSP Residual norm 2.323490585554e-05 5290 KSP Residual norm 2.408795096474e-05 5291 KSP Residual norm 2.398501489650e-05 5292 KSP Residual norm 2.127585024156e-05 5293 KSP Residual norm 1.973755948805e-05 5294 KSP Residual norm 1.986590834033e-05 5295 KSP Residual norm 2.058700096123e-05 5296 KSP Residual norm 2.110207179006e-05 5297 KSP Residual norm 2.074095928087e-05 5298 KSP Residual norm 2.065225405290e-05 5299 KSP Residual norm 2.164679055681e-05 5300 KSP Residual norm 2.231406311089e-05 5301 KSP Residual norm 2.187809549652e-05 5302 KSP Residual norm 2.223774669513e-05 5303 KSP Residual norm 2.245024794570e-05 5304 KSP Residual norm 2.365181747614e-05 5305 KSP Residual norm 2.599186150152e-05 5306 KSP Residual norm 2.569490615705e-05 5307 KSP Residual norm 2.265917121154e-05 5308 KSP Residual norm 2.200430906084e-05 5309 KSP Residual norm 2.248417569648e-05 5310 KSP Residual norm 2.105729504171e-05 5311 KSP Residual norm 1.804994868974e-05 5312 KSP Residual norm 1.457976894569e-05 5313 KSP Residual norm 1.250884429987e-05 5314 KSP Residual norm 1.258421227279e-05 5315 KSP Residual norm 1.442954099109e-05 5316 KSP Residual norm 1.512360784407e-05 5317 KSP Residual norm 1.290785014599e-05 5318 KSP Residual norm 1.135555717721e-05 5319 KSP Residual norm 1.180232954200e-05 5320 KSP Residual norm 1.339622707811e-05 5321 KSP Residual norm 1.352071491592e-05 5322 KSP Residual norm 1.217398218753e-05 5323 KSP Residual norm 1.099138044646e-05 5324 KSP Residual norm 1.159550298081e-05 5325 KSP Residual norm 1.317080043075e-05 5326 KSP Residual norm 1.331755690409e-05 5327 KSP Residual norm 1.197965064748e-05 5328 KSP Residual norm 1.114931970007e-05 5329 KSP Residual norm 1.079779195505e-05 5330 KSP Residual norm 1.003193301916e-05 5331 KSP Residual norm 8.766630643615e-06 5332 KSP Residual norm 7.472191574213e-06 5333 KSP Residual norm 6.141176967959e-06 5334 KSP Residual norm 5.166553592940e-06 5335 KSP Residual norm 5.461535799019e-06 5336 KSP Residual norm 6.341340649007e-06 5337 KSP Residual norm 6.019802029044e-06 5338 KSP Residual norm 5.204084607812e-06 5339 KSP Residual norm 4.980629218206e-06 5340 KSP Residual norm 5.092264200404e-06 5341 KSP Residual norm 5.150131662172e-06 5342 KSP Residual norm 5.398309624069e-06 5343 KSP Residual norm 5.880759054873e-06 5344 KSP Residual norm 6.386632346182e-06 5345 KSP Residual norm 6.806856891920e-06 5346 KSP Residual norm 7.354823282149e-06 5347 KSP Residual norm 7.568868156829e-06 5348 KSP Residual norm 6.936818631693e-06 5349 KSP Residual norm 6.369802031129e-06 5350 KSP Residual norm 5.769136855671e-06 5351 KSP Residual norm 5.011039465004e-06 5352 KSP Residual norm 4.335319423599e-06 5353 KSP Residual norm 3.844419359938e-06 5354 KSP Residual norm 3.390508636338e-06 5355 KSP Residual norm 3.179458559683e-06 5356 KSP Residual norm 3.113202742255e-06 5357 KSP Residual norm 3.040290225500e-06 5358 KSP Residual norm 2.950528592601e-06 5359 KSP Residual norm 3.019619020502e-06 5360 KSP Residual norm 3.143673186037e-06 5361 KSP Residual norm 3.269276557569e-06 5362 KSP Residual norm 3.356248469776e-06 5363 KSP Residual norm 3.600310368377e-06 5364 KSP Residual norm 3.805498541574e-06 5365 KSP Residual norm 4.095554358517e-06 5366 KSP Residual norm 4.232018805251e-06 5367 KSP Residual norm 4.405723514558e-06 5368 KSP Residual norm 4.470898441287e-06 5369 KSP Residual norm 4.574818421971e-06 5370 KSP Residual norm 5.186806926745e-06 5371 KSP Residual norm 5.736881498382e-06 5372 KSP Residual norm 5.552796721800e-06 5373 KSP Residual norm 5.548498205794e-06 5374 KSP Residual norm 5.924411304026e-06 5375 KSP Residual norm 5.949416006281e-06 5376 KSP Residual norm 5.455033064685e-06 5377 KSP Residual norm 4.992653955523e-06 5378 KSP Residual norm 5.205434463130e-06 5379 KSP Residual norm 5.651616992074e-06 5380 KSP Residual norm 5.261508329334e-06 5381 KSP Residual norm 4.398368087577e-06 5382 KSP Residual norm 4.170460919987e-06 5383 KSP Residual norm 4.425051117658e-06 5384 KSP Residual norm 4.410699079509e-06 5385 KSP Residual norm 3.906925328441e-06 5386 KSP Residual norm 3.931178323414e-06 5387 KSP Residual norm 4.739637765365e-06 5388 KSP Residual norm 5.881306526868e-06 5389 KSP Residual norm 6.432895028701e-06 5390 KSP Residual norm 6.354554814005e-06 5391 KSP Residual norm 6.633955648163e-06 5392 KSP Residual norm 7.369398848998e-06 5393 KSP Residual norm 7.773649110969e-06 5394 KSP Residual norm 7.562943533390e-06 5395 KSP Residual norm 7.283115738957e-06 5396 KSP Residual norm 7.574199274886e-06 5397 KSP Residual norm 8.606065923579e-06 5398 KSP Residual norm 9.267867744582e-06 5399 KSP Residual norm 8.051040672477e-06 5400 KSP Residual norm 6.291884894095e-06 5401 KSP Residual norm 5.947426965959e-06 5402 KSP Residual norm 5.965490949672e-06 5403 KSP Residual norm 5.314446054795e-06 5404 KSP Residual norm 4.741362100572e-06 5405 KSP Residual norm 5.073920418377e-06 5406 KSP Residual norm 6.175108125931e-06 5407 KSP Residual norm 6.909625261632e-06 5408 KSP Residual norm 6.830144202480e-06 5409 KSP Residual norm 6.850613365538e-06 5410 KSP Residual norm 7.028557316752e-06 5411 KSP Residual norm 6.899700480251e-06 5412 KSP Residual norm 6.414415301855e-06 5413 KSP Residual norm 6.157272404518e-06 5414 KSP Residual norm 6.575130132065e-06 5415 KSP Residual norm 6.739759095732e-06 5416 KSP Residual norm 6.003630816430e-06 5417 KSP Residual norm 4.887778758119e-06 5418 KSP Residual norm 4.505276154808e-06 5419 KSP Residual norm 4.749829166380e-06 5420 KSP Residual norm 5.054596867370e-06 5421 KSP Residual norm 4.822886633332e-06 5422 KSP Residual norm 4.508693009486e-06 5423 KSP Residual norm 4.480233898737e-06 5424 KSP Residual norm 4.383546049897e-06 5425 KSP Residual norm 4.217002372359e-06 5426 KSP Residual norm 3.839299482728e-06 5427 KSP Residual norm 3.601568628579e-06 5428 KSP Residual norm 3.811484052474e-06 5429 KSP Residual norm 4.130831962168e-06 5430 KSP Residual norm 4.041655821764e-06 5431 KSP Residual norm 3.660837963350e-06 5432 KSP Residual norm 3.518188312194e-06 5433 KSP Residual norm 3.558513484447e-06 5434 KSP Residual norm 3.558792920736e-06 5435 KSP Residual norm 3.136722315162e-06 5436 KSP Residual norm 2.871939172431e-06 5437 KSP Residual norm 2.980400190316e-06 5438 KSP Residual norm 3.050197839715e-06 5439 KSP Residual norm 2.848958913818e-06 5440 KSP Residual norm 2.804893767689e-06 5441 KSP Residual norm 3.101007230811e-06 5442 KSP Residual norm 3.298011606061e-06 5443 KSP Residual norm 3.519614516205e-06 5444 KSP Residual norm 3.922937324800e-06 5445 KSP Residual norm 4.104472088199e-06 5446 KSP Residual norm 4.313871383144e-06 5447 KSP Residual norm 4.283689253662e-06 5448 KSP Residual norm 3.565770598566e-06 5449 KSP Residual norm 3.019545044112e-06 5450 KSP Residual norm 2.910694553551e-06 5451 KSP Residual norm 2.882145235540e-06 5452 KSP Residual norm 2.788665519059e-06 5453 KSP Residual norm 2.677666584049e-06 5454 KSP Residual norm 2.562591917687e-06 5455 KSP Residual norm 2.368778067332e-06 5456 KSP Residual norm 2.311820209788e-06 5457 KSP Residual norm 2.553017632731e-06 5458 KSP Residual norm 2.861173244073e-06 5459 KSP Residual norm 3.035036050312e-06 5460 KSP Residual norm 3.295671441835e-06 5461 KSP Residual norm 3.966560870102e-06 5462 KSP Residual norm 4.846038820653e-06 5463 KSP Residual norm 5.255008993057e-06 5464 KSP Residual norm 5.571703372525e-06 5465 KSP Residual norm 5.815245826277e-06 5466 KSP Residual norm 5.836891686753e-06 5467 KSP Residual norm 6.662743981342e-06 5468 KSP Residual norm 6.855203089823e-06 5469 KSP Residual norm 6.362448463954e-06 5470 KSP Residual norm 6.853417698629e-06 5471 KSP Residual norm 8.215838502763e-06 5472 KSP Residual norm 8.304014534722e-06 5473 KSP Residual norm 7.131704724682e-06 5474 KSP Residual norm 6.219898196049e-06 5475 KSP Residual norm 5.812616173018e-06 5476 KSP Residual norm 5.535643296512e-06 5477 KSP Residual norm 5.641686630424e-06 5478 KSP Residual norm 5.805216509153e-06 5479 KSP Residual norm 6.062747358934e-06 5480 KSP Residual norm 6.820183132139e-06 5481 KSP Residual norm 7.350235827792e-06 5482 KSP Residual norm 7.284057732861e-06 5483 KSP Residual norm 7.065811376913e-06 5484 KSP Residual norm 6.741939047962e-06 5485 KSP Residual norm 7.002338736197e-06 5486 KSP Residual norm 7.494533412534e-06 5487 KSP Residual norm 7.695904858735e-06 5488 KSP Residual norm 7.833529460142e-06 5489 KSP Residual norm 8.578802254821e-06 5490 KSP Residual norm 9.399250662232e-06 5491 KSP Residual norm 1.049295756843e-05 5492 KSP Residual norm 1.077796862400e-05 5493 KSP Residual norm 1.009219660826e-05 5494 KSP Residual norm 9.691802079987e-06 5495 KSP Residual norm 9.582949292529e-06 5496 KSP Residual norm 9.311633084079e-06 5497 KSP Residual norm 8.745833420024e-06 5498 KSP Residual norm 8.437873840044e-06 5499 KSP Residual norm 8.227880394874e-06 5500 KSP Residual norm 7.582360092270e-06 5501 KSP Residual norm 7.069748101912e-06 5502 KSP Residual norm 6.470585934534e-06 5503 KSP Residual norm 5.875664271492e-06 5504 KSP Residual norm 5.596979604300e-06 5505 KSP Residual norm 5.491480519291e-06 5506 KSP Residual norm 4.852646953339e-06 5507 KSP Residual norm 3.880342022501e-06 5508 KSP Residual norm 3.425681605682e-06 5509 KSP Residual norm 3.422944599876e-06 5510 KSP Residual norm 3.364888190966e-06 5511 KSP Residual norm 2.939191801172e-06 5512 KSP Residual norm 2.755787696764e-06 5513 KSP Residual norm 2.971890028152e-06 5514 KSP Residual norm 3.316198695129e-06 5515 KSP Residual norm 3.260884358883e-06 5516 KSP Residual norm 3.542673149487e-06 5517 KSP Residual norm 4.196610024542e-06 5518 KSP Residual norm 4.627562218199e-06 5519 KSP Residual norm 4.612345028438e-06 5520 KSP Residual norm 4.443770957171e-06 5521 KSP Residual norm 4.380938551701e-06 5522 KSP Residual norm 4.488535382945e-06 5523 KSP Residual norm 4.712284498048e-06 5524 KSP Residual norm 4.871248371661e-06 5525 KSP Residual norm 4.726437209821e-06 5526 KSP Residual norm 4.386096025645e-06 5527 KSP Residual norm 4.094841007872e-06 5528 KSP Residual norm 3.767234070384e-06 5529 KSP Residual norm 3.648045072169e-06 5530 KSP Residual norm 3.742156938352e-06 5531 KSP Residual norm 4.028756284201e-06 5532 KSP Residual norm 4.175767503667e-06 5533 KSP Residual norm 4.409168191274e-06 5534 KSP Residual norm 4.549924408238e-06 5535 KSP Residual norm 4.509624886236e-06 5536 KSP Residual norm 4.516345381255e-06 5537 KSP Residual norm 4.624907949762e-06 5538 KSP Residual norm 4.917901823717e-06 5539 KSP Residual norm 5.276738004716e-06 5540 KSP Residual norm 5.435000405822e-06 5541 KSP Residual norm 5.852701785395e-06 5542 KSP Residual norm 6.990750031184e-06 5543 KSP Residual norm 8.693517933586e-06 5544 KSP Residual norm 9.369760658684e-06 5545 KSP Residual norm 9.091365377916e-06 5546 KSP Residual norm 9.676056965588e-06 5547 KSP Residual norm 1.162831783947e-05 5548 KSP Residual norm 1.185908484572e-05 5549 KSP Residual norm 1.101692319117e-05 5550 KSP Residual norm 1.106300259118e-05 5551 KSP Residual norm 1.061829230992e-05 5552 KSP Residual norm 9.446204531968e-06 5553 KSP Residual norm 8.217657494866e-06 5554 KSP Residual norm 7.288849347838e-06 5555 KSP Residual norm 6.408876538547e-06 5556 KSP Residual norm 6.085451815969e-06 5557 KSP Residual norm 6.484162312180e-06 5558 KSP Residual norm 6.770883097588e-06 5559 KSP Residual norm 6.792925646033e-06 5560 KSP Residual norm 7.669167690081e-06 5561 KSP Residual norm 8.945530968109e-06 5562 KSP Residual norm 8.614026120564e-06 5563 KSP Residual norm 7.952021595321e-06 5564 KSP Residual norm 8.808864196321e-06 5565 KSP Residual norm 1.108897368684e-05 5566 KSP Residual norm 1.274380198253e-05 5567 KSP Residual norm 1.251726858037e-05 5568 KSP Residual norm 1.382310202679e-05 5569 KSP Residual norm 1.782536149940e-05 5570 KSP Residual norm 2.222905376302e-05 5571 KSP Residual norm 2.246282468850e-05 5572 KSP Residual norm 2.162339938983e-05 5573 KSP Residual norm 2.071950582582e-05 5574 KSP Residual norm 1.976450131276e-05 5575 KSP Residual norm 1.924216283532e-05 5576 KSP Residual norm 1.852990059726e-05 5577 KSP Residual norm 1.591815146051e-05 5578 KSP Residual norm 1.430446127036e-05 5579 KSP Residual norm 1.359567948228e-05 5580 KSP Residual norm 1.293618730115e-05 5581 KSP Residual norm 1.343244375440e-05 5582 KSP Residual norm 1.451686401145e-05 5583 KSP Residual norm 1.463743451381e-05 5584 KSP Residual norm 1.483511784571e-05 5585 KSP Residual norm 1.549122347708e-05 5586 KSP Residual norm 1.623796200138e-05 5587 KSP Residual norm 1.769082340475e-05 5588 KSP Residual norm 1.890567415499e-05 5589 KSP Residual norm 1.859474190613e-05 5590 KSP Residual norm 1.877462201109e-05 5591 KSP Residual norm 2.000523147755e-05 5592 KSP Residual norm 2.183557977029e-05 5593 KSP Residual norm 2.191978451221e-05 5594 KSP Residual norm 2.100070646075e-05 5595 KSP Residual norm 2.240185493519e-05 5596 KSP Residual norm 2.583641443670e-05 5597 KSP Residual norm 2.882567670906e-05 5598 KSP Residual norm 3.217132297033e-05 5599 KSP Residual norm 3.633931039861e-05 5600 KSP Residual norm 3.730308094759e-05 5601 KSP Residual norm 3.055257131683e-05 5602 KSP Residual norm 2.528066776091e-05 5603 KSP Residual norm 2.540202678204e-05 5604 KSP Residual norm 2.686180319422e-05 5605 KSP Residual norm 2.354634484453e-05 5606 KSP Residual norm 2.280406503005e-05 5607 KSP Residual norm 2.174783123132e-05 5608 KSP Residual norm 1.984819360122e-05 5609 KSP Residual norm 1.978195329938e-05 5610 KSP Residual norm 2.301921999814e-05 5611 KSP Residual norm 2.409027614363e-05 5612 KSP Residual norm 2.288593960908e-05 5613 KSP Residual norm 2.207850791745e-05 5614 KSP Residual norm 2.274204403283e-05 5615 KSP Residual norm 2.247876000820e-05 5616 KSP Residual norm 2.295507162696e-05 5617 KSP Residual norm 2.320878968269e-05 5618 KSP Residual norm 2.278502208103e-05 5619 KSP Residual norm 2.392375194562e-05 5620 KSP Residual norm 2.438091769438e-05 5621 KSP Residual norm 2.162041012160e-05 5622 KSP Residual norm 1.860746526771e-05 5623 KSP Residual norm 1.938470088009e-05 5624 KSP Residual norm 1.966786513250e-05 5625 KSP Residual norm 1.711044378105e-05 5626 KSP Residual norm 1.465659728516e-05 5627 KSP Residual norm 1.325403260913e-05 5628 KSP Residual norm 1.271168352537e-05 5629 KSP Residual norm 1.159272800447e-05 5630 KSP Residual norm 1.157809095829e-05 5631 KSP Residual norm 1.165958179611e-05 5632 KSP Residual norm 1.081697540804e-05 5633 KSP Residual norm 1.088131338939e-05 5634 KSP Residual norm 1.276452122241e-05 5635 KSP Residual norm 1.526331334932e-05 5636 KSP Residual norm 1.648673074780e-05 5637 KSP Residual norm 1.554192437512e-05 5638 KSP Residual norm 1.429637639057e-05 5639 KSP Residual norm 1.451211727891e-05 5640 KSP Residual norm 1.542919107416e-05 5641 KSP Residual norm 1.587904909472e-05 5642 KSP Residual norm 1.679889551408e-05 5643 KSP Residual norm 1.796966854906e-05 5644 KSP Residual norm 1.937380313007e-05 5645 KSP Residual norm 1.868782579059e-05 5646 KSP Residual norm 1.611698412572e-05 5647 KSP Residual norm 1.513651859409e-05 5648 KSP Residual norm 1.631594330603e-05 5649 KSP Residual norm 1.662312043502e-05 5650 KSP Residual norm 1.577007658589e-05 5651 KSP Residual norm 1.429258386307e-05 5652 KSP Residual norm 1.239852729962e-05 5653 KSP Residual norm 1.149818590301e-05 5654 KSP Residual norm 1.203759190781e-05 5655 KSP Residual norm 1.166223800163e-05 5656 KSP Residual norm 1.067644696576e-05 5657 KSP Residual norm 1.167709456072e-05 5658 KSP Residual norm 1.262413644833e-05 5659 KSP Residual norm 1.119514853851e-05 5660 KSP Residual norm 1.090353866791e-05 5661 KSP Residual norm 1.292606097272e-05 5662 KSP Residual norm 1.522265850866e-05 5663 KSP Residual norm 1.682699556565e-05 5664 KSP Residual norm 1.861426152386e-05 5665 KSP Residual norm 2.146452252388e-05 5666 KSP Residual norm 2.325557857889e-05 5667 KSP Residual norm 2.195507348651e-05 5668 KSP Residual norm 2.195232080741e-05 5669 KSP Residual norm 2.553436631729e-05 5670 KSP Residual norm 2.872147817779e-05 5671 KSP Residual norm 2.823889361995e-05 5672 KSP Residual norm 3.131282504082e-05 5673 KSP Residual norm 3.656584624398e-05 5674 KSP Residual norm 3.468831425485e-05 5675 KSP Residual norm 3.359725448340e-05 5676 KSP Residual norm 3.788686685775e-05 5677 KSP Residual norm 4.077681104817e-05 5678 KSP Residual norm 4.297939987724e-05 5679 KSP Residual norm 4.452542660864e-05 5680 KSP Residual norm 4.201494437652e-05 5681 KSP Residual norm 4.499523402656e-05 5682 KSP Residual norm 4.907649659042e-05 5683 KSP Residual norm 4.965063196527e-05 5684 KSP Residual norm 4.925382355341e-05 5685 KSP Residual norm 5.444260216793e-05 5686 KSP Residual norm 6.003934766354e-05 5687 KSP Residual norm 5.801507984838e-05 5688 KSP Residual norm 5.713053766418e-05 5689 KSP Residual norm 6.049219273280e-05 5690 KSP Residual norm 6.858110678971e-05 5691 KSP Residual norm 7.954424524029e-05 5692 KSP Residual norm 7.871180313959e-05 5693 KSP Residual norm 7.335490314706e-05 5694 KSP Residual norm 7.619457810116e-05 5695 KSP Residual norm 7.752061580039e-05 5696 KSP Residual norm 6.893843631676e-05 5697 KSP Residual norm 5.936840534785e-05 5698 KSP Residual norm 5.240926856457e-05 5699 KSP Residual norm 5.174437745017e-05 5700 KSP Residual norm 5.507814039568e-05 5701 KSP Residual norm 5.362145502310e-05 5702 KSP Residual norm 5.570774126388e-05 5703 KSP Residual norm 6.602632868639e-05 5704 KSP Residual norm 6.767208624985e-05 5705 KSP Residual norm 6.261452221010e-05 5706 KSP Residual norm 6.646518426196e-05 5707 KSP Residual norm 7.647189913308e-05 5708 KSP Residual norm 7.704157272977e-05 5709 KSP Residual norm 7.107528707986e-05 5710 KSP Residual norm 6.948803848116e-05 5711 KSP Residual norm 8.081529790175e-05 5712 KSP Residual norm 1.014095405237e-04 5713 KSP Residual norm 1.081476301613e-04 5714 KSP Residual norm 1.042940025880e-04 5715 KSP Residual norm 1.036880210342e-04 5716 KSP Residual norm 1.021209577493e-04 5717 KSP Residual norm 1.002850675406e-04 5718 KSP Residual norm 9.332248986234e-05 5719 KSP Residual norm 7.361566497504e-05 5720 KSP Residual norm 6.422715770513e-05 5721 KSP Residual norm 6.915058654177e-05 5722 KSP Residual norm 6.416960067782e-05 5723 KSP Residual norm 5.373042668160e-05 5724 KSP Residual norm 5.047072632927e-05 5725 KSP Residual norm 4.778740133795e-05 5726 KSP Residual norm 4.219161146925e-05 5727 KSP Residual norm 4.256811949806e-05 5728 KSP Residual norm 4.877370255517e-05 5729 KSP Residual norm 5.098750364346e-05 5730 KSP Residual norm 4.872494579491e-05 5731 KSP Residual norm 4.684661658167e-05 5732 KSP Residual norm 4.898005818936e-05 5733 KSP Residual norm 5.431404478127e-05 5734 KSP Residual norm 5.914049427487e-05 5735 KSP Residual norm 6.376363132635e-05 5736 KSP Residual norm 6.886913366330e-05 5737 KSP Residual norm 6.868196363700e-05 5738 KSP Residual norm 7.093686092014e-05 5739 KSP Residual norm 6.742977748224e-05 5740 KSP Residual norm 5.538166803462e-05 5741 KSP Residual norm 5.419796645776e-05 5742 KSP Residual norm 6.211945548186e-05 5743 KSP Residual norm 6.109248894122e-05 5744 KSP Residual norm 5.047999238871e-05 5745 KSP Residual norm 4.611285835908e-05 5746 KSP Residual norm 4.605103274247e-05 5747 KSP Residual norm 4.680826782300e-05 5748 KSP Residual norm 4.543184806948e-05 5749 KSP Residual norm 4.288521600355e-05 5750 KSP Residual norm 4.014917000339e-05 5751 KSP Residual norm 3.932260316700e-05 5752 KSP Residual norm 3.588456923275e-05 5753 KSP Residual norm 3.196196632128e-05 5754 KSP Residual norm 3.139416129741e-05 5755 KSP Residual norm 3.403508684195e-05 5756 KSP Residual norm 3.628350579050e-05 5757 KSP Residual norm 3.836857274861e-05 5758 KSP Residual norm 3.954386974449e-05 5759 KSP Residual norm 4.011915555833e-05 5760 KSP Residual norm 4.128544534608e-05 5761 KSP Residual norm 4.439593735248e-05 5762 KSP Residual norm 5.073916878302e-05 5763 KSP Residual norm 5.662560320266e-05 5764 KSP Residual norm 5.603731212473e-05 5765 KSP Residual norm 5.533008125977e-05 5766 KSP Residual norm 6.029697647124e-05 5767 KSP Residual norm 6.130533399644e-05 5768 KSP Residual norm 6.176757635929e-05 5769 KSP Residual norm 6.175069208590e-05 5770 KSP Residual norm 5.980765988776e-05 5771 KSP Residual norm 5.878499444665e-05 5772 KSP Residual norm 6.410517017736e-05 5773 KSP Residual norm 6.402187258826e-05 5774 KSP Residual norm 5.622645147391e-05 5775 KSP Residual norm 5.388674481786e-05 5776 KSP Residual norm 6.160839129385e-05 5777 KSP Residual norm 7.149101947595e-05 5778 KSP Residual norm 7.276262684592e-05 5779 KSP Residual norm 6.671489237116e-05 5780 KSP Residual norm 6.504247025748e-05 5781 KSP Residual norm 7.006957397434e-05 5782 KSP Residual norm 7.329011483403e-05 5783 KSP Residual norm 7.826353525746e-05 5784 KSP Residual norm 9.456704158856e-05 5785 KSP Residual norm 1.092171369618e-04 5786 KSP Residual norm 1.108722207906e-04 5787 KSP Residual norm 1.190011317138e-04 5788 KSP Residual norm 1.286731075942e-04 5789 KSP Residual norm 1.240684802814e-04 5790 KSP Residual norm 1.299914015235e-04 5791 KSP Residual norm 1.345286967498e-04 5792 KSP Residual norm 1.331515669063e-04 5793 KSP Residual norm 1.391848408482e-04 5794 KSP Residual norm 1.389994870591e-04 5795 KSP Residual norm 1.316377668436e-04 5796 KSP Residual norm 1.360418555561e-04 5797 KSP Residual norm 1.434549931809e-04 5798 KSP Residual norm 1.377260578661e-04 5799 KSP Residual norm 1.306327978207e-04 5800 KSP Residual norm 1.133978146600e-04 5801 KSP Residual norm 9.364218397859e-05 5802 KSP Residual norm 9.179247606745e-05 5803 KSP Residual norm 9.592385039723e-05 5804 KSP Residual norm 8.401491164108e-05 5805 KSP Residual norm 7.765458544600e-05 5806 KSP Residual norm 8.034330438499e-05 5807 KSP Residual norm 7.518789904032e-05 5808 KSP Residual norm 7.109056953595e-05 5809 KSP Residual norm 6.638332743955e-05 5810 KSP Residual norm 5.826636461352e-05 5811 KSP Residual norm 5.429215649359e-05 5812 KSP Residual norm 5.634929811268e-05 5813 KSP Residual norm 6.148619101370e-05 5814 KSP Residual norm 6.899444044826e-05 5815 KSP Residual norm 7.282923000521e-05 5816 KSP Residual norm 7.303867766647e-05 5817 KSP Residual norm 7.733188829761e-05 5818 KSP Residual norm 8.307647561037e-05 5819 KSP Residual norm 7.965971236127e-05 5820 KSP Residual norm 7.934842114619e-05 5821 KSP Residual norm 8.209899838983e-05 5822 KSP Residual norm 8.779306783545e-05 5823 KSP Residual norm 9.284908590190e-05 5824 KSP Residual norm 9.035707596758e-05 5825 KSP Residual norm 8.497427643231e-05 5826 KSP Residual norm 8.870949703092e-05 5827 KSP Residual norm 8.962203641969e-05 5828 KSP Residual norm 8.556785348387e-05 5829 KSP Residual norm 8.707156662234e-05 5830 KSP Residual norm 9.563497246028e-05 5831 KSP Residual norm 1.001603792157e-04 5832 KSP Residual norm 1.058354970032e-04 5833 KSP Residual norm 1.020122592705e-04 5834 KSP Residual norm 8.363758530242e-05 5835 KSP Residual norm 7.795739977042e-05 5836 KSP Residual norm 8.475746957849e-05 5837 KSP Residual norm 7.746555774856e-05 5838 KSP Residual norm 6.775976202369e-05 5839 KSP Residual norm 6.713802694789e-05 5840 KSP Residual norm 7.307563844167e-05 5841 KSP Residual norm 7.119972459979e-05 5842 KSP Residual norm 6.545254270597e-05 5843 KSP Residual norm 6.307563127922e-05 5844 KSP Residual norm 6.423756540491e-05 5845 KSP Residual norm 5.995115181578e-05 5846 KSP Residual norm 5.483045191622e-05 5847 KSP Residual norm 5.852567918184e-05 5848 KSP Residual norm 6.647278028454e-05 5849 KSP Residual norm 7.028911931516e-05 5850 KSP Residual norm 7.344523749685e-05 5851 KSP Residual norm 7.446548518580e-05 5852 KSP Residual norm 7.471144488338e-05 5853 KSP Residual norm 8.373363285549e-05 5854 KSP Residual norm 8.976656017027e-05 5855 KSP Residual norm 7.798728475167e-05 5856 KSP Residual norm 7.495418242486e-05 5857 KSP Residual norm 9.070859194941e-05 5858 KSP Residual norm 9.869230183081e-05 5859 KSP Residual norm 8.635481019118e-05 5860 KSP Residual norm 8.145442001528e-05 5861 KSP Residual norm 8.539746419754e-05 5862 KSP Residual norm 8.468993064896e-05 5863 KSP Residual norm 7.623263543236e-05 5864 KSP Residual norm 6.623690254470e-05 5865 KSP Residual norm 6.297598378864e-05 5866 KSP Residual norm 6.584547937519e-05 5867 KSP Residual norm 6.918215431497e-05 5868 KSP Residual norm 6.790112230965e-05 5869 KSP Residual norm 6.472893330219e-05 5870 KSP Residual norm 6.297316456709e-05 5871 KSP Residual norm 6.466990087345e-05 5872 KSP Residual norm 6.356245911866e-05 5873 KSP Residual norm 5.566960638297e-05 5874 KSP Residual norm 5.489411989031e-05 5875 KSP Residual norm 6.786302618293e-05 5876 KSP Residual norm 8.898723672657e-05 5877 KSP Residual norm 1.046662334987e-04 5878 KSP Residual norm 1.178311958989e-04 5879 KSP Residual norm 1.249016558722e-04 5880 KSP Residual norm 1.276688320097e-04 5881 KSP Residual norm 1.325739034260e-04 5882 KSP Residual norm 1.285071707702e-04 5883 KSP Residual norm 1.293764487073e-04 5884 KSP Residual norm 1.357160635366e-04 5885 KSP Residual norm 1.217995115067e-04 5886 KSP Residual norm 9.830532495372e-05 5887 KSP Residual norm 8.267137011876e-05 5888 KSP Residual norm 7.026973139745e-05 5889 KSP Residual norm 6.452580056007e-05 5890 KSP Residual norm 6.693542001935e-05 5891 KSP Residual norm 6.467452058779e-05 5892 KSP Residual norm 5.876773316360e-05 5893 KSP Residual norm 6.256794804701e-05 5894 KSP Residual norm 7.195749199543e-05 5895 KSP Residual norm 7.668743317361e-05 5896 KSP Residual norm 8.404104696612e-05 5897 KSP Residual norm 8.982636850415e-05 5898 KSP Residual norm 9.257872539147e-05 5899 KSP Residual norm 9.151710165918e-05 5900 KSP Residual norm 9.293499807731e-05 5901 KSP Residual norm 9.811482056441e-05 5902 KSP Residual norm 1.024843051569e-04 5903 KSP Residual norm 9.343999355379e-05 5904 KSP Residual norm 9.049291794660e-05 5905 KSP Residual norm 9.780787111635e-05 5906 KSP Residual norm 1.059508366191e-04 5907 KSP Residual norm 1.126988190670e-04 5908 KSP Residual norm 1.168502763320e-04 5909 KSP Residual norm 1.058631006983e-04 5910 KSP Residual norm 1.064516791154e-04 5911 KSP Residual norm 1.234908099173e-04 5912 KSP Residual norm 1.262053141692e-04 5913 KSP Residual norm 1.031661774481e-04 5914 KSP Residual norm 9.621588561916e-05 5915 KSP Residual norm 9.489692242985e-05 5916 KSP Residual norm 8.595600033991e-05 5917 KSP Residual norm 7.207240509951e-05 5918 KSP Residual norm 6.474146144496e-05 5919 KSP Residual norm 6.376305361345e-05 5920 KSP Residual norm 6.367481741882e-05 5921 KSP Residual norm 6.344420389635e-05 5922 KSP Residual norm 6.903360153335e-05 5923 KSP Residual norm 6.900109282684e-05 5924 KSP Residual norm 5.633226622044e-05 5925 KSP Residual norm 5.203861413094e-05 5926 KSP Residual norm 5.714220742057e-05 5927 KSP Residual norm 5.657186479194e-05 5928 KSP Residual norm 5.385541755184e-05 5929 KSP Residual norm 5.532324257529e-05 5930 KSP Residual norm 5.720913722852e-05 5931 KSP Residual norm 5.695133225358e-05 5932 KSP Residual norm 5.580479164771e-05 5933 KSP Residual norm 5.493432868676e-05 5934 KSP Residual norm 5.593406284629e-05 5935 KSP Residual norm 6.302356173445e-05 5936 KSP Residual norm 7.570064667564e-05 5937 KSP Residual norm 8.576816086030e-05 5938 KSP Residual norm 8.319890574186e-05 5939 KSP Residual norm 6.732763896739e-05 5940 KSP Residual norm 6.489616500681e-05 5941 KSP Residual norm 7.135041606923e-05 5942 KSP Residual norm 6.830405223954e-05 5943 KSP Residual norm 5.761532078027e-05 5944 KSP Residual norm 5.622718053728e-05 5945 KSP Residual norm 5.101883472875e-05 5946 KSP Residual norm 4.008305256116e-05 5947 KSP Residual norm 3.634938787462e-05 5948 KSP Residual norm 3.623893532702e-05 5949 KSP Residual norm 2.946781214777e-05 5950 KSP Residual norm 2.447952214964e-05 5951 KSP Residual norm 2.351733396593e-05 5952 KSP Residual norm 2.343833612171e-05 5953 KSP Residual norm 2.271370206340e-05 5954 KSP Residual norm 2.121613500926e-05 5955 KSP Residual norm 1.919447557952e-05 5956 KSP Residual norm 1.749976246413e-05 5957 KSP Residual norm 1.580117365586e-05 5958 KSP Residual norm 1.375424464567e-05 5959 KSP Residual norm 1.209146880015e-05 5960 KSP Residual norm 1.077129590390e-05 5961 KSP Residual norm 1.000905983268e-05 5962 KSP Residual norm 1.078203630223e-05 5963 KSP Residual norm 1.188677774862e-05 5964 KSP Residual norm 1.117435445555e-05 5965 KSP Residual norm 1.117714238861e-05 5966 KSP Residual norm 1.261972120189e-05 5967 KSP Residual norm 1.223266462713e-05 5968 KSP Residual norm 1.157131813930e-05 5969 KSP Residual norm 1.222225083873e-05 5970 KSP Residual norm 1.361994081007e-05 5971 KSP Residual norm 1.625486334181e-05 5972 KSP Residual norm 2.011035970780e-05 5973 KSP Residual norm 2.047203563357e-05 5974 KSP Residual norm 1.929428112718e-05 5975 KSP Residual norm 1.836266924292e-05 5976 KSP Residual norm 1.754772457829e-05 5977 KSP Residual norm 1.871991777010e-05 5978 KSP Residual norm 2.008547451322e-05 5979 KSP Residual norm 1.928719405081e-05 5980 KSP Residual norm 1.930994816398e-05 5981 KSP Residual norm 2.222333165052e-05 5982 KSP Residual norm 2.228829521698e-05 5983 KSP Residual norm 2.079645222948e-05 5984 KSP Residual norm 2.230243651587e-05 5985 KSP Residual norm 2.363034117943e-05 5986 KSP Residual norm 2.049160611610e-05 5987 KSP Residual norm 1.783598976960e-05 5988 KSP Residual norm 1.592192891959e-05 5989 KSP Residual norm 1.326994488128e-05 5990 KSP Residual norm 1.222755168757e-05 5991 KSP Residual norm 1.305422019295e-05 5992 KSP Residual norm 1.366287447021e-05 5993 KSP Residual norm 1.245502360146e-05 5994 KSP Residual norm 1.090667301314e-05 5995 KSP Residual norm 1.095573106839e-05 5996 KSP Residual norm 1.177140261272e-05 5997 KSP Residual norm 1.144633078168e-05 5998 KSP Residual norm 1.171583467884e-05 5999 KSP Residual norm 1.300992972096e-05 6000 KSP Residual norm 1.253643003247e-05 6001 KSP Residual norm 1.261982583416e-05 6002 KSP Residual norm 1.412796192941e-05 6003 KSP Residual norm 1.378380777273e-05 6004 KSP Residual norm 1.323936035625e-05 6005 KSP Residual norm 1.437925049690e-05 6006 KSP Residual norm 1.475915154193e-05 6007 KSP Residual norm 1.486719633729e-05 6008 KSP Residual norm 1.712898564869e-05 6009 KSP Residual norm 1.794388651580e-05 6010 KSP Residual norm 1.677613578545e-05 6011 KSP Residual norm 1.783774693846e-05 6012 KSP Residual norm 1.937552212011e-05 6013 KSP Residual norm 1.630636664770e-05 6014 KSP Residual norm 1.203072450443e-05 6015 KSP Residual norm 1.051315266850e-05 6016 KSP Residual norm 1.045391597948e-05 6017 KSP Residual norm 9.505878927534e-06 6018 KSP Residual norm 7.515713925981e-06 6019 KSP Residual norm 6.849528398227e-06 6020 KSP Residual norm 7.626882125460e-06 6021 KSP Residual norm 8.965246928439e-06 6022 KSP Residual norm 1.164015696430e-05 6023 KSP Residual norm 1.568276073948e-05 6024 KSP Residual norm 1.884977586545e-05 6025 KSP Residual norm 2.125370004144e-05 6026 KSP Residual norm 2.302208259023e-05 6027 KSP Residual norm 2.461262298888e-05 6028 KSP Residual norm 2.509430151314e-05 6029 KSP Residual norm 2.404273355965e-05 6030 KSP Residual norm 2.260517065963e-05 6031 KSP Residual norm 2.215218737557e-05 6032 KSP Residual norm 2.165586929092e-05 6033 KSP Residual norm 1.904201997243e-05 6034 KSP Residual norm 1.652157059593e-05 6035 KSP Residual norm 1.545083552534e-05 6036 KSP Residual norm 1.369536292644e-05 6037 KSP Residual norm 1.177487033952e-05 6038 KSP Residual norm 1.079295078337e-05 6039 KSP Residual norm 1.001728310439e-05 6040 KSP Residual norm 8.964515934753e-06 6041 KSP Residual norm 8.264283603573e-06 6042 KSP Residual norm 7.425923123589e-06 6043 KSP Residual norm 6.170590287514e-06 6044 KSP Residual norm 4.752078726923e-06 6045 KSP Residual norm 3.658161277277e-06 6046 KSP Residual norm 2.995395032187e-06 6047 KSP Residual norm 2.715150704054e-06 6048 KSP Residual norm 2.455276765537e-06 6049 KSP Residual norm 2.347399557100e-06 6050 KSP Residual norm 2.395500396908e-06 6051 KSP Residual norm 2.663705358342e-06 6052 KSP Residual norm 3.154059126358e-06 6053 KSP Residual norm 3.158921080396e-06 6054 KSP Residual norm 2.919528523571e-06 6055 KSP Residual norm 3.350158617061e-06 6056 KSP Residual norm 4.103848896061e-06 6057 KSP Residual norm 3.851426153256e-06 6058 KSP Residual norm 3.708543631737e-06 6059 KSP Residual norm 4.068145614636e-06 6060 KSP Residual norm 3.864003590417e-06 6061 KSP Residual norm 3.601806826327e-06 6062 KSP Residual norm 3.811777894430e-06 6063 KSP Residual norm 3.639678919318e-06 6064 KSP Residual norm 3.368028834060e-06 6065 KSP Residual norm 3.732046219438e-06 6066 KSP Residual norm 4.181593072820e-06 6067 KSP Residual norm 3.882236217403e-06 6068 KSP Residual norm 3.650219437644e-06 6069 KSP Residual norm 3.589890152124e-06 6070 KSP Residual norm 3.101744457122e-06 6071 KSP Residual norm 2.315808469526e-06 6072 KSP Residual norm 1.914332958941e-06 6073 KSP Residual norm 1.922949524325e-06 6074 KSP Residual norm 2.106086532816e-06 6075 KSP Residual norm 1.975474421294e-06 6076 KSP Residual norm 2.050001409721e-06 6077 KSP Residual norm 2.615742099219e-06 6078 KSP Residual norm 2.942360269072e-06 6079 KSP Residual norm 2.626954717820e-06 6080 KSP Residual norm 2.371753616575e-06 6081 KSP Residual norm 2.260098410000e-06 6082 KSP Residual norm 2.027621339770e-06 6083 KSP Residual norm 1.979468525753e-06 6084 KSP Residual norm 2.118945737429e-06 6085 KSP Residual norm 2.077856634973e-06 6086 KSP Residual norm 2.138160935999e-06 6087 KSP Residual norm 2.540534594474e-06 6088 KSP Residual norm 2.888537741658e-06 6089 KSP Residual norm 3.050163512717e-06 6090 KSP Residual norm 3.371443056816e-06 6091 KSP Residual norm 3.915989100625e-06 6092 KSP Residual norm 4.619553001790e-06 6093 KSP Residual norm 5.272783885974e-06 6094 KSP Residual norm 5.494039996044e-06 6095 KSP Residual norm 5.924549292204e-06 6096 KSP Residual norm 6.477369554878e-06 6097 KSP Residual norm 6.841278802813e-06 6098 KSP Residual norm 7.204216680474e-06 6099 KSP Residual norm 7.232036392861e-06 6100 KSP Residual norm 7.367111989628e-06 6101 KSP Residual norm 8.528618192501e-06 6102 KSP Residual norm 9.945914945270e-06 6103 KSP Residual norm 1.034840402594e-05 6104 KSP Residual norm 1.046037751746e-05 6105 KSP Residual norm 1.021006880083e-05 6106 KSP Residual norm 9.442280639837e-06 6107 KSP Residual norm 8.869804848302e-06 6108 KSP Residual norm 8.640017918616e-06 6109 KSP Residual norm 9.116766017146e-06 6110 KSP Residual norm 1.033439101560e-05 6111 KSP Residual norm 9.661346546923e-06 6112 KSP Residual norm 8.587923832160e-06 6113 KSP Residual norm 9.095712347458e-06 6114 KSP Residual norm 9.439618281688e-06 6115 KSP Residual norm 8.843385064887e-06 6116 KSP Residual norm 8.663583213299e-06 6117 KSP Residual norm 8.578675663830e-06 6118 KSP Residual norm 6.973282589225e-06 6119 KSP Residual norm 5.056604287602e-06 6120 KSP Residual norm 3.668504552844e-06 6121 KSP Residual norm 2.821322326973e-06 6122 KSP Residual norm 2.221793292753e-06 6123 KSP Residual norm 1.982221390195e-06 6124 KSP Residual norm 2.107063762549e-06 6125 KSP Residual norm 2.392903560272e-06 6126 KSP Residual norm 2.501184931995e-06 6127 KSP Residual norm 2.332697236585e-06 6128 KSP Residual norm 2.291633503538e-06 6129 KSP Residual norm 2.636867691983e-06 6130 KSP Residual norm 3.307155054538e-06 6131 KSP Residual norm 3.718460419808e-06 6132 KSP Residual norm 3.181955374404e-06 6133 KSP Residual norm 2.672708003693e-06 6134 KSP Residual norm 2.800497011271e-06 6135 KSP Residual norm 2.864107488967e-06 6136 KSP Residual norm 2.471159903414e-06 6137 KSP Residual norm 2.265433521555e-06 6138 KSP Residual norm 2.370594792280e-06 6139 KSP Residual norm 2.660529802365e-06 6140 KSP Residual norm 2.550011779831e-06 6141 KSP Residual norm 2.270909182094e-06 6142 KSP Residual norm 2.228171658170e-06 6143 KSP Residual norm 2.072665241661e-06 6144 KSP Residual norm 1.772675370133e-06 6145 KSP Residual norm 1.634616865668e-06 6146 KSP Residual norm 1.746068856671e-06 6147 KSP Residual norm 1.886534439370e-06 6148 KSP Residual norm 1.954516026958e-06 6149 KSP Residual norm 2.008664962379e-06 6150 KSP Residual norm 1.892225476083e-06 6151 KSP Residual norm 1.779181971350e-06 6152 KSP Residual norm 1.549649111191e-06 6153 KSP Residual norm 1.309189440465e-06 6154 KSP Residual norm 1.254473787788e-06 6155 KSP Residual norm 1.300000186278e-06 6156 KSP Residual norm 1.163877793019e-06 6157 KSP Residual norm 1.025048556343e-06 6158 KSP Residual norm 9.725747720047e-07 6159 KSP Residual norm 8.882646534310e-07 6160 KSP Residual norm 7.899786617629e-07 6161 KSP Residual norm 7.098962117114e-07 6162 KSP Residual norm 6.571438804392e-07 6163 KSP Residual norm 6.233705437693e-07 6164 KSP Residual norm 5.793518027939e-07 6165 KSP Residual norm 5.017620383284e-07 6166 KSP Residual norm 4.717545202792e-07 6167 KSP Residual norm 5.409588054727e-07 6168 KSP Residual norm 6.689457279101e-07 6169 KSP Residual norm 7.068553231551e-07 6170 KSP Residual norm 7.306878061877e-07 6171 KSP Residual norm 8.044900764292e-07 6172 KSP Residual norm 7.882541231289e-07 6173 KSP Residual norm 6.381753603503e-07 6174 KSP Residual norm 5.762064426122e-07 6175 KSP Residual norm 6.081472533300e-07 6176 KSP Residual norm 6.494564662187e-07 6177 KSP Residual norm 6.376539971492e-07 6178 KSP Residual norm 6.322647079046e-07 6179 KSP Residual norm 6.013395049067e-07 6180 KSP Residual norm 5.508862201265e-07 6181 KSP Residual norm 5.525669649454e-07 6182 KSP Residual norm 6.022831749401e-07 6183 KSP Residual norm 5.808445015043e-07 6184 KSP Residual norm 5.309139555604e-07 6185 KSP Residual norm 5.450536141986e-07 6186 KSP Residual norm 5.735104819019e-07 6187 KSP Residual norm 6.218315760969e-07 6188 KSP Residual norm 6.751463428041e-07 6189 KSP Residual norm 6.788945578262e-07 6190 KSP Residual norm 6.235551315947e-07 6191 KSP Residual norm 5.597095584396e-07 6192 KSP Residual norm 5.334634746193e-07 6193 KSP Residual norm 5.074424507864e-07 6194 KSP Residual norm 4.646078196431e-07 6195 KSP Residual norm 4.321083281233e-07 6196 KSP Residual norm 4.271703839390e-07 6197 KSP Residual norm 4.133860169395e-07 6198 KSP Residual norm 3.728355677075e-07 6199 KSP Residual norm 3.533013662579e-07 6200 KSP Residual norm 3.605775962320e-07 6201 KSP Residual norm 3.676115948431e-07 6202 KSP Residual norm 3.662721261886e-07 6203 KSP Residual norm 3.496472803836e-07 6204 KSP Residual norm 3.311737034080e-07 6205 KSP Residual norm 3.168943432327e-07 6206 KSP Residual norm 2.915249814439e-07 6207 KSP Residual norm 2.891728659654e-07 6208 KSP Residual norm 3.292998619762e-07 6209 KSP Residual norm 3.602544304104e-07 6210 KSP Residual norm 3.879586491687e-07 6211 KSP Residual norm 4.876159692383e-07 6212 KSP Residual norm 5.907366149398e-07 6213 KSP Residual norm 5.718446140922e-07 6214 KSP Residual norm 5.602727798831e-07 6215 KSP Residual norm 5.730910805663e-07 6216 KSP Residual norm 5.884160198151e-07 6217 KSP Residual norm 6.345265110046e-07 6218 KSP Residual norm 6.442966684300e-07 6219 KSP Residual norm 5.773477175642e-07 6220 KSP Residual norm 5.543998358024e-07 6221 KSP Residual norm 5.968563716856e-07 6222 KSP Residual norm 6.584070093030e-07 6223 KSP Residual norm 7.903467297070e-07 6224 KSP Residual norm 1.023098403601e-06 6225 KSP Residual norm 1.172941978045e-06 6226 KSP Residual norm 1.079032268490e-06 6227 KSP Residual norm 9.413901883227e-07 6228 KSP Residual norm 8.877698900315e-07 6229 KSP Residual norm 8.839468704128e-07 6230 KSP Residual norm 7.522229317914e-07 6231 KSP Residual norm 6.472378292787e-07 6232 KSP Residual norm 6.564460517430e-07 6233 KSP Residual norm 6.505807418437e-07 6234 KSP Residual norm 5.835276775672e-07 6235 KSP Residual norm 5.701262909769e-07 6236 KSP Residual norm 6.249695750825e-07 6237 KSP Residual norm 6.239842615417e-07 6238 KSP Residual norm 5.925788599882e-07 6239 KSP Residual norm 5.866535984806e-07 6240 KSP Residual norm 5.996540220879e-07 6241 KSP Residual norm 6.549927664876e-07 6242 KSP Residual norm 7.358669830895e-07 6243 KSP Residual norm 7.080454652216e-07 6244 KSP Residual norm 6.433351751416e-07 6245 KSP Residual norm 6.399308702687e-07 6246 KSP Residual norm 6.825307143473e-07 6247 KSP Residual norm 7.344511881925e-07 6248 KSP Residual norm 7.842788008254e-07 6249 KSP Residual norm 8.608078101576e-07 6250 KSP Residual norm 9.019566507324e-07 6251 KSP Residual norm 8.913735864525e-07 6252 KSP Residual norm 9.113623736156e-07 6253 KSP Residual norm 9.130355102928e-07 6254 KSP Residual norm 8.278906501356e-07 6255 KSP Residual norm 8.145103778308e-07 6256 KSP Residual norm 9.030607636329e-07 6257 KSP Residual norm 1.010372806425e-06 6258 KSP Residual norm 1.008382566013e-06 6259 KSP Residual norm 9.284688392946e-07 6260 KSP Residual norm 8.203527631787e-07 6261 KSP Residual norm 7.197639005482e-07 6262 KSP Residual norm 6.311405907599e-07 6263 KSP Residual norm 6.157436203499e-07 6264 KSP Residual norm 6.898865782815e-07 6265 KSP Residual norm 7.417194953489e-07 6266 KSP Residual norm 6.054022573058e-07 6267 KSP Residual norm 4.668785423647e-07 6268 KSP Residual norm 4.508410385323e-07 6269 KSP Residual norm 5.143883805824e-07 6270 KSP Residual norm 5.648076373633e-07 6271 KSP Residual norm 5.409148297683e-07 6272 KSP Residual norm 5.165351259597e-07 6273 KSP Residual norm 5.026269462844e-07 6274 KSP Residual norm 4.375611368963e-07 6275 KSP Residual norm 4.146849745407e-07 6276 KSP Residual norm 4.586406046337e-07 6277 KSP Residual norm 4.817441492424e-07 6278 KSP Residual norm 4.703949486561e-07 6279 KSP Residual norm 4.667749440561e-07 6280 KSP Residual norm 4.222196971916e-07 6281 KSP Residual norm 3.746239950849e-07 6282 KSP Residual norm 3.713933920870e-07 6283 KSP Residual norm 3.451443225881e-07 6284 KSP Residual norm 3.119330212513e-07 6285 KSP Residual norm 3.488614292656e-07 6286 KSP Residual norm 3.950470493621e-07 6287 KSP Residual norm 3.713638264573e-07 6288 KSP Residual norm 3.517574039356e-07 6289 KSP Residual norm 3.733144081422e-07 6290 KSP Residual norm 3.807975879218e-07 6291 KSP Residual norm 4.023448976417e-07 6292 KSP Residual norm 4.662616703048e-07 6293 KSP Residual norm 5.209718259475e-07 6294 KSP Residual norm 5.471513171313e-07 6295 KSP Residual norm 5.271663865540e-07 6296 KSP Residual norm 4.721587317034e-07 6297 KSP Residual norm 4.497928064897e-07 6298 KSP Residual norm 4.483047746957e-07 6299 KSP Residual norm 4.673073053402e-07 6300 KSP Residual norm 5.446891881007e-07 6301 KSP Residual norm 6.418818852161e-07 6302 KSP Residual norm 6.429118196541e-07 6303 KSP Residual norm 5.738634053431e-07 6304 KSP Residual norm 5.274462265722e-07 6305 KSP Residual norm 5.477501014999e-07 6306 KSP Residual norm 6.009321493012e-07 6307 KSP Residual norm 5.602291527368e-07 6308 KSP Residual norm 5.097902502607e-07 6309 KSP Residual norm 5.227295482773e-07 6310 KSP Residual norm 5.231921770761e-07 6311 KSP Residual norm 5.193533448924e-07 6312 KSP Residual norm 5.303133887726e-07 6313 KSP Residual norm 5.538205668096e-07 6314 KSP Residual norm 5.980964288526e-07 6315 KSP Residual norm 5.910984694443e-07 6316 KSP Residual norm 5.065642923694e-07 6317 KSP Residual norm 4.577736390202e-07 6318 KSP Residual norm 4.582705686672e-07 6319 KSP Residual norm 4.350673266952e-07 6320 KSP Residual norm 3.526619107630e-07 6321 KSP Residual norm 2.931298862947e-07 6322 KSP Residual norm 2.729729671243e-07 6323 KSP Residual norm 2.504681920557e-07 6324 KSP Residual norm 2.426653358934e-07 6325 KSP Residual norm 2.640835584745e-07 6326 KSP Residual norm 3.019080161690e-07 6327 KSP Residual norm 3.061128156656e-07 6328 KSP Residual norm 3.091441485783e-07 6329 KSP Residual norm 3.352028294424e-07 6330 KSP Residual norm 3.358706220189e-07 6331 KSP Residual norm 3.485393523240e-07 6332 KSP Residual norm 3.727648800641e-07 6333 KSP Residual norm 3.759082683508e-07 6334 KSP Residual norm 3.818279626156e-07 6335 KSP Residual norm 4.218993941558e-07 6336 KSP Residual norm 4.344682534579e-07 6337 KSP Residual norm 3.656257248123e-07 6338 KSP Residual norm 3.385624452825e-07 6339 KSP Residual norm 3.877039070152e-07 6340 KSP Residual norm 4.935951751207e-07 6341 KSP Residual norm 6.569144767093e-07 6342 KSP Residual norm 7.660848531008e-07 6343 KSP Residual norm 7.934984653229e-07 6344 KSP Residual norm 7.675424668406e-07 6345 KSP Residual norm 8.377962840044e-07 6346 KSP Residual norm 9.545422865608e-07 6347 KSP Residual norm 1.057496088882e-06 6348 KSP Residual norm 9.883710828163e-07 6349 KSP Residual norm 9.107235950616e-07 6350 KSP Residual norm 9.268906938325e-07 6351 KSP Residual norm 1.051090392880e-06 6352 KSP Residual norm 1.145183384474e-06 6353 KSP Residual norm 1.058134302179e-06 6354 KSP Residual norm 9.014728553726e-07 6355 KSP Residual norm 8.724238298098e-07 6356 KSP Residual norm 8.785293802982e-07 6357 KSP Residual norm 8.761702757471e-07 6358 KSP Residual norm 1.038441501526e-06 6359 KSP Residual norm 1.274765200598e-06 6360 KSP Residual norm 1.265325712153e-06 6361 KSP Residual norm 1.128333599573e-06 6362 KSP Residual norm 1.009829087768e-06 6363 KSP Residual norm 9.459916949749e-07 6364 KSP Residual norm 9.570452792994e-07 6365 KSP Residual norm 1.016066291095e-06 6366 KSP Residual norm 1.017760336923e-06 6367 KSP Residual norm 9.808817971808e-07 6368 KSP Residual norm 9.318995809997e-07 6369 KSP Residual norm 8.742829944314e-07 6370 KSP Residual norm 7.739606841477e-07 6371 KSP Residual norm 6.954261149272e-07 6372 KSP Residual norm 7.252391747903e-07 6373 KSP Residual norm 7.356799731202e-07 6374 KSP Residual norm 7.122346949730e-07 6375 KSP Residual norm 8.700041933654e-07 6376 KSP Residual norm 1.309737473034e-06 6377 KSP Residual norm 1.662607349196e-06 6378 KSP Residual norm 1.660696405667e-06 6379 KSP Residual norm 1.572580005056e-06 6380 KSP Residual norm 1.437223844487e-06 6381 KSP Residual norm 1.235833472296e-06 6382 KSP Residual norm 1.253437425101e-06 6383 KSP Residual norm 1.360610622598e-06 6384 KSP Residual norm 1.201697931476e-06 6385 KSP Residual norm 9.286365966349e-07 6386 KSP Residual norm 8.050433100219e-07 6387 KSP Residual norm 8.165869035553e-07 6388 KSP Residual norm 8.794784161120e-07 6389 KSP Residual norm 8.958610985378e-07 6390 KSP Residual norm 8.927986017434e-07 6391 KSP Residual norm 8.601085910450e-07 6392 KSP Residual norm 8.209399908810e-07 6393 KSP Residual norm 7.650475479517e-07 6394 KSP Residual norm 6.755650749524e-07 6395 KSP Residual norm 6.309199963230e-07 6396 KSP Residual norm 6.119978403458e-07 6397 KSP Residual norm 5.463252455487e-07 6398 KSP Residual norm 4.664170726255e-07 6399 KSP Residual norm 4.133908923676e-07 6400 KSP Residual norm 4.035225900687e-07 6401 KSP Residual norm 4.150741015445e-07 6402 KSP Residual norm 4.236628903813e-07 6403 KSP Residual norm 4.641201780473e-07 6404 KSP Residual norm 4.850726263279e-07 6405 KSP Residual norm 4.317305700374e-07 6406 KSP Residual norm 3.941400281745e-07 6407 KSP Residual norm 3.639894440264e-07 6408 KSP Residual norm 3.550010380782e-07 6409 KSP Residual norm 3.782837092357e-07 6410 KSP Residual norm 3.899330605412e-07 6411 KSP Residual norm 3.588732395886e-07 6412 KSP Residual norm 3.278552930119e-07 6413 KSP Residual norm 2.950659243915e-07 6414 KSP Residual norm 2.665135783277e-07 6415 KSP Residual norm 2.612875782367e-07 6416 KSP Residual norm 2.719226911927e-07 6417 KSP Residual norm 2.738727738220e-07 6418 KSP Residual norm 2.938431035966e-07 6419 KSP Residual norm 3.430435262522e-07 6420 KSP Residual norm 3.965901655360e-07 6421 KSP Residual norm 4.427360817797e-07 6422 KSP Residual norm 4.335946254100e-07 6423 KSP Residual norm 4.089456189178e-07 6424 KSP Residual norm 3.878119410296e-07 6425 KSP Residual norm 3.461972258353e-07 6426 KSP Residual norm 3.243312221466e-07 6427 KSP Residual norm 3.491674965006e-07 6428 KSP Residual norm 3.776879508520e-07 6429 KSP Residual norm 4.524322451519e-07 6430 KSP Residual norm 5.483594836085e-07 6431 KSP Residual norm 5.733378424481e-07 6432 KSP Residual norm 5.327321240402e-07 6433 KSP Residual norm 4.782191508858e-07 6434 KSP Residual norm 4.107190346814e-07 6435 KSP Residual norm 3.845455050237e-07 6436 KSP Residual norm 4.077341644874e-07 6437 KSP Residual norm 4.402708696954e-07 6438 KSP Residual norm 4.269029442970e-07 6439 KSP Residual norm 4.163612319024e-07 6440 KSP Residual norm 4.635362670375e-07 6441 KSP Residual norm 5.408700879165e-07 6442 KSP Residual norm 5.661962438069e-07 6443 KSP Residual norm 5.480154694337e-07 6444 KSP Residual norm 5.431861895114e-07 6445 KSP Residual norm 5.023287783328e-07 6446 KSP Residual norm 4.308621096541e-07 6447 KSP Residual norm 3.652212931539e-07 6448 KSP Residual norm 3.455206081438e-07 6449 KSP Residual norm 3.899431001830e-07 6450 KSP Residual norm 4.353716613212e-07 6451 KSP Residual norm 4.429191743152e-07 6452 KSP Residual norm 4.669285566468e-07 6453 KSP Residual norm 5.032267202951e-07 6454 KSP Residual norm 5.401002508381e-07 6455 KSP Residual norm 5.923452132371e-07 6456 KSP Residual norm 6.156796942614e-07 6457 KSP Residual norm 5.023873492445e-07 6458 KSP Residual norm 4.098589249157e-07 6459 KSP Residual norm 4.063047300452e-07 6460 KSP Residual norm 4.774444002876e-07 6461 KSP Residual norm 6.027807269102e-07 6462 KSP Residual norm 7.081376696352e-07 6463 KSP Residual norm 6.811177988892e-07 6464 KSP Residual norm 6.447612999089e-07 6465 KSP Residual norm 6.758705720612e-07 6466 KSP Residual norm 7.702373296061e-07 6467 KSP Residual norm 9.085967717883e-07 6468 KSP Residual norm 1.078216178031e-06 6469 KSP Residual norm 1.282219806708e-06 6470 KSP Residual norm 1.453948358168e-06 6471 KSP Residual norm 1.362399350142e-06 6472 KSP Residual norm 1.203571202395e-06 6473 KSP Residual norm 1.258929645483e-06 6474 KSP Residual norm 1.487083260489e-06 6475 KSP Residual norm 1.645774021633e-06 6476 KSP Residual norm 1.817354409772e-06 6477 KSP Residual norm 1.914833147807e-06 6478 KSP Residual norm 1.814116053321e-06 6479 KSP Residual norm 1.921619699210e-06 6480 KSP Residual norm 2.225373682415e-06 6481 KSP Residual norm 2.284626561128e-06 6482 KSP Residual norm 2.303082462630e-06 6483 KSP Residual norm 2.487318750616e-06 6484 KSP Residual norm 2.799410186991e-06 6485 KSP Residual norm 3.082026006130e-06 6486 KSP Residual norm 3.490225571515e-06 6487 KSP Residual norm 3.695684083933e-06 6488 KSP Residual norm 3.547551056691e-06 6489 KSP Residual norm 3.288697778579e-06 6490 KSP Residual norm 3.253980862495e-06 6491 KSP Residual norm 3.625463830435e-06 6492 KSP Residual norm 4.303710132026e-06 6493 KSP Residual norm 5.276677491431e-06 6494 KSP Residual norm 6.119632787806e-06 6495 KSP Residual norm 5.952085434105e-06 6496 KSP Residual norm 5.272290046261e-06 6497 KSP Residual norm 5.252152549508e-06 6498 KSP Residual norm 5.568805438562e-06 6499 KSP Residual norm 5.916828534667e-06 6500 KSP Residual norm 5.935638543313e-06 6501 KSP Residual norm 5.679869652339e-06 6502 KSP Residual norm 5.362173261480e-06 6503 KSP Residual norm 5.176646890969e-06 6504 KSP Residual norm 5.012903150866e-06 6505 KSP Residual norm 5.452348248820e-06 6506 KSP Residual norm 7.043846038944e-06 6507 KSP Residual norm 8.125638665320e-06 6508 KSP Residual norm 7.545775595277e-06 6509 KSP Residual norm 6.736123743073e-06 6510 KSP Residual norm 6.007392221309e-06 6511 KSP Residual norm 4.992766628093e-06 6512 KSP Residual norm 3.933053182416e-06 6513 KSP Residual norm 3.371860366765e-06 6514 KSP Residual norm 3.636531506059e-06 6515 KSP Residual norm 4.049261808414e-06 6516 KSP Residual norm 4.147158951520e-06 6517 KSP Residual norm 4.377587302811e-06 6518 KSP Residual norm 4.626976755291e-06 6519 KSP Residual norm 4.155196693767e-06 6520 KSP Residual norm 3.431080684784e-06 6521 KSP Residual norm 3.069358954856e-06 6522 KSP Residual norm 3.114707735473e-06 6523 KSP Residual norm 3.738380157111e-06 6524 KSP Residual norm 5.002404920306e-06 6525 KSP Residual norm 6.456060454222e-06 6526 KSP Residual norm 7.069232004555e-06 6527 KSP Residual norm 6.036502593663e-06 6528 KSP Residual norm 4.670445885839e-06 6529 KSP Residual norm 4.213087777487e-06 6530 KSP Residual norm 4.698086358194e-06 6531 KSP Residual norm 6.091005397835e-06 6532 KSP Residual norm 8.463870423912e-06 6533 KSP Residual norm 1.164852925514e-05 6534 KSP Residual norm 1.503976182330e-05 6535 KSP Residual norm 1.768418226181e-05 6536 KSP Residual norm 1.657826336905e-05 6537 KSP Residual norm 1.297312955031e-05 6538 KSP Residual norm 1.096611991307e-05 6539 KSP Residual norm 1.091745835092e-05 6540 KSP Residual norm 1.086101082767e-05 6541 KSP Residual norm 9.979342139445e-06 6542 KSP Residual norm 8.805349034561e-06 6543 KSP Residual norm 8.774213426229e-06 6544 KSP Residual norm 9.799128533711e-06 6545 KSP Residual norm 9.942236203370e-06 6546 KSP Residual norm 9.247492304707e-06 6547 KSP Residual norm 8.336162402891e-06 6548 KSP Residual norm 6.583289844006e-06 6549 KSP Residual norm 5.370615485679e-06 6550 KSP Residual norm 5.426101013011e-06 6551 KSP Residual norm 6.338012993154e-06 6552 KSP Residual norm 7.214910601911e-06 6553 KSP Residual norm 6.882295353635e-06 6554 KSP Residual norm 5.786641027609e-06 6555 KSP Residual norm 5.574949475327e-06 6556 KSP Residual norm 5.832691625224e-06 6557 KSP Residual norm 5.378836581653e-06 6558 KSP Residual norm 4.655270024854e-06 6559 KSP Residual norm 4.161180032462e-06 6560 KSP Residual norm 3.488194818668e-06 6561 KSP Residual norm 2.855454545643e-06 6562 KSP Residual norm 2.556251893986e-06 6563 KSP Residual norm 2.739938520416e-06 6564 KSP Residual norm 3.488584541423e-06 6565 KSP Residual norm 4.272282951704e-06 6566 KSP Residual norm 4.133694892783e-06 6567 KSP Residual norm 3.352153587961e-06 6568 KSP Residual norm 2.802260125366e-06 6569 KSP Residual norm 2.380756108678e-06 6570 KSP Residual norm 2.119745619573e-06 6571 KSP Residual norm 2.173099026336e-06 6572 KSP Residual norm 2.419929520031e-06 6573 KSP Residual norm 2.683893231275e-06 6574 KSP Residual norm 3.026100260062e-06 6575 KSP Residual norm 3.141986656121e-06 6576 KSP Residual norm 3.047974011258e-06 6577 KSP Residual norm 2.835623366711e-06 6578 KSP Residual norm 2.330604085445e-06 6579 KSP Residual norm 1.702488309527e-06 6580 KSP Residual norm 1.361915293592e-06 6581 KSP Residual norm 1.329333593207e-06 6582 KSP Residual norm 1.551935079600e-06 6583 KSP Residual norm 1.976250999687e-06 6584 KSP Residual norm 2.322721161211e-06 6585 KSP Residual norm 2.239741723428e-06 6586 KSP Residual norm 2.054609179497e-06 6587 KSP Residual norm 2.073128435766e-06 6588 KSP Residual norm 2.306230221339e-06 6589 KSP Residual norm 2.643567629850e-06 6590 KSP Residual norm 2.968773932912e-06 6591 KSP Residual norm 2.951126612809e-06 6592 KSP Residual norm 2.933317034566e-06 6593 KSP Residual norm 3.146571340576e-06 6594 KSP Residual norm 3.740933621166e-06 6595 KSP Residual norm 4.727032623993e-06 6596 KSP Residual norm 5.618033614885e-06 6597 KSP Residual norm 6.085286032396e-06 6598 KSP Residual norm 6.137086483318e-06 6599 KSP Residual norm 5.362419496497e-06 6600 KSP Residual norm 4.095244956105e-06 6601 KSP Residual norm 3.615967398964e-06 6602 KSP Residual norm 4.380404751580e-06 6603 KSP Residual norm 6.059230428234e-06 6604 KSP Residual norm 7.185373333013e-06 6605 KSP Residual norm 8.320081129831e-06 6606 KSP Residual norm 9.808972742735e-06 6607 KSP Residual norm 1.114409918032e-05 6608 KSP Residual norm 1.169621248075e-05 6609 KSP Residual norm 1.238085205447e-05 6610 KSP Residual norm 1.334922873702e-05 6611 KSP Residual norm 1.557009789532e-05 6612 KSP Residual norm 1.836639221466e-05 6613 KSP Residual norm 2.117357429923e-05 6614 KSP Residual norm 2.290953654721e-05 6615 KSP Residual norm 2.297552309835e-05 6616 KSP Residual norm 1.812376416333e-05 6617 KSP Residual norm 1.306591416800e-05 6618 KSP Residual norm 9.144831632555e-06 6619 KSP Residual norm 5.867136223229e-06 6620 KSP Residual norm 4.519592289942e-06 6621 KSP Residual norm 4.773251526751e-06 6622 KSP Residual norm 6.357201477886e-06 6623 KSP Residual norm 9.071141108326e-06 6624 KSP Residual norm 1.200886015081e-05 6625 KSP Residual norm 1.335374322040e-05 6626 KSP Residual norm 1.166972572181e-05 6627 KSP Residual norm 9.469259405639e-06 6628 KSP Residual norm 7.867880026589e-06 6629 KSP Residual norm 7.148318013719e-06 6630 KSP Residual norm 7.941631307980e-06 6631 KSP Residual norm 1.095399848723e-05 6632 KSP Residual norm 1.269831536566e-05 6633 KSP Residual norm 1.166030556887e-05 6634 KSP Residual norm 9.946358018498e-06 6635 KSP Residual norm 8.905881699893e-06 6636 KSP Residual norm 9.178573800420e-06 6637 KSP Residual norm 1.101729494667e-05 6638 KSP Residual norm 1.223260889330e-05 6639 KSP Residual norm 1.238270795120e-05 6640 KSP Residual norm 1.329543699464e-05 6641 KSP Residual norm 1.427573911345e-05 6642 KSP Residual norm 1.388470219904e-05 6643 KSP Residual norm 1.321066339522e-05 6644 KSP Residual norm 1.427781320863e-05 6645 KSP Residual norm 1.678676766855e-05 6646 KSP Residual norm 1.828711848748e-05 6647 KSP Residual norm 1.794034891510e-05 6648 KSP Residual norm 1.621408633770e-05 6649 KSP Residual norm 1.267460849314e-05 6650 KSP Residual norm 1.008607870930e-05 6651 KSP Residual norm 9.385919454817e-06 6652 KSP Residual norm 8.786443985421e-06 6653 KSP Residual norm 7.587023500817e-06 6654 KSP Residual norm 6.614216274233e-06 6655 KSP Residual norm 5.837286796526e-06 6656 KSP Residual norm 5.838385689975e-06 6657 KSP Residual norm 7.033157692158e-06 6658 KSP Residual norm 9.078611364514e-06 6659 KSP Residual norm 1.040220968915e-05 6660 KSP Residual norm 1.042114849271e-05 6661 KSP Residual norm 8.851604394952e-06 6662 KSP Residual norm 6.177059201453e-06 6663 KSP Residual norm 4.257618992377e-06 6664 KSP Residual norm 3.431190627451e-06 6665 KSP Residual norm 2.996993710013e-06 6666 KSP Residual norm 2.624505451887e-06 6667 KSP Residual norm 2.559319804493e-06 6668 KSP Residual norm 2.851280763754e-06 6669 KSP Residual norm 3.097716855338e-06 6670 KSP Residual norm 2.994036044599e-06 6671 KSP Residual norm 2.942570790846e-06 6672 KSP Residual norm 2.695394857269e-06 6673 KSP Residual norm 2.330828191411e-06 6674 KSP Residual norm 2.240425957789e-06 6675 KSP Residual norm 2.220792693582e-06 6676 KSP Residual norm 2.006762493251e-06 6677 KSP Residual norm 1.737636310397e-06 6678 KSP Residual norm 1.529860731062e-06 6679 KSP Residual norm 1.452431511323e-06 6680 KSP Residual norm 1.450799857764e-06 6681 KSP Residual norm 1.423577979714e-06 6682 KSP Residual norm 1.313425297293e-06 6683 KSP Residual norm 1.298414357061e-06 6684 KSP Residual norm 1.370882022878e-06 6685 KSP Residual norm 1.338896796665e-06 6686 KSP Residual norm 1.304140065198e-06 6687 KSP Residual norm 1.344936581831e-06 6688 KSP Residual norm 1.351499383490e-06 6689 KSP Residual norm 1.349607949115e-06 6690 KSP Residual norm 1.490656859932e-06 6691 KSP Residual norm 1.763991880261e-06 6692 KSP Residual norm 2.032509610570e-06 6693 KSP Residual norm 1.822523765048e-06 6694 KSP Residual norm 1.513191857086e-06 6695 KSP Residual norm 1.444246887402e-06 6696 KSP Residual norm 1.615306191272e-06 6697 KSP Residual norm 1.666610699592e-06 6698 KSP Residual norm 1.476347584497e-06 6699 KSP Residual norm 1.306213776415e-06 6700 KSP Residual norm 1.350960136403e-06 6701 KSP Residual norm 1.517610220609e-06 6702 KSP Residual norm 1.678378025187e-06 6703 KSP Residual norm 1.844134600893e-06 6704 KSP Residual norm 1.767052217614e-06 6705 KSP Residual norm 1.415094837577e-06 6706 KSP Residual norm 1.152284154901e-06 6707 KSP Residual norm 1.056819277912e-06 6708 KSP Residual norm 1.025303322665e-06 6709 KSP Residual norm 1.046427360246e-06 6710 KSP Residual norm 1.067144309663e-06 6711 KSP Residual norm 1.002011102303e-06 6712 KSP Residual norm 8.411216349052e-07 6713 KSP Residual norm 6.211537014588e-07 6714 KSP Residual norm 4.857100625600e-07 6715 KSP Residual norm 4.733123203108e-07 6716 KSP Residual norm 5.160395362539e-07 6717 KSP Residual norm 5.840294788590e-07 6718 KSP Residual norm 6.350169780973e-07 6719 KSP Residual norm 6.057821727552e-07 6720 KSP Residual norm 5.843909348521e-07 6721 KSP Residual norm 5.870284875070e-07 6722 KSP Residual norm 5.891865286007e-07 6723 KSP Residual norm 6.016887955255e-07 6724 KSP Residual norm 5.788259613774e-07 6725 KSP Residual norm 4.703629718590e-07 6726 KSP Residual norm 3.841806232261e-07 6727 KSP Residual norm 3.858233112481e-07 6728 KSP Residual norm 4.188237664181e-07 6729 KSP Residual norm 4.254974214811e-07 6730 KSP Residual norm 4.217723264430e-07 6731 KSP Residual norm 4.334853544073e-07 6732 KSP Residual norm 4.717642251073e-07 6733 KSP Residual norm 5.204185776885e-07 6734 KSP Residual norm 5.395769023579e-07 6735 KSP Residual norm 5.373538349540e-07 6736 KSP Residual norm 5.830986518148e-07 6737 KSP Residual norm 6.524846034779e-07 6738 KSP Residual norm 6.994354768747e-07 6739 KSP Residual norm 7.625331162864e-07 6740 KSP Residual norm 7.548369733101e-07 6741 KSP Residual norm 6.765919371895e-07 6742 KSP Residual norm 6.935519504027e-07 6743 KSP Residual norm 8.337381465247e-07 6744 KSP Residual norm 8.997082440405e-07 6745 KSP Residual norm 7.657881212629e-07 6746 KSP Residual norm 6.307120198647e-07 6747 KSP Residual norm 5.849740003627e-07 6748 KSP Residual norm 6.143722261097e-07 6749 KSP Residual norm 6.257246721979e-07 6750 KSP Residual norm 5.918744269781e-07 6751 KSP Residual norm 5.947744685625e-07 6752 KSP Residual norm 6.271904032836e-07 6753 KSP Residual norm 5.614655861926e-07 6754 KSP Residual norm 4.556843669160e-07 6755 KSP Residual norm 4.379113005478e-07 6756 KSP Residual norm 4.617346846598e-07 6757 KSP Residual norm 4.584315583348e-07 6758 KSP Residual norm 4.800651068004e-07 6759 KSP Residual norm 5.622972047444e-07 6760 KSP Residual norm 5.975454953659e-07 6761 KSP Residual norm 6.153982352088e-07 6762 KSP Residual norm 6.423895209871e-07 6763 KSP Residual norm 6.097660070361e-07 6764 KSP Residual norm 5.544947489951e-07 6765 KSP Residual norm 5.340462578322e-07 6766 KSP Residual norm 4.970999368944e-07 6767 KSP Residual norm 4.404433530368e-07 6768 KSP Residual norm 3.790987030243e-07 6769 KSP Residual norm 3.269040326141e-07 6770 KSP Residual norm 2.998573747686e-07 6771 KSP Residual norm 2.840787499022e-07 6772 KSP Residual norm 2.813250784708e-07 6773 KSP Residual norm 2.842116059637e-07 6774 KSP Residual norm 3.017198015302e-07 6775 KSP Residual norm 3.145541463400e-07 6776 KSP Residual norm 3.101136746635e-07 6777 KSP Residual norm 3.393676687243e-07 6778 KSP Residual norm 3.926551973832e-07 6779 KSP Residual norm 3.921321961914e-07 6780 KSP Residual norm 3.393442988392e-07 6781 KSP Residual norm 3.151231337632e-07 6782 KSP Residual norm 3.353308459016e-07 6783 KSP Residual norm 3.325077905975e-07 6784 KSP Residual norm 2.857379639833e-07 6785 KSP Residual norm 2.492540038559e-07 6786 KSP Residual norm 2.437626967129e-07 6787 KSP Residual norm 2.416988058239e-07 6788 KSP Residual norm 2.404581908445e-07 6789 KSP Residual norm 2.439876678438e-07 6790 KSP Residual norm 2.612133728482e-07 6791 KSP Residual norm 2.968943601329e-07 6792 KSP Residual norm 3.214257405215e-07 6793 KSP Residual norm 3.369217143396e-07 6794 KSP Residual norm 3.487393417235e-07 6795 KSP Residual norm 3.884276033478e-07 6796 KSP Residual norm 4.437204658280e-07 6797 KSP Residual norm 4.564014703318e-07 6798 KSP Residual norm 4.381130941475e-07 6799 KSP Residual norm 4.293632526846e-07 6800 KSP Residual norm 4.127003357639e-07 6801 KSP Residual norm 4.543922499320e-07 6802 KSP Residual norm 5.534901820968e-07 6803 KSP Residual norm 5.617498079988e-07 6804 KSP Residual norm 5.316756259514e-07 6805 KSP Residual norm 5.586285486300e-07 6806 KSP Residual norm 5.878574341116e-07 6807 KSP Residual norm 6.176195699203e-07 6808 KSP Residual norm 6.699973713749e-07 6809 KSP Residual norm 6.834683406623e-07 6810 KSP Residual norm 7.047100913327e-07 6811 KSP Residual norm 7.683923729736e-07 6812 KSP Residual norm 8.216543617353e-07 6813 KSP Residual norm 8.929326230207e-07 6814 KSP Residual norm 9.421865141884e-07 6815 KSP Residual norm 8.100916229406e-07 6816 KSP Residual norm 6.936381674114e-07 6817 KSP Residual norm 7.108463279164e-07 6818 KSP Residual norm 7.650673696821e-07 6819 KSP Residual norm 7.944938532719e-07 6820 KSP Residual norm 8.567023897731e-07 6821 KSP Residual norm 9.177910969013e-07 6822 KSP Residual norm 9.640965117381e-07 6823 KSP Residual norm 1.043724766784e-06 6824 KSP Residual norm 1.123336709193e-06 6825 KSP Residual norm 1.070717679282e-06 6826 KSP Residual norm 9.220585049278e-07 6827 KSP Residual norm 8.055958295105e-07 6828 KSP Residual norm 7.441105527280e-07 6829 KSP Residual norm 7.581001051878e-07 6830 KSP Residual norm 7.271303600875e-07 6831 KSP Residual norm 6.514314794771e-07 6832 KSP Residual norm 6.361709198155e-07 6833 KSP Residual norm 6.603690065721e-07 6834 KSP Residual norm 6.568973683527e-07 6835 KSP Residual norm 6.389990764880e-07 6836 KSP Residual norm 6.140769420995e-07 6837 KSP Residual norm 6.126496775830e-07 6838 KSP Residual norm 6.542619744968e-07 6839 KSP Residual norm 6.915217125869e-07 6840 KSP Residual norm 6.908895775180e-07 6841 KSP Residual norm 6.834781418859e-07 6842 KSP Residual norm 6.491098534682e-07 6843 KSP Residual norm 6.104729611287e-07 6844 KSP Residual norm 5.475421772278e-07 6845 KSP Residual norm 5.094513820383e-07 6846 KSP Residual norm 5.188992294829e-07 6847 KSP Residual norm 5.130653354818e-07 6848 KSP Residual norm 4.743805601088e-07 6849 KSP Residual norm 4.072649861230e-07 6850 KSP Residual norm 3.587735096846e-07 6851 KSP Residual norm 3.874964143557e-07 6852 KSP Residual norm 4.434871659896e-07 6853 KSP Residual norm 4.391144552194e-07 6854 KSP Residual norm 4.564405613687e-07 6855 KSP Residual norm 5.248520066387e-07 6856 KSP Residual norm 5.510349167838e-07 6857 KSP Residual norm 5.649309460664e-07 6858 KSP Residual norm 5.549868654521e-07 6859 KSP Residual norm 4.939096866625e-07 6860 KSP Residual norm 4.451697998912e-07 6861 KSP Residual norm 4.197381238377e-07 6862 KSP Residual norm 3.852029804015e-07 6863 KSP Residual norm 3.735665303614e-07 6864 KSP Residual norm 3.731898235927e-07 6865 KSP Residual norm 3.681618677067e-07 6866 KSP Residual norm 4.049666316262e-07 6867 KSP Residual norm 4.908565354795e-07 6868 KSP Residual norm 5.557903872167e-07 6869 KSP Residual norm 6.149272972553e-07 6870 KSP Residual norm 6.705359883947e-07 6871 KSP Residual norm 6.886944879713e-07 6872 KSP Residual norm 7.272732788909e-07 6873 KSP Residual norm 8.354109733404e-07 6874 KSP Residual norm 8.461638919534e-07 6875 KSP Residual norm 6.957789206705e-07 6876 KSP Residual norm 5.810462785831e-07 6877 KSP Residual norm 6.121022939447e-07 6878 KSP Residual norm 7.041961735215e-07 6879 KSP Residual norm 7.119938234273e-07 6880 KSP Residual norm 6.336869649012e-07 6881 KSP Residual norm 5.823027255643e-07 6882 KSP Residual norm 6.332105346374e-07 6883 KSP Residual norm 7.719184493072e-07 6884 KSP Residual norm 8.414924355228e-07 6885 KSP Residual norm 7.937693586857e-07 6886 KSP Residual norm 7.875945041115e-07 6887 KSP Residual norm 7.782124993486e-07 6888 KSP Residual norm 7.410734072886e-07 6889 KSP Residual norm 7.343551180159e-07 6890 KSP Residual norm 7.517788791179e-07 6891 KSP Residual norm 7.107448425107e-07 6892 KSP Residual norm 6.957818871198e-07 6893 KSP Residual norm 6.609499104601e-07 6894 KSP Residual norm 5.553024160866e-07 6895 KSP Residual norm 4.841493854281e-07 6896 KSP Residual norm 4.325711564758e-07 6897 KSP Residual norm 3.688393042728e-07 6898 KSP Residual norm 3.498375879648e-07 6899 KSP Residual norm 3.526090815221e-07 6900 KSP Residual norm 3.532697575106e-07 6901 KSP Residual norm 3.405838717125e-07 6902 KSP Residual norm 3.192031702110e-07 6903 KSP Residual norm 2.999780640028e-07 6904 KSP Residual norm 2.944494582545e-07 6905 KSP Residual norm 3.109547155351e-07 6906 KSP Residual norm 3.366456604374e-07 6907 KSP Residual norm 3.098451728381e-07 6908 KSP Residual norm 2.628040158125e-07 6909 KSP Residual norm 2.404863184852e-07 6910 KSP Residual norm 2.364380930648e-07 6911 KSP Residual norm 2.326294013716e-07 6912 KSP Residual norm 2.133750602761e-07 6913 KSP Residual norm 1.785633719382e-07 6914 KSP Residual norm 1.482640930471e-07 6915 KSP Residual norm 1.457136832825e-07 6916 KSP Residual norm 1.681446557917e-07 6917 KSP Residual norm 1.895358117061e-07 6918 KSP Residual norm 2.006024087906e-07 6919 KSP Residual norm 2.050102425565e-07 6920 KSP Residual norm 2.115917827221e-07 6921 KSP Residual norm 2.341586686122e-07 6922 KSP Residual norm 2.409380553240e-07 6923 KSP Residual norm 2.090387148976e-07 6924 KSP Residual norm 1.772535191023e-07 6925 KSP Residual norm 1.635940472128e-07 6926 KSP Residual norm 1.655646104047e-07 6927 KSP Residual norm 1.710165196599e-07 6928 KSP Residual norm 1.630051272625e-07 6929 KSP Residual norm 1.514134054727e-07 6930 KSP Residual norm 1.529866023573e-07 6931 KSP Residual norm 1.621848351987e-07 6932 KSP Residual norm 1.691277102554e-07 6933 KSP Residual norm 1.680750599890e-07 6934 KSP Residual norm 1.539977682821e-07 6935 KSP Residual norm 1.302018127960e-07 6936 KSP Residual norm 1.186187860436e-07 6937 KSP Residual norm 1.237702681296e-07 6938 KSP Residual norm 1.310276463659e-07 6939 KSP Residual norm 1.358176541626e-07 6940 KSP Residual norm 1.336132428635e-07 6941 KSP Residual norm 1.353917508986e-07 6942 KSP Residual norm 1.535669882092e-07 6943 KSP Residual norm 1.869566718011e-07 6944 KSP Residual norm 2.249215471818e-07 6945 KSP Residual norm 2.371629750425e-07 6946 KSP Residual norm 1.977586649980e-07 6947 KSP Residual norm 1.833756991929e-07 6948 KSP Residual norm 1.897651250677e-07 6949 KSP Residual norm 1.695290443290e-07 6950 KSP Residual norm 1.434150731050e-07 6951 KSP Residual norm 1.170137016176e-07 6952 KSP Residual norm 1.056055261459e-07 6953 KSP Residual norm 1.098939330189e-07 6954 KSP Residual norm 1.222557672223e-07 6955 KSP Residual norm 1.179428559954e-07 6956 KSP Residual norm 1.106748163215e-07 6957 KSP Residual norm 1.203938177972e-07 6958 KSP Residual norm 1.415083573787e-07 6959 KSP Residual norm 1.467789134375e-07 6960 KSP Residual norm 1.373482166777e-07 6961 KSP Residual norm 1.332046092187e-07 6962 KSP Residual norm 1.381139286760e-07 6963 KSP Residual norm 1.482506753385e-07 6964 KSP Residual norm 1.495587722966e-07 6965 KSP Residual norm 1.316435492624e-07 6966 KSP Residual norm 1.153549487872e-07 6967 KSP Residual norm 1.142292361911e-07 6968 KSP Residual norm 1.059936849884e-07 6969 KSP Residual norm 9.083452888867e-08 6970 KSP Residual norm 8.712012582098e-08 6971 KSP Residual norm 8.407955470623e-08 6972 KSP Residual norm 7.483065652687e-08 6973 KSP Residual norm 6.636351380974e-08 6974 KSP Residual norm 6.127718978055e-08 6975 KSP Residual norm 6.088849759165e-08 6976 KSP Residual norm 6.199385851259e-08 6977 KSP Residual norm 5.992467208927e-08 6978 KSP Residual norm 5.903363936366e-08 6979 KSP Residual norm 6.401091334516e-08 6980 KSP Residual norm 7.196735651336e-08 6981 KSP Residual norm 7.094880968693e-08 6982 KSP Residual norm 6.861296953829e-08 6983 KSP Residual norm 7.088158982262e-08 6984 KSP Residual norm 7.333594275249e-08 6985 KSP Residual norm 7.553570869445e-08 6986 KSP Residual norm 7.796998874383e-08 6987 KSP Residual norm 7.015620199567e-08 6988 KSP Residual norm 6.096226154075e-08 6989 KSP Residual norm 6.176490155873e-08 6990 KSP Residual norm 6.945638074107e-08 6991 KSP Residual norm 7.574781261518e-08 6992 KSP Residual norm 7.796047921170e-08 6993 KSP Residual norm 7.808110787694e-08 6994 KSP Residual norm 7.579717716480e-08 6995 KSP Residual norm 7.676411540061e-08 6996 KSP Residual norm 8.262114515763e-08 6997 KSP Residual norm 9.177388956783e-08 6998 KSP Residual norm 9.617462608009e-08 6999 KSP Residual norm 8.941163153964e-08 7000 KSP Residual norm 8.401524890944e-08 7001 KSP Residual norm 8.513721631951e-08 7002 KSP Residual norm 9.017096394170e-08 7003 KSP Residual norm 9.857766880038e-08 7004 KSP Residual norm 1.022548301836e-07 7005 KSP Residual norm 1.029011656001e-07 7006 KSP Residual norm 1.036875576543e-07 7007 KSP Residual norm 1.068555232348e-07 7008 KSP Residual norm 1.099052560653e-07 7009 KSP Residual norm 1.123527370015e-07 7010 KSP Residual norm 1.048210830232e-07 7011 KSP Residual norm 9.141919649454e-08 7012 KSP Residual norm 8.628018506195e-08 7013 KSP Residual norm 9.393405954699e-08 7014 KSP Residual norm 9.346318371668e-08 7015 KSP Residual norm 8.396103424247e-08 7016 KSP Residual norm 7.817808203003e-08 7017 KSP Residual norm 7.267968926525e-08 7018 KSP Residual norm 7.023729380458e-08 7019 KSP Residual norm 7.315988860634e-08 7020 KSP Residual norm 7.909858920963e-08 7021 KSP Residual norm 8.158470314923e-08 7022 KSP Residual norm 8.120008553746e-08 7023 KSP Residual norm 8.360024845460e-08 7024 KSP Residual norm 9.791033923165e-08 7025 KSP Residual norm 1.221770335161e-07 7026 KSP Residual norm 1.357195323307e-07 7027 KSP Residual norm 1.276948800596e-07 7028 KSP Residual norm 1.275005813735e-07 7029 KSP Residual norm 1.496522645305e-07 7030 KSP Residual norm 1.702158210665e-07 7031 KSP Residual norm 1.588993181866e-07 7032 KSP Residual norm 1.436223776002e-07 7033 KSP Residual norm 1.335516331820e-07 7034 KSP Residual norm 1.305230980530e-07 7035 KSP Residual norm 1.361086771117e-07 7036 KSP Residual norm 1.345309860839e-07 7037 KSP Residual norm 1.296770558694e-07 7038 KSP Residual norm 1.373873830185e-07 7039 KSP Residual norm 1.541702564342e-07 7040 KSP Residual norm 1.672343024808e-07 7041 KSP Residual norm 1.704345765849e-07 7042 KSP Residual norm 1.555432640758e-07 7043 KSP Residual norm 1.446505145884e-07 7044 KSP Residual norm 1.537588827766e-07 7045 KSP Residual norm 1.636038720932e-07 7046 KSP Residual norm 1.587427926900e-07 7047 KSP Residual norm 1.478525284368e-07 7048 KSP Residual norm 1.517691029199e-07 7049 KSP Residual norm 1.631863251644e-07 7050 KSP Residual norm 1.662339090192e-07 7051 KSP Residual norm 1.632121257908e-07 7052 KSP Residual norm 1.610935186985e-07 7053 KSP Residual norm 1.591240619248e-07 7054 KSP Residual norm 1.765923361621e-07 7055 KSP Residual norm 1.875840478655e-07 7056 KSP Residual norm 1.797804842108e-07 7057 KSP Residual norm 1.829429459130e-07 7058 KSP Residual norm 2.011340485448e-07 7059 KSP Residual norm 2.285278784152e-07 7060 KSP Residual norm 2.582768755843e-07 7061 KSP Residual norm 2.704942323737e-07 7062 KSP Residual norm 2.649893119075e-07 7063 KSP Residual norm 2.582816750451e-07 7064 KSP Residual norm 2.437252028868e-07 7065 KSP Residual norm 2.229109788451e-07 7066 KSP Residual norm 2.031149301886e-07 7067 KSP Residual norm 2.108349851405e-07 7068 KSP Residual norm 2.223865104080e-07 7069 KSP Residual norm 2.134704631591e-07 7070 KSP Residual norm 1.980489396135e-07 7071 KSP Residual norm 1.956979793181e-07 7072 KSP Residual norm 2.112635961163e-07 7073 KSP Residual norm 2.271516431746e-07 7074 KSP Residual norm 2.293998645662e-07 7075 KSP Residual norm 2.255678941471e-07 7076 KSP Residual norm 2.224718178276e-07 7077 KSP Residual norm 2.202513299985e-07 7078 KSP Residual norm 2.153554374946e-07 7079 KSP Residual norm 1.970982318973e-07 7080 KSP Residual norm 1.833554929085e-07 7081 KSP Residual norm 1.914230785395e-07 7082 KSP Residual norm 1.937345110156e-07 7083 KSP Residual norm 1.753127310305e-07 7084 KSP Residual norm 1.520262751991e-07 7085 KSP Residual norm 1.460252068392e-07 7086 KSP Residual norm 1.412619420368e-07 7087 KSP Residual norm 1.309334763016e-07 7088 KSP Residual norm 1.227306999678e-07 7089 KSP Residual norm 1.203016706898e-07 7090 KSP Residual norm 1.173312324317e-07 7091 KSP Residual norm 1.106789169274e-07 7092 KSP Residual norm 1.108407801762e-07 7093 KSP Residual norm 1.170567305188e-07 7094 KSP Residual norm 1.178183853214e-07 7095 KSP Residual norm 1.093805315863e-07 7096 KSP Residual norm 1.027205465504e-07 7097 KSP Residual norm 1.109233105315e-07 7098 KSP Residual norm 1.165326894587e-07 7099 KSP Residual norm 1.112082573563e-07 7100 KSP Residual norm 1.027833917460e-07 7101 KSP Residual norm 9.753202727619e-08 7102 KSP Residual norm 9.703911994493e-08 7103 KSP Residual norm 9.790925514857e-08 7104 KSP Residual norm 9.637197467685e-08 7105 KSP Residual norm 9.511376992954e-08 7106 KSP Residual norm 9.586555373011e-08 7107 KSP Residual norm 9.359647777554e-08 7108 KSP Residual norm 9.117299347872e-08 7109 KSP Residual norm 9.500547783038e-08 7110 KSP Residual norm 9.420707679878e-08 7111 KSP Residual norm 8.266409344174e-08 7112 KSP Residual norm 7.488645691674e-08 7113 KSP Residual norm 7.098698776525e-08 7114 KSP Residual norm 6.754446132338e-08 7115 KSP Residual norm 6.510855153112e-08 7116 KSP Residual norm 6.712957895384e-08 7117 KSP Residual norm 6.566861865982e-08 7118 KSP Residual norm 6.181502920184e-08 7119 KSP Residual norm 6.886762710879e-08 7120 KSP Residual norm 8.363862635951e-08 7121 KSP Residual norm 9.215313751779e-08 7122 KSP Residual norm 9.648378613926e-08 7123 KSP Residual norm 9.399681994510e-08 7124 KSP Residual norm 9.313419725666e-08 7125 KSP Residual norm 9.685427536833e-08 7126 KSP Residual norm 9.587960073673e-08 7127 KSP Residual norm 8.504617156191e-08 7128 KSP Residual norm 7.453574582655e-08 7129 KSP Residual norm 7.091070101550e-08 7130 KSP Residual norm 7.309833109890e-08 7131 KSP Residual norm 7.791563551742e-08 7132 KSP Residual norm 7.898938238778e-08 7133 KSP Residual norm 7.082727216069e-08 7134 KSP Residual norm 6.731472131907e-08 7135 KSP Residual norm 6.858692392740e-08 7136 KSP Residual norm 6.893025137245e-08 7137 KSP Residual norm 6.781914058432e-08 7138 KSP Residual norm 6.638046574423e-08 7139 KSP Residual norm 6.662654937694e-08 7140 KSP Residual norm 6.712146693464e-08 7141 KSP Residual norm 6.357555930685e-08 7142 KSP Residual norm 6.051361498691e-08 7143 KSP Residual norm 5.858945091601e-08 7144 KSP Residual norm 5.563513001168e-08 7145 KSP Residual norm 5.318325092439e-08 7146 KSP Residual norm 5.172624319418e-08 7147 KSP Residual norm 5.157830769367e-08 7148 KSP Residual norm 5.101505732286e-08 7149 KSP Residual norm 4.657868050280e-08 7150 KSP Residual norm 4.599594593610e-08 7151 KSP Residual norm 4.968119952596e-08 7152 KSP Residual norm 4.944010222136e-08 7153 KSP Residual norm 4.797813415718e-08 7154 KSP Residual norm 4.922439839735e-08 7155 KSP Residual norm 4.896964941268e-08 7156 KSP Residual norm 4.741131298000e-08 7157 KSP Residual norm 4.880835689160e-08 7158 KSP Residual norm 5.051359898964e-08 7159 KSP Residual norm 5.013730154192e-08 7160 KSP Residual norm 4.723544432392e-08 7161 KSP Residual norm 4.207929515199e-08 7162 KSP Residual norm 3.904373515189e-08 7163 KSP Residual norm 3.866431120925e-08 7164 KSP Residual norm 3.918229641765e-08 7165 KSP Residual norm 3.962416320753e-08 7166 KSP Residual norm 3.539511106489e-08 7167 KSP Residual norm 3.316278375512e-08 7168 KSP Residual norm 3.759396097196e-08 7169 KSP Residual norm 4.195683099718e-08 7170 KSP Residual norm 4.040094010224e-08 7171 KSP Residual norm 3.846988435149e-08 7172 KSP Residual norm 3.847278225971e-08 7173 KSP Residual norm 4.263022889365e-08 7174 KSP Residual norm 5.080396305862e-08 7175 KSP Residual norm 5.221809997639e-08 7176 KSP Residual norm 4.654713657878e-08 7177 KSP Residual norm 4.418870897838e-08 7178 KSP Residual norm 4.720652790370e-08 7179 KSP Residual norm 4.550459344390e-08 7180 KSP Residual norm 3.773885426493e-08 7181 KSP Residual norm 3.154569922226e-08 7182 KSP Residual norm 2.982774172800e-08 7183 KSP Residual norm 3.022039741612e-08 7184 KSP Residual norm 3.138746572474e-08 7185 KSP Residual norm 3.125112887785e-08 7186 KSP Residual norm 3.008820648041e-08 7187 KSP Residual norm 3.040098108082e-08 7188 KSP Residual norm 3.256483834220e-08 7189 KSP Residual norm 3.620389485277e-08 7190 KSP Residual norm 3.973826197065e-08 7191 KSP Residual norm 4.058638616401e-08 7192 KSP Residual norm 4.244676648091e-08 7193 KSP Residual norm 4.737166464887e-08 7194 KSP Residual norm 5.089035681954e-08 7195 KSP Residual norm 4.957882566318e-08 7196 KSP Residual norm 4.934629029121e-08 7197 KSP Residual norm 5.415955364923e-08 7198 KSP Residual norm 6.324254388436e-08 7199 KSP Residual norm 6.987256807195e-08 7200 KSP Residual norm 7.364003809136e-08 7201 KSP Residual norm 7.537183435563e-08 7202 KSP Residual norm 7.595704453611e-08 7203 KSP Residual norm 8.218971910390e-08 7204 KSP Residual norm 8.337980520790e-08 7205 KSP Residual norm 7.750531652948e-08 7206 KSP Residual norm 8.270853970847e-08 7207 KSP Residual norm 8.631055861934e-08 7208 KSP Residual norm 7.947836977351e-08 7209 KSP Residual norm 6.766469294742e-08 7210 KSP Residual norm 5.756691665602e-08 7211 KSP Residual norm 4.840164668507e-08 7212 KSP Residual norm 4.420197556324e-08 7213 KSP Residual norm 4.101874803406e-08 7214 KSP Residual norm 3.789038056892e-08 7215 KSP Residual norm 3.900776325398e-08 7216 KSP Residual norm 4.195904454366e-08 7217 KSP Residual norm 4.192955603167e-08 7218 KSP Residual norm 4.107417974009e-08 7219 KSP Residual norm 3.929350671012e-08 7220 KSP Residual norm 3.657042951107e-08 7221 KSP Residual norm 3.766625710148e-08 7222 KSP Residual norm 4.389487691813e-08 7223 KSP Residual norm 4.967827997605e-08 7224 KSP Residual norm 5.339292213320e-08 7225 KSP Residual norm 5.715963489940e-08 7226 KSP Residual norm 6.364203364607e-08 7227 KSP Residual norm 6.955531970780e-08 7228 KSP Residual norm 7.231863364659e-08 7229 KSP Residual norm 7.371142288083e-08 7230 KSP Residual norm 7.802137707430e-08 7231 KSP Residual norm 8.560237628796e-08 7232 KSP Residual norm 9.605474499130e-08 7233 KSP Residual norm 9.888502654439e-08 7234 KSP Residual norm 9.591488394736e-08 7235 KSP Residual norm 1.004541705838e-07 7236 KSP Residual norm 1.031416935636e-07 7237 KSP Residual norm 9.608448894178e-08 7238 KSP Residual norm 8.672183912301e-08 7239 KSP Residual norm 8.452959896811e-08 7240 KSP Residual norm 9.352133664228e-08 7241 KSP Residual norm 1.035693434837e-07 7242 KSP Residual norm 9.755993024403e-08 7243 KSP Residual norm 9.323274216286e-08 7244 KSP Residual norm 1.022259922304e-07 7245 KSP Residual norm 1.156390780902e-07 7246 KSP Residual norm 1.284854700774e-07 7247 KSP Residual norm 1.353672794527e-07 7248 KSP Residual norm 1.418438068534e-07 7249 KSP Residual norm 1.528470879974e-07 7250 KSP Residual norm 1.566151966646e-07 7251 KSP Residual norm 1.650966536538e-07 7252 KSP Residual norm 1.935624707177e-07 7253 KSP Residual norm 2.135569302835e-07 7254 KSP Residual norm 2.069946439357e-07 7255 KSP Residual norm 2.080672721633e-07 7256 KSP Residual norm 2.376329747756e-07 7257 KSP Residual norm 2.750217320201e-07 7258 KSP Residual norm 2.935013129889e-07 7259 KSP Residual norm 3.046975383831e-07 7260 KSP Residual norm 3.116564406197e-07 7261 KSP Residual norm 3.161155099252e-07 7262 KSP Residual norm 2.788830404226e-07 7263 KSP Residual norm 2.263091834626e-07 7264 KSP Residual norm 2.110819276117e-07 7265 KSP Residual norm 2.184860531322e-07 7266 KSP Residual norm 2.123127707179e-07 7267 KSP Residual norm 2.089817664631e-07 7268 KSP Residual norm 1.985502799856e-07 7269 KSP Residual norm 1.951510907328e-07 7270 KSP Residual norm 2.179052316567e-07 7271 KSP Residual norm 2.409518664097e-07 7272 KSP Residual norm 2.369398097534e-07 7273 KSP Residual norm 2.368167253357e-07 7274 KSP Residual norm 2.599636539767e-07 7275 KSP Residual norm 2.889140136086e-07 7276 KSP Residual norm 2.921847062945e-07 7277 KSP Residual norm 2.887697374046e-07 7278 KSP Residual norm 2.961514714439e-07 7279 KSP Residual norm 2.949891173686e-07 7280 KSP Residual norm 2.896933025296e-07 7281 KSP Residual norm 2.901461544134e-07 7282 KSP Residual norm 2.687765857205e-07 7283 KSP Residual norm 2.408008687125e-07 7284 KSP Residual norm 2.390329547974e-07 7285 KSP Residual norm 2.510678609930e-07 7286 KSP Residual norm 2.368966475218e-07 7287 KSP Residual norm 2.122542807679e-07 7288 KSP Residual norm 2.029671769330e-07 7289 KSP Residual norm 1.961522742176e-07 7290 KSP Residual norm 1.913443852905e-07 7291 KSP Residual norm 2.039416180900e-07 7292 KSP Residual norm 2.150959917544e-07 7293 KSP Residual norm 2.219345636290e-07 7294 KSP Residual norm 2.541533072622e-07 7295 KSP Residual norm 3.018808921148e-07 7296 KSP Residual norm 3.139616684865e-07 7297 KSP Residual norm 2.962550895982e-07 7298 KSP Residual norm 3.198068863404e-07 7299 KSP Residual norm 3.742756194810e-07 7300 KSP Residual norm 3.959842818393e-07 7301 KSP Residual norm 3.964823202399e-07 7302 KSP Residual norm 4.187061959804e-07 7303 KSP Residual norm 4.396971781499e-07 7304 KSP Residual norm 4.554715642577e-07 7305 KSP Residual norm 4.822287691056e-07 7306 KSP Residual norm 4.739427062245e-07 7307 KSP Residual norm 4.685734909259e-07 7308 KSP Residual norm 5.046245107242e-07 7309 KSP Residual norm 5.114040065988e-07 7310 KSP Residual norm 4.886034839479e-07 7311 KSP Residual norm 4.908948908224e-07 7312 KSP Residual norm 5.064694567661e-07 7313 KSP Residual norm 5.271964707650e-07 7314 KSP Residual norm 5.410219411796e-07 7315 KSP Residual norm 5.336878375794e-07 7316 KSP Residual norm 5.410391750970e-07 7317 KSP Residual norm 6.157963015528e-07 7318 KSP Residual norm 6.968998024400e-07 7319 KSP Residual norm 7.041993953182e-07 7320 KSP Residual norm 7.147096845297e-07 7321 KSP Residual norm 8.001120848706e-07 7322 KSP Residual norm 8.956876190316e-07 7323 KSP Residual norm 9.648024666539e-07 7324 KSP Residual norm 1.037546390916e-06 7325 KSP Residual norm 1.160551330876e-06 7326 KSP Residual norm 1.345903748827e-06 7327 KSP Residual norm 1.548893195179e-06 7328 KSP Residual norm 1.728489433297e-06 7329 KSP Residual norm 1.928407162036e-06 7330 KSP Residual norm 2.064523475193e-06 7331 KSP Residual norm 2.127668034474e-06 7332 KSP Residual norm 2.200837900367e-06 7333 KSP Residual norm 2.342478046569e-06 7334 KSP Residual norm 2.255502483522e-06 7335 KSP Residual norm 2.071242807730e-06 7336 KSP Residual norm 2.049622070359e-06 7337 KSP Residual norm 1.891432251160e-06 7338 KSP Residual norm 1.670979119049e-06 7339 KSP Residual norm 1.624795912576e-06 7340 KSP Residual norm 1.618959764401e-06 7341 KSP Residual norm 1.496541227742e-06 7342 KSP Residual norm 1.534664349931e-06 7343 KSP Residual norm 1.658766214841e-06 7344 KSP Residual norm 1.693499053434e-06 7345 KSP Residual norm 1.770783134740e-06 7346 KSP Residual norm 1.906003432519e-06 7347 KSP Residual norm 1.864989918715e-06 7348 KSP Residual norm 1.724205497069e-06 7349 KSP Residual norm 1.585113809120e-06 7350 KSP Residual norm 1.510430501692e-06 7351 KSP Residual norm 1.544709111610e-06 7352 KSP Residual norm 1.704223603156e-06 7353 KSP Residual norm 1.797355777725e-06 7354 KSP Residual norm 1.900166913996e-06 7355 KSP Residual norm 1.975027210596e-06 7356 KSP Residual norm 1.955158673940e-06 7357 KSP Residual norm 1.836040615689e-06 7358 KSP Residual norm 1.839221638011e-06 7359 KSP Residual norm 1.971513329176e-06 7360 KSP Residual norm 1.911622851630e-06 7361 KSP Residual norm 1.890715037426e-06 7362 KSP Residual norm 2.055261284146e-06 7363 KSP Residual norm 2.153464847999e-06 7364 KSP Residual norm 2.008393185291e-06 7365 KSP Residual norm 1.897706439102e-06 7366 KSP Residual norm 1.915211148214e-06 7367 KSP Residual norm 2.042470700063e-06 7368 KSP Residual norm 2.139329234885e-06 7369 KSP Residual norm 2.180019660575e-06 7370 KSP Residual norm 2.459675335832e-06 7371 KSP Residual norm 2.803709806540e-06 7372 KSP Residual norm 2.826919004970e-06 7373 KSP Residual norm 2.807219252052e-06 7374 KSP Residual norm 2.933592940288e-06 7375 KSP Residual norm 2.882575790868e-06 7376 KSP Residual norm 2.824077878684e-06 7377 KSP Residual norm 3.056055908946e-06 7378 KSP Residual norm 3.393614741223e-06 7379 KSP Residual norm 3.711719625836e-06 7380 KSP Residual norm 4.171457549568e-06 7381 KSP Residual norm 4.315183598905e-06 7382 KSP Residual norm 4.330420470982e-06 7383 KSP Residual norm 4.498371230511e-06 7384 KSP Residual norm 4.391051229443e-06 7385 KSP Residual norm 3.950767227256e-06 7386 KSP Residual norm 3.622979530472e-06 7387 KSP Residual norm 3.433195136852e-06 7388 KSP Residual norm 3.460097833917e-06 7389 KSP Residual norm 3.404750932069e-06 7390 KSP Residual norm 3.205631196985e-06 7391 KSP Residual norm 3.039048015714e-06 7392 KSP Residual norm 2.875810027938e-06 7393 KSP Residual norm 2.756544457966e-06 7394 KSP Residual norm 2.759596195061e-06 7395 KSP Residual norm 2.721814333379e-06 7396 KSP Residual norm 2.808216136613e-06 7397 KSP Residual norm 3.079381464310e-06 7398 KSP Residual norm 3.296628479187e-06 7399 KSP Residual norm 3.496924708281e-06 7400 KSP Residual norm 3.609924256297e-06 7401 KSP Residual norm 3.611912260519e-06 7402 KSP Residual norm 3.349943920058e-06 7403 KSP Residual norm 3.105712990170e-06 7404 KSP Residual norm 3.397066750256e-06 7405 KSP Residual norm 4.403106039167e-06 7406 KSP Residual norm 5.448131878687e-06 7407 KSP Residual norm 5.868924270531e-06 7408 KSP Residual norm 6.358360513096e-06 7409 KSP Residual norm 6.546256823724e-06 7410 KSP Residual norm 5.807578075804e-06 7411 KSP Residual norm 4.877589321014e-06 7412 KSP Residual norm 4.357047961191e-06 7413 KSP Residual norm 4.625109842446e-06 7414 KSP Residual norm 5.445790188789e-06 7415 KSP Residual norm 5.369350612031e-06 7416 KSP Residual norm 4.758037938977e-06 7417 KSP Residual norm 4.947853851393e-06 7418 KSP Residual norm 5.408830808929e-06 7419 KSP Residual norm 5.527948257413e-06 7420 KSP Residual norm 5.870817384348e-06 7421 KSP Residual norm 6.634202769939e-06 7422 KSP Residual norm 6.950684104754e-06 7423 KSP Residual norm 7.094895956547e-06 7424 KSP Residual norm 7.134684662989e-06 7425 KSP Residual norm 7.058220874156e-06 7426 KSP Residual norm 7.165775774834e-06 7427 KSP Residual norm 7.209989257062e-06 7428 KSP Residual norm 7.401207431631e-06 7429 KSP Residual norm 7.491830128985e-06 7430 KSP Residual norm 7.732961534271e-06 7431 KSP Residual norm 7.905118596087e-06 7432 KSP Residual norm 7.880525450629e-06 7433 KSP Residual norm 8.077074255041e-06 7434 KSP Residual norm 7.932670451256e-06 7435 KSP Residual norm 7.808390286480e-06 7436 KSP Residual norm 8.337561976437e-06 7437 KSP Residual norm 8.346486640413e-06 7438 KSP Residual norm 7.400818417879e-06 7439 KSP Residual norm 6.781601708653e-06 7440 KSP Residual norm 6.677486394509e-06 7441 KSP Residual norm 6.543964415339e-06 7442 KSP Residual norm 7.030968134865e-06 7443 KSP Residual norm 6.783836032831e-06 7444 KSP Residual norm 6.051743568940e-06 7445 KSP Residual norm 5.632219658258e-06 7446 KSP Residual norm 5.175084164761e-06 7447 KSP Residual norm 4.956510470261e-06 7448 KSP Residual norm 5.106001542929e-06 7449 KSP Residual norm 4.884137056565e-06 7450 KSP Residual norm 4.018563979161e-06 7451 KSP Residual norm 3.762635094921e-06 7452 KSP Residual norm 4.174839206176e-06 7453 KSP Residual norm 4.601150014605e-06 7454 KSP Residual norm 4.714770978273e-06 7455 KSP Residual norm 5.258794056687e-06 7456 KSP Residual norm 5.947423843634e-06 7457 KSP Residual norm 5.886855801846e-06 7458 KSP Residual norm 5.812182455590e-06 7459 KSP Residual norm 6.266799958497e-06 7460 KSP Residual norm 7.125586227852e-06 7461 KSP Residual norm 7.725603768311e-06 7462 KSP Residual norm 7.768847337324e-06 7463 KSP Residual norm 6.868912963031e-06 7464 KSP Residual norm 6.298035970097e-06 7465 KSP Residual norm 5.986254222685e-06 7466 KSP Residual norm 5.263279249285e-06 7467 KSP Residual norm 4.640850270861e-06 7468 KSP Residual norm 4.281274645449e-06 7469 KSP Residual norm 4.378865379569e-06 7470 KSP Residual norm 4.625254291148e-06 7471 KSP Residual norm 4.604480313008e-06 7472 KSP Residual norm 4.499574793283e-06 7473 KSP Residual norm 4.754601510162e-06 7474 KSP Residual norm 5.471345280433e-06 7475 KSP Residual norm 5.803430465006e-06 7476 KSP Residual norm 5.491827843799e-06 7477 KSP Residual norm 5.568292521666e-06 7478 KSP Residual norm 5.781627228754e-06 7479 KSP Residual norm 6.032745873122e-06 7480 KSP Residual norm 6.574324966192e-06 7481 KSP Residual norm 7.520802226600e-06 7482 KSP Residual norm 8.105384029071e-06 7483 KSP Residual norm 8.696982662423e-06 7484 KSP Residual norm 9.043312829595e-06 7485 KSP Residual norm 9.153174889175e-06 7486 KSP Residual norm 8.933332050159e-06 7487 KSP Residual norm 9.114493043593e-06 7488 KSP Residual norm 9.257993259567e-06 7489 KSP Residual norm 1.010193639190e-05 7490 KSP Residual norm 1.133217772365e-05 7491 KSP Residual norm 1.105033167397e-05 7492 KSP Residual norm 9.393679005325e-06 7493 KSP Residual norm 8.461994509747e-06 7494 KSP Residual norm 8.154267437754e-06 7495 KSP Residual norm 8.359580049015e-06 7496 KSP Residual norm 8.811861853892e-06 7497 KSP Residual norm 9.206540247349e-06 7498 KSP Residual norm 9.210921137128e-06 7499 KSP Residual norm 9.595170861270e-06 7500 KSP Residual norm 9.720476572278e-06 7501 KSP Residual norm 9.057207253062e-06 7502 KSP Residual norm 8.793348718699e-06 7503 KSP Residual norm 8.872635795232e-06 7504 KSP Residual norm 8.067157632592e-06 7505 KSP Residual norm 6.910946690861e-06 7506 KSP Residual norm 6.637232149285e-06 7507 KSP Residual norm 7.212674730349e-06 7508 KSP Residual norm 7.222678830237e-06 7509 KSP Residual norm 6.801658479535e-06 7510 KSP Residual norm 6.803026343778e-06 7511 KSP Residual norm 6.826157559353e-06 7512 KSP Residual norm 6.147860925154e-06 7513 KSP Residual norm 5.649665742164e-06 7514 KSP Residual norm 5.661394177507e-06 7515 KSP Residual norm 6.033033496710e-06 7516 KSP Residual norm 6.217098881835e-06 7517 KSP Residual norm 5.790977177662e-06 7518 KSP Residual norm 5.262555389241e-06 7519 KSP Residual norm 5.394054256262e-06 7520 KSP Residual norm 6.099391473298e-06 7521 KSP Residual norm 6.017669258310e-06 7522 KSP Residual norm 5.168489864323e-06 7523 KSP Residual norm 4.653495379470e-06 7524 KSP Residual norm 4.756808962319e-06 7525 KSP Residual norm 4.710616237623e-06 7526 KSP Residual norm 4.192384423526e-06 7527 KSP Residual norm 3.571821185925e-06 7528 KSP Residual norm 3.380979950370e-06 7529 KSP Residual norm 3.394373770687e-06 7530 KSP Residual norm 3.133394193570e-06 7531 KSP Residual norm 2.906341574694e-06 7532 KSP Residual norm 2.806390273384e-06 7533 KSP Residual norm 2.576973346213e-06 7534 KSP Residual norm 2.402896835379e-06 7535 KSP Residual norm 2.286698857055e-06 7536 KSP Residual norm 2.128749462796e-06 7537 KSP Residual norm 1.849129157966e-06 7538 KSP Residual norm 1.745649237557e-06 7539 KSP Residual norm 1.949177086934e-06 7540 KSP Residual norm 2.220735985337e-06 7541 KSP Residual norm 2.491220676205e-06 7542 KSP Residual norm 2.901861044743e-06 7543 KSP Residual norm 2.764480884909e-06 7544 KSP Residual norm 2.339823036839e-06 7545 KSP Residual norm 2.117424113113e-06 7546 KSP Residual norm 2.097804474104e-06 7547 KSP Residual norm 2.218188692609e-06 7548 KSP Residual norm 2.331652482351e-06 7549 KSP Residual norm 2.281179482142e-06 7550 KSP Residual norm 2.230880163625e-06 7551 KSP Residual norm 2.193340962817e-06 7552 KSP Residual norm 2.012912718695e-06 7553 KSP Residual norm 1.797494138650e-06 7554 KSP Residual norm 1.888851558483e-06 7555 KSP Residual norm 2.347433938128e-06 7556 KSP Residual norm 2.468137721919e-06 7557 KSP Residual norm 2.384405907187e-06 7558 KSP Residual norm 2.385175481351e-06 7559 KSP Residual norm 2.339659926500e-06 7560 KSP Residual norm 2.213745819020e-06 7561 KSP Residual norm 2.038283319788e-06 7562 KSP Residual norm 1.877067766034e-06 7563 KSP Residual norm 1.709117716898e-06 7564 KSP Residual norm 1.545362621855e-06 7565 KSP Residual norm 1.370809885245e-06 7566 KSP Residual norm 1.325113506238e-06 7567 KSP Residual norm 1.451356910965e-06 7568 KSP Residual norm 1.641920092931e-06 7569 KSP Residual norm 1.789209021483e-06 7570 KSP Residual norm 1.825554920150e-06 7571 KSP Residual norm 1.701651428292e-06 7572 KSP Residual norm 1.549520241332e-06 7573 KSP Residual norm 1.478751731399e-06 7574 KSP Residual norm 1.308030230514e-06 7575 KSP Residual norm 1.098056663857e-06 7576 KSP Residual norm 1.047742702407e-06 7577 KSP Residual norm 1.077581379460e-06 7578 KSP Residual norm 1.084524811502e-06 7579 KSP Residual norm 1.171361395739e-06 7580 KSP Residual norm 1.559829460255e-06 7581 KSP Residual norm 1.888336534721e-06 7582 KSP Residual norm 1.794949507659e-06 7583 KSP Residual norm 1.706950736080e-06 7584 KSP Residual norm 1.986844617867e-06 7585 KSP Residual norm 2.333741308409e-06 7586 KSP Residual norm 2.635723909210e-06 7587 KSP Residual norm 2.716018453242e-06 7588 KSP Residual norm 2.499402013645e-06 7589 KSP Residual norm 2.563480763981e-06 7590 KSP Residual norm 2.893051475505e-06 7591 KSP Residual norm 2.750302539350e-06 7592 KSP Residual norm 2.456456231021e-06 7593 KSP Residual norm 2.258745876614e-06 7594 KSP Residual norm 2.069890740290e-06 7595 KSP Residual norm 1.969981072226e-06 7596 KSP Residual norm 1.987239392270e-06 7597 KSP Residual norm 2.046077299872e-06 7598 KSP Residual norm 2.092767133420e-06 7599 KSP Residual norm 1.987199894632e-06 7600 KSP Residual norm 1.708487619018e-06 7601 KSP Residual norm 1.528502112138e-06 7602 KSP Residual norm 1.519839485156e-06 7603 KSP Residual norm 1.558248795308e-06 7604 KSP Residual norm 1.498227452433e-06 7605 KSP Residual norm 1.350123919455e-06 7606 KSP Residual norm 1.323829824032e-06 7607 KSP Residual norm 1.368322777727e-06 7608 KSP Residual norm 1.315047169960e-06 7609 KSP Residual norm 1.226868028559e-06 7610 KSP Residual norm 1.209345337633e-06 7611 KSP Residual norm 1.296756002698e-06 7612 KSP Residual norm 1.495403681433e-06 7613 KSP Residual norm 1.543632257846e-06 7614 KSP Residual norm 1.552141546969e-06 7615 KSP Residual norm 1.728284505624e-06 7616 KSP Residual norm 1.740959446910e-06 7617 KSP Residual norm 1.427164898704e-06 7618 KSP Residual norm 1.188343046565e-06 7619 KSP Residual norm 1.094236234960e-06 7620 KSP Residual norm 1.188114715187e-06 7621 KSP Residual norm 1.311944475543e-06 7622 KSP Residual norm 1.173316002371e-06 7623 KSP Residual norm 9.315092629314e-07 7624 KSP Residual norm 8.050722044022e-07 7625 KSP Residual norm 7.758971819723e-07 7626 KSP Residual norm 7.902972324827e-07 7627 KSP Residual norm 7.707781085207e-07 7628 KSP Residual norm 7.550263137422e-07 7629 KSP Residual norm 7.903295915348e-07 7630 KSP Residual norm 7.964973817909e-07 7631 KSP Residual norm 7.399853560834e-07 7632 KSP Residual norm 6.864624571673e-07 7633 KSP Residual norm 7.173851891277e-07 7634 KSP Residual norm 8.678556031920e-07 7635 KSP Residual norm 1.109058101050e-06 7636 KSP Residual norm 1.128134085893e-06 7637 KSP Residual norm 1.003895519920e-06 7638 KSP Residual norm 9.357897272235e-07 7639 KSP Residual norm 9.305936242133e-07 7640 KSP Residual norm 1.044599556472e-06 7641 KSP Residual norm 1.267764522320e-06 7642 KSP Residual norm 1.405290123929e-06 7643 KSP Residual norm 1.430947834916e-06 7644 KSP Residual norm 1.421326922048e-06 7645 KSP Residual norm 1.255786956863e-06 7646 KSP Residual norm 1.084774632763e-06 7647 KSP Residual norm 1.003474902166e-06 7648 KSP Residual norm 1.030442178222e-06 7649 KSP Residual norm 1.115967208130e-06 7650 KSP Residual norm 1.258699230974e-06 7651 KSP Residual norm 1.455157355988e-06 7652 KSP Residual norm 1.422819130257e-06 7653 KSP Residual norm 1.256296351642e-06 7654 KSP Residual norm 1.184008824620e-06 7655 KSP Residual norm 1.288665846616e-06 7656 KSP Residual norm 1.541683158251e-06 7657 KSP Residual norm 1.621744483781e-06 7658 KSP Residual norm 1.515973471917e-06 7659 KSP Residual norm 1.546293219122e-06 7660 KSP Residual norm 1.722504472439e-06 7661 KSP Residual norm 1.895528349135e-06 7662 KSP Residual norm 2.033657819580e-06 7663 KSP Residual norm 1.944098905122e-06 7664 KSP Residual norm 1.714483455032e-06 7665 KSP Residual norm 1.639339083118e-06 7666 KSP Residual norm 1.741623408973e-06 7667 KSP Residual norm 1.981786428858e-06 7668 KSP Residual norm 2.328744191905e-06 7669 KSP Residual norm 2.812612361480e-06 7670 KSP Residual norm 3.280118055052e-06 7671 KSP Residual norm 3.320087437432e-06 7672 KSP Residual norm 3.150149732969e-06 7673 KSP Residual norm 3.053736682444e-06 7674 KSP Residual norm 2.678521357222e-06 7675 KSP Residual norm 2.179998337595e-06 7676 KSP Residual norm 1.866392801572e-06 7677 KSP Residual norm 1.827194308348e-06 7678 KSP Residual norm 1.875664570303e-06 7679 KSP Residual norm 1.726766288160e-06 7680 KSP Residual norm 1.454187767829e-06 7681 KSP Residual norm 1.372884760919e-06 7682 KSP Residual norm 1.463078983051e-06 7683 KSP Residual norm 1.430787554902e-06 7684 KSP Residual norm 1.305234955272e-06 7685 KSP Residual norm 1.315870472081e-06 7686 KSP Residual norm 1.605222771185e-06 7687 KSP Residual norm 2.088576810084e-06 7688 KSP Residual norm 2.354255260154e-06 7689 KSP Residual norm 2.038521667586e-06 7690 KSP Residual norm 1.565043688766e-06 7691 KSP Residual norm 1.394994089855e-06 7692 KSP Residual norm 1.451967110031e-06 7693 KSP Residual norm 1.471636194778e-06 7694 KSP Residual norm 1.266189142449e-06 7695 KSP Residual norm 1.136694354160e-06 7696 KSP Residual norm 1.218027225686e-06 7697 KSP Residual norm 1.462192636199e-06 7698 KSP Residual norm 1.627371740201e-06 7699 KSP Residual norm 1.565381118206e-06 7700 KSP Residual norm 1.479758923915e-06 7701 KSP Residual norm 1.461408952508e-06 7702 KSP Residual norm 1.265609759472e-06 7703 KSP Residual norm 9.915598891812e-07 7704 KSP Residual norm 8.662916334648e-07 7705 KSP Residual norm 9.059432381449e-07 7706 KSP Residual norm 1.112807107759e-06 7707 KSP Residual norm 1.392331327892e-06 7708 KSP Residual norm 1.640877480017e-06 7709 KSP Residual norm 1.759439173999e-06 7710 KSP Residual norm 1.803277092063e-06 7711 KSP Residual norm 1.812475848526e-06 7712 KSP Residual norm 1.752306727511e-06 7713 KSP Residual norm 1.748093673952e-06 7714 KSP Residual norm 1.665086925123e-06 7715 KSP Residual norm 1.511042176560e-06 7716 KSP Residual norm 1.435478985287e-06 7717 KSP Residual norm 1.400611117409e-06 7718 KSP Residual norm 1.321510760937e-06 7719 KSP Residual norm 1.336048870707e-06 7720 KSP Residual norm 1.334587277990e-06 7721 KSP Residual norm 1.359770636563e-06 7722 KSP Residual norm 1.572246787352e-06 7723 KSP Residual norm 1.898121724474e-06 7724 KSP Residual norm 2.041265790273e-06 7725 KSP Residual norm 1.988637757144e-06 7726 KSP Residual norm 1.758225849780e-06 7727 KSP Residual norm 1.604241350551e-06 7728 KSP Residual norm 1.493395238764e-06 7729 KSP Residual norm 1.278840087985e-06 7730 KSP Residual norm 1.199994309068e-06 7731 KSP Residual norm 1.410547879288e-06 7732 KSP Residual norm 1.892887321646e-06 7733 KSP Residual norm 2.382556372287e-06 7734 KSP Residual norm 2.526191545685e-06 7735 KSP Residual norm 2.366849552919e-06 7736 KSP Residual norm 2.201965785733e-06 7737 KSP Residual norm 1.949263491447e-06 7738 KSP Residual norm 1.660818134384e-06 7739 KSP Residual norm 1.502108494913e-06 7740 KSP Residual norm 1.441703063890e-06 7741 KSP Residual norm 1.543516101052e-06 7742 KSP Residual norm 1.912139912682e-06 7743 KSP Residual norm 2.181260820141e-06 7744 KSP Residual norm 1.982404721499e-06 7745 KSP Residual norm 1.788496415450e-06 7746 KSP Residual norm 1.755905856383e-06 7747 KSP Residual norm 1.798751962707e-06 7748 KSP Residual norm 1.826743448290e-06 7749 KSP Residual norm 1.794364237013e-06 7750 KSP Residual norm 1.655274861349e-06 7751 KSP Residual norm 1.607864966586e-06 7752 KSP Residual norm 1.545793792005e-06 7753 KSP Residual norm 1.383804916330e-06 7754 KSP Residual norm 1.256160923816e-06 7755 KSP Residual norm 1.225617055834e-06 7756 KSP Residual norm 1.440135604763e-06 7757 KSP Residual norm 1.822276218037e-06 7758 KSP Residual norm 2.021691727895e-06 7759 KSP Residual norm 1.945821107226e-06 7760 KSP Residual norm 1.601892493006e-06 7761 KSP Residual norm 1.320577427551e-06 7762 KSP Residual norm 1.244750709422e-06 7763 KSP Residual norm 1.304776354825e-06 7764 KSP Residual norm 1.383670746135e-06 7765 KSP Residual norm 1.330299955581e-06 7766 KSP Residual norm 1.249706226691e-06 7767 KSP Residual norm 1.328075503647e-06 7768 KSP Residual norm 1.477294098990e-06 7769 KSP Residual norm 1.433690077428e-06 7770 KSP Residual norm 1.381042913590e-06 7771 KSP Residual norm 1.287887954423e-06 7772 KSP Residual norm 1.130006127816e-06 7773 KSP Residual norm 1.011716780233e-06 7774 KSP Residual norm 9.147596375336e-07 7775 KSP Residual norm 7.628489774252e-07 7776 KSP Residual norm 5.637655308019e-07 7777 KSP Residual norm 4.169385291491e-07 7778 KSP Residual norm 3.376188844955e-07 7779 KSP Residual norm 3.194042227907e-07 7780 KSP Residual norm 3.502557581226e-07 7781 KSP Residual norm 4.271475349041e-07 7782 KSP Residual norm 5.199353996380e-07 7783 KSP Residual norm 6.231873012615e-07 7784 KSP Residual norm 7.065676843507e-07 7785 KSP Residual norm 7.020533601458e-07 7786 KSP Residual norm 6.333922928740e-07 7787 KSP Residual norm 6.126710697391e-07 7788 KSP Residual norm 6.245516936564e-07 7789 KSP Residual norm 6.125644819357e-07 7790 KSP Residual norm 6.175451435045e-07 7791 KSP Residual norm 6.624950033581e-07 7792 KSP Residual norm 7.371108945590e-07 7793 KSP Residual norm 8.388528650842e-07 7794 KSP Residual norm 9.238375246647e-07 7795 KSP Residual norm 8.304071987238e-07 7796 KSP Residual norm 7.087597135093e-07 7797 KSP Residual norm 6.153650288372e-07 7798 KSP Residual norm 5.088104464888e-07 7799 KSP Residual norm 4.155686689330e-07 7800 KSP Residual norm 3.613689860961e-07 7801 KSP Residual norm 3.303151926430e-07 7802 KSP Residual norm 3.317027875012e-07 7803 KSP Residual norm 3.541152444808e-07 7804 KSP Residual norm 3.915848982899e-07 7805 KSP Residual norm 4.392157509576e-07 7806 KSP Residual norm 4.514871887997e-07 7807 KSP Residual norm 4.500378393007e-07 7808 KSP Residual norm 4.918681967471e-07 7809 KSP Residual norm 5.988226284555e-07 7810 KSP Residual norm 6.271943214777e-07 7811 KSP Residual norm 5.439448444939e-07 7812 KSP Residual norm 5.090939056623e-07 7813 KSP Residual norm 5.607999729476e-07 7814 KSP Residual norm 6.733954523873e-07 7815 KSP Residual norm 8.800202184278e-07 7816 KSP Residual norm 1.089453565854e-06 7817 KSP Residual norm 8.759768236034e-07 7818 KSP Residual norm 5.658411523005e-07 7819 KSP Residual norm 4.440312990819e-07 7820 KSP Residual norm 3.851573111640e-07 7821 KSP Residual norm 3.128250866591e-07 7822 KSP Residual norm 2.701549036854e-07 7823 KSP Residual norm 2.623585589528e-07 7824 KSP Residual norm 2.640675072228e-07 7825 KSP Residual norm 2.766206869245e-07 7826 KSP Residual norm 2.959579435269e-07 7827 KSP Residual norm 3.191487734765e-07 7828 KSP Residual norm 3.957849404071e-07 7829 KSP Residual norm 5.915551783732e-07 7830 KSP Residual norm 8.236966418673e-07 7831 KSP Residual norm 8.096132648785e-07 7832 KSP Residual norm 6.454286934911e-07 7833 KSP Residual norm 5.516344767876e-07 7834 KSP Residual norm 5.455079849659e-07 7835 KSP Residual norm 5.696007548804e-07 7836 KSP Residual norm 4.849226915416e-07 7837 KSP Residual norm 3.900262142211e-07 7838 KSP Residual norm 3.634726596977e-07 7839 KSP Residual norm 3.470919882618e-07 7840 KSP Residual norm 2.919319565751e-07 7841 KSP Residual norm 2.332888220556e-07 7842 KSP Residual norm 1.848391416604e-07 7843 KSP Residual norm 1.655712646219e-07 7844 KSP Residual norm 1.800714102656e-07 7845 KSP Residual norm 1.975597739976e-07 7846 KSP Residual norm 1.917655666113e-07 7847 KSP Residual norm 1.897844147078e-07 7848 KSP Residual norm 1.902781709051e-07 7849 KSP Residual norm 1.732907900364e-07 7850 KSP Residual norm 1.605368992933e-07 7851 KSP Residual norm 1.917814237521e-07 7852 KSP Residual norm 2.529096988932e-07 7853 KSP Residual norm 2.975013975433e-07 7854 KSP Residual norm 3.084666262852e-07 7855 KSP Residual norm 3.080532270391e-07 7856 KSP Residual norm 3.076966076176e-07 7857 KSP Residual norm 3.401942014978e-07 7858 KSP Residual norm 3.860996998338e-07 7859 KSP Residual norm 3.534189697168e-07 7860 KSP Residual norm 3.273592303477e-07 7861 KSP Residual norm 3.371386162608e-07 7862 KSP Residual norm 3.055434993511e-07 7863 KSP Residual norm 2.599853444728e-07 7864 KSP Residual norm 2.432475982379e-07 7865 KSP Residual norm 2.381295488489e-07 7866 KSP Residual norm 2.404107453197e-07 7867 KSP Residual norm 2.266153684418e-07 7868 KSP Residual norm 2.171308920114e-07 7869 KSP Residual norm 2.446128115014e-07 7870 KSP Residual norm 2.934260132442e-07 7871 KSP Residual norm 3.518341924976e-07 7872 KSP Residual norm 3.858084956032e-07 7873 KSP Residual norm 3.862689234988e-07 7874 KSP Residual norm 4.076547606188e-07 7875 KSP Residual norm 4.482743976025e-07 7876 KSP Residual norm 5.154688453269e-07 7877 KSP Residual norm 5.487037342204e-07 7878 KSP Residual norm 5.221342164123e-07 7879 KSP Residual norm 4.742054797895e-07 7880 KSP Residual norm 3.983806445520e-07 7881 KSP Residual norm 2.961926272928e-07 7882 KSP Residual norm 2.359776353977e-07 7883 KSP Residual norm 2.277291789503e-07 7884 KSP Residual norm 2.223418298875e-07 7885 KSP Residual norm 1.893235411678e-07 7886 KSP Residual norm 1.534105099781e-07 7887 KSP Residual norm 1.254661639792e-07 7888 KSP Residual norm 1.135579898410e-07 7889 KSP Residual norm 1.312498163977e-07 7890 KSP Residual norm 1.768469855483e-07 7891 KSP Residual norm 1.949380402344e-07 7892 KSP Residual norm 1.855235049881e-07 7893 KSP Residual norm 1.825816412243e-07 7894 KSP Residual norm 2.028823102601e-07 7895 KSP Residual norm 2.356248695017e-07 7896 KSP Residual norm 2.604014066294e-07 7897 KSP Residual norm 2.704263225644e-07 7898 KSP Residual norm 3.101537848053e-07 7899 KSP Residual norm 3.740512218823e-07 7900 KSP Residual norm 3.519697408755e-07 7901 KSP Residual norm 2.538404267501e-07 7902 KSP Residual norm 2.058518899867e-07 7903 KSP Residual norm 1.839727959678e-07 7904 KSP Residual norm 1.786699263491e-07 7905 KSP Residual norm 1.734821419894e-07 7906 KSP Residual norm 1.682588537774e-07 7907 KSP Residual norm 1.670860584442e-07 7908 KSP Residual norm 1.666448465470e-07 7909 KSP Residual norm 1.688202170661e-07 7910 KSP Residual norm 1.687753982599e-07 7911 KSP Residual norm 1.540603074341e-07 7912 KSP Residual norm 1.433976715364e-07 7913 KSP Residual norm 1.441885388914e-07 7914 KSP Residual norm 1.354919259712e-07 7915 KSP Residual norm 1.214584483221e-07 7916 KSP Residual norm 1.134572104274e-07 7917 KSP Residual norm 1.154910483308e-07 7918 KSP Residual norm 1.322883548829e-07 7919 KSP Residual norm 1.377479981038e-07 7920 KSP Residual norm 1.232153654558e-07 7921 KSP Residual norm 1.162538811398e-07 7922 KSP Residual norm 1.285562659148e-07 7923 KSP Residual norm 1.460850487169e-07 7924 KSP Residual norm 1.808964028303e-07 7925 KSP Residual norm 2.164464361009e-07 7926 KSP Residual norm 2.065840948205e-07 7927 KSP Residual norm 1.840979636849e-07 7928 KSP Residual norm 1.919529282356e-07 7929 KSP Residual norm 2.315411553482e-07 7930 KSP Residual norm 2.928671128214e-07 7931 KSP Residual norm 3.214906371225e-07 7932 KSP Residual norm 3.200007280974e-07 7933 KSP Residual norm 3.097761538709e-07 7934 KSP Residual norm 2.846148098205e-07 7935 KSP Residual norm 2.455667408874e-07 7936 KSP Residual norm 2.076685197628e-07 7937 KSP Residual norm 1.970372148185e-07 7938 KSP Residual norm 1.907694917871e-07 7939 KSP Residual norm 1.638082937903e-07 7940 KSP Residual norm 1.318547075624e-07 7941 KSP Residual norm 1.139763334807e-07 7942 KSP Residual norm 1.091558105570e-07 7943 KSP Residual norm 1.108382419994e-07 7944 KSP Residual norm 1.185986605639e-07 7945 KSP Residual norm 1.231322493752e-07 7946 KSP Residual norm 1.350694442136e-07 7947 KSP Residual norm 1.668124241152e-07 7948 KSP Residual norm 1.697737705990e-07 7949 KSP Residual norm 1.481791571280e-07 7950 KSP Residual norm 1.336122602693e-07 7951 KSP Residual norm 1.278035161891e-07 7952 KSP Residual norm 1.364812274164e-07 7953 KSP Residual norm 1.682003365398e-07 7954 KSP Residual norm 1.922424843523e-07 7955 KSP Residual norm 1.938892763032e-07 7956 KSP Residual norm 1.897401422072e-07 7957 KSP Residual norm 1.818309041194e-07 7958 KSP Residual norm 1.734056056482e-07 7959 KSP Residual norm 1.814756398668e-07 7960 KSP Residual norm 2.135273397600e-07 7961 KSP Residual norm 2.333176468566e-07 7962 KSP Residual norm 2.276991968038e-07 7963 KSP Residual norm 2.029040313011e-07 7964 KSP Residual norm 1.755336410683e-07 7965 KSP Residual norm 1.791923592825e-07 7966 KSP Residual norm 2.088339759696e-07 7967 KSP Residual norm 2.230199098983e-07 7968 KSP Residual norm 2.071167771478e-07 7969 KSP Residual norm 1.875318598556e-07 7970 KSP Residual norm 1.619419996492e-07 7971 KSP Residual norm 1.483905675752e-07 7972 KSP Residual norm 1.547490085129e-07 7973 KSP Residual norm 1.609190570457e-07 7974 KSP Residual norm 1.550612493565e-07 7975 KSP Residual norm 1.539941746848e-07 7976 KSP Residual norm 1.602703531411e-07 7977 KSP Residual norm 1.511264754163e-07 7978 KSP Residual norm 1.235449701877e-07 7979 KSP Residual norm 9.546949724247e-08 7980 KSP Residual norm 7.598790560702e-08 7981 KSP Residual norm 7.647244723344e-08 7982 KSP Residual norm 8.865460014319e-08 7983 KSP Residual norm 9.125079128923e-08 7984 KSP Residual norm 8.913462237708e-08 7985 KSP Residual norm 1.023975703858e-07 7986 KSP Residual norm 1.251663369631e-07 7987 KSP Residual norm 1.397533666415e-07 7988 KSP Residual norm 1.434533474053e-07 7989 KSP Residual norm 1.461275716979e-07 7990 KSP Residual norm 1.591826370687e-07 7991 KSP Residual norm 1.724244877498e-07 7992 KSP Residual norm 1.678368835412e-07 7993 KSP Residual norm 1.455405188043e-07 7994 KSP Residual norm 1.303234111468e-07 7995 KSP Residual norm 1.306510863790e-07 7996 KSP Residual norm 1.283368283856e-07 7997 KSP Residual norm 1.204277188283e-07 7998 KSP Residual norm 1.268576701252e-07 7999 KSP Residual norm 1.469297128612e-07 8000 KSP Residual norm 1.671116107791e-07 8001 KSP Residual norm 1.753537815876e-07 8002 KSP Residual norm 1.772431238478e-07 8003 KSP Residual norm 1.565539391204e-07 8004 KSP Residual norm 1.140322963539e-07 8005 KSP Residual norm 7.811559555552e-08 8006 KSP Residual norm 6.104447658556e-08 8007 KSP Residual norm 6.094287522417e-08 8008 KSP Residual norm 7.281655256732e-08 8009 KSP Residual norm 8.587077797006e-08 8010 KSP Residual norm 8.320379101793e-08 8011 KSP Residual norm 7.668827992595e-08 8012 KSP Residual norm 7.603426475381e-08 8013 KSP Residual norm 8.683839764122e-08 8014 KSP Residual norm 1.022330715911e-07 8015 KSP Residual norm 1.165681124277e-07 8016 KSP Residual norm 1.303699478073e-07 8017 KSP Residual norm 1.454563834765e-07 8018 KSP Residual norm 1.493475154903e-07 8019 KSP Residual norm 1.396388561676e-07 8020 KSP Residual norm 1.382074658829e-07 8021 KSP Residual norm 1.390864715770e-07 8022 KSP Residual norm 1.186968857752e-07 8023 KSP Residual norm 9.826574517770e-08 8024 KSP Residual norm 9.161894053450e-08 8025 KSP Residual norm 9.710172397342e-08 8026 KSP Residual norm 1.196314925050e-07 8027 KSP Residual norm 1.454906301402e-07 8028 KSP Residual norm 1.340285048430e-07 8029 KSP Residual norm 1.139588517959e-07 8030 KSP Residual norm 1.091091309305e-07 8031 KSP Residual norm 1.100497165458e-07 8032 KSP Residual norm 1.156913601001e-07 8033 KSP Residual norm 1.158972527887e-07 8034 KSP Residual norm 1.012568422382e-07 8035 KSP Residual norm 8.717569410330e-08 8036 KSP Residual norm 8.054284322170e-08 8037 KSP Residual norm 7.359767944540e-08 8038 KSP Residual norm 7.089659969333e-08 8039 KSP Residual norm 7.652456475297e-08 8040 KSP Residual norm 8.231454629192e-08 8041 KSP Residual norm 8.379555337591e-08 8042 KSP Residual norm 9.250916621889e-08 8043 KSP Residual norm 1.139626887290e-07 8044 KSP Residual norm 1.373060794601e-07 8045 KSP Residual norm 1.417046967103e-07 8046 KSP Residual norm 1.268522689459e-07 8047 KSP Residual norm 1.108684525052e-07 8048 KSP Residual norm 1.054133741146e-07 8049 KSP Residual norm 1.124949466187e-07 8050 KSP Residual norm 1.341757007942e-07 8051 KSP Residual norm 1.615135499510e-07 8052 KSP Residual norm 1.997756557535e-07 8053 KSP Residual norm 2.036753504143e-07 8054 KSP Residual norm 1.711877698983e-07 8055 KSP Residual norm 1.670618866395e-07 8056 KSP Residual norm 1.927210219069e-07 8057 KSP Residual norm 2.201400353626e-07 8058 KSP Residual norm 2.425169146844e-07 8059 KSP Residual norm 2.394827120585e-07 8060 KSP Residual norm 2.345349567835e-07 8061 KSP Residual norm 2.472882243308e-07 8062 KSP Residual norm 2.698750271744e-07 8063 KSP Residual norm 2.709028252171e-07 8064 KSP Residual norm 2.460326686723e-07 8065 KSP Residual norm 2.160467864260e-07 8066 KSP Residual norm 1.866177255059e-07 8067 KSP Residual norm 1.756026331405e-07 8068 KSP Residual norm 1.761011557913e-07 8069 KSP Residual norm 1.667368543158e-07 8070 KSP Residual norm 1.542804187234e-07 8071 KSP Residual norm 1.483758101109e-07 8072 KSP Residual norm 1.365275472760e-07 8073 KSP Residual norm 1.160282431619e-07 8074 KSP Residual norm 1.038107865688e-07 8075 KSP Residual norm 1.051844319387e-07 8076 KSP Residual norm 1.092140814882e-07 8077 KSP Residual norm 1.212890097750e-07 8078 KSP Residual norm 1.331292993083e-07 8079 KSP Residual norm 1.362942419934e-07 8080 KSP Residual norm 1.420593966625e-07 8081 KSP Residual norm 1.552848425667e-07 8082 KSP Residual norm 1.732088645981e-07 8083 KSP Residual norm 1.900681814089e-07 8084 KSP Residual norm 2.105976939310e-07 8085 KSP Residual norm 2.307996312757e-07 8086 KSP Residual norm 2.296348242639e-07 8087 KSP Residual norm 2.218764084774e-07 8088 KSP Residual norm 2.315513531708e-07 8089 KSP Residual norm 2.654618479296e-07 8090 KSP Residual norm 3.033057518761e-07 8091 KSP Residual norm 3.134767579334e-07 8092 KSP Residual norm 3.345140306933e-07 8093 KSP Residual norm 3.558332724883e-07 8094 KSP Residual norm 3.222785626229e-07 8095 KSP Residual norm 2.839742900738e-07 8096 KSP Residual norm 2.884488628695e-07 8097 KSP Residual norm 3.141681803454e-07 8098 KSP Residual norm 3.268323944666e-07 8099 KSP Residual norm 3.313589995010e-07 8100 KSP Residual norm 3.440781076779e-07 8101 KSP Residual norm 3.878638853320e-07 8102 KSP Residual norm 4.483814078548e-07 8103 KSP Residual norm 4.955122521795e-07 8104 KSP Residual norm 4.240396262148e-07 8105 KSP Residual norm 3.567755842840e-07 8106 KSP Residual norm 3.183332215489e-07 8107 KSP Residual norm 2.776523250698e-07 8108 KSP Residual norm 2.615391851424e-07 8109 KSP Residual norm 2.533708188430e-07 8110 KSP Residual norm 2.445258384858e-07 8111 KSP Residual norm 2.525561435896e-07 8112 KSP Residual norm 2.875298513083e-07 8113 KSP Residual norm 3.318420441954e-07 8114 KSP Residual norm 3.520479791208e-07 8115 KSP Residual norm 3.537100577215e-07 8116 KSP Residual norm 3.787168144664e-07 8117 KSP Residual norm 3.989284133489e-07 8118 KSP Residual norm 4.047857892943e-07 8119 KSP Residual norm 3.787039293600e-07 8120 KSP Residual norm 3.476380507194e-07 8121 KSP Residual norm 3.473457549216e-07 8122 KSP Residual norm 3.595491458881e-07 8123 KSP Residual norm 3.161135371187e-07 8124 KSP Residual norm 2.572531837543e-07 8125 KSP Residual norm 2.518397959188e-07 8126 KSP Residual norm 2.681978699763e-07 8127 KSP Residual norm 2.556508712073e-07 8128 KSP Residual norm 2.369677709949e-07 8129 KSP Residual norm 2.187386686073e-07 8130 KSP Residual norm 1.961445381468e-07 8131 KSP Residual norm 1.741003596786e-07 8132 KSP Residual norm 1.509379091122e-07 8133 KSP Residual norm 1.402535742283e-07 8134 KSP Residual norm 1.446935363471e-07 8135 KSP Residual norm 1.426270809981e-07 8136 KSP Residual norm 1.394290549272e-07 8137 KSP Residual norm 1.482642674560e-07 8138 KSP Residual norm 1.664610632052e-07 8139 KSP Residual norm 1.946620990213e-07 8140 KSP Residual norm 2.059751991649e-07 8141 KSP Residual norm 1.854985569190e-07 8142 KSP Residual norm 1.499449194792e-07 8143 KSP Residual norm 1.260383314018e-07 8144 KSP Residual norm 1.208548393115e-07 8145 KSP Residual norm 1.197863170633e-07 8146 KSP Residual norm 1.165889835339e-07 8147 KSP Residual norm 1.215780667108e-07 8148 KSP Residual norm 1.269448372734e-07 8149 KSP Residual norm 1.294170690179e-07 8150 KSP Residual norm 1.445149057012e-07 8151 KSP Residual norm 1.777319392239e-07 8152 KSP Residual norm 1.933530880859e-07 8153 KSP Residual norm 1.920263944173e-07 8154 KSP Residual norm 1.923992395845e-07 8155 KSP Residual norm 1.895717394226e-07 8156 KSP Residual norm 2.017880179412e-07 8157 KSP Residual norm 2.077886389624e-07 8158 KSP Residual norm 2.008259590381e-07 8159 KSP Residual norm 2.109835535903e-07 8160 KSP Residual norm 2.168115519092e-07 8161 KSP Residual norm 1.987845075219e-07 8162 KSP Residual norm 1.818674140177e-07 8163 KSP Residual norm 1.794404144368e-07 8164 KSP Residual norm 1.799756363352e-07 8165 KSP Residual norm 1.574897935970e-07 8166 KSP Residual norm 1.395808984100e-07 8167 KSP Residual norm 1.341819021271e-07 8168 KSP Residual norm 1.308964137536e-07 8169 KSP Residual norm 1.381805565467e-07 8170 KSP Residual norm 1.559639692093e-07 8171 KSP Residual norm 1.553107986189e-07 8172 KSP Residual norm 1.149877105812e-07 8173 KSP Residual norm 7.601298376647e-08 8174 KSP Residual norm 6.067059448813e-08 8175 KSP Residual norm 6.233121800085e-08 8176 KSP Residual norm 7.101263683727e-08 8177 KSP Residual norm 7.586898328109e-08 8178 KSP Residual norm 8.123264274501e-08 8179 KSP Residual norm 9.650960285873e-08 8180 KSP Residual norm 1.044400738220e-07 8181 KSP Residual norm 9.760167627437e-08 8182 KSP Residual norm 8.890408625961e-08 8183 KSP Residual norm 7.120159679626e-08 8184 KSP Residual norm 5.643730011864e-08 8185 KSP Residual norm 5.380416923686e-08 8186 KSP Residual norm 5.923414711690e-08 8187 KSP Residual norm 6.811310711329e-08 8188 KSP Residual norm 7.907029114505e-08 8189 KSP Residual norm 8.779301345165e-08 8190 KSP Residual norm 8.437311924713e-08 8191 KSP Residual norm 7.421676267285e-08 8192 KSP Residual norm 6.942164441429e-08 8193 KSP Residual norm 7.621127613684e-08 8194 KSP Residual norm 9.511735755401e-08 8195 KSP Residual norm 1.148240988239e-07 8196 KSP Residual norm 1.412723876400e-07 8197 KSP Residual norm 1.586434687392e-07 8198 KSP Residual norm 1.663285066785e-07 8199 KSP Residual norm 1.575386237028e-07 8200 KSP Residual norm 1.463020740937e-07 8201 KSP Residual norm 1.440141495552e-07 8202 KSP Residual norm 1.352373105979e-07 8203 KSP Residual norm 1.148568626842e-07 8204 KSP Residual norm 9.822149067879e-08 8205 KSP Residual norm 9.694527174724e-08 8206 KSP Residual norm 1.087085072858e-07 8207 KSP Residual norm 1.206499061173e-07 8208 KSP Residual norm 1.202718427106e-07 8209 KSP Residual norm 1.056427209528e-07 8210 KSP Residual norm 9.437572976817e-08 8211 KSP Residual norm 8.403274396125e-08 8212 KSP Residual norm 7.762608003983e-08 8213 KSP Residual norm 7.758515041912e-08 8214 KSP Residual norm 8.010721807318e-08 8215 KSP Residual norm 8.090906665125e-08 8216 KSP Residual norm 7.979117035351e-08 8217 KSP Residual norm 8.265067885007e-08 8218 KSP Residual norm 9.695677096518e-08 8219 KSP Residual norm 1.195296040435e-07 8220 KSP Residual norm 1.243758775694e-07 8221 KSP Residual norm 1.179957262768e-07 8222 KSP Residual norm 1.093729777648e-07 8223 KSP Residual norm 9.961961876537e-08 8224 KSP Residual norm 9.511784430671e-08 8225 KSP Residual norm 8.777166043946e-08 8226 KSP Residual norm 8.490926410784e-08 8227 KSP Residual norm 9.727262199837e-08 8228 KSP Residual norm 1.027310090408e-07 8229 KSP Residual norm 9.737536106588e-08 8230 KSP Residual norm 9.309925457419e-08 8231 KSP Residual norm 8.981188245750e-08 8232 KSP Residual norm 9.232002152399e-08 8233 KSP Residual norm 1.078565744450e-07 8234 KSP Residual norm 1.296724934650e-07 8235 KSP Residual norm 1.476976944983e-07 8236 KSP Residual norm 1.537952946545e-07 8237 KSP Residual norm 1.319568088519e-07 8238 KSP Residual norm 1.106302334569e-07 8239 KSP Residual norm 1.031548994982e-07 8240 KSP Residual norm 1.026618981486e-07 8241 KSP Residual norm 1.042267280452e-07 8242 KSP Residual norm 1.002284136528e-07 8243 KSP Residual norm 9.358322106088e-08 8244 KSP Residual norm 8.419731402061e-08 8245 KSP Residual norm 6.989547765926e-08 8246 KSP Residual norm 6.343702587368e-08 8247 KSP Residual norm 6.336961062012e-08 8248 KSP Residual norm 6.312497386673e-08 8249 KSP Residual norm 5.762737073744e-08 8250 KSP Residual norm 4.721550186137e-08 8251 KSP Residual norm 4.113217813494e-08 8252 KSP Residual norm 4.026866469743e-08 8253 KSP Residual norm 4.266709749115e-08 8254 KSP Residual norm 4.410197852650e-08 8255 KSP Residual norm 4.383023694240e-08 8256 KSP Residual norm 4.151615264390e-08 8257 KSP Residual norm 3.340231393819e-08 8258 KSP Residual norm 2.456194205881e-08 8259 KSP Residual norm 1.975458161608e-08 8260 KSP Residual norm 1.991297408068e-08 8261 KSP Residual norm 2.434685261917e-08 8262 KSP Residual norm 2.610083285993e-08 8263 KSP Residual norm 2.551883123954e-08 8264 KSP Residual norm 2.402857926867e-08 8265 KSP Residual norm 2.006552677903e-08 8266 KSP Residual norm 1.798741393319e-08 8267 KSP Residual norm 1.863390123332e-08 8268 KSP Residual norm 1.926992568396e-08 8269 KSP Residual norm 1.870104147413e-08 8270 KSP Residual norm 1.836691600844e-08 8271 KSP Residual norm 1.948640221752e-08 8272 KSP Residual norm 2.110022367758e-08 8273 KSP Residual norm 2.138032707085e-08 8274 KSP Residual norm 2.016778881229e-08 8275 KSP Residual norm 1.936066762869e-08 8276 KSP Residual norm 1.970506982458e-08 8277 KSP Residual norm 1.966028883015e-08 8278 KSP Residual norm 2.020927080510e-08 8279 KSP Residual norm 2.276556968725e-08 8280 KSP Residual norm 2.657435749998e-08 8281 KSP Residual norm 2.956142607119e-08 8282 KSP Residual norm 3.034349058150e-08 8283 KSP Residual norm 3.205613116762e-08 8284 KSP Residual norm 3.878253489280e-08 8285 KSP Residual norm 4.440522410959e-08 8286 KSP Residual norm 4.193078550999e-08 8287 KSP Residual norm 3.730499531938e-08 8288 KSP Residual norm 3.388697139884e-08 8289 KSP Residual norm 3.331211763528e-08 8290 KSP Residual norm 3.518153975680e-08 8291 KSP Residual norm 3.911463099701e-08 8292 KSP Residual norm 4.755996046227e-08 8293 KSP Residual norm 4.998956252225e-08 8294 KSP Residual norm 4.169998909042e-08 8295 KSP Residual norm 3.246486200574e-08 8296 KSP Residual norm 2.830009644268e-08 8297 KSP Residual norm 2.802905065427e-08 8298 KSP Residual norm 3.096063556875e-08 8299 KSP Residual norm 3.468881469928e-08 8300 KSP Residual norm 3.958806687273e-08 8301 KSP Residual norm 4.237659849484e-08 8302 KSP Residual norm 3.962901360314e-08 8303 KSP Residual norm 3.174762806688e-08 8304 KSP Residual norm 2.654510228998e-08 8305 KSP Residual norm 2.442821910470e-08 8306 KSP Residual norm 2.380910534733e-08 8307 KSP Residual norm 2.367634107360e-08 8308 KSP Residual norm 2.373618844011e-08 8309 KSP Residual norm 2.504609411018e-08 8310 KSP Residual norm 2.403606310737e-08 8311 KSP Residual norm 2.361349723221e-08 8312 KSP Residual norm 2.593261448389e-08 8313 KSP Residual norm 2.635959528567e-08 8314 KSP Residual norm 2.267503789115e-08 8315 KSP Residual norm 2.026199083606e-08 8316 KSP Residual norm 2.158244561341e-08 8317 KSP Residual norm 2.399009937714e-08 8318 KSP Residual norm 2.350534118721e-08 8319 KSP Residual norm 2.251341348129e-08 8320 KSP Residual norm 2.403727218036e-08 8321 KSP Residual norm 3.017923209658e-08 8322 KSP Residual norm 3.550911432396e-08 8323 KSP Residual norm 3.664888792608e-08 8324 KSP Residual norm 3.762029036968e-08 8325 KSP Residual norm 3.945169115973e-08 8326 KSP Residual norm 4.129361221284e-08 8327 KSP Residual norm 4.257422290255e-08 8328 KSP Residual norm 4.238874472628e-08 8329 KSP Residual norm 4.411135094167e-08 8330 KSP Residual norm 4.472287313990e-08 8331 KSP Residual norm 4.229951763496e-08 8332 KSP Residual norm 3.987296857135e-08 8333 KSP Residual norm 3.944912246029e-08 8334 KSP Residual norm 4.218273301985e-08 8335 KSP Residual norm 4.375191512212e-08 8336 KSP Residual norm 4.266552268868e-08 8337 KSP Residual norm 4.142453633145e-08 8338 KSP Residual norm 3.607133895629e-08 8339 KSP Residual norm 3.193495604509e-08 8340 KSP Residual norm 3.231604594279e-08 8341 KSP Residual norm 3.447412052267e-08 8342 KSP Residual norm 3.530910150506e-08 8343 KSP Residual norm 3.572343898851e-08 8344 KSP Residual norm 3.591650277812e-08 8345 KSP Residual norm 3.472131218209e-08 8346 KSP Residual norm 3.218174644833e-08 8347 KSP Residual norm 2.761170384574e-08 8348 KSP Residual norm 2.330952777237e-08 8349 KSP Residual norm 2.033708013109e-08 8350 KSP Residual norm 1.778056712085e-08 8351 KSP Residual norm 1.445143221133e-08 8352 KSP Residual norm 1.152185072821e-08 8353 KSP Residual norm 9.406988566318e-09 8354 KSP Residual norm 8.137696006130e-09 8355 KSP Residual norm 7.665397480013e-09 8356 KSP Residual norm 7.993271988101e-09 8357 KSP Residual norm 8.650400336091e-09 8358 KSP Residual norm 8.920178332292e-09 8359 KSP Residual norm 8.700017091435e-09 8360 KSP Residual norm 8.840071300215e-09 8361 KSP Residual norm 9.465779262939e-09 8362 KSP Residual norm 1.044034516687e-08 8363 KSP Residual norm 1.119605029569e-08 8364 KSP Residual norm 1.232694360878e-08 8365 KSP Residual norm 1.372681545630e-08 8366 KSP Residual norm 1.477438840295e-08 8367 KSP Residual norm 1.444729448257e-08 8368 KSP Residual norm 1.247430958940e-08 8369 KSP Residual norm 1.132184869726e-08 8370 KSP Residual norm 1.229739471081e-08 8371 KSP Residual norm 1.363373172125e-08 8372 KSP Residual norm 1.394767074689e-08 8373 KSP Residual norm 1.434264963171e-08 8374 KSP Residual norm 1.580114494816e-08 8375 KSP Residual norm 1.763876616418e-08 8376 KSP Residual norm 1.716618873389e-08 8377 KSP Residual norm 1.639926572063e-08 8378 KSP Residual norm 1.839225702431e-08 8379 KSP Residual norm 2.188934161337e-08 8380 KSP Residual norm 2.321687326199e-08 8381 KSP Residual norm 2.206478484448e-08 8382 KSP Residual norm 2.219570589673e-08 8383 KSP Residual norm 2.407753536906e-08 8384 KSP Residual norm 2.676880379990e-08 8385 KSP Residual norm 2.754731990861e-08 8386 KSP Residual norm 2.693970237854e-08 8387 KSP Residual norm 2.320835698519e-08 8388 KSP Residual norm 1.861307118737e-08 8389 KSP Residual norm 1.654698086189e-08 8390 KSP Residual norm 1.715183549771e-08 8391 KSP Residual norm 1.898931366610e-08 8392 KSP Residual norm 1.897974741648e-08 8393 KSP Residual norm 1.810451259311e-08 8394 KSP Residual norm 1.882063784389e-08 8395 KSP Residual norm 2.048551534722e-08 8396 KSP Residual norm 2.099894763288e-08 8397 KSP Residual norm 2.124414297958e-08 8398 KSP Residual norm 2.127293296109e-08 8399 KSP Residual norm 1.878565997689e-08 8400 KSP Residual norm 1.514479548060e-08 8401 KSP Residual norm 1.360572918165e-08 8402 KSP Residual norm 1.368400931682e-08 8403 KSP Residual norm 1.328973572162e-08 8404 KSP Residual norm 1.198536521624e-08 8405 KSP Residual norm 1.190355765915e-08 8406 KSP Residual norm 1.427664143322e-08 8407 KSP Residual norm 1.551621970355e-08 8408 KSP Residual norm 1.355142212447e-08 8409 KSP Residual norm 1.175822710583e-08 8410 KSP Residual norm 1.094808493432e-08 8411 KSP Residual norm 1.109341843599e-08 8412 KSP Residual norm 1.130288646501e-08 8413 KSP Residual norm 1.112641146277e-08 8414 KSP Residual norm 1.230854713695e-08 8415 KSP Residual norm 1.415195764033e-08 8416 KSP Residual norm 1.488960749268e-08 8417 KSP Residual norm 1.487305216294e-08 8418 KSP Residual norm 1.355237322937e-08 8419 KSP Residual norm 1.105176061857e-08 8420 KSP Residual norm 9.094660784816e-09 8421 KSP Residual norm 8.690647699884e-09 8422 KSP Residual norm 9.045488101692e-09 8423 KSP Residual norm 9.175694713105e-09 8424 KSP Residual norm 8.806563455271e-09 8425 KSP Residual norm 8.429603803819e-09 8426 KSP Residual norm 7.781537346549e-09 8427 KSP Residual norm 7.634704735917e-09 8428 KSP Residual norm 8.301041341438e-09 8429 KSP Residual norm 9.080219623757e-09 8430 KSP Residual norm 9.407473586697e-09 8431 KSP Residual norm 9.703104311891e-09 8432 KSP Residual norm 9.686751005158e-09 8433 KSP Residual norm 8.650031885896e-09 8434 KSP Residual norm 7.661911598803e-09 8435 KSP Residual norm 7.208642038801e-09 8436 KSP Residual norm 6.543867576988e-09 8437 KSP Residual norm 5.562299099623e-09 8438 KSP Residual norm 5.082116294954e-09 8439 KSP Residual norm 5.176891810989e-09 8440 KSP Residual norm 4.600020792388e-09 8441 KSP Residual norm 3.309238346058e-09 8442 KSP Residual norm 2.772899903223e-09 8443 KSP Residual norm 3.211277016778e-09 8444 KSP Residual norm 4.123665107737e-09 8445 KSP Residual norm 4.884569121277e-09 8446 KSP Residual norm 5.155925820269e-09 8447 KSP Residual norm 4.955884245656e-09 8448 KSP Residual norm 4.632054120753e-09 8449 KSP Residual norm 4.773839219961e-09 8450 KSP Residual norm 4.863446277112e-09 8451 KSP Residual norm 4.649036937677e-09 8452 KSP Residual norm 4.701927965135e-09 8453 KSP Residual norm 4.838145326785e-09 8454 KSP Residual norm 4.620895045863e-09 8455 KSP Residual norm 4.438555814468e-09 8456 KSP Residual norm 4.713172493984e-09 8457 KSP Residual norm 5.114067866779e-09 8458 KSP Residual norm 5.319442534046e-09 8459 KSP Residual norm 5.428258771877e-09 8460 KSP Residual norm 5.870510989293e-09 8461 KSP Residual norm 6.226939158978e-09 8462 KSP Residual norm 6.475616687609e-09 8463 KSP Residual norm 7.059575186950e-09 8464 KSP Residual norm 8.546562774206e-09 8465 KSP Residual norm 1.097187768387e-08 8466 KSP Residual norm 1.358928613579e-08 8467 KSP Residual norm 1.513514855188e-08 8468 KSP Residual norm 1.567051114533e-08 8469 KSP Residual norm 1.516976511259e-08 8470 KSP Residual norm 1.480890333726e-08 8471 KSP Residual norm 1.601608950574e-08 8472 KSP Residual norm 1.851200281858e-08 8473 KSP Residual norm 1.875675087744e-08 8474 KSP Residual norm 1.754703776620e-08 8475 KSP Residual norm 1.795735102988e-08 8476 KSP Residual norm 1.904076093650e-08 8477 KSP Residual norm 1.892206588956e-08 8478 KSP Residual norm 1.685818130446e-08 8479 KSP Residual norm 1.498358954663e-08 8480 KSP Residual norm 1.527819060121e-08 8481 KSP Residual norm 1.631027769306e-08 8482 KSP Residual norm 1.636461330665e-08 8483 KSP Residual norm 1.652288424446e-08 8484 KSP Residual norm 1.720989876561e-08 8485 KSP Residual norm 1.633509243624e-08 8486 KSP Residual norm 1.527548704162e-08 8487 KSP Residual norm 1.634725339633e-08 8488 KSP Residual norm 1.882016253273e-08 8489 KSP Residual norm 1.884036837716e-08 8490 KSP Residual norm 1.764590719930e-08 8491 KSP Residual norm 1.699476453828e-08 8492 KSP Residual norm 1.559457790853e-08 8493 KSP Residual norm 1.447008489702e-08 8494 KSP Residual norm 1.494168280585e-08 8495 KSP Residual norm 1.654161101039e-08 8496 KSP Residual norm 1.531738931568e-08 8497 KSP Residual norm 1.295849812908e-08 8498 KSP Residual norm 1.216503665022e-08 8499 KSP Residual norm 1.286245412142e-08 8500 KSP Residual norm 1.217426284241e-08 8501 KSP Residual norm 9.713911176753e-09 8502 KSP Residual norm 8.036724862629e-09 8503 KSP Residual norm 7.630826177360e-09 8504 KSP Residual norm 7.455183699549e-09 8505 KSP Residual norm 7.211140963642e-09 8506 KSP Residual norm 7.252201224395e-09 8507 KSP Residual norm 7.471180708799e-09 8508 KSP Residual norm 7.705647022067e-09 8509 KSP Residual norm 7.686544938263e-09 8510 KSP Residual norm 7.618336133820e-09 8511 KSP Residual norm 7.762326564906e-09 8512 KSP Residual norm 7.436114625215e-09 8513 KSP Residual norm 6.709607085936e-09 8514 KSP Residual norm 6.485611623914e-09 8515 KSP Residual norm 6.978433717476e-09 8516 KSP Residual norm 6.918586373201e-09 8517 KSP Residual norm 6.109042023633e-09 8518 KSP Residual norm 5.651557496077e-09 8519 KSP Residual norm 5.638029748563e-09 8520 KSP Residual norm 5.926311005288e-09 8521 KSP Residual norm 6.866620082785e-09 8522 KSP Residual norm 8.301611236704e-09 8523 KSP Residual norm 8.584752844559e-09 8524 KSP Residual norm 8.199015324435e-09 8525 KSP Residual norm 8.303146812502e-09 8526 KSP Residual norm 8.871372147376e-09 8527 KSP Residual norm 8.607928285553e-09 8528 KSP Residual norm 7.683733727360e-09 8529 KSP Residual norm 7.341439595283e-09 8530 KSP Residual norm 7.506714338062e-09 8531 KSP Residual norm 7.788576640336e-09 8532 KSP Residual norm 8.745932012631e-09 8533 KSP Residual norm 9.535080420107e-09 8534 KSP Residual norm 1.002029471220e-08 8535 KSP Residual norm 1.069270476439e-08 8536 KSP Residual norm 1.120417072149e-08 8537 KSP Residual norm 1.140955256246e-08 8538 KSP Residual norm 1.162681834871e-08 8539 KSP Residual norm 1.211753042238e-08 8540 KSP Residual norm 1.341288284159e-08 8541 KSP Residual norm 1.306319561759e-08 8542 KSP Residual norm 1.157977013039e-08 8543 KSP Residual norm 1.086868648037e-08 8544 KSP Residual norm 9.594750235659e-09 8545 KSP Residual norm 8.087319633715e-09 8546 KSP Residual norm 7.883300790653e-09 8547 KSP Residual norm 8.537811733206e-09 8548 KSP Residual norm 9.260830424347e-09 8549 KSP Residual norm 9.705415537720e-09 8550 KSP Residual norm 1.033281313085e-08 8551 KSP Residual norm 1.247440812930e-08 8552 KSP Residual norm 1.484498435278e-08 8553 KSP Residual norm 1.567685655437e-08 8554 KSP Residual norm 1.649800495966e-08 8555 KSP Residual norm 1.825774714201e-08 8556 KSP Residual norm 2.044542249042e-08 8557 KSP Residual norm 2.278459544561e-08 8558 KSP Residual norm 2.303148688012e-08 8559 KSP Residual norm 2.427680741271e-08 8560 KSP Residual norm 2.861573199464e-08 8561 KSP Residual norm 3.124560508552e-08 8562 KSP Residual norm 3.050624954930e-08 8563 KSP Residual norm 2.874388465424e-08 8564 KSP Residual norm 2.730640345194e-08 8565 KSP Residual norm 2.471090397732e-08 8566 KSP Residual norm 2.180885348681e-08 8567 KSP Residual norm 2.200718282775e-08 8568 KSP Residual norm 2.285314781889e-08 8569 KSP Residual norm 2.008436864495e-08 8570 KSP Residual norm 1.900603313325e-08 8571 KSP Residual norm 2.055024876773e-08 8572 KSP Residual norm 2.208735929378e-08 8573 KSP Residual norm 2.110298900966e-08 8574 KSP Residual norm 1.860572272127e-08 8575 KSP Residual norm 1.661260446147e-08 8576 KSP Residual norm 1.562385856847e-08 8577 KSP Residual norm 1.439151771550e-08 8578 KSP Residual norm 1.193375451459e-08 8579 KSP Residual norm 9.538997414678e-09 8580 KSP Residual norm 8.321684963984e-09 8581 KSP Residual norm 8.142081305677e-09 8582 KSP Residual norm 7.672049058349e-09 8583 KSP Residual norm 6.911533540908e-09 8584 KSP Residual norm 6.877388044757e-09 8585 KSP Residual norm 7.512558420142e-09 8586 KSP Residual norm 7.242145792903e-09 8587 KSP Residual norm 6.799092440733e-09 8588 KSP Residual norm 7.465830086631e-09 8589 KSP Residual norm 8.284722685059e-09 8590 KSP Residual norm 8.008444303913e-09 8591 KSP Residual norm 7.451882862701e-09 8592 KSP Residual norm 7.481346620614e-09 8593 KSP Residual norm 7.404192945530e-09 8594 KSP Residual norm 7.889657209515e-09 8595 KSP Residual norm 8.616494943654e-09 8596 KSP Residual norm 9.201691549956e-09 8597 KSP Residual norm 9.435912571966e-09 8598 KSP Residual norm 1.019696479086e-08 8599 KSP Residual norm 1.180096396765e-08 8600 KSP Residual norm 1.385058477350e-08 8601 KSP Residual norm 1.559290396232e-08 8602 KSP Residual norm 1.728400131689e-08 8603 KSP Residual norm 1.772020157046e-08 8604 KSP Residual norm 1.782318946155e-08 8605 KSP Residual norm 1.869332872295e-08 8606 KSP Residual norm 1.873666652454e-08 8607 KSP Residual norm 1.592594818370e-08 8608 KSP Residual norm 1.382121453864e-08 8609 KSP Residual norm 1.416124160283e-08 8610 KSP Residual norm 1.557538976534e-08 8611 KSP Residual norm 1.566524755655e-08 8612 KSP Residual norm 1.507897330649e-08 8613 KSP Residual norm 1.751504251488e-08 8614 KSP Residual norm 2.079400227856e-08 8615 KSP Residual norm 2.187646914482e-08 8616 KSP Residual norm 2.529964911724e-08 8617 KSP Residual norm 3.047526453547e-08 8618 KSP Residual norm 3.137998452226e-08 8619 KSP Residual norm 2.948241044712e-08 8620 KSP Residual norm 2.915443757825e-08 8621 KSP Residual norm 3.255604782130e-08 8622 KSP Residual norm 3.545845298304e-08 8623 KSP Residual norm 3.380254678994e-08 8624 KSP Residual norm 3.409232743236e-08 8625 KSP Residual norm 3.464996180449e-08 8626 KSP Residual norm 3.443724748546e-08 8627 KSP Residual norm 3.335419366997e-08 8628 KSP Residual norm 3.302657231993e-08 8629 KSP Residual norm 3.188703573811e-08 8630 KSP Residual norm 3.083389483569e-08 8631 KSP Residual norm 3.283410887749e-08 8632 KSP Residual norm 3.576894383964e-08 8633 KSP Residual norm 3.934817584805e-08 8634 KSP Residual norm 4.376046281810e-08 8635 KSP Residual norm 4.349236128823e-08 8636 KSP Residual norm 4.382813403687e-08 8637 KSP Residual norm 4.816047697331e-08 8638 KSP Residual norm 5.065565784333e-08 8639 KSP Residual norm 4.418458753988e-08 8640 KSP Residual norm 3.973656874844e-08 8641 KSP Residual norm 3.963961291907e-08 8642 KSP Residual norm 4.016773617757e-08 8643 KSP Residual norm 3.993852853476e-08 8644 KSP Residual norm 4.036019753768e-08 8645 KSP Residual norm 3.933252140571e-08 8646 KSP Residual norm 3.665079020596e-08 8647 KSP Residual norm 3.725606638580e-08 8648 KSP Residual norm 4.020679236599e-08 8649 KSP Residual norm 3.979407789950e-08 8650 KSP Residual norm 4.180650071836e-08 8651 KSP Residual norm 4.635436141150e-08 8652 KSP Residual norm 4.189956773501e-08 8653 KSP Residual norm 3.256149102354e-08 8654 KSP Residual norm 2.850344018432e-08 8655 KSP Residual norm 2.860751672287e-08 8656 KSP Residual norm 2.798178030186e-08 8657 KSP Residual norm 2.654711808733e-08 8658 KSP Residual norm 2.731966631327e-08 8659 KSP Residual norm 2.777210436239e-08 8660 KSP Residual norm 2.611274222448e-08 8661 KSP Residual norm 2.537392790214e-08 8662 KSP Residual norm 2.665224964550e-08 8663 KSP Residual norm 2.647219468220e-08 8664 KSP Residual norm 2.609381737749e-08 8665 KSP Residual norm 2.581600447717e-08 8666 KSP Residual norm 2.407948814920e-08 8667 KSP Residual norm 2.136445756694e-08 8668 KSP Residual norm 2.094954659488e-08 8669 KSP Residual norm 2.187021162031e-08 8670 KSP Residual norm 2.213914317937e-08 8671 KSP Residual norm 2.021765159734e-08 8672 KSP Residual norm 1.792263589912e-08 8673 KSP Residual norm 1.737357114984e-08 8674 KSP Residual norm 1.745060467174e-08 8675 KSP Residual norm 1.738355919213e-08 8676 KSP Residual norm 1.729427197379e-08 8677 KSP Residual norm 1.662213833634e-08 8678 KSP Residual norm 1.645058844701e-08 8679 KSP Residual norm 1.546872940896e-08 8680 KSP Residual norm 1.472991501051e-08 8681 KSP Residual norm 1.527163776937e-08 8682 KSP Residual norm 1.547684438653e-08 8683 KSP Residual norm 1.385760425949e-08 8684 KSP Residual norm 1.298420226947e-08 8685 KSP Residual norm 1.400180165602e-08 8686 KSP Residual norm 1.460933423556e-08 8687 KSP Residual norm 1.317071505554e-08 8688 KSP Residual norm 1.208334284832e-08 8689 KSP Residual norm 1.212798637391e-08 8690 KSP Residual norm 1.188796377312e-08 8691 KSP Residual norm 1.106319062807e-08 8692 KSP Residual norm 1.062532561091e-08 8693 KSP Residual norm 1.028832546383e-08 8694 KSP Residual norm 9.246136596702e-09 8695 KSP Residual norm 8.692531438054e-09 8696 KSP Residual norm 8.879980578229e-09 8697 KSP Residual norm 9.494743815125e-09 8698 KSP Residual norm 9.094265015219e-09 8699 KSP Residual norm 7.856703802347e-09 8700 KSP Residual norm 7.323540871151e-09 8701 KSP Residual norm 7.414209253069e-09 8702 KSP Residual norm 7.806869812777e-09 8703 KSP Residual norm 7.653981130836e-09 8704 KSP Residual norm 7.206008639479e-09 8705 KSP Residual norm 6.659346651871e-09 8706 KSP Residual norm 6.449514539434e-09 8707 KSP Residual norm 6.555927801000e-09 8708 KSP Residual norm 6.360304013846e-09 8709 KSP Residual norm 6.324375572827e-09 8710 KSP Residual norm 6.914167167737e-09 8711 KSP Residual norm 7.952823918218e-09 8712 KSP Residual norm 8.469328814199e-09 8713 KSP Residual norm 8.763071907624e-09 8714 KSP Residual norm 8.949458373570e-09 8715 KSP Residual norm 9.495082472562e-09 8716 KSP Residual norm 9.905601014247e-09 8717 KSP Residual norm 1.039571921986e-08 8718 KSP Residual norm 1.175348281564e-08 8719 KSP Residual norm 1.106119462264e-08 8720 KSP Residual norm 9.307737703967e-09 8721 KSP Residual norm 9.087367369325e-09 8722 KSP Residual norm 1.040000819149e-08 8723 KSP Residual norm 1.179162353492e-08 8724 KSP Residual norm 1.217019560195e-08 8725 KSP Residual norm 1.262699497078e-08 8726 KSP Residual norm 1.382244348467e-08 8727 KSP Residual norm 1.580860338079e-08 8728 KSP Residual norm 1.836846554705e-08 8729 KSP Residual norm 2.053203160464e-08 8730 KSP Residual norm 2.115929466377e-08 8731 KSP Residual norm 2.211630616883e-08 8732 KSP Residual norm 2.263295269215e-08 8733 KSP Residual norm 2.154535534531e-08 8734 KSP Residual norm 2.028926709377e-08 8735 KSP Residual norm 2.089412413757e-08 8736 KSP Residual norm 2.028212340696e-08 8737 KSP Residual norm 1.784589299546e-08 8738 KSP Residual norm 1.740652597146e-08 8739 KSP Residual norm 1.796039620249e-08 8740 KSP Residual norm 1.735910416941e-08 8741 KSP Residual norm 1.618643194710e-08 8742 KSP Residual norm 1.449071736309e-08 8743 KSP Residual norm 1.264398904327e-08 8744 KSP Residual norm 1.172803569934e-08 8745 KSP Residual norm 1.131913666728e-08 8746 KSP Residual norm 1.133352683512e-08 8747 KSP Residual norm 1.115022910428e-08 8748 KSP Residual norm 1.094661249087e-08 8749 KSP Residual norm 1.083293913269e-08 8750 KSP Residual norm 1.066106254498e-08 8751 KSP Residual norm 1.085458217071e-08 8752 KSP Residual norm 1.083592597992e-08 8753 KSP Residual norm 1.016419888544e-08 8754 KSP Residual norm 9.689007455267e-09 8755 KSP Residual norm 9.558300151543e-09 8756 KSP Residual norm 9.821992875716e-09 8757 KSP Residual norm 9.446405264589e-09 8758 KSP Residual norm 9.076794705715e-09 8759 KSP Residual norm 9.497456280938e-09 8760 KSP Residual norm 1.013973495648e-08 8761 KSP Residual norm 1.008064466860e-08 8762 KSP Residual norm 1.012639589366e-08 8763 KSP Residual norm 9.993459040368e-09 8764 KSP Residual norm 9.273508631326e-09 8765 KSP Residual norm 8.746160672665e-09 8766 KSP Residual norm 9.002756477213e-09 8767 KSP Residual norm 9.072079308387e-09 8768 KSP Residual norm 8.786163540242e-09 8769 KSP Residual norm 8.646964694318e-09 8770 KSP Residual norm 8.858270746694e-09 8771 KSP Residual norm 8.745321967787e-09 8772 KSP Residual norm 7.878313545491e-09 8773 KSP Residual norm 7.585590432371e-09 8774 KSP Residual norm 7.950125853260e-09 8775 KSP Residual norm 8.017899729678e-09 8776 KSP Residual norm 7.746204084037e-09 8777 KSP Residual norm 7.292279224567e-09 8778 KSP Residual norm 6.498375562199e-09 8779 KSP Residual norm 5.757787944668e-09 8780 KSP Residual norm 5.503433897550e-09 8781 KSP Residual norm 6.063667828958e-09 8782 KSP Residual norm 7.009680854751e-09 8783 KSP Residual norm 6.561489552362e-09 8784 KSP Residual norm 5.717079011731e-09 8785 KSP Residual norm 5.580318281549e-09 8786 KSP Residual norm 6.038423156504e-09 8787 KSP Residual norm 6.913291760311e-09 8788 KSP Residual norm 7.046186016661e-09 8789 KSP Residual norm 6.159419885598e-09 8790 KSP Residual norm 5.866342165723e-09 8791 KSP Residual norm 6.102991302608e-09 8792 KSP Residual norm 6.073062363822e-09 8793 KSP Residual norm 5.533164439893e-09 8794 KSP Residual norm 5.140272577467e-09 8795 KSP Residual norm 5.357033839064e-09 8796 KSP Residual norm 5.692439098078e-09 8797 KSP Residual norm 5.839789702656e-09 8798 KSP Residual norm 5.916184697847e-09 8799 KSP Residual norm 5.791867615769e-09 8800 KSP Residual norm 5.681192983387e-09 8801 KSP Residual norm 5.973101342865e-09 8802 KSP Residual norm 6.135487951032e-09 8803 KSP Residual norm 5.597697412914e-09 8804 KSP Residual norm 5.258085450915e-09 8805 KSP Residual norm 5.554898208534e-09 8806 KSP Residual norm 5.654155586065e-09 8807 KSP Residual norm 5.517182412200e-09 8808 KSP Residual norm 5.713586061302e-09 8809 KSP Residual norm 5.859775062928e-09 8810 KSP Residual norm 5.918359536373e-09 8811 KSP Residual norm 5.702756789881e-09 8812 KSP Residual norm 5.418672824538e-09 8813 KSP Residual norm 5.347303785582e-09 8814 KSP Residual norm 5.280945313816e-09 8815 KSP Residual norm 5.410978173110e-09 8816 KSP Residual norm 5.822930740078e-09 8817 KSP Residual norm 5.966392358796e-09 8818 KSP Residual norm 6.314315791006e-09 8819 KSP Residual norm 6.965501127235e-09 8820 KSP Residual norm 6.991758856234e-09 8821 KSP Residual norm 6.655927371855e-09 8822 KSP Residual norm 6.794408736255e-09 8823 KSP Residual norm 7.473087551288e-09 8824 KSP Residual norm 8.177920261847e-09 8825 KSP Residual norm 8.551538896785e-09 8826 KSP Residual norm 8.432074598458e-09 8827 KSP Residual norm 8.031511086926e-09 8828 KSP Residual norm 7.934028317848e-09 8829 KSP Residual norm 8.577492779089e-09 8830 KSP Residual norm 8.969926879263e-09 8831 KSP Residual norm 8.581716278987e-09 8832 KSP Residual norm 8.573205944971e-09 8833 KSP Residual norm 9.398036631434e-09 8834 KSP Residual norm 1.002929073417e-08 8835 KSP Residual norm 1.077516154340e-08 8836 KSP Residual norm 1.117816949849e-08 8837 KSP Residual norm 1.040555685475e-08 8838 KSP Residual norm 9.363946281645e-09 8839 KSP Residual norm 9.921778812477e-09 8840 KSP Residual norm 1.133862024228e-08 8841 KSP Residual norm 1.093644388299e-08 8842 KSP Residual norm 9.534443525560e-09 8843 KSP Residual norm 8.519108670138e-09 8844 KSP Residual norm 8.184880070581e-09 8845 KSP Residual norm 8.361595218092e-09 8846 KSP Residual norm 8.971525370491e-09 8847 KSP Residual norm 9.231987952100e-09 8848 KSP Residual norm 8.452352276354e-09 8849 KSP Residual norm 8.468107327617e-09 8850 KSP Residual norm 1.017437283294e-08 8851 KSP Residual norm 1.051088926906e-08 8852 KSP Residual norm 9.756554606238e-09 8853 KSP Residual norm 9.533996227423e-09 8854 KSP Residual norm 9.766062433989e-09 8855 KSP Residual norm 1.041250961020e-08 8856 KSP Residual norm 1.152381663496e-08 8857 KSP Residual norm 1.198970769580e-08 8858 KSP Residual norm 1.115582849092e-08 8859 KSP Residual norm 9.882118916092e-09 8860 KSP Residual norm 9.582039925273e-09 8861 KSP Residual norm 9.101680913825e-09 8862 KSP Residual norm 8.247802361459e-09 8863 KSP Residual norm 8.452875464974e-09 8864 KSP Residual norm 1.044745064400e-08 8865 KSP Residual norm 1.125605769927e-08 8866 KSP Residual norm 1.106538951201e-08 8867 KSP Residual norm 1.154689637119e-08 8868 KSP Residual norm 1.193861148112e-08 8869 KSP Residual norm 1.153152979257e-08 8870 KSP Residual norm 1.041884121353e-08 8871 KSP Residual norm 9.648173257277e-09 8872 KSP Residual norm 9.688169239763e-09 8873 KSP Residual norm 1.042940963428e-08 8874 KSP Residual norm 1.088038033279e-08 8875 KSP Residual norm 1.064003815351e-08 8876 KSP Residual norm 1.082436694473e-08 8877 KSP Residual norm 1.177518921408e-08 8878 KSP Residual norm 1.253389743121e-08 8879 KSP Residual norm 1.121423883594e-08 8880 KSP Residual norm 9.799086166815e-09 8881 KSP Residual norm 9.179277105744e-09 8882 KSP Residual norm 9.359438668568e-09 8883 KSP Residual norm 1.032969833157e-08 8884 KSP Residual norm 1.096729022376e-08 8885 KSP Residual norm 1.021223657770e-08 8886 KSP Residual norm 9.082057250436e-09 8887 KSP Residual norm 7.743266209865e-09 8888 KSP Residual norm 6.740673141900e-09 8889 KSP Residual norm 6.711697428115e-09 8890 KSP Residual norm 6.629451052432e-09 8891 KSP Residual norm 5.863162925862e-09 8892 KSP Residual norm 5.480531375678e-09 8893 KSP Residual norm 6.110065971890e-09 8894 KSP Residual norm 6.402535302842e-09 8895 KSP Residual norm 5.804036946253e-09 8896 KSP Residual norm 5.098512708575e-09 8897 KSP Residual norm 4.763538019831e-09 8898 KSP Residual norm 4.561609852934e-09 8899 KSP Residual norm 4.384639575665e-09 8900 KSP Residual norm 5.007624296738e-09 8901 KSP Residual norm 5.740634257442e-09 8902 KSP Residual norm 6.028725600778e-09 8903 KSP Residual norm 6.248132801940e-09 8904 KSP Residual norm 6.403788411869e-09 8905 KSP Residual norm 6.378382820048e-09 8906 KSP Residual norm 6.372740774876e-09 8907 KSP Residual norm 6.440930686612e-09 8908 KSP Residual norm 6.643169964271e-09 8909 KSP Residual norm 6.641563131942e-09 8910 KSP Residual norm 6.848258352121e-09 8911 KSP Residual norm 7.821386722177e-09 8912 KSP Residual norm 8.181223823149e-09 8913 KSP Residual norm 7.839926924337e-09 8914 KSP Residual norm 8.285173733415e-09 8915 KSP Residual norm 8.856781828095e-09 8916 KSP Residual norm 9.118637914033e-09 8917 KSP Residual norm 1.013578168771e-08 8918 KSP Residual norm 1.103431981943e-08 8919 KSP Residual norm 1.059673390710e-08 8920 KSP Residual norm 1.011225571390e-08 8921 KSP Residual norm 1.086664283089e-08 8922 KSP Residual norm 1.147634827909e-08 8923 KSP Residual norm 1.068532421302e-08 8924 KSP Residual norm 9.947879087919e-09 8925 KSP Residual norm 1.000267614393e-08 8926 KSP Residual norm 1.055792372549e-08 8927 KSP Residual norm 1.107486089609e-08 8928 KSP Residual norm 1.182240267290e-08 8929 KSP Residual norm 1.140153122020e-08 8930 KSP Residual norm 1.091067962897e-08 8931 KSP Residual norm 1.130830113917e-08 8932 KSP Residual norm 1.028243427294e-08 8933 KSP Residual norm 9.209663283185e-09 8934 KSP Residual norm 9.136746412143e-09 8935 KSP Residual norm 8.523789238832e-09 8936 KSP Residual norm 8.104205649795e-09 8937 KSP Residual norm 7.988713683889e-09 8938 KSP Residual norm 7.311343701293e-09 8939 KSP Residual norm 6.540306800679e-09 8940 KSP Residual norm 6.011833471426e-09 8941 KSP Residual norm 5.868066084289e-09 8942 KSP Residual norm 5.521734306485e-09 8943 KSP Residual norm 5.360918894604e-09 8944 KSP Residual norm 6.099854173915e-09 8945 KSP Residual norm 7.090919098140e-09 8946 KSP Residual norm 6.698010258941e-09 8947 KSP Residual norm 5.845522007615e-09 8948 KSP Residual norm 5.718306337455e-09 8949 KSP Residual norm 5.973497395893e-09 8950 KSP Residual norm 5.797927570290e-09 8951 KSP Residual norm 5.988017705153e-09 8952 KSP Residual norm 6.480785731029e-09 8953 KSP Residual norm 6.802301671933e-09 8954 KSP Residual norm 7.019232648367e-09 8955 KSP Residual norm 7.253276941856e-09 8956 KSP Residual norm 7.549612480376e-09 8957 KSP Residual norm 7.427470777762e-09 8958 KSP Residual norm 7.389045282554e-09 8959 KSP Residual norm 8.005225156045e-09 8960 KSP Residual norm 8.199835575176e-09 8961 KSP Residual norm 8.071271605404e-09 8962 KSP Residual norm 8.314880983604e-09 8963 KSP Residual norm 9.083435578466e-09 8964 KSP Residual norm 9.299967340116e-09 8965 KSP Residual norm 9.109182346784e-09 8966 KSP Residual norm 9.231350339459e-09 8967 KSP Residual norm 1.008066239008e-08 8968 KSP Residual norm 1.091274763214e-08 8969 KSP Residual norm 1.208162126878e-08 8970 KSP Residual norm 1.345275962014e-08 8971 KSP Residual norm 1.416539684637e-08 8972 KSP Residual norm 1.347547353335e-08 8973 KSP Residual norm 1.316365592777e-08 8974 KSP Residual norm 1.299808991337e-08 8975 KSP Residual norm 1.266271321361e-08 8976 KSP Residual norm 1.278685988373e-08 8977 KSP Residual norm 1.388881235412e-08 8978 KSP Residual norm 1.372720484746e-08 8979 KSP Residual norm 1.353334491816e-08 8980 KSP Residual norm 1.484600601560e-08 8981 KSP Residual norm 1.536783771726e-08 8982 KSP Residual norm 1.350046444683e-08 8983 KSP Residual norm 1.156995920093e-08 8984 KSP Residual norm 1.078078423739e-08 8985 KSP Residual norm 9.942149622360e-09 8986 KSP Residual norm 8.250437229563e-09 8987 KSP Residual norm 7.605593820799e-09 8988 KSP Residual norm 7.256857712935e-09 8989 KSP Residual norm 6.285957503240e-09 8990 KSP Residual norm 5.690995316574e-09 8991 KSP Residual norm 5.335773462611e-09 8992 KSP Residual norm 4.719064087211e-09 8993 KSP Residual norm 4.218129373993e-09 8994 KSP Residual norm 3.852715688466e-09 8995 KSP Residual norm 3.828490914013e-09 8996 KSP Residual norm 3.958415532292e-09 8997 KSP Residual norm 4.297850559253e-09 8998 KSP Residual norm 4.748851180019e-09 8999 KSP Residual norm 4.782934563090e-09 9000 KSP Residual norm 4.378518816260e-09 9001 KSP Residual norm 4.490334138402e-09 9002 KSP Residual norm 4.781086366952e-09 9003 KSP Residual norm 4.498035035260e-09 9004 KSP Residual norm 4.554759114966e-09 9005 KSP Residual norm 5.145897633476e-09 9006 KSP Residual norm 5.206359808891e-09 9007 KSP Residual norm 4.984743865215e-09 9008 KSP Residual norm 5.264297901333e-09 9009 KSP Residual norm 5.622766086028e-09 9010 KSP Residual norm 5.780116394132e-09 9011 KSP Residual norm 6.203588933100e-09 9012 KSP Residual norm 7.032903306156e-09 9013 KSP Residual norm 7.794011134703e-09 9014 KSP Residual norm 7.042066462347e-09 9015 KSP Residual norm 6.427722753340e-09 9016 KSP Residual norm 6.505962593421e-09 9017 KSP Residual norm 6.587286549508e-09 9018 KSP Residual norm 6.583076510259e-09 9019 KSP Residual norm 6.548508954592e-09 9020 KSP Residual norm 6.554557773928e-09 9021 KSP Residual norm 6.950735015205e-09 9022 KSP Residual norm 7.262834553420e-09 9023 KSP Residual norm 6.889948234507e-09 9024 KSP Residual norm 6.556950226374e-09 9025 KSP Residual norm 6.447763871540e-09 9026 KSP Residual norm 6.553934492438e-09 9027 KSP Residual norm 6.579650002353e-09 9028 KSP Residual norm 6.519354770507e-09 9029 KSP Residual norm 6.841243566278e-09 9030 KSP Residual norm 7.694556328724e-09 9031 KSP Residual norm 8.736268066264e-09 9032 KSP Residual norm 8.778438278900e-09 9033 KSP Residual norm 8.495233799953e-09 9034 KSP Residual norm 8.514962021083e-09 9035 KSP Residual norm 9.345519722904e-09 9036 KSP Residual norm 9.775194409644e-09 9037 KSP Residual norm 9.733281756655e-09 9038 KSP Residual norm 1.017412023708e-08 9039 KSP Residual norm 9.849904881736e-09 9040 KSP Residual norm 9.505095041147e-09 9041 KSP Residual norm 9.579910110702e-09 9042 KSP Residual norm 8.377045919689e-09 9043 KSP Residual norm 7.632925810041e-09 9044 KSP Residual norm 8.331761501494e-09 9045 KSP Residual norm 8.325087661837e-09 9046 KSP Residual norm 7.218093133010e-09 9047 KSP Residual norm 6.898588776288e-09 9048 KSP Residual norm 7.204646951647e-09 9049 KSP Residual norm 6.797440894492e-09 9050 KSP Residual norm 6.114197507687e-09 9051 KSP Residual norm 5.962188598276e-09 9052 KSP Residual norm 5.644323553419e-09 9053 KSP Residual norm 4.957078484639e-09 9054 KSP Residual norm 5.269087518147e-09 9055 KSP Residual norm 6.436218479674e-09 9056 KSP Residual norm 6.543929385769e-09 9057 KSP Residual norm 6.112575212140e-09 9058 KSP Residual norm 6.494831556297e-09 9059 KSP Residual norm 6.754226202064e-09 9060 KSP Residual norm 6.441020260725e-09 9061 KSP Residual norm 6.079153376993e-09 9062 KSP Residual norm 6.476512614081e-09 9063 KSP Residual norm 6.990778460472e-09 9064 KSP Residual norm 7.101167547708e-09 9065 KSP Residual norm 7.080528528840e-09 9066 KSP Residual norm 6.763004873329e-09 9067 KSP Residual norm 6.034657353551e-09 9068 KSP Residual norm 5.603625204797e-09 9069 KSP Residual norm 6.027281078923e-09 9070 KSP Residual norm 6.144682923949e-09 9071 KSP Residual norm 5.987245941912e-09 9072 KSP Residual norm 6.371134615856e-09 9073 KSP Residual norm 6.556236977999e-09 9074 KSP Residual norm 6.499932496932e-09 9075 KSP Residual norm 6.815367190256e-09 9076 KSP Residual norm 6.950167215938e-09 9077 KSP Residual norm 7.250393443833e-09 9078 KSP Residual norm 7.979593163358e-09 9079 KSP Residual norm 8.174469391755e-09 9080 KSP Residual norm 7.243453912754e-09 9081 KSP Residual norm 6.127968078814e-09 9082 KSP Residual norm 5.362990543438e-09 9083 KSP Residual norm 5.374306185686e-09 9084 KSP Residual norm 5.630389519460e-09 9085 KSP Residual norm 5.104596042544e-09 9086 KSP Residual norm 4.260562661038e-09 9087 KSP Residual norm 3.694674735772e-09 9088 KSP Residual norm 3.618239288912e-09 9089 KSP Residual norm 3.441347426990e-09 9090 KSP Residual norm 3.088685803450e-09 9091 KSP Residual norm 2.951446136954e-09 9092 KSP Residual norm 2.817121461993e-09 9093 KSP Residual norm 2.343436916737e-09 9094 KSP Residual norm 2.100553583560e-09 9095 KSP Residual norm 2.228765075901e-09 9096 KSP Residual norm 2.581354504647e-09 9097 KSP Residual norm 2.865078722409e-09 9098 KSP Residual norm 2.934849468465e-09 9099 KSP Residual norm 2.653877590984e-09 9100 KSP Residual norm 2.382565432111e-09 9101 KSP Residual norm 2.323292525524e-09 9102 KSP Residual norm 2.490134694025e-09 9103 KSP Residual norm 2.552679696669e-09 9104 KSP Residual norm 2.360771813905e-09 9105 KSP Residual norm 2.186632702773e-09 9106 KSP Residual norm 1.965470655818e-09 9107 KSP Residual norm 1.803022376049e-09 9108 KSP Residual norm 1.810957988708e-09 9109 KSP Residual norm 1.983667001897e-09 9110 KSP Residual norm 2.010042623264e-09 9111 KSP Residual norm 1.786534481600e-09 9112 KSP Residual norm 1.648693393965e-09 9113 KSP Residual norm 1.594360290461e-09 9114 KSP Residual norm 1.487856091289e-09 9115 KSP Residual norm 1.404220348840e-09 9116 KSP Residual norm 1.339677712361e-09 9117 KSP Residual norm 1.221929514088e-09 9118 KSP Residual norm 1.070016953968e-09 9119 KSP Residual norm 9.810183659531e-10 9120 KSP Residual norm 8.482075398213e-10 9121 KSP Residual norm 7.471212940289e-10 9122 KSP Residual norm 8.545157719661e-10 9123 KSP Residual norm 1.118286491633e-09 9124 KSP Residual norm 1.132859138892e-09 9125 KSP Residual norm 1.088877818916e-09 9126 KSP Residual norm 1.046702784280e-09 9127 KSP Residual norm 9.714554365215e-10 9128 KSP Residual norm 9.103377319977e-10 9129 KSP Residual norm 9.268911077925e-10 9130 KSP Residual norm 9.520055323000e-10 9131 KSP Residual norm 1.025678215897e-09 9132 KSP Residual norm 1.099478061775e-09 9133 KSP Residual norm 1.154914910228e-09 9134 KSP Residual norm 1.383770832367e-09 9135 KSP Residual norm 1.726958260805e-09 9136 KSP Residual norm 1.994891093759e-09 9137 KSP Residual norm 2.026502062193e-09 9138 KSP Residual norm 2.048590800352e-09 9139 KSP Residual norm 2.226349026058e-09 9140 KSP Residual norm 2.326931657849e-09 9141 KSP Residual norm 2.245902734368e-09 9142 KSP Residual norm 2.390890697594e-09 9143 KSP Residual norm 2.611697552978e-09 9144 KSP Residual norm 2.920215938974e-09 9145 KSP Residual norm 3.257477704995e-09 9146 KSP Residual norm 3.303606886347e-09 9147 KSP Residual norm 3.287647011625e-09 9148 KSP Residual norm 3.371917997931e-09 9149 KSP Residual norm 3.461009245521e-09 9150 KSP Residual norm 3.370989404154e-09 9151 KSP Residual norm 3.100604909983e-09 9152 KSP Residual norm 2.816140524451e-09 9153 KSP Residual norm 2.538626828450e-09 9154 KSP Residual norm 2.268399284911e-09 9155 KSP Residual norm 2.136577220729e-09 9156 KSP Residual norm 2.240530814013e-09 9157 KSP Residual norm 2.284311817918e-09 9158 KSP Residual norm 2.269589360641e-09 9159 KSP Residual norm 2.346151724150e-09 9160 KSP Residual norm 2.226447496905e-09 9161 KSP Residual norm 1.967815089744e-09 9162 KSP Residual norm 1.971012873278e-09 9163 KSP Residual norm 2.037360393319e-09 9164 KSP Residual norm 1.894930719186e-09 9165 KSP Residual norm 1.749586592049e-09 9166 KSP Residual norm 1.736587816450e-09 9167 KSP Residual norm 1.661348397228e-09 9168 KSP Residual norm 1.554895749154e-09 9169 KSP Residual norm 1.542019658538e-09 9170 KSP Residual norm 1.633095762697e-09 9171 KSP Residual norm 1.685467876935e-09 9172 KSP Residual norm 1.737387330953e-09 9173 KSP Residual norm 2.038401001808e-09 9174 KSP Residual norm 2.126492649159e-09 9175 KSP Residual norm 2.042866639433e-09 9176 KSP Residual norm 2.034525121408e-09 9177 KSP Residual norm 1.846641119188e-09 9178 KSP Residual norm 1.477907952263e-09 9179 KSP Residual norm 1.253468339307e-09 9180 KSP Residual norm 1.229849585367e-09 9181 KSP Residual norm 1.289266450210e-09 9182 KSP Residual norm 1.257995030111e-09 9183 KSP Residual norm 1.147066358982e-09 9184 KSP Residual norm 1.141482193748e-09 9185 KSP Residual norm 1.144103903514e-09 9186 KSP Residual norm 1.188107622830e-09 9187 KSP Residual norm 1.390988440840e-09 9188 KSP Residual norm 1.598891248735e-09 9189 KSP Residual norm 1.668884971982e-09 9190 KSP Residual norm 1.733422675643e-09 9191 KSP Residual norm 1.795903930528e-09 9192 KSP Residual norm 1.843991671217e-09 9193 KSP Residual norm 1.617228624086e-09 9194 KSP Residual norm 1.320076138845e-09 9195 KSP Residual norm 1.119392127417e-09 9196 KSP Residual norm 1.078832826890e-09 9197 KSP Residual norm 1.202955825762e-09 9198 KSP Residual norm 1.385727942518e-09 9199 KSP Residual norm 1.545100689556e-09 9200 KSP Residual norm 1.553147822993e-09 9201 KSP Residual norm 1.407780716894e-09 9202 KSP Residual norm 1.319202125322e-09 9203 KSP Residual norm 1.284179473673e-09 9204 KSP Residual norm 1.133994659329e-09 9205 KSP Residual norm 1.043090474923e-09 9206 KSP Residual norm 1.012998637507e-09 9207 KSP Residual norm 9.147865915259e-10 9208 KSP Residual norm 8.662064803857e-10 9209 KSP Residual norm 9.640338270348e-10 9210 KSP Residual norm 1.157010413416e-09 9211 KSP Residual norm 1.295477554805e-09 9212 KSP Residual norm 1.213286263685e-09 9213 KSP Residual norm 1.166362088055e-09 9214 KSP Residual norm 1.148536595886e-09 9215 KSP Residual norm 1.047757066862e-09 9216 KSP Residual norm 9.668331522612e-10 9217 KSP Residual norm 9.782526659750e-10 9218 KSP Residual norm 9.824584250404e-10 9219 KSP Residual norm 9.689849685875e-10 9220 KSP Residual norm 1.015870979449e-09 9221 KSP Residual norm 9.916293343467e-10 9222 KSP Residual norm 9.312685438198e-10 9223 KSP Residual norm 9.637451218235e-10 9224 KSP Residual norm 1.039770085774e-09 9225 KSP Residual norm 9.426988282807e-10 9226 KSP Residual norm 7.409694168904e-10 9227 KSP Residual norm 6.356330396652e-10 1 KSP Residual norm 1.849649014371e+02 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 1.027182122453e-05 1 KSP Residual norm 3.135569947779e-06 2 KSP Residual norm 2.180419532782e-06 3 KSP Residual norm 1.528519061651e-06 4 KSP Residual norm 1.229732258843e-06 5 KSP Residual norm 9.829524791222e-07 6 KSP Residual norm 8.687617524913e-07 7 KSP Residual norm 7.696035207235e-07 8 KSP Residual norm 6.785231901611e-07 9 KSP Residual norm 5.756887785283e-07 10 KSP Residual norm 5.293207094341e-07 11 KSP Residual norm 4.866978594837e-07 12 KSP Residual norm 4.267227044550e-07 13 KSP Residual norm 4.241444967014e-07 14 KSP Residual norm 3.853573900604e-07 15 KSP Residual norm 3.584900443244e-07 16 KSP Residual norm 3.389319354301e-07 17 KSP Residual norm 3.208807162881e-07 18 KSP Residual norm 3.011082111084e-07 19 KSP Residual norm 2.861796734103e-07 20 KSP Residual norm 2.613993785089e-07 21 KSP Residual norm 2.510683050321e-07 22 KSP Residual norm 2.599306815389e-07 23 KSP Residual norm 2.339467745424e-07 24 KSP Residual norm 2.153030888102e-07 25 KSP Residual norm 2.150286128282e-07 26 KSP Residual norm 2.167816703824e-07 27 KSP Residual norm 2.059322802772e-07 28 KSP Residual norm 2.049453923559e-07 29 KSP Residual norm 1.932049574976e-07 30 KSP Residual norm 1.898996913719e-07 31 KSP Residual norm 1.872900297333e-07 32 KSP Residual norm 1.781915323925e-07 33 KSP Residual norm 1.751942709500e-07 34 KSP Residual norm 1.663251505079e-07 35 KSP Residual norm 1.553556259625e-07 36 KSP Residual norm 1.567367881883e-07 37 KSP Residual norm 1.532926291474e-07 38 KSP Residual norm 1.422081410320e-07 39 KSP Residual norm 1.436680450260e-07 40 KSP Residual norm 1.440821625669e-07 41 KSP Residual norm 1.340774443381e-07 42 KSP Residual norm 1.318169190753e-07 43 KSP Residual norm 1.257948948114e-07 44 KSP Residual norm 1.231017575923e-07 45 KSP Residual norm 1.215855396398e-07 46 KSP Residual norm 1.174314784248e-07 47 KSP Residual norm 1.182643390149e-07 48 KSP Residual norm 1.146322844711e-07 49 KSP Residual norm 1.095530645158e-07 50 KSP Residual norm 1.086851045267e-07 51 KSP Residual norm 1.129053703069e-07 52 KSP Residual norm 1.144586281434e-07 53 KSP Residual norm 1.088946076931e-07 54 KSP Residual norm 1.062454716223e-07 55 KSP Residual norm 1.080839392455e-07 56 KSP Residual norm 1.034709791240e-07 57 KSP Residual norm 9.990590423722e-08 58 KSP Residual norm 1.021211873416e-07 59 KSP Residual norm 9.717776076258e-08 60 KSP Residual norm 9.203825395114e-08 61 KSP Residual norm 9.266453982191e-08 62 KSP Residual norm 8.970140008495e-08 63 KSP Residual norm 9.164070446014e-08 64 KSP Residual norm 9.080730275091e-08 65 KSP Residual norm 8.884866186306e-08 66 KSP Residual norm 8.492764404537e-08 67 KSP Residual norm 8.110252263730e-08 68 KSP Residual norm 7.790954181068e-08 69 KSP Residual norm 7.973621974751e-08 70 KSP Residual norm 8.143063857031e-08 71 KSP Residual norm 7.739178000635e-08 72 KSP Residual norm 7.503156097489e-08 73 KSP Residual norm 7.547735326047e-08 74 KSP Residual norm 7.484714618236e-08 75 KSP Residual norm 7.748312439773e-08 76 KSP Residual norm 7.807508092519e-08 77 KSP Residual norm 7.572977466303e-08 78 KSP Residual norm 7.665209770868e-08 79 KSP Residual norm 7.431223627349e-08 80 KSP Residual norm 7.141901261878e-08 81 KSP Residual norm 7.196469096316e-08 82 KSP Residual norm 7.120233758807e-08 83 KSP Residual norm 7.029697994445e-08 84 KSP Residual norm 6.718931484299e-08 85 KSP Residual norm 6.809929072658e-08 86 KSP Residual norm 6.459895170316e-08 87 KSP Residual norm 6.393548183799e-08 88 KSP Residual norm 6.528170282680e-08 89 KSP Residual norm 6.455327969881e-08 90 KSP Residual norm 6.395969408471e-08 91 KSP Residual norm 6.047775089351e-08 92 KSP Residual norm 6.014475855054e-08 93 KSP Residual norm 6.083694172663e-08 94 KSP Residual norm 5.903994958624e-08 95 KSP Residual norm 5.915849060716e-08 96 KSP Residual norm 5.801215315670e-08 97 KSP Residual norm 5.553473899722e-08 98 KSP Residual norm 5.674557082287e-08 99 KSP Residual norm 5.783602905141e-08 100 KSP Residual norm 5.834648896713e-08 101 KSP Residual norm 6.101552128335e-08 102 KSP Residual norm 5.918703686799e-08 103 KSP Residual norm 5.595872601118e-08 104 KSP Residual norm 5.753784803382e-08 105 KSP Residual norm 5.502063107439e-08 106 KSP Residual norm 5.437143512765e-08 107 KSP Residual norm 5.289551493423e-08 108 KSP Residual norm 5.408424858808e-08 109 KSP Residual norm 5.251163546873e-08 110 KSP Residual norm 5.101083421682e-08 111 KSP Residual norm 5.105373554003e-08 112 KSP Residual norm 5.179194383344e-08 113 KSP Residual norm 5.221347823222e-08 114 KSP Residual norm 4.920554168357e-08 115 KSP Residual norm 4.968787718031e-08 116 KSP Residual norm 4.901941039863e-08 117 KSP Residual norm 4.729910682551e-08 118 KSP Residual norm 4.732727932420e-08 119 KSP Residual norm 4.788431228088e-08 120 KSP Residual norm 4.709199891744e-08 121 KSP Residual norm 4.633174023376e-08 122 KSP Residual norm 4.575146306337e-08 123 KSP Residual norm 4.675273545847e-08 124 KSP Residual norm 4.808797552367e-08 125 KSP Residual norm 4.683196648045e-08 126 KSP Residual norm 4.805455052599e-08 127 KSP Residual norm 4.782203895247e-08 128 KSP Residual norm 4.638040072176e-08 129 KSP Residual norm 4.406285137898e-08 130 KSP Residual norm 4.554618434120e-08 131 KSP Residual norm 4.412076681475e-08 132 KSP Residual norm 4.257102684841e-08 133 KSP Residual norm 4.360630524041e-08 134 KSP Residual norm 4.439592322709e-08 135 KSP Residual norm 4.371933550721e-08 136 KSP Residual norm 4.314718814149e-08 137 KSP Residual norm 4.203693595073e-08 138 KSP Residual norm 4.160989517548e-08 139 KSP Residual norm 4.100637862429e-08 140 KSP Residual norm 4.021812214559e-08 141 KSP Residual norm 3.841786540700e-08 142 KSP Residual norm 3.985913676947e-08 143 KSP Residual norm 4.049623994462e-08 144 KSP Residual norm 3.893108342042e-08 145 KSP Residual norm 3.868790068774e-08 146 KSP Residual norm 3.825368674076e-08 147 KSP Residual norm 3.990337986027e-08 148 KSP Residual norm 4.059908155370e-08 149 KSP Residual norm 3.941119419482e-08 150 KSP Residual norm 3.998620761870e-08 151 KSP Residual norm 4.002546814452e-08 152 KSP Residual norm 3.755788414789e-08 153 KSP Residual norm 3.742454519881e-08 154 KSP Residual norm 3.711512103196e-08 155 KSP Residual norm 3.684895633420e-08 156 KSP Residual norm 3.685468343218e-08 157 KSP Residual norm 3.681172090059e-08 158 KSP Residual norm 3.625658052374e-08 159 KSP Residual norm 3.614374521410e-08 160 KSP Residual norm 3.675609152693e-08 161 KSP Residual norm 3.576400026546e-08 162 KSP Residual norm 3.524711834329e-08 163 KSP Residual norm 3.439177764481e-08 164 KSP Residual norm 3.393816437112e-08 165 KSP Residual norm 3.337629608870e-08 166 KSP Residual norm 3.377495388660e-08 167 KSP Residual norm 3.474391815589e-08 168 KSP Residual norm 3.470287882486e-08 169 KSP Residual norm 3.410072023569e-08 170 KSP Residual norm 3.317145997178e-08 171 KSP Residual norm 3.289681793645e-08 172 KSP Residual norm 3.547991630953e-08 173 KSP Residual norm 3.581285694693e-08 174 KSP Residual norm 3.354468381827e-08 175 KSP Residual norm 3.330337680098e-08 176 KSP Residual norm 3.339832234604e-08 177 KSP Residual norm 3.273556581602e-08 178 KSP Residual norm 3.189982581599e-08 179 KSP Residual norm 3.118707613203e-08 180 KSP Residual norm 3.190723390938e-08 181 KSP Residual norm 3.301913092525e-08 182 KSP Residual norm 3.241616565362e-08 183 KSP Residual norm 3.088824535361e-08 184 KSP Residual norm 3.175601876213e-08 185 KSP Residual norm 3.185925296383e-08 186 KSP Residual norm 3.071037294847e-08 187 KSP Residual norm 3.045086297402e-08 188 KSP Residual norm 2.979242520953e-08 189 KSP Residual norm 2.894078844296e-08 190 KSP Residual norm 3.002052268813e-08 191 KSP Residual norm 2.981898263867e-08 192 KSP Residual norm 2.936316221760e-08 193 KSP Residual norm 3.021707023498e-08 194 KSP Residual norm 3.031320787410e-08 195 KSP Residual norm 3.044946501393e-08 196 KSP Residual norm 3.031926106429e-08 197 KSP Residual norm 3.221430437011e-08 198 KSP Residual norm 3.038875346630e-08 199 KSP Residual norm 2.987253524714e-08 200 KSP Residual norm 2.973047891211e-08 201 KSP Residual norm 2.847470306728e-08 202 KSP Residual norm 2.769231503652e-08 203 KSP Residual norm 2.933560992098e-08 204 KSP Residual norm 2.877064307209e-08 205 KSP Residual norm 2.916569134710e-08 206 KSP Residual norm 2.874782000253e-08 207 KSP Residual norm 2.877952534673e-08 208 KSP Residual norm 2.759301348296e-08 209 KSP Residual norm 2.702767920242e-08 210 KSP Residual norm 2.761499970041e-08 211 KSP Residual norm 2.713224970490e-08 212 KSP Residual norm 2.638187630207e-08 213 KSP Residual norm 2.588786843550e-08 214 KSP Residual norm 2.649200860731e-08 215 KSP Residual norm 2.677505880424e-08 216 KSP Residual norm 2.658569524878e-08 217 KSP Residual norm 2.690092378488e-08 218 KSP Residual norm 2.665102352370e-08 219 KSP Residual norm 2.720769423327e-08 220 KSP Residual norm 2.744424052180e-08 221 KSP Residual norm 2.672401670159e-08 222 KSP Residual norm 2.717697366328e-08 223 KSP Residual norm 2.756633455568e-08 224 KSP Residual norm 2.629991517984e-08 225 KSP Residual norm 2.613549070088e-08 226 KSP Residual norm 2.561197918867e-08 227 KSP Residual norm 2.530506944853e-08 228 KSP Residual norm 2.563545980560e-08 229 KSP Residual norm 2.582825497646e-08 230 KSP Residual norm 2.508499061822e-08 231 KSP Residual norm 2.536361630489e-08 232 KSP Residual norm 2.572398586300e-08 233 KSP Residual norm 2.424315094943e-08 234 KSP Residual norm 2.360929803953e-08 235 KSP Residual norm 2.456493828502e-08 236 KSP Residual norm 2.479193950223e-08 237 KSP Residual norm 2.367904467396e-08 238 KSP Residual norm 2.445201181013e-08 239 KSP Residual norm 2.372867018510e-08 240 KSP Residual norm 2.359013173829e-08 241 KSP Residual norm 2.456875813499e-08 242 KSP Residual norm 2.463467535232e-08 243 KSP Residual norm 2.373462879103e-08 244 KSP Residual norm 2.402524293089e-08 245 KSP Residual norm 2.427927343422e-08 246 KSP Residual norm 2.436440506256e-08 247 KSP Residual norm 2.423414959700e-08 248 KSP Residual norm 2.337977647049e-08 249 KSP Residual norm 2.307142842133e-08 250 KSP Residual norm 2.298823072682e-08 251 KSP Residual norm 2.305600356361e-08 252 KSP Residual norm 2.324155041362e-08 253 KSP Residual norm 2.298441642898e-08 254 KSP Residual norm 2.332875861016e-08 255 KSP Residual norm 2.399095557238e-08 256 KSP Residual norm 2.520205815115e-08 257 KSP Residual norm 2.565245234099e-08 258 KSP Residual norm 2.764462107850e-08 259 KSP Residual norm 3.040508746916e-08 260 KSP Residual norm 3.459548723400e-08 261 KSP Residual norm 4.129663295425e-08 262 KSP Residual norm 5.001562019471e-08 263 KSP Residual norm 6.192932213022e-08 264 KSP Residual norm 7.957301695971e-08 265 KSP Residual norm 1.004214579770e-07 266 KSP Residual norm 1.174729609942e-07 267 KSP Residual norm 1.246051516797e-07 268 KSP Residual norm 1.226333849560e-07 269 KSP Residual norm 1.041753836034e-07 270 KSP Residual norm 8.266487400295e-08 271 KSP Residual norm 6.459448295187e-08 272 KSP Residual norm 4.767446738456e-08 273 KSP Residual norm 3.635518413777e-08 274 KSP Residual norm 3.040891301132e-08 275 KSP Residual norm 2.999870923414e-08 276 KSP Residual norm 3.033856829606e-08 277 KSP Residual norm 3.033505821160e-08 278 KSP Residual norm 2.989050704971e-08 279 KSP Residual norm 2.538400038591e-08 280 KSP Residual norm 2.058554366105e-08 281 KSP Residual norm 1.542916393016e-08 282 KSP Residual norm 1.248705137582e-08 283 KSP Residual norm 1.069833215571e-08 284 KSP Residual norm 9.985197354920e-09 285 KSP Residual norm 9.842500827831e-09 286 KSP Residual norm 9.329517439693e-09 287 KSP Residual norm 9.274069567627e-09 288 KSP Residual norm 8.967858460002e-09 289 KSP Residual norm 9.183147163129e-09 290 KSP Residual norm 9.200056208671e-09 291 KSP Residual norm 8.732204420993e-09 292 KSP Residual norm 7.876738027203e-09 293 KSP Residual norm 6.840063569219e-09 294 KSP Residual norm 6.452530492864e-09 295 KSP Residual norm 6.389602001607e-09 296 KSP Residual norm 6.360200975562e-09 297 KSP Residual norm 6.297155859788e-09 298 KSP Residual norm 5.708872790872e-09 299 KSP Residual norm 4.841477192084e-09 300 KSP Residual norm 4.498586927442e-09 301 KSP Residual norm 4.442895790317e-09 302 KSP Residual norm 4.464361027179e-09 303 KSP Residual norm 4.194037706882e-09 304 KSP Residual norm 3.937500911657e-09 305 KSP Residual norm 3.781863635271e-09 306 KSP Residual norm 3.917849883064e-09 307 KSP Residual norm 3.803545205399e-09 308 KSP Residual norm 3.575620156785e-09 309 KSP Residual norm 3.219568643605e-09 310 KSP Residual norm 3.126060864842e-09 311 KSP Residual norm 3.313203994886e-09 312 KSP Residual norm 3.636566666435e-09 313 KSP Residual norm 3.964845689608e-09 314 KSP Residual norm 3.815086572108e-09 315 KSP Residual norm 3.398287547657e-09 316 KSP Residual norm 3.155270497494e-09 317 KSP Residual norm 3.120531941015e-09 318 KSP Residual norm 3.072529914245e-09 319 KSP Residual norm 2.875715794189e-09 320 KSP Residual norm 2.633338659307e-09 321 KSP Residual norm 2.486013309889e-09 322 KSP Residual norm 2.564768273032e-09 323 KSP Residual norm 2.549370175215e-09 324 KSP Residual norm 2.281623714518e-09 325 KSP Residual norm 2.050492141000e-09 326 KSP Residual norm 1.855284575961e-09 327 KSP Residual norm 1.780505753230e-09 328 KSP Residual norm 1.672576359477e-09 329 KSP Residual norm 1.591052539076e-09 330 KSP Residual norm 1.543471457247e-09 331 KSP Residual norm 1.592723876594e-09 332 KSP Residual norm 1.603492920762e-09 333 KSP Residual norm 1.574349854565e-09 334 KSP Residual norm 1.633182713633e-09 335 KSP Residual norm 1.723932629493e-09 336 KSP Residual norm 1.818169620960e-09 337 KSP Residual norm 1.710486648312e-09 338 KSP Residual norm 1.608096129330e-09 339 KSP Residual norm 1.587834958353e-09 340 KSP Residual norm 1.566214069409e-09 341 KSP Residual norm 1.492325745182e-09 342 KSP Residual norm 1.311296843707e-09 343 KSP Residual norm 1.201421360760e-09 344 KSP Residual norm 1.242474044972e-09 345 KSP Residual norm 1.303015100243e-09 346 KSP Residual norm 1.301463807014e-09 347 KSP Residual norm 1.295320567357e-09 348 KSP Residual norm 1.386355882704e-09 349 KSP Residual norm 1.489909370944e-09 350 KSP Residual norm 1.570403083463e-09 351 KSP Residual norm 1.618101985373e-09 352 KSP Residual norm 1.685805874585e-09 353 KSP Residual norm 1.747971773746e-09 354 KSP Residual norm 1.817920635600e-09 355 KSP Residual norm 1.838629812996e-09 356 KSP Residual norm 1.837717242627e-09 357 KSP Residual norm 1.922683898295e-09 358 KSP Residual norm 1.908712791625e-09 359 KSP Residual norm 1.849908180510e-09 360 KSP Residual norm 1.856780413631e-09 361 KSP Residual norm 1.842267171653e-09 362 KSP Residual norm 1.746348805008e-09 363 KSP Residual norm 1.749518357920e-09 364 KSP Residual norm 1.939475822199e-09 365 KSP Residual norm 1.990414477599e-09 366 KSP Residual norm 1.980692673368e-09 367 KSP Residual norm 1.886053185584e-09 368 KSP Residual norm 1.781946469507e-09 369 KSP Residual norm 1.774261751797e-09 370 KSP Residual norm 1.804078271165e-09 371 KSP Residual norm 1.745686706072e-09 372 KSP Residual norm 1.731852936549e-09 373 KSP Residual norm 1.754926617127e-09 374 KSP Residual norm 1.779046260348e-09 375 KSP Residual norm 1.745327121629e-09 376 KSP Residual norm 1.771959236451e-09 377 KSP Residual norm 1.827974420788e-09 378 KSP Residual norm 1.849788084290e-09 379 KSP Residual norm 1.941969624488e-09 380 KSP Residual norm 2.014370599936e-09 381 KSP Residual norm 2.030828617532e-09 382 KSP Residual norm 1.905719257138e-09 383 KSP Residual norm 1.863775147410e-09 384 KSP Residual norm 1.941510721061e-09 385 KSP Residual norm 1.974216393942e-09 386 KSP Residual norm 1.931772303021e-09 387 KSP Residual norm 1.899391183354e-09 388 KSP Residual norm 1.886673499425e-09 389 KSP Residual norm 1.878559022648e-09 390 KSP Residual norm 1.837604105003e-09 391 KSP Residual norm 1.740366082192e-09 392 KSP Residual norm 1.708352566906e-09 393 KSP Residual norm 1.787417448975e-09 394 KSP Residual norm 1.816237088262e-09 395 KSP Residual norm 1.691668465071e-09 396 KSP Residual norm 1.560944830824e-09 397 KSP Residual norm 1.515334735527e-09 398 KSP Residual norm 1.568661395616e-09 399 KSP Residual norm 1.548240848714e-09 400 KSP Residual norm 1.511697795135e-09 401 KSP Residual norm 1.487092179622e-09 402 KSP Residual norm 1.478493260143e-09 403 KSP Residual norm 1.538102653313e-09 404 KSP Residual norm 1.609378381870e-09 405 KSP Residual norm 1.664672082005e-09 406 KSP Residual norm 1.686998752671e-09 407 KSP Residual norm 1.595392958096e-09 408 KSP Residual norm 1.516896889051e-09 409 KSP Residual norm 1.511085515858e-09 410 KSP Residual norm 1.626436837978e-09 411 KSP Residual norm 1.606467539436e-09 412 KSP Residual norm 1.629402219182e-09 413 KSP Residual norm 1.721446700057e-09 414 KSP Residual norm 1.818687216231e-09 415 KSP Residual norm 1.805534605090e-09 416 KSP Residual norm 1.754658310712e-09 417 KSP Residual norm 1.773616942646e-09 418 KSP Residual norm 1.941262952136e-09 419 KSP Residual norm 1.968575814969e-09 420 KSP Residual norm 1.935639538944e-09 421 KSP Residual norm 1.953700286188e-09 422 KSP Residual norm 2.020298940926e-09 423 KSP Residual norm 1.999535508313e-09 424 KSP Residual norm 2.114232391751e-09 425 KSP Residual norm 2.020817403734e-09 426 KSP Residual norm 1.881705180294e-09 427 KSP Residual norm 1.882372586623e-09 428 KSP Residual norm 1.935463919864e-09 429 KSP Residual norm 1.949854098796e-09 430 KSP Residual norm 1.927157285058e-09 431 KSP Residual norm 1.920417265886e-09 432 KSP Residual norm 2.010685319396e-09 433 KSP Residual norm 2.078288135969e-09 434 KSP Residual norm 2.114180100144e-09 435 KSP Residual norm 2.012030914193e-09 436 KSP Residual norm 1.997518451721e-09 437 KSP Residual norm 1.956631368086e-09 438 KSP Residual norm 1.847118497113e-09 439 KSP Residual norm 1.730297426468e-09 440 KSP Residual norm 1.748800420223e-09 441 KSP Residual norm 1.828205065436e-09 442 KSP Residual norm 1.760760033573e-09 443 KSP Residual norm 1.724366337066e-09 444 KSP Residual norm 1.757107798255e-09 445 KSP Residual norm 1.831884364702e-09 446 KSP Residual norm 1.804891154466e-09 447 KSP Residual norm 1.716546566462e-09 448 KSP Residual norm 1.618879910544e-09 449 KSP Residual norm 1.579585771131e-09 450 KSP Residual norm 1.581266394147e-09 451 KSP Residual norm 1.553069056692e-09 452 KSP Residual norm 1.510694874612e-09 453 KSP Residual norm 1.450687087999e-09 454 KSP Residual norm 1.418031117917e-09 455 KSP Residual norm 1.381905481894e-09 456 KSP Residual norm 1.336416913484e-09 457 KSP Residual norm 1.291136597738e-09 458 KSP Residual norm 1.296581485943e-09 459 KSP Residual norm 1.304381643340e-09 460 KSP Residual norm 1.320574612072e-09 461 KSP Residual norm 1.395516588061e-09 462 KSP Residual norm 1.412209599932e-09 463 KSP Residual norm 1.400047814931e-09 464 KSP Residual norm 1.353600507813e-09 465 KSP Residual norm 1.416640083739e-09 466 KSP Residual norm 1.445244139346e-09 467 KSP Residual norm 1.433493018296e-09 468 KSP Residual norm 1.435449995915e-09 469 KSP Residual norm 1.429070518235e-09 470 KSP Residual norm 1.388899906388e-09 471 KSP Residual norm 1.395259291383e-09 472 KSP Residual norm 1.392606153754e-09 473 KSP Residual norm 1.396971891755e-09 474 KSP Residual norm 1.382745441360e-09 475 KSP Residual norm 1.408922664777e-09 476 KSP Residual norm 1.411789277377e-09 477 KSP Residual norm 1.361735404197e-09 478 KSP Residual norm 1.296947723973e-09 479 KSP Residual norm 1.240461882888e-09 480 KSP Residual norm 1.248819724338e-09 481 KSP Residual norm 1.322112290182e-09 482 KSP Residual norm 1.343629557443e-09 483 KSP Residual norm 1.324575543601e-09 484 KSP Residual norm 1.309845665022e-09 485 KSP Residual norm 1.314171697703e-09 486 KSP Residual norm 1.297011033591e-09 487 KSP Residual norm 1.286772693697e-09 488 KSP Residual norm 1.329444773478e-09 489 KSP Residual norm 1.335802971685e-09 490 KSP Residual norm 1.324402725640e-09 491 KSP Residual norm 1.394280527801e-09 492 KSP Residual norm 1.461658436416e-09 493 KSP Residual norm 1.495261891135e-09 494 KSP Residual norm 1.547817326381e-09 495 KSP Residual norm 1.532839879649e-09 496 KSP Residual norm 1.570858598946e-09 497 KSP Residual norm 1.641705524032e-09 498 KSP Residual norm 1.671529718867e-09 499 KSP Residual norm 1.772495074632e-09 500 KSP Residual norm 1.847029248055e-09 501 KSP Residual norm 1.766738060761e-09 502 KSP Residual norm 1.781542344002e-09 503 KSP Residual norm 1.735146745630e-09 504 KSP Residual norm 1.681687631135e-09 505 KSP Residual norm 1.727289790522e-09 506 KSP Residual norm 1.763230239574e-09 507 KSP Residual norm 1.795057298585e-09 508 KSP Residual norm 1.692534511967e-09 509 KSP Residual norm 1.532249455094e-09 510 KSP Residual norm 1.524484968248e-09 511 KSP Residual norm 1.614981719405e-09 512 KSP Residual norm 1.644732322839e-09 513 KSP Residual norm 1.648727655330e-09 514 KSP Residual norm 1.603155602400e-09 515 KSP Residual norm 1.610134372803e-09 516 KSP Residual norm 1.689652383961e-09 517 KSP Residual norm 1.652238248612e-09 518 KSP Residual norm 1.545180754492e-09 519 KSP Residual norm 1.595513579393e-09 520 KSP Residual norm 1.683866217938e-09 521 KSP Residual norm 1.763739358542e-09 522 KSP Residual norm 1.772000950785e-09 523 KSP Residual norm 1.847047086745e-09 524 KSP Residual norm 1.894853851552e-09 525 KSP Residual norm 1.988406619670e-09 526 KSP Residual norm 1.946964922346e-09 527 KSP Residual norm 1.854597647178e-09 528 KSP Residual norm 1.859158597920e-09 529 KSP Residual norm 1.873074309863e-09 530 KSP Residual norm 1.841854149780e-09 531 KSP Residual norm 1.820360993445e-09 532 KSP Residual norm 1.765709064136e-09 533 KSP Residual norm 1.743639470192e-09 534 KSP Residual norm 1.700118978897e-09 535 KSP Residual norm 1.686261786531e-09 536 KSP Residual norm 1.736933884590e-09 537 KSP Residual norm 1.714856623027e-09 538 KSP Residual norm 1.712580333166e-09 539 KSP Residual norm 1.772859236532e-09 540 KSP Residual norm 1.755407084707e-09 541 KSP Residual norm 1.768766278526e-09 542 KSP Residual norm 1.765845696674e-09 543 KSP Residual norm 1.775162125958e-09 544 KSP Residual norm 1.814609910488e-09 545 KSP Residual norm 1.880652692903e-09 546 KSP Residual norm 1.929178971764e-09 547 KSP Residual norm 1.934747440650e-09 548 KSP Residual norm 1.891117997319e-09 549 KSP Residual norm 1.915937837428e-09 550 KSP Residual norm 1.937989831489e-09 551 KSP Residual norm 2.001012117016e-09 552 KSP Residual norm 2.035968118293e-09 553 KSP Residual norm 1.945679174279e-09 554 KSP Residual norm 1.976732083516e-09 555 KSP Residual norm 2.116237603035e-09 556 KSP Residual norm 2.178943630802e-09 557 KSP Residual norm 2.172060610074e-09 558 KSP Residual norm 2.096427640437e-09 559 KSP Residual norm 2.008266586912e-09 560 KSP Residual norm 2.061231834099e-09 561 KSP Residual norm 2.205176081615e-09 562 KSP Residual norm 2.320317704500e-09 563 KSP Residual norm 2.354305445855e-09 564 KSP Residual norm 2.425625924464e-09 565 KSP Residual norm 2.428922668942e-09 566 KSP Residual norm 2.481481736987e-09 567 KSP Residual norm 2.558544706905e-09 568 KSP Residual norm 2.534676686647e-09 569 KSP Residual norm 2.575790935864e-09 570 KSP Residual norm 2.643759551321e-09 571 KSP Residual norm 2.636357543635e-09 572 KSP Residual norm 2.672432286967e-09 573 KSP Residual norm 2.712348359281e-09 574 KSP Residual norm 2.814931661634e-09 575 KSP Residual norm 2.795225099939e-09 576 KSP Residual norm 2.856138873093e-09 577 KSP Residual norm 2.896582884594e-09 578 KSP Residual norm 2.813751269116e-09 579 KSP Residual norm 2.838895549463e-09 580 KSP Residual norm 3.166547856785e-09 581 KSP Residual norm 3.467591793548e-09 582 KSP Residual norm 3.468463407237e-09 583 KSP Residual norm 3.455049867302e-09 584 KSP Residual norm 3.490160079207e-09 585 KSP Residual norm 3.679438172608e-09 586 KSP Residual norm 3.794138702879e-09 587 KSP Residual norm 3.916754271143e-09 588 KSP Residual norm 3.969463178263e-09 589 KSP Residual norm 4.049059306507e-09 590 KSP Residual norm 4.161015123219e-09 591 KSP Residual norm 4.314138131830e-09 592 KSP Residual norm 4.622943377089e-09 593 KSP Residual norm 4.760455162437e-09 594 KSP Residual norm 4.665593699675e-09 595 KSP Residual norm 4.701358342161e-09 596 KSP Residual norm 4.756247348908e-09 597 KSP Residual norm 4.529691109948e-09 598 KSP Residual norm 4.231837687517e-09 599 KSP Residual norm 4.108313481190e-09 600 KSP Residual norm 4.115730756272e-09 601 KSP Residual norm 3.938768705167e-09 602 KSP Residual norm 3.657550920846e-09 603 KSP Residual norm 3.418436273245e-09 604 KSP Residual norm 3.301056252721e-09 605 KSP Residual norm 3.307907612331e-09 606 KSP Residual norm 3.318421058876e-09 607 KSP Residual norm 3.334708126964e-09 608 KSP Residual norm 3.357935341797e-09 609 KSP Residual norm 3.382178946176e-09 610 KSP Residual norm 3.500952993405e-09 611 KSP Residual norm 3.621225863202e-09 612 KSP Residual norm 3.807464983169e-09 613 KSP Residual norm 4.042123944953e-09 614 KSP Residual norm 4.139638937863e-09 615 KSP Residual norm 4.237157279987e-09 616 KSP Residual norm 4.132400549353e-09 617 KSP Residual norm 3.891694551034e-09 618 KSP Residual norm 3.719060748535e-09 619 KSP Residual norm 3.769538284941e-09 620 KSP Residual norm 3.862205365366e-09 621 KSP Residual norm 3.687129391416e-09 622 KSP Residual norm 3.274085792912e-09 623 KSP Residual norm 3.004073308853e-09 624 KSP Residual norm 2.847476307893e-09 625 KSP Residual norm 2.862846211184e-09 626 KSP Residual norm 2.833158068211e-09 627 KSP Residual norm 2.892860164804e-09 628 KSP Residual norm 2.923237474514e-09 629 KSP Residual norm 2.878428780279e-09 630 KSP Residual norm 2.883813062194e-09 631 KSP Residual norm 2.915949789670e-09 632 KSP Residual norm 2.875344149110e-09 633 KSP Residual norm 2.929668481231e-09 634 KSP Residual norm 2.999723144163e-09 635 KSP Residual norm 2.967399948344e-09 636 KSP Residual norm 2.978206024806e-09 637 KSP Residual norm 3.065811028516e-09 638 KSP Residual norm 3.154641432697e-09 639 KSP Residual norm 3.174640088809e-09 640 KSP Residual norm 3.241171659702e-09 641 KSP Residual norm 3.513875375327e-09 642 KSP Residual norm 3.872306217510e-09 643 KSP Residual norm 4.360757665916e-09 644 KSP Residual norm 4.690253449849e-09 645 KSP Residual norm 4.762568488373e-09 646 KSP Residual norm 4.927776012132e-09 647 KSP Residual norm 5.232553271582e-09 648 KSP Residual norm 5.472989166388e-09 649 KSP Residual norm 5.656851351546e-09 650 KSP Residual norm 5.697974630712e-09 651 KSP Residual norm 5.785925250061e-09 652 KSP Residual norm 5.766662930042e-09 653 KSP Residual norm 5.592418716831e-09 654 KSP Residual norm 5.652438840848e-09 655 KSP Residual norm 5.720758885642e-09 656 KSP Residual norm 5.544726700086e-09 657 KSP Residual norm 5.429352228380e-09 658 KSP Residual norm 5.429515264316e-09 659 KSP Residual norm 5.489269956176e-09 660 KSP Residual norm 5.712827764565e-09 661 KSP Residual norm 6.087666923820e-09 662 KSP Residual norm 6.278362274492e-09 663 KSP Residual norm 6.301831288228e-09 664 KSP Residual norm 5.904205849963e-09 665 KSP Residual norm 5.779884902777e-09 666 KSP Residual norm 6.166206300841e-09 667 KSP Residual norm 6.628546109875e-09 668 KSP Residual norm 7.369527158813e-09 669 KSP Residual norm 7.834466616015e-09 670 KSP Residual norm 7.706044513704e-09 671 KSP Residual norm 7.189880506311e-09 672 KSP Residual norm 6.867685071342e-09 673 KSP Residual norm 6.998974946100e-09 674 KSP Residual norm 7.272049542534e-09 675 KSP Residual norm 7.413805979531e-09 676 KSP Residual norm 7.400097954362e-09 677 KSP Residual norm 7.463072280556e-09 678 KSP Residual norm 7.505407311348e-09 679 KSP Residual norm 7.519288955187e-09 680 KSP Residual norm 7.958281843376e-09 681 KSP Residual norm 8.503716660846e-09 682 KSP Residual norm 8.797293228028e-09 683 KSP Residual norm 8.302001441191e-09 684 KSP Residual norm 7.986093430824e-09 685 KSP Residual norm 7.954561302759e-09 686 KSP Residual norm 8.212198244218e-09 687 KSP Residual norm 8.201991741836e-09 688 KSP Residual norm 7.909094237038e-09 689 KSP Residual norm 7.905508907252e-09 690 KSP Residual norm 7.574673743162e-09 691 KSP Residual norm 7.048564390762e-09 692 KSP Residual norm 6.679912990284e-09 693 KSP Residual norm 6.559162221948e-09 694 KSP Residual norm 6.901596439668e-09 695 KSP Residual norm 6.784534083983e-09 696 KSP Residual norm 6.653239702631e-09 697 KSP Residual norm 6.990473567042e-09 698 KSP Residual norm 7.302923907033e-09 699 KSP Residual norm 6.911087553228e-09 700 KSP Residual norm 6.523847038279e-09 701 KSP Residual norm 6.577503533026e-09 702 KSP Residual norm 6.601448221557e-09 703 KSP Residual norm 6.478319254699e-09 704 KSP Residual norm 6.200524824596e-09 705 KSP Residual norm 5.978691426175e-09 706 KSP Residual norm 6.025466334544e-09 707 KSP Residual norm 6.460135262803e-09 708 KSP Residual norm 6.612230713785e-09 709 KSP Residual norm 6.513059031597e-09 710 KSP Residual norm 6.335966030477e-09 711 KSP Residual norm 6.245890934433e-09 712 KSP Residual norm 5.955798653406e-09 713 KSP Residual norm 5.855595656728e-09 714 KSP Residual norm 6.029673220099e-09 715 KSP Residual norm 6.081193375711e-09 716 KSP Residual norm 6.145075528393e-09 717 KSP Residual norm 6.420983429738e-09 718 KSP Residual norm 6.707696634223e-09 719 KSP Residual norm 6.919081626264e-09 720 KSP Residual norm 6.932654590679e-09 721 KSP Residual norm 6.674365261677e-09 722 KSP Residual norm 6.684395882579e-09 723 KSP Residual norm 7.031641131260e-09 724 KSP Residual norm 7.244008138092e-09 725 KSP Residual norm 6.985369198773e-09 726 KSP Residual norm 6.877293912545e-09 727 KSP Residual norm 6.656048546880e-09 728 KSP Residual norm 6.465602711445e-09 729 KSP Residual norm 6.124479811739e-09 730 KSP Residual norm 5.717349545493e-09 731 KSP Residual norm 5.837934988084e-09 732 KSP Residual norm 5.972927388556e-09 733 KSP Residual norm 6.227047643661e-09 734 KSP Residual norm 6.472940051759e-09 735 KSP Residual norm 6.500052735720e-09 736 KSP Residual norm 6.228413412324e-09 737 KSP Residual norm 6.203070149011e-09 738 KSP Residual norm 6.045043224275e-09 739 KSP Residual norm 6.217800535565e-09 740 KSP Residual norm 6.272435645284e-09 741 KSP Residual norm 6.093501440885e-09 742 KSP Residual norm 5.868387675671e-09 743 KSP Residual norm 5.941150492818e-09 744 KSP Residual norm 6.257994600326e-09 745 KSP Residual norm 6.559466775819e-09 746 KSP Residual norm 6.646013125951e-09 747 KSP Residual norm 6.603083012038e-09 748 KSP Residual norm 6.869798917114e-09 749 KSP Residual norm 7.035848305198e-09 750 KSP Residual norm 6.724702795544e-09 751 KSP Residual norm 6.643039288536e-09 752 KSP Residual norm 6.877580654451e-09 753 KSP Residual norm 7.372500871887e-09 754 KSP Residual norm 7.287842891225e-09 755 KSP Residual norm 6.854182835958e-09 756 KSP Residual norm 6.608884099750e-09 757 KSP Residual norm 6.543541585624e-09 758 KSP Residual norm 6.859615233234e-09 759 KSP Residual norm 7.577449098830e-09 760 KSP Residual norm 7.953014484535e-09 761 KSP Residual norm 8.026613977031e-09 762 KSP Residual norm 7.827001819223e-09 763 KSP Residual norm 7.545541474815e-09 764 KSP Residual norm 7.117909723248e-09 765 KSP Residual norm 6.653285401663e-09 766 KSP Residual norm 6.572883257089e-09 767 KSP Residual norm 6.748422974851e-09 768 KSP Residual norm 7.066382058042e-09 769 KSP Residual norm 7.559923939180e-09 770 KSP Residual norm 8.357225213639e-09 771 KSP Residual norm 9.015058935006e-09 772 KSP Residual norm 9.236946944050e-09 773 KSP Residual norm 8.704063459155e-09 774 KSP Residual norm 8.235715120280e-09 775 KSP Residual norm 8.163466859000e-09 776 KSP Residual norm 7.895208198763e-09 777 KSP Residual norm 7.569488982129e-09 778 KSP Residual norm 7.565997464462e-09 779 KSP Residual norm 7.453386151749e-09 780 KSP Residual norm 7.169651648109e-09 781 KSP Residual norm 7.143581355048e-09 782 KSP Residual norm 7.282046900589e-09 783 KSP Residual norm 7.157581279652e-09 784 KSP Residual norm 7.230461011824e-09 785 KSP Residual norm 7.878183064383e-09 786 KSP Residual norm 8.060948501929e-09 787 KSP Residual norm 7.912577052218e-09 788 KSP Residual norm 7.948835121168e-09 789 KSP Residual norm 7.674276302470e-09 790 KSP Residual norm 7.706182467257e-09 791 KSP Residual norm 7.388719541897e-09 792 KSP Residual norm 7.688982310654e-09 793 KSP Residual norm 8.020141760739e-09 794 KSP Residual norm 8.162208765609e-09 795 KSP Residual norm 8.034774510079e-09 796 KSP Residual norm 7.836893773661e-09 797 KSP Residual norm 7.547212125466e-09 798 KSP Residual norm 7.668103085871e-09 799 KSP Residual norm 7.623166606802e-09 800 KSP Residual norm 7.448196207566e-09 801 KSP Residual norm 7.091278411865e-09 802 KSP Residual norm 6.998064038304e-09 803 KSP Residual norm 7.160226734579e-09 804 KSP Residual norm 7.443078968953e-09 805 KSP Residual norm 7.369587397843e-09 806 KSP Residual norm 7.202525538865e-09 807 KSP Residual norm 6.962021415456e-09 808 KSP Residual norm 6.681119302584e-09 809 KSP Residual norm 6.785368950271e-09 810 KSP Residual norm 6.851824936260e-09 811 KSP Residual norm 6.565067688156e-09 812 KSP Residual norm 6.359528559300e-09 813 KSP Residual norm 6.202258875461e-09 814 KSP Residual norm 6.040993476528e-09 815 KSP Residual norm 6.048505449348e-09 816 KSP Residual norm 6.125282224091e-09 817 KSP Residual norm 6.190690894085e-09 818 KSP Residual norm 6.173396419131e-09 819 KSP Residual norm 5.797081982935e-09 820 KSP Residual norm 5.780784627591e-09 821 KSP Residual norm 5.908891550852e-09 822 KSP Residual norm 6.278057324239e-09 823 KSP Residual norm 6.129600776127e-09 824 KSP Residual norm 5.983984436644e-09 825 KSP Residual norm 5.706130836202e-09 826 KSP Residual norm 5.514456245658e-09 827 KSP Residual norm 5.443952121390e-09 828 KSP Residual norm 5.679143660625e-09 829 KSP Residual norm 5.607722564238e-09 830 KSP Residual norm 5.527711112136e-09 831 KSP Residual norm 5.540718014405e-09 832 KSP Residual norm 5.636554453756e-09 833 KSP Residual norm 5.602696964501e-09 834 KSP Residual norm 5.587850079021e-09 835 KSP Residual norm 5.601421004204e-09 836 KSP Residual norm 5.513532911406e-09 837 KSP Residual norm 5.464122957114e-09 838 KSP Residual norm 5.492864528095e-09 839 KSP Residual norm 5.728267078389e-09 840 KSP Residual norm 5.658137947643e-09 841 KSP Residual norm 5.382615240830e-09 842 KSP Residual norm 5.084257134948e-09 843 KSP Residual norm 4.800761393363e-09 844 KSP Residual norm 4.809139612126e-09 845 KSP Residual norm 4.863267747940e-09 846 KSP Residual norm 4.788287281942e-09 847 KSP Residual norm 4.805361198429e-09 848 KSP Residual norm 4.872644208108e-09 849 KSP Residual norm 5.004821109516e-09 850 KSP Residual norm 5.226925462836e-09 851 KSP Residual norm 5.408527552589e-09 852 KSP Residual norm 5.668341959668e-09 853 KSP Residual norm 5.825110482612e-09 854 KSP Residual norm 5.719827720569e-09 855 KSP Residual norm 5.647673996373e-09 856 KSP Residual norm 5.695930630044e-09 857 KSP Residual norm 5.786161739978e-09 858 KSP Residual norm 5.718465017172e-09 859 KSP Residual norm 5.474379755098e-09 860 KSP Residual norm 5.100927461531e-09 861 KSP Residual norm 5.182803528228e-09 862 KSP Residual norm 5.216240079042e-09 863 KSP Residual norm 5.183988210785e-09 864 KSP Residual norm 5.008909032486e-09 865 KSP Residual norm 4.826226353690e-09 866 KSP Residual norm 4.914528211049e-09 867 KSP Residual norm 5.231291841512e-09 868 KSP Residual norm 5.316169185419e-09 869 KSP Residual norm 5.400183546490e-09 870 KSP Residual norm 5.366977587361e-09 871 KSP Residual norm 5.341670596335e-09 872 KSP Residual norm 5.303468774687e-09 873 KSP Residual norm 5.366740194011e-09 874 KSP Residual norm 5.296933732000e-09 875 KSP Residual norm 5.288750013770e-09 876 KSP Residual norm 5.444363031954e-09 877 KSP Residual norm 5.784263576158e-09 878 KSP Residual norm 5.948011993179e-09 879 KSP Residual norm 6.115145666760e-09 880 KSP Residual norm 6.468587599719e-09 881 KSP Residual norm 6.923520186849e-09 882 KSP Residual norm 6.858575958634e-09 883 KSP Residual norm 6.530469286925e-09 884 KSP Residual norm 6.292450555488e-09 885 KSP Residual norm 6.613749509452e-09 886 KSP Residual norm 7.260143027649e-09 887 KSP Residual norm 7.724309477503e-09 888 KSP Residual norm 7.562182819475e-09 889 KSP Residual norm 7.258173186873e-09 890 KSP Residual norm 7.248820531580e-09 891 KSP Residual norm 7.057351368755e-09 892 KSP Residual norm 6.912152860904e-09 893 KSP Residual norm 7.151675702628e-09 894 KSP Residual norm 7.567601172563e-09 895 KSP Residual norm 7.974144326828e-09 896 KSP Residual norm 7.954801860110e-09 897 KSP Residual norm 8.106295496802e-09 898 KSP Residual norm 8.521052347560e-09 899 KSP Residual norm 9.062756710290e-09 900 KSP Residual norm 8.858680189932e-09 901 KSP Residual norm 8.275959964339e-09 902 KSP Residual norm 7.798679446082e-09 903 KSP Residual norm 7.735865714005e-09 904 KSP Residual norm 7.692675569565e-09 905 KSP Residual norm 7.877351922282e-09 906 KSP Residual norm 7.848158383283e-09 907 KSP Residual norm 7.412279192010e-09 908 KSP Residual norm 7.128091487987e-09 909 KSP Residual norm 7.143322128602e-09 910 KSP Residual norm 7.198009345265e-09 911 KSP Residual norm 6.993940937968e-09 912 KSP Residual norm 6.821657925321e-09 913 KSP Residual norm 6.517358067203e-09 914 KSP Residual norm 5.981662338732e-09 915 KSP Residual norm 5.688795473476e-09 916 KSP Residual norm 5.509153758326e-09 917 KSP Residual norm 5.677921543509e-09 918 KSP Residual norm 5.716800765156e-09 919 KSP Residual norm 5.638726241970e-09 920 KSP Residual norm 5.878812403449e-09 921 KSP Residual norm 6.114511948264e-09 922 KSP Residual norm 6.240447909717e-09 923 KSP Residual norm 6.251944017577e-09 924 KSP Residual norm 6.023715038662e-09 925 KSP Residual norm 5.693535878019e-09 926 KSP Residual norm 5.525011521074e-09 927 KSP Residual norm 5.681150465276e-09 928 KSP Residual norm 5.935861345955e-09 929 KSP Residual norm 6.159510250080e-09 930 KSP Residual norm 6.556862156855e-09 931 KSP Residual norm 6.934405621634e-09 932 KSP Residual norm 7.247255478605e-09 933 KSP Residual norm 7.405281862255e-09 934 KSP Residual norm 7.847098478143e-09 935 KSP Residual norm 8.233361159953e-09 936 KSP Residual norm 8.785756248873e-09 937 KSP Residual norm 8.968568396860e-09 938 KSP Residual norm 8.808067495405e-09 939 KSP Residual norm 8.643926487272e-09 940 KSP Residual norm 9.083129511502e-09 941 KSP Residual norm 9.679285377657e-09 942 KSP Residual norm 9.811706021424e-09 943 KSP Residual norm 9.667016485816e-09 944 KSP Residual norm 9.725162049022e-09 945 KSP Residual norm 1.068902760110e-08 946 KSP Residual norm 1.103863427705e-08 947 KSP Residual norm 1.017363934353e-08 948 KSP Residual norm 9.320665616680e-09 949 KSP Residual norm 9.051704918474e-09 950 KSP Residual norm 9.239669744355e-09 951 KSP Residual norm 8.957637299840e-09 952 KSP Residual norm 8.161234593837e-09 953 KSP Residual norm 7.564960336925e-09 954 KSP Residual norm 7.353806885156e-09 955 KSP Residual norm 7.367097546397e-09 956 KSP Residual norm 7.259549067035e-09 957 KSP Residual norm 7.325747974757e-09 958 KSP Residual norm 7.499519634152e-09 959 KSP Residual norm 7.437087940069e-09 960 KSP Residual norm 7.384386309173e-09 961 KSP Residual norm 7.496377330740e-09 962 KSP Residual norm 7.762435909619e-09 963 KSP Residual norm 7.908523824425e-09 964 KSP Residual norm 7.659805582899e-09 965 KSP Residual norm 6.832397920205e-09 966 KSP Residual norm 6.519845374949e-09 967 KSP Residual norm 6.574461012132e-09 968 KSP Residual norm 6.511372659672e-09 969 KSP Residual norm 6.583506389026e-09 970 KSP Residual norm 6.874484606155e-09 971 KSP Residual norm 7.012007818853e-09 972 KSP Residual norm 7.118177107805e-09 973 KSP Residual norm 7.256487171561e-09 974 KSP Residual norm 7.573526048223e-09 975 KSP Residual norm 7.974617082430e-09 976 KSP Residual norm 7.906185386514e-09 977 KSP Residual norm 7.823359314357e-09 978 KSP Residual norm 7.917015528232e-09 979 KSP Residual norm 7.898753797419e-09 980 KSP Residual norm 7.744004531711e-09 981 KSP Residual norm 8.030269921517e-09 982 KSP Residual norm 8.413528651713e-09 983 KSP Residual norm 8.887655576893e-09 984 KSP Residual norm 8.734099877500e-09 985 KSP Residual norm 8.613602066716e-09 986 KSP Residual norm 8.427988446776e-09 987 KSP Residual norm 8.513553314201e-09 988 KSP Residual norm 8.998234130515e-09 989 KSP Residual norm 8.915944527933e-09 990 KSP Residual norm 8.263158719163e-09 991 KSP Residual norm 7.701030359952e-09 992 KSP Residual norm 7.522201936886e-09 993 KSP Residual norm 7.127382905299e-09 994 KSP Residual norm 6.914360075698e-09 995 KSP Residual norm 6.958875544316e-09 996 KSP Residual norm 6.745554185865e-09 997 KSP Residual norm 6.497643604142e-09 998 KSP Residual norm 6.658185439567e-09 999 KSP Residual norm 6.797706259527e-09 1000 KSP Residual norm 6.643547980643e-09 1001 KSP Residual norm 6.525716501621e-09 1002 KSP Residual norm 6.408688372054e-09 1003 KSP Residual norm 6.212810977432e-09 1004 KSP Residual norm 6.355261722701e-09 1005 KSP Residual norm 6.683369639665e-09 1006 KSP Residual norm 6.755423402063e-09 1007 KSP Residual norm 6.817321116932e-09 1008 KSP Residual norm 6.977177254165e-09 1009 KSP Residual norm 7.250489956452e-09 1010 KSP Residual norm 7.077012235869e-09 1011 KSP Residual norm 6.554400361128e-09 1012 KSP Residual norm 6.239676004657e-09 1013 KSP Residual norm 6.474158239235e-09 1014 KSP Residual norm 6.870072691977e-09 1015 KSP Residual norm 6.864592915272e-09 1016 KSP Residual norm 6.937825130396e-09 1017 KSP Residual norm 6.952557781448e-09 1018 KSP Residual norm 7.017833155262e-09 1019 KSP Residual norm 7.291220932088e-09 1020 KSP Residual norm 7.450728999707e-09 1021 KSP Residual norm 7.283777585467e-09 1022 KSP Residual norm 7.033832918743e-09 1023 KSP Residual norm 7.196823000425e-09 1024 KSP Residual norm 7.346160430197e-09 1025 KSP Residual norm 7.382018006703e-09 1026 KSP Residual norm 7.125543832461e-09 1027 KSP Residual norm 7.237500085312e-09 1028 KSP Residual norm 7.295483620750e-09 1029 KSP Residual norm 7.296341550071e-09 1030 KSP Residual norm 7.173884125813e-09 1031 KSP Residual norm 6.939322667988e-09 1032 KSP Residual norm 6.537141232920e-09 1033 KSP Residual norm 6.386527226313e-09 1034 KSP Residual norm 6.452925411362e-09 1035 KSP Residual norm 6.198119685324e-09 1036 KSP Residual norm 5.799844419135e-09 1037 KSP Residual norm 5.809414044221e-09 1038 KSP Residual norm 5.805182841062e-09 1039 KSP Residual norm 5.889747430483e-09 1040 KSP Residual norm 5.869962399738e-09 1041 KSP Residual norm 5.357415686878e-09 1042 KSP Residual norm 5.029360405793e-09 1043 KSP Residual norm 4.948841519246e-09 1044 KSP Residual norm 5.126733639701e-09 1045 KSP Residual norm 5.274225737834e-09 1046 KSP Residual norm 5.414643543120e-09 1047 KSP Residual norm 5.354333787908e-09 1048 KSP Residual norm 5.181061229734e-09 1049 KSP Residual norm 5.375989535532e-09 1050 KSP Residual norm 5.657566587982e-09 1051 KSP Residual norm 5.248448275588e-09 1052 KSP Residual norm 4.566160191450e-09 1053 KSP Residual norm 4.375577774801e-09 1054 KSP Residual norm 4.576011842869e-09 1055 KSP Residual norm 4.915662417165e-09 1056 KSP Residual norm 5.067861117888e-09 1057 KSP Residual norm 5.214458113091e-09 1058 KSP Residual norm 5.681364855361e-09 1059 KSP Residual norm 5.856244264088e-09 1060 KSP Residual norm 5.615827106959e-09 1061 KSP Residual norm 5.335577355833e-09 1062 KSP Residual norm 5.219291139339e-09 1063 KSP Residual norm 5.001481924689e-09 1064 KSP Residual norm 4.766345190169e-09 1065 KSP Residual norm 4.595334671668e-09 1066 KSP Residual norm 4.589812995159e-09 1067 KSP Residual norm 4.861988170360e-09 1068 KSP Residual norm 5.285857804457e-09 1069 KSP Residual norm 5.573336093276e-09 1070 KSP Residual norm 5.541018434628e-09 1071 KSP Residual norm 5.534991390608e-09 1072 KSP Residual norm 5.571698872281e-09 1073 KSP Residual norm 5.380018423122e-09 1074 KSP Residual norm 5.480686274751e-09 1075 KSP Residual norm 5.844147186637e-09 1076 KSP Residual norm 6.244775835500e-09 1077 KSP Residual norm 6.422531855198e-09 1078 KSP Residual norm 6.363517038435e-09 1079 KSP Residual norm 6.327028491825e-09 1080 KSP Residual norm 6.267223842887e-09 1081 KSP Residual norm 6.283382477294e-09 1082 KSP Residual norm 6.243640617307e-09 1083 KSP Residual norm 6.151485523210e-09 1084 KSP Residual norm 6.091475515101e-09 1085 KSP Residual norm 6.098064843009e-09 1086 KSP Residual norm 6.298631030314e-09 1087 KSP Residual norm 6.404858845342e-09 1088 KSP Residual norm 6.313176373162e-09 1089 KSP Residual norm 6.110546614074e-09 1090 KSP Residual norm 5.990890130531e-09 1091 KSP Residual norm 6.132289844452e-09 1092 KSP Residual norm 6.122679680216e-09 1093 KSP Residual norm 5.841217190319e-09 1094 KSP Residual norm 5.973798203640e-09 1095 KSP Residual norm 6.331906846830e-09 1096 KSP Residual norm 6.267233105290e-09 1097 KSP Residual norm 6.004452007948e-09 1098 KSP Residual norm 5.868518189995e-09 1099 KSP Residual norm 5.998988425377e-09 1100 KSP Residual norm 6.327160413271e-09 1101 KSP Residual norm 6.371906224722e-09 1102 KSP Residual norm 5.900757207841e-09 1103 KSP Residual norm 5.510585278636e-09 1104 KSP Residual norm 5.535611456748e-09 1105 KSP Residual norm 5.703867445861e-09 1106 KSP Residual norm 5.812403989081e-09 1107 KSP Residual norm 6.260566864209e-09 1108 KSP Residual norm 6.758791651898e-09 1109 KSP Residual norm 7.134518657245e-09 1110 KSP Residual norm 7.173057675491e-09 1111 KSP Residual norm 7.246672952069e-09 1112 KSP Residual norm 7.202316177001e-09 1113 KSP Residual norm 6.804576953348e-09 1114 KSP Residual norm 6.566181332967e-09 1115 KSP Residual norm 6.652124927676e-09 1116 KSP Residual norm 7.070295450505e-09 1117 KSP Residual norm 7.443728026630e-09 1118 KSP Residual norm 7.425130228737e-09 1119 KSP Residual norm 7.052236298243e-09 1120 KSP Residual norm 7.040314853515e-09 1121 KSP Residual norm 7.406869377203e-09 1122 KSP Residual norm 7.336596584712e-09 1123 KSP Residual norm 6.839915807792e-09 1124 KSP Residual norm 6.967748469987e-09 1125 KSP Residual norm 7.181353827279e-09 1126 KSP Residual norm 7.145118996033e-09 1127 KSP Residual norm 6.848387297878e-09 1128 KSP Residual norm 6.736283742099e-09 1129 KSP Residual norm 6.533697196895e-09 1130 KSP Residual norm 6.520318554180e-09 1131 KSP Residual norm 6.757962098923e-09 1132 KSP Residual norm 6.586499453244e-09 1133 KSP Residual norm 6.167976960141e-09 1134 KSP Residual norm 5.988575365919e-09 1135 KSP Residual norm 6.085138520089e-09 1136 KSP Residual norm 6.016534945749e-09 1137 KSP Residual norm 6.191293510705e-09 1138 KSP Residual norm 6.622866455767e-09 1139 KSP Residual norm 6.630509323645e-09 1140 KSP Residual norm 6.409833525617e-09 1141 KSP Residual norm 6.295534129994e-09 1142 KSP Residual norm 6.206047976266e-09 1143 KSP Residual norm 6.219060883724e-09 1144 KSP Residual norm 6.301985859728e-09 1145 KSP Residual norm 6.539243445239e-09 1146 KSP Residual norm 6.557366373887e-09 1147 KSP Residual norm 6.535327474781e-09 1148 KSP Residual norm 6.708995885579e-09 1149 KSP Residual norm 7.193301098553e-09 1150 KSP Residual norm 7.465522756484e-09 1151 KSP Residual norm 7.378588830839e-09 1152 KSP Residual norm 7.205740814503e-09 1153 KSP Residual norm 7.141304336030e-09 1154 KSP Residual norm 7.266029491326e-09 1155 KSP Residual norm 7.457420944714e-09 1156 KSP Residual norm 7.203911768333e-09 1157 KSP Residual norm 7.102016305393e-09 1158 KSP Residual norm 6.962395838658e-09 1159 KSP Residual norm 6.883960592143e-09 1160 KSP Residual norm 6.768301964829e-09 1161 KSP Residual norm 6.740809348278e-09 1162 KSP Residual norm 7.017766801025e-09 1163 KSP Residual norm 6.859702231446e-09 1164 KSP Residual norm 6.390321735953e-09 1165 KSP Residual norm 5.939094327521e-09 1166 KSP Residual norm 5.902585648094e-09 1167 KSP Residual norm 6.254064559952e-09 1168 KSP Residual norm 6.642458569898e-09 1169 KSP Residual norm 6.311392943269e-09 1170 KSP Residual norm 5.858607834117e-09 1171 KSP Residual norm 5.370273524009e-09 1172 KSP Residual norm 4.830991609068e-09 1173 KSP Residual norm 4.593012148583e-09 1174 KSP Residual norm 4.664567590757e-09 1175 KSP Residual norm 4.830423752279e-09 1176 KSP Residual norm 5.069882357069e-09 1177 KSP Residual norm 5.243765559176e-09 1178 KSP Residual norm 5.237210501583e-09 1179 KSP Residual norm 5.319806537348e-09 1180 KSP Residual norm 5.806306595545e-09 1181 KSP Residual norm 6.205357832334e-09 1182 KSP Residual norm 6.142636958698e-09 1183 KSP Residual norm 6.115309204284e-09 1184 KSP Residual norm 5.910952878667e-09 1185 KSP Residual norm 5.612607861819e-09 1186 KSP Residual norm 5.332519322139e-09 1187 KSP Residual norm 5.494238262890e-09 1188 KSP Residual norm 5.967681267125e-09 1189 KSP Residual norm 6.412165183577e-09 1190 KSP Residual norm 6.433356115566e-09 1191 KSP Residual norm 6.082936666289e-09 1192 KSP Residual norm 5.534608624956e-09 1193 KSP Residual norm 5.059570131258e-09 1194 KSP Residual norm 4.929391333560e-09 1195 KSP Residual norm 4.930415084572e-09 1196 KSP Residual norm 5.141994477921e-09 1197 KSP Residual norm 5.226299317763e-09 1198 KSP Residual norm 4.946496705404e-09 1199 KSP Residual norm 4.650088197859e-09 1200 KSP Residual norm 4.352259567985e-09 1201 KSP Residual norm 4.120143379597e-09 1202 KSP Residual norm 4.279059386121e-09 1203 KSP Residual norm 4.567671586457e-09 1204 KSP Residual norm 4.699298571409e-09 1205 KSP Residual norm 4.624299946119e-09 1206 KSP Residual norm 4.608770294521e-09 1207 KSP Residual norm 4.881280460746e-09 1208 KSP Residual norm 5.330401693919e-09 1209 KSP Residual norm 5.385432775086e-09 1210 KSP Residual norm 5.388214614690e-09 1211 KSP Residual norm 5.351149896277e-09 1212 KSP Residual norm 5.582075652390e-09 1213 KSP Residual norm 5.996592344369e-09 1214 KSP Residual norm 6.214408064240e-09 1215 KSP Residual norm 6.200175871376e-09 1216 KSP Residual norm 5.913835930666e-09 1217 KSP Residual norm 5.634141218368e-09 1218 KSP Residual norm 5.654842954419e-09 1219 KSP Residual norm 5.665522319266e-09 1220 KSP Residual norm 5.659720591079e-09 1221 KSP Residual norm 5.769598812748e-09 1222 KSP Residual norm 5.794951041275e-09 1223 KSP Residual norm 6.160662343286e-09 1224 KSP Residual norm 6.784284283364e-09 1225 KSP Residual norm 7.250131995313e-09 1226 KSP Residual norm 6.905530944923e-09 1227 KSP Residual norm 6.327343703300e-09 1228 KSP Residual norm 5.994703658988e-09 1229 KSP Residual norm 5.960612848211e-09 1230 KSP Residual norm 5.854683522494e-09 1231 KSP Residual norm 5.283888980834e-09 1232 KSP Residual norm 4.846806212250e-09 1233 KSP Residual norm 4.689158987462e-09 1234 KSP Residual norm 4.612153129971e-09 1235 KSP Residual norm 4.486081995314e-09 1236 KSP Residual norm 4.321890217555e-09 1237 KSP Residual norm 4.299931629943e-09 1238 KSP Residual norm 4.359725336312e-09 1239 KSP Residual norm 4.360079493672e-09 1240 KSP Residual norm 4.526052441946e-09 1241 KSP Residual norm 4.965642352778e-09 1242 KSP Residual norm 5.080516847525e-09 1243 KSP Residual norm 4.498215233579e-09 1244 KSP Residual norm 3.947183260933e-09 1245 KSP Residual norm 3.614065402520e-09 1246 KSP Residual norm 3.555410827381e-09 1247 KSP Residual norm 3.676762000646e-09 1248 KSP Residual norm 3.870078588593e-09 1249 KSP Residual norm 4.163933781681e-09 1250 KSP Residual norm 4.406412219699e-09 1251 KSP Residual norm 4.599241159667e-09 1252 KSP Residual norm 4.855642713056e-09 1253 KSP Residual norm 5.007231578626e-09 1254 KSP Residual norm 5.007928666428e-09 1255 KSP Residual norm 5.213736099852e-09 1256 KSP Residual norm 5.616982382382e-09 1257 KSP Residual norm 5.614730284415e-09 1258 KSP Residual norm 5.399304490788e-09 1259 KSP Residual norm 5.382003425362e-09 1260 KSP Residual norm 5.687888283483e-09 1261 KSP Residual norm 5.824991233076e-09 1262 KSP Residual norm 5.856433798492e-09 1263 KSP Residual norm 5.631386321492e-09 1264 KSP Residual norm 5.385578999528e-09 1265 KSP Residual norm 5.257576755390e-09 1266 KSP Residual norm 5.020793149592e-09 1267 KSP Residual norm 4.906235331480e-09 1268 KSP Residual norm 4.970895336339e-09 1269 KSP Residual norm 4.997553668471e-09 1270 KSP Residual norm 4.855323166727e-09 1271 KSP Residual norm 4.821922839131e-09 1272 KSP Residual norm 5.069687721718e-09 1273 KSP Residual norm 5.294325975402e-09 1274 KSP Residual norm 5.311311993649e-09 1275 KSP Residual norm 5.317997561356e-09 1276 KSP Residual norm 5.309721221747e-09 1277 KSP Residual norm 5.620105797503e-09 1278 KSP Residual norm 5.935439775168e-09 1279 KSP Residual norm 5.998833765340e-09 1280 KSP Residual norm 5.583404679582e-09 1281 KSP Residual norm 5.425625266983e-09 1282 KSP Residual norm 5.707330707086e-09 1283 KSP Residual norm 6.031584282364e-09 1284 KSP Residual norm 6.208485196306e-09 1285 KSP Residual norm 5.930943101110e-09 1286 KSP Residual norm 5.598922112699e-09 1287 KSP Residual norm 5.455787065836e-09 1288 KSP Residual norm 5.647832365929e-09 1289 KSP Residual norm 5.943644239040e-09 1290 KSP Residual norm 5.746254069827e-09 1291 KSP Residual norm 5.469313278458e-09 1292 KSP Residual norm 5.453617323870e-09 1293 KSP Residual norm 5.483709114093e-09 1294 KSP Residual norm 5.443243024556e-09 1295 KSP Residual norm 5.274735971777e-09 1296 KSP Residual norm 4.908737761391e-09 1297 KSP Residual norm 4.787274963500e-09 1298 KSP Residual norm 4.945285809020e-09 1299 KSP Residual norm 5.091335837870e-09 1300 KSP Residual norm 5.035912141837e-09 1301 KSP Residual norm 5.053412721701e-09 1302 KSP Residual norm 4.883870130753e-09 1303 KSP Residual norm 4.556902217391e-09 1304 KSP Residual norm 4.284552690591e-09 1305 KSP Residual norm 4.326567573011e-09 1306 KSP Residual norm 4.535924840015e-09 1307 KSP Residual norm 4.844375804149e-09 1308 KSP Residual norm 5.150956700998e-09 1309 KSP Residual norm 5.263363526059e-09 1310 KSP Residual norm 5.366625313142e-09 1311 KSP Residual norm 5.513627157783e-09 1312 KSP Residual norm 5.855262985462e-09 1313 KSP Residual norm 6.330804485631e-09 1314 KSP Residual norm 6.365256259406e-09 1315 KSP Residual norm 6.190177395191e-09 1316 KSP Residual norm 5.902414589545e-09 1317 KSP Residual norm 5.762126876609e-09 1318 KSP Residual norm 5.411406055006e-09 1319 KSP Residual norm 5.111208365799e-09 1320 KSP Residual norm 5.261067040128e-09 1321 KSP Residual norm 5.287474930802e-09 1322 KSP Residual norm 5.085599122668e-09 1323 KSP Residual norm 5.083665650932e-09 1324 KSP Residual norm 5.644383286632e-09 1325 KSP Residual norm 6.156904323742e-09 1326 KSP Residual norm 6.060170460270e-09 1327 KSP Residual norm 6.000481709883e-09 1328 KSP Residual norm 6.040621671142e-09 1329 KSP Residual norm 6.004637505034e-09 1330 KSP Residual norm 5.623286585342e-09 1331 KSP Residual norm 5.298267775547e-09 1332 KSP Residual norm 5.410136331835e-09 1333 KSP Residual norm 5.974779175717e-09 1334 KSP Residual norm 6.475641539617e-09 1335 KSP Residual norm 6.868245965861e-09 1336 KSP Residual norm 6.951227743263e-09 1337 KSP Residual norm 7.103433831470e-09 1338 KSP Residual norm 7.085946382509e-09 1339 KSP Residual norm 7.400655750695e-09 1340 KSP Residual norm 7.810180354020e-09 1341 KSP Residual norm 8.382374392546e-09 1342 KSP Residual norm 9.138428756890e-09 1343 KSP Residual norm 9.789757644279e-09 1344 KSP Residual norm 9.703147236028e-09 1345 KSP Residual norm 9.048100224587e-09 1346 KSP Residual norm 8.395423647786e-09 1347 KSP Residual norm 7.625116892622e-09 1348 KSP Residual norm 7.043054266680e-09 1349 KSP Residual norm 6.838684120386e-09 1350 KSP Residual norm 7.101303776111e-09 1351 KSP Residual norm 7.210020595016e-09 1352 KSP Residual norm 6.896418475054e-09 1353 KSP Residual norm 6.514945509018e-09 1354 KSP Residual norm 6.102045601718e-09 1355 KSP Residual norm 5.972851649423e-09 1356 KSP Residual norm 5.819162849617e-09 1357 KSP Residual norm 5.422353326831e-09 1358 KSP Residual norm 5.246131948395e-09 1359 KSP Residual norm 5.232332369809e-09 1360 KSP Residual norm 5.426837933524e-09 1361 KSP Residual norm 5.512656806512e-09 1362 KSP Residual norm 5.433118812931e-09 1363 KSP Residual norm 5.013864261452e-09 1364 KSP Residual norm 4.682205773071e-09 1365 KSP Residual norm 4.531291442580e-09 1366 KSP Residual norm 4.578791094112e-09 1367 KSP Residual norm 4.906620921344e-09 1368 KSP Residual norm 5.427132286797e-09 1369 KSP Residual norm 5.854265639752e-09 1370 KSP Residual norm 5.623807021231e-09 1371 KSP Residual norm 5.242499112571e-09 1372 KSP Residual norm 5.069759852451e-09 1373 KSP Residual norm 5.081698629546e-09 1374 KSP Residual norm 5.591195229187e-09 1375 KSP Residual norm 5.913547221993e-09 1376 KSP Residual norm 6.030696461741e-09 1377 KSP Residual norm 6.054336420463e-09 1378 KSP Residual norm 6.269919305163e-09 1379 KSP Residual norm 6.393109154438e-09 1380 KSP Residual norm 6.259991929062e-09 1381 KSP Residual norm 6.017314653575e-09 1382 KSP Residual norm 6.108019719040e-09 1383 KSP Residual norm 6.309376500621e-09 1384 KSP Residual norm 6.310777047865e-09 1385 KSP Residual norm 6.675847423541e-09 1386 KSP Residual norm 7.352149498320e-09 1387 KSP Residual norm 8.250486916503e-09 1388 KSP Residual norm 8.513052968240e-09 1389 KSP Residual norm 8.249043482065e-09 1390 KSP Residual norm 7.622286228721e-09 1391 KSP Residual norm 7.747448405644e-09 1392 KSP Residual norm 8.406074316338e-09 1393 KSP Residual norm 9.294535267842e-09 1394 KSP Residual norm 9.531046799137e-09 1395 KSP Residual norm 9.496251168835e-09 1396 KSP Residual norm 1.010589553490e-08 1397 KSP Residual norm 1.090170851741e-08 1398 KSP Residual norm 1.052211556223e-08 1399 KSP Residual norm 1.021790606961e-08 1400 KSP Residual norm 1.004286654038e-08 1401 KSP Residual norm 1.074533324318e-08 1402 KSP Residual norm 1.130910961436e-08 1403 KSP Residual norm 1.090649823476e-08 1404 KSP Residual norm 1.032580007927e-08 1405 KSP Residual norm 9.855638426089e-09 1406 KSP Residual norm 8.988622930243e-09 1407 KSP Residual norm 8.242702340871e-09 1408 KSP Residual norm 8.060879284039e-09 1409 KSP Residual norm 7.746184075645e-09 1410 KSP Residual norm 6.967198406970e-09 1411 KSP Residual norm 6.448578582118e-09 1412 KSP Residual norm 6.231104048162e-09 1413 KSP Residual norm 6.171242946031e-09 1414 KSP Residual norm 6.400331745132e-09 1415 KSP Residual norm 6.967951646008e-09 1416 KSP Residual norm 7.182359604457e-09 1417 KSP Residual norm 7.086783585722e-09 1418 KSP Residual norm 7.022212723172e-09 1419 KSP Residual norm 7.310181358459e-09 1420 KSP Residual norm 7.528464685526e-09 1421 KSP Residual norm 7.577177499084e-09 1422 KSP Residual norm 7.354810933642e-09 1423 KSP Residual norm 6.959244342347e-09 1424 KSP Residual norm 6.565494178522e-09 1425 KSP Residual norm 6.666715261036e-09 1426 KSP Residual norm 6.882880963642e-09 1427 KSP Residual norm 7.165813511189e-09 1428 KSP Residual norm 7.372403376300e-09 1429 KSP Residual norm 7.158514503851e-09 1430 KSP Residual norm 6.690337053705e-09 1431 KSP Residual norm 6.756178846384e-09 1432 KSP Residual norm 7.089250083470e-09 1433 KSP Residual norm 7.175264719540e-09 1434 KSP Residual norm 7.143748554355e-09 1435 KSP Residual norm 6.859559068938e-09 1436 KSP Residual norm 6.514087905487e-09 1437 KSP Residual norm 6.562643222800e-09 1438 KSP Residual norm 6.286996638589e-09 1439 KSP Residual norm 6.062552801710e-09 1440 KSP Residual norm 6.094373399938e-09 1441 KSP Residual norm 5.830373295586e-09 1442 KSP Residual norm 5.627522077750e-09 1443 KSP Residual norm 5.435861099698e-09 1444 KSP Residual norm 5.652393116644e-09 1445 KSP Residual norm 6.186970893932e-09 1446 KSP Residual norm 6.445166771951e-09 1447 KSP Residual norm 6.445171276528e-09 1448 KSP Residual norm 6.528443673474e-09 1449 KSP Residual norm 6.748282868543e-09 1450 KSP Residual norm 6.815421719872e-09 1451 KSP Residual norm 6.663119139805e-09 1452 KSP Residual norm 6.555604283097e-09 1453 KSP Residual norm 6.886244609169e-09 1454 KSP Residual norm 7.233954616481e-09 1455 KSP Residual norm 7.206090196836e-09 1456 KSP Residual norm 6.847925290184e-09 1457 KSP Residual norm 6.544880897185e-09 1458 KSP Residual norm 6.679356838442e-09 1459 KSP Residual norm 6.848973731862e-09 1460 KSP Residual norm 6.585010868100e-09 1461 KSP Residual norm 6.527079366880e-09 1462 KSP Residual norm 6.789139960888e-09 1463 KSP Residual norm 6.865235050360e-09 1464 KSP Residual norm 6.383461698835e-09 1465 KSP Residual norm 5.957825132753e-09 1466 KSP Residual norm 5.594256767203e-09 1467 KSP Residual norm 5.050259010153e-09 1468 KSP Residual norm 4.635201330524e-09 1469 KSP Residual norm 4.588522577443e-09 1470 KSP Residual norm 4.923162698408e-09 1471 KSP Residual norm 5.275890603420e-09 1472 KSP Residual norm 5.438420870909e-09 1473 KSP Residual norm 5.539471465431e-09 1474 KSP Residual norm 5.450169152634e-09 1475 KSP Residual norm 5.764061300490e-09 1476 KSP Residual norm 6.274072161136e-09 1477 KSP Residual norm 6.534496745724e-09 1478 KSP Residual norm 6.689191936490e-09 1479 KSP Residual norm 7.222196859859e-09 1480 KSP Residual norm 8.159134912909e-09 1481 KSP Residual norm 9.079622355516e-09 1482 KSP Residual norm 8.895603642795e-09 1483 KSP Residual norm 8.478255971498e-09 1484 KSP Residual norm 8.215665726515e-09 1485 KSP Residual norm 8.196166810231e-09 1486 KSP Residual norm 8.129806329385e-09 1487 KSP Residual norm 7.712186713730e-09 1488 KSP Residual norm 7.145726170660e-09 1489 KSP Residual norm 6.710201723537e-09 1490 KSP Residual norm 6.183112769537e-09 1491 KSP Residual norm 5.890996079331e-09 1492 KSP Residual norm 6.010850376930e-09 1493 KSP Residual norm 6.320394181203e-09 1494 KSP Residual norm 6.480675247649e-09 1495 KSP Residual norm 6.468945225868e-09 1496 KSP Residual norm 6.413151416541e-09 1497 KSP Residual norm 6.405683049967e-09 1498 KSP Residual norm 6.713362378397e-09 1499 KSP Residual norm 7.362839019693e-09 1500 KSP Residual norm 8.218546400443e-09 1501 KSP Residual norm 9.126244371288e-09 1502 KSP Residual norm 9.187999637682e-09 1503 KSP Residual norm 8.994224568114e-09 1504 KSP Residual norm 8.956791651348e-09 1505 KSP Residual norm 9.008931250954e-09 1506 KSP Residual norm 8.111298405204e-09 1507 KSP Residual norm 6.857889616687e-09 1508 KSP Residual norm 6.176286147163e-09 1509 KSP Residual norm 6.440231115732e-09 1510 KSP Residual norm 6.823661389405e-09 1511 KSP Residual norm 6.992340350875e-09 1512 KSP Residual norm 6.958305876771e-09 1513 KSP Residual norm 6.467795789628e-09 1514 KSP Residual norm 6.353785021005e-09 1515 KSP Residual norm 6.340916661332e-09 1516 KSP Residual norm 6.116061543265e-09 1517 KSP Residual norm 5.807521585733e-09 1518 KSP Residual norm 5.475984080710e-09 1519 KSP Residual norm 5.654083515429e-09 1520 KSP Residual norm 6.166066208453e-09 1521 KSP Residual norm 6.320483108977e-09 1522 KSP Residual norm 5.722024017225e-09 1523 KSP Residual norm 5.188219990147e-09 1524 KSP Residual norm 5.429086649369e-09 1525 KSP Residual norm 6.106545681193e-09 1526 KSP Residual norm 6.674486064947e-09 1527 KSP Residual norm 6.910667075049e-09 1528 KSP Residual norm 7.083576459984e-09 1529 KSP Residual norm 7.029737335857e-09 1530 KSP Residual norm 6.274380452945e-09 1531 KSP Residual norm 5.679041113612e-09 1532 KSP Residual norm 5.635687992529e-09 1533 KSP Residual norm 5.937116614564e-09 1534 KSP Residual norm 6.270517604186e-09 1535 KSP Residual norm 6.244915105009e-09 1536 KSP Residual norm 5.542777175423e-09 1537 KSP Residual norm 5.104777082332e-09 1538 KSP Residual norm 5.609912204567e-09 1539 KSP Residual norm 6.794471819737e-09 1540 KSP Residual norm 7.551266820266e-09 1541 KSP Residual norm 7.657458970890e-09 1542 KSP Residual norm 7.666962353838e-09 1543 KSP Residual norm 7.524013071447e-09 1544 KSP Residual norm 7.015237105295e-09 1545 KSP Residual norm 6.741916666246e-09 1546 KSP Residual norm 6.655195334126e-09 1547 KSP Residual norm 6.749674411151e-09 1548 KSP Residual norm 6.879670292889e-09 1549 KSP Residual norm 7.051384388209e-09 1550 KSP Residual norm 6.901598593430e-09 1551 KSP Residual norm 6.799594206440e-09 1552 KSP Residual norm 6.791120105167e-09 1553 KSP Residual norm 6.615349086539e-09 1554 KSP Residual norm 6.453552303435e-09 1555 KSP Residual norm 6.197419971854e-09 1556 KSP Residual norm 5.846764327012e-09 1557 KSP Residual norm 5.957828177962e-09 1558 KSP Residual norm 6.885904302613e-09 1559 KSP Residual norm 7.918278632307e-09 1560 KSP Residual norm 8.098796980225e-09 1561 KSP Residual norm 7.686663728298e-09 1562 KSP Residual norm 7.302415918739e-09 1563 KSP Residual norm 7.108455464551e-09 1564 KSP Residual norm 7.198431086715e-09 1565 KSP Residual norm 7.332354765431e-09 1566 KSP Residual norm 7.137457743318e-09 1567 KSP Residual norm 6.843859668678e-09 1568 KSP Residual norm 6.280092540185e-09 1569 KSP Residual norm 5.904344627956e-09 1570 KSP Residual norm 5.911953943207e-09 1571 KSP Residual norm 6.395845671491e-09 1572 KSP Residual norm 6.516540452999e-09 1573 KSP Residual norm 6.342856496735e-09 1574 KSP Residual norm 6.432099136908e-09 1575 KSP Residual norm 6.990799336033e-09 1576 KSP Residual norm 7.054507263406e-09 1577 KSP Residual norm 6.403387625555e-09 1578 KSP Residual norm 5.850568209413e-09 1579 KSP Residual norm 5.731867704067e-09 1580 KSP Residual norm 6.151642297887e-09 1581 KSP Residual norm 6.476135359801e-09 1582 KSP Residual norm 6.264158263459e-09 1583 KSP Residual norm 5.789703129690e-09 1584 KSP Residual norm 5.181119184503e-09 1585 KSP Residual norm 4.640201574210e-09 1586 KSP Residual norm 4.543685286328e-09 1587 KSP Residual norm 4.886622821368e-09 1588 KSP Residual norm 5.499371925317e-09 1589 KSP Residual norm 5.827923288971e-09 1590 KSP Residual norm 5.338254605360e-09 1591 KSP Residual norm 4.615487114070e-09 1592 KSP Residual norm 4.557857910604e-09 1593 KSP Residual norm 5.173542884141e-09 1594 KSP Residual norm 5.915929603126e-09 1595 KSP Residual norm 5.912764596601e-09 1596 KSP Residual norm 5.490354240378e-09 1597 KSP Residual norm 5.282576243602e-09 1598 KSP Residual norm 5.174946750847e-09 1599 KSP Residual norm 5.194932481331e-09 1600 KSP Residual norm 5.287455161159e-09 1601 KSP Residual norm 5.580128563512e-09 1602 KSP Residual norm 6.117273558801e-09 1603 KSP Residual norm 7.071752727718e-09 1604 KSP Residual norm 7.464901425403e-09 1605 KSP Residual norm 7.105943174801e-09 1606 KSP Residual norm 6.487174801206e-09 1607 KSP Residual norm 5.797444690955e-09 1608 KSP Residual norm 5.571365050587e-09 1609 KSP Residual norm 5.341121800831e-09 1610 KSP Residual norm 5.387488129072e-09 1611 KSP Residual norm 5.949467617357e-09 1612 KSP Residual norm 6.833874143135e-09 1613 KSP Residual norm 7.145119281720e-09 1614 KSP Residual norm 5.993885739091e-09 1615 KSP Residual norm 4.942335038392e-09 1616 KSP Residual norm 4.592638057255e-09 1617 KSP Residual norm 4.575130760594e-09 1618 KSP Residual norm 5.001175421143e-09 1619 KSP Residual norm 5.479651413713e-09 1620 KSP Residual norm 5.542987613821e-09 1621 KSP Residual norm 5.566682970825e-09 1622 KSP Residual norm 6.039006342030e-09 1623 KSP Residual norm 6.688281898548e-09 1624 KSP Residual norm 7.405214752204e-09 1625 KSP Residual norm 8.192244573982e-09 1626 KSP Residual norm 8.213320139816e-09 1627 KSP Residual norm 6.719148898610e-09 1628 KSP Residual norm 4.816441445922e-09 1629 KSP Residual norm 3.569276385470e-09 1630 KSP Residual norm 3.161718929418e-09 1631 KSP Residual norm 3.384072773542e-09 1632 KSP Residual norm 4.365942611272e-09 1633 KSP Residual norm 5.838613372568e-09 1634 KSP Residual norm 7.063814896833e-09 1635 KSP Residual norm 6.754881478231e-09 1636 KSP Residual norm 5.302993549806e-09 1637 KSP Residual norm 4.022294698904e-09 1638 KSP Residual norm 3.336830956381e-09 1639 KSP Residual norm 3.382216790704e-09 1640 KSP Residual norm 4.122846590595e-09 1641 KSP Residual norm 5.628455678064e-09 1642 KSP Residual norm 7.453809527574e-09 1643 KSP Residual norm 8.585572370746e-09 1644 KSP Residual norm 7.720245610888e-09 1645 KSP Residual norm 5.839892756764e-09 1646 KSP Residual norm 4.562759110000e-09 1647 KSP Residual norm 4.035502971172e-09 1648 KSP Residual norm 4.459746395291e-09 1649 KSP Residual norm 6.339971680605e-09 1650 KSP Residual norm 9.544252251435e-09 1651 KSP Residual norm 1.277853505406e-08 1652 KSP Residual norm 1.333674987639e-08 1653 KSP Residual norm 1.018728090427e-08 1654 KSP Residual norm 6.787602195166e-09 1655 KSP Residual norm 4.634761985687e-09 1656 KSP Residual norm 3.884765096322e-09 1657 KSP Residual norm 4.452327817239e-09 1658 KSP Residual norm 6.187708996720e-09 1659 KSP Residual norm 9.188762273242e-09 1660 KSP Residual norm 1.249234464833e-08 1661 KSP Residual norm 1.199108676087e-08 1662 KSP Residual norm 8.552244908845e-09 1663 KSP Residual norm 5.350571406453e-09 1664 KSP Residual norm 3.718955784274e-09 1665 KSP Residual norm 3.264820096550e-09 1666 KSP Residual norm 3.738781861703e-09 1667 KSP Residual norm 5.868977171989e-09 1668 KSP Residual norm 1.009716432728e-08 1669 KSP Residual norm 1.436105605370e-08 1670 KSP Residual norm 1.301939326527e-08 1671 KSP Residual norm 8.547140158888e-09 1672 KSP Residual norm 5.136000634038e-09 1673 KSP Residual norm 3.666011695231e-09 1674 KSP Residual norm 3.661543587382e-09 1675 KSP Residual norm 5.172136262899e-09 1676 KSP Residual norm 8.102397425397e-09 1677 KSP Residual norm 1.061328652132e-08 1678 KSP Residual norm 1.024850504659e-08 1679 KSP Residual norm 7.497006947870e-09 1680 KSP Residual norm 4.699527341785e-09 1681 KSP Residual norm 3.256915648897e-09 1682 KSP Residual norm 2.904042292936e-09 1683 KSP Residual norm 3.406465868263e-09 1684 KSP Residual norm 5.201344426952e-09 1685 KSP Residual norm 8.658733429381e-09 1686 KSP Residual norm 1.349484446793e-08 1687 KSP Residual norm 1.480056569974e-08 1688 KSP Residual norm 1.034265976153e-08 1689 KSP Residual norm 5.922786724866e-09 1690 KSP Residual norm 3.946059101282e-09 1691 KSP Residual norm 3.478074210148e-09 1692 KSP Residual norm 3.825656362010e-09 1693 KSP Residual norm 5.101541505286e-09 1694 KSP Residual norm 8.122535643963e-09 1695 KSP Residual norm 1.310865373363e-08 1696 KSP Residual norm 1.852190796237e-08 1697 KSP Residual norm 1.750532575209e-08 1698 KSP Residual norm 1.195401840458e-08 1699 KSP Residual norm 7.469248030256e-09 1700 KSP Residual norm 5.429595671418e-09 1701 KSP Residual norm 5.120735893425e-09 1702 KSP Residual norm 6.344323256916e-09 1703 KSP Residual norm 1.005316504690e-08 1704 KSP Residual norm 1.847642997386e-08 1705 KSP Residual norm 3.055363250437e-08 1706 KSP Residual norm 3.745170469893e-08 1707 KSP Residual norm 2.894167662406e-08 1708 KSP Residual norm 1.573341449007e-08 1709 KSP Residual norm 8.793214737611e-09 1710 KSP Residual norm 5.976477842103e-09 1711 KSP Residual norm 5.542141477006e-09 1712 KSP Residual norm 7.472309843921e-09 1713 KSP Residual norm 1.329830601414e-08 1714 KSP Residual norm 2.116608001002e-08 1715 KSP Residual norm 2.352195802296e-08 1716 KSP Residual norm 1.976217034417e-08 1717 KSP Residual norm 1.801947342850e-08 1718 KSP Residual norm 1.828554262245e-08 1719 KSP Residual norm 1.760195187008e-08 1720 KSP Residual norm 1.280354947717e-08 1721 KSP Residual norm 8.451736364491e-09 1722 KSP Residual norm 6.774907628386e-09 1723 KSP Residual norm 7.218941102094e-09 1724 KSP Residual norm 1.027905090372e-08 1725 KSP Residual norm 1.474928042566e-08 1726 KSP Residual norm 1.733499186888e-08 1727 KSP Residual norm 1.788010785926e-08 1728 KSP Residual norm 2.026175285120e-08 1729 KSP Residual norm 2.526034973014e-08 1730 KSP Residual norm 2.603499434422e-08 1731 KSP Residual norm 1.923848259791e-08 1732 KSP Residual norm 1.269150796253e-08 1733 KSP Residual norm 1.060884812411e-08 1734 KSP Residual norm 1.121019061563e-08 1735 KSP Residual norm 1.170739883726e-08 1736 KSP Residual norm 9.900160343060e-09 1737 KSP Residual norm 7.366235694034e-09 1738 KSP Residual norm 6.525633125056e-09 1739 KSP Residual norm 7.892408545725e-09 1740 KSP Residual norm 1.173096638007e-08 1741 KSP Residual norm 1.602540287519e-08 1742 KSP Residual norm 1.716535327318e-08 1743 KSP Residual norm 1.598567556601e-08 1744 KSP Residual norm 1.730297482535e-08 1745 KSP Residual norm 2.129119221021e-08 1746 KSP Residual norm 2.342872435743e-08 1747 KSP Residual norm 1.777124044598e-08 1748 KSP Residual norm 1.248784132970e-08 1749 KSP Residual norm 9.849689081455e-09 1750 KSP Residual norm 8.704322735055e-09 1751 KSP Residual norm 7.815148094748e-09 1752 KSP Residual norm 6.549680348628e-09 1753 KSP Residual norm 5.704521062665e-09 1754 KSP Residual norm 5.467258644598e-09 1755 KSP Residual norm 6.364436168776e-09 1756 KSP Residual norm 7.983867064264e-09 1757 KSP Residual norm 8.999623138887e-09 1758 KSP Residual norm 9.459257919283e-09 1759 KSP Residual norm 1.133172032843e-08 1760 KSP Residual norm 1.613675607867e-08 1761 KSP Residual norm 2.254356723549e-08 1762 KSP Residual norm 2.391265453795e-08 1763 KSP Residual norm 2.057232498243e-08 1764 KSP Residual norm 1.752588794538e-08 1765 KSP Residual norm 1.717382880075e-08 1766 KSP Residual norm 1.642590216359e-08 1767 KSP Residual norm 1.256531750183e-08 1768 KSP Residual norm 8.516038842617e-09 1769 KSP Residual norm 7.234320500545e-09 1770 KSP Residual norm 7.596160652642e-09 1771 KSP Residual norm 8.904076728744e-09 1772 KSP Residual norm 9.042841058097e-09 1773 KSP Residual norm 8.498177420260e-09 1774 KSP Residual norm 9.264797639594e-09 1775 KSP Residual norm 1.218853216383e-08 1776 KSP Residual norm 1.589969823584e-08 1777 KSP Residual norm 1.689252909459e-08 1778 KSP Residual norm 1.537543973601e-08 1779 KSP Residual norm 1.573806341339e-08 1780 KSP Residual norm 1.891512673160e-08 1781 KSP Residual norm 2.210719254489e-08 1782 KSP Residual norm 1.974345296844e-08 1783 KSP Residual norm 1.638111270806e-08 1784 KSP Residual norm 1.479827356574e-08 1785 KSP Residual norm 1.476432877437e-08 1786 KSP Residual norm 1.373550665203e-08 1787 KSP Residual norm 1.061094861872e-08 1788 KSP Residual norm 8.080373386618e-09 1789 KSP Residual norm 7.632636997264e-09 1790 KSP Residual norm 9.164688190743e-09 1791 KSP Residual norm 1.067897504869e-08 1792 KSP Residual norm 1.036983113582e-08 1793 KSP Residual norm 1.022905298199e-08 1794 KSP Residual norm 1.230158077891e-08 1795 KSP Residual norm 1.608756654318e-08 1796 KSP Residual norm 1.904297873147e-08 1797 KSP Residual norm 1.788470456475e-08 1798 KSP Residual norm 1.731695711901e-08 1799 KSP Residual norm 2.030644250363e-08 1800 KSP Residual norm 2.329210952484e-08 1801 KSP Residual norm 2.013022304072e-08 1802 KSP Residual norm 1.541458496493e-08 1803 KSP Residual norm 1.322871545010e-08 1804 KSP Residual norm 1.208525458204e-08 1805 KSP Residual norm 1.121080615712e-08 1806 KSP Residual norm 8.949785326725e-09 1807 KSP Residual norm 6.569588845211e-09 1808 KSP Residual norm 5.358179174338e-09 1809 KSP Residual norm 5.162364029233e-09 1810 KSP Residual norm 5.522616748915e-09 1811 KSP Residual norm 5.458132785714e-09 1812 KSP Residual norm 5.237020215507e-09 1813 KSP Residual norm 5.816194788577e-09 1814 KSP Residual norm 7.701676381278e-09 1815 KSP Residual norm 1.116293028979e-08 1816 KSP Residual norm 1.327857071381e-08 1817 KSP Residual norm 1.295919675400e-08 1818 KSP Residual norm 1.429161180964e-08 1819 KSP Residual norm 1.859301931437e-08 1820 KSP Residual norm 2.330822930914e-08 1821 KSP Residual norm 2.306253036764e-08 1822 KSP Residual norm 2.019566520997e-08 1823 KSP Residual norm 1.997990240906e-08 1824 KSP Residual norm 2.229829398227e-08 1825 KSP Residual norm 2.159665340240e-08 1826 KSP Residual norm 1.781093990726e-08 1827 KSP Residual norm 1.508308993315e-08 1828 KSP Residual norm 1.466340210288e-08 1829 KSP Residual norm 1.406366536224e-08 1830 KSP Residual norm 1.153869244684e-08 1831 KSP Residual norm 9.168480281613e-09 1832 KSP Residual norm 7.926855457118e-09 1833 KSP Residual norm 7.629078007991e-09 1834 KSP Residual norm 7.095628733395e-09 1835 KSP Residual norm 5.816365626893e-09 1836 KSP Residual norm 5.228358158961e-09 1837 KSP Residual norm 5.966904006649e-09 1838 KSP Residual norm 7.801035160832e-09 1839 KSP Residual norm 9.025098740914e-09 1840 KSP Residual norm 9.635038602733e-09 1841 KSP Residual norm 1.079466311366e-08 1842 KSP Residual norm 1.313896727167e-08 1843 KSP Residual norm 1.618865461740e-08 1844 KSP Residual norm 1.713577583648e-08 1845 KSP Residual norm 1.684058929489e-08 1846 KSP Residual norm 1.818481532415e-08 1847 KSP Residual norm 2.016280652098e-08 1848 KSP Residual norm 1.860839518552e-08 1849 KSP Residual norm 1.387186213987e-08 1850 KSP Residual norm 1.199747151286e-08 1851 KSP Residual norm 1.220805212276e-08 1852 KSP Residual norm 1.195497789080e-08 1853 KSP Residual norm 9.483156084109e-09 1854 KSP Residual norm 7.155822214847e-09 1855 KSP Residual norm 6.440129885764e-09 1856 KSP Residual norm 6.833104397731e-09 1857 KSP Residual norm 6.833048263177e-09 1858 KSP Residual norm 5.965312918960e-09 1859 KSP Residual norm 5.754759350189e-09 1860 KSP Residual norm 6.478501589805e-09 1861 KSP Residual norm 6.908714291275e-09 1862 KSP Residual norm 6.275959164779e-09 1863 KSP Residual norm 6.234284503816e-09 1864 KSP Residual norm 7.462466450869e-09 1865 KSP Residual norm 1.010278392749e-08 1866 KSP Residual norm 1.211599682211e-08 1867 KSP Residual norm 1.203590489442e-08 1868 KSP Residual norm 1.149924770875e-08 1869 KSP Residual norm 1.255803267784e-08 1870 KSP Residual norm 1.471173662658e-08 1871 KSP Residual norm 1.489013230380e-08 1872 KSP Residual norm 1.331860786533e-08 1873 KSP Residual norm 1.197295717444e-08 1874 KSP Residual norm 1.175092691038e-08 1875 KSP Residual norm 1.065113986256e-08 1876 KSP Residual norm 8.854186060263e-09 1877 KSP Residual norm 8.042736995728e-09 1878 KSP Residual norm 7.687537624914e-09 1879 KSP Residual norm 6.979535142410e-09 1880 KSP Residual norm 5.828717889093e-09 1881 KSP Residual norm 5.061855541307e-09 1882 KSP Residual norm 5.097822909601e-09 1883 KSP Residual norm 5.635232948463e-09 1884 KSP Residual norm 5.528839162897e-09 1885 KSP Residual norm 5.075976285308e-09 1886 KSP Residual norm 5.271339773256e-09 1887 KSP Residual norm 6.466522966761e-09 1888 KSP Residual norm 7.698424896003e-09 1889 KSP Residual norm 7.537857878592e-09 1890 KSP Residual norm 7.841477714487e-09 1891 KSP Residual norm 9.802089370960e-09 1892 KSP Residual norm 1.186683301708e-08 1893 KSP Residual norm 1.181572401973e-08 1894 KSP Residual norm 1.088560446859e-08 1895 KSP Residual norm 1.074363967601e-08 1896 KSP Residual norm 1.091873109012e-08 1897 KSP Residual norm 1.011447702529e-08 1898 KSP Residual norm 8.607150935035e-09 1899 KSP Residual norm 7.809768672545e-09 1900 KSP Residual norm 7.271647130807e-09 1901 KSP Residual norm 6.151894558418e-09 1902 KSP Residual norm 4.701650767176e-09 1903 KSP Residual norm 4.001250930082e-09 1904 KSP Residual norm 4.014342670641e-09 1905 KSP Residual norm 4.014755518569e-09 1906 KSP Residual norm 3.381486151607e-09 1907 KSP Residual norm 2.860570095875e-09 1908 KSP Residual norm 2.787194249446e-09 1909 KSP Residual norm 2.925637573286e-09 1910 KSP Residual norm 2.850768392710e-09 1911 KSP Residual norm 2.629632141618e-09 1912 KSP Residual norm 2.543488281363e-09 1913 KSP Residual norm 2.758665203168e-09 1914 KSP Residual norm 3.122577954135e-09 1915 KSP Residual norm 3.300279104415e-09 1916 KSP Residual norm 3.575903972707e-09 1917 KSP Residual norm 4.331383168363e-09 1918 KSP Residual norm 5.322382648184e-09 1919 KSP Residual norm 5.989643009644e-09 1920 KSP Residual norm 6.129501873252e-09 1921 KSP Residual norm 6.558677929064e-09 1922 KSP Residual norm 7.448496029930e-09 1923 KSP Residual norm 8.309276460357e-09 1924 KSP Residual norm 8.888394508602e-09 1925 KSP Residual norm 9.260993804675e-09 1926 KSP Residual norm 9.398818802763e-09 1927 KSP Residual norm 9.167733030243e-09 1928 KSP Residual norm 8.694452036454e-09 1929 KSP Residual norm 8.118493730918e-09 1930 KSP Residual norm 7.552453726393e-09 1931 KSP Residual norm 6.690468117701e-09 1932 KSP Residual norm 5.906787021661e-09 1933 KSP Residual norm 5.475548056650e-09 1934 KSP Residual norm 5.101915525627e-09 1935 KSP Residual norm 4.372814211622e-09 1936 KSP Residual norm 3.635411946770e-09 1937 KSP Residual norm 3.474617295726e-09 1938 KSP Residual norm 3.691623873856e-09 1939 KSP Residual norm 3.769104859578e-09 1940 KSP Residual norm 3.654357596247e-09 1941 KSP Residual norm 3.676225696378e-09 1942 KSP Residual norm 4.060283810478e-09 1943 KSP Residual norm 4.794706100032e-09 1944 KSP Residual norm 5.124706513180e-09 1945 KSP Residual norm 4.973925947713e-09 1946 KSP Residual norm 5.341217972474e-09 1947 KSP Residual norm 6.724200294589e-09 1948 KSP Residual norm 7.956478285450e-09 1949 KSP Residual norm 8.457839054913e-09 1950 KSP Residual norm 8.967832999552e-09 1951 KSP Residual norm 1.079873418093e-08 1952 KSP Residual norm 1.188594306097e-08 1953 KSP Residual norm 1.187881125415e-08 1954 KSP Residual norm 1.186042252355e-08 1955 KSP Residual norm 1.215056506464e-08 1956 KSP Residual norm 1.189277288364e-08 1957 KSP Residual norm 1.102180231854e-08 1958 KSP Residual norm 1.056733835946e-08 1959 KSP Residual norm 1.070061030016e-08 1960 KSP Residual norm 1.053021970473e-08 1961 KSP Residual norm 9.302144668995e-09 1962 KSP Residual norm 8.563188146430e-09 1963 KSP Residual norm 8.636311626926e-09 1964 KSP Residual norm 8.225365528182e-09 1965 KSP Residual norm 6.776914402484e-09 1966 KSP Residual norm 5.856719971272e-09 1967 KSP Residual norm 5.674287606106e-09 1968 KSP Residual norm 5.405078139958e-09 1969 KSP Residual norm 4.400452615927e-09 1970 KSP Residual norm 3.686095705298e-09 1971 KSP Residual norm 3.423827820756e-09 1972 KSP Residual norm 3.212476288681e-09 1973 KSP Residual norm 2.812927100868e-09 1974 KSP Residual norm 2.545718902593e-09 1975 KSP Residual norm 2.685895353071e-09 1976 KSP Residual norm 3.047570960645e-09 1977 KSP Residual norm 3.139469952608e-09 1978 KSP Residual norm 3.290853250926e-09 1979 KSP Residual norm 3.797355271334e-09 1980 KSP Residual norm 4.220123069091e-09 1981 KSP Residual norm 4.573388248561e-09 1982 KSP Residual norm 4.799374598505e-09 1983 KSP Residual norm 5.168404604099e-09 1984 KSP Residual norm 6.125216462392e-09 1985 KSP Residual norm 7.355668791331e-09 1986 KSP Residual norm 8.492608513183e-09 1987 KSP Residual norm 9.785696118936e-09 1988 KSP Residual norm 1.142568762660e-08 1989 KSP Residual norm 1.267567918020e-08 1990 KSP Residual norm 1.296256317284e-08 1991 KSP Residual norm 1.278542028330e-08 1992 KSP Residual norm 1.384185237478e-08 1993 KSP Residual norm 1.492501923916e-08 1994 KSP Residual norm 1.498357150428e-08 1995 KSP Residual norm 1.459317975598e-08 1996 KSP Residual norm 1.530320681307e-08 1997 KSP Residual norm 1.586709210965e-08 1998 KSP Residual norm 1.455087112376e-08 1999 KSP Residual norm 1.210461945035e-08 2000 KSP Residual norm 1.088544041600e-08 2001 KSP Residual norm 1.015678590302e-08 2002 KSP Residual norm 8.785680491739e-09 2003 KSP Residual norm 7.162297936646e-09 2004 KSP Residual norm 6.219081471319e-09 2005 KSP Residual norm 6.158646998047e-09 2006 KSP Residual norm 5.883414720073e-09 2007 KSP Residual norm 5.058300095333e-09 2008 KSP Residual norm 4.282198234298e-09 2009 KSP Residual norm 4.011222031996e-09 2010 KSP Residual norm 3.860601041274e-09 2011 KSP Residual norm 3.343380168272e-09 2012 KSP Residual norm 2.982887267319e-09 2013 KSP Residual norm 3.041729383032e-09 2014 KSP Residual norm 3.321134508816e-09 2015 KSP Residual norm 3.554899694626e-09 2016 KSP Residual norm 3.851531701994e-09 2017 KSP Residual norm 4.262354490114e-09 2018 KSP Residual norm 4.956661540559e-09 2019 KSP Residual norm 5.407005925179e-09 2020 KSP Residual norm 5.319152303475e-09 2021 KSP Residual norm 5.647028584698e-09 2022 KSP Residual norm 6.814017830980e-09 2023 KSP Residual norm 7.988386234640e-09 2024 KSP Residual norm 8.002221121661e-09 2025 KSP Residual norm 8.100794945750e-09 2026 KSP Residual norm 8.669333119823e-09 2027 KSP Residual norm 9.139603961619e-09 2028 KSP Residual norm 8.850704165089e-09 2029 KSP Residual norm 9.216507414717e-09 2030 KSP Residual norm 9.623212278569e-09 2031 KSP Residual norm 8.980714559382e-09 2032 KSP Residual norm 7.533323551446e-09 2033 KSP Residual norm 6.805500572549e-09 2034 KSP Residual norm 6.248699676593e-09 2035 KSP Residual norm 5.832948034853e-09 2036 KSP Residual norm 5.439983993243e-09 2037 KSP Residual norm 5.233831436763e-09 2038 KSP Residual norm 5.165909444091e-09 2039 KSP Residual norm 4.871990250703e-09 2040 KSP Residual norm 4.412149176292e-09 2041 KSP Residual norm 3.995011545730e-09 2042 KSP Residual norm 3.950119802286e-09 2043 KSP Residual norm 3.736957978864e-09 2044 KSP Residual norm 3.371533629831e-09 2045 KSP Residual norm 3.213106028831e-09 2046 KSP Residual norm 3.371732022504e-09 2047 KSP Residual norm 3.570045121255e-09 2048 KSP Residual norm 3.643938053927e-09 2049 KSP Residual norm 3.728190288317e-09 2050 KSP Residual norm 3.964004855891e-09 2051 KSP Residual norm 4.420555370104e-09 2052 KSP Residual norm 4.737581481973e-09 2053 KSP Residual norm 5.107093954646e-09 2054 KSP Residual norm 5.835900836343e-09 2055 KSP Residual norm 6.209834941748e-09 2056 KSP Residual norm 6.285170592162e-09 2057 KSP Residual norm 6.411575279884e-09 2058 KSP Residual norm 6.797719534515e-09 2059 KSP Residual norm 7.195568796046e-09 2060 KSP Residual norm 7.783525041961e-09 2061 KSP Residual norm 8.491411039081e-09 2062 KSP Residual norm 8.960477378264e-09 2063 KSP Residual norm 9.874208707923e-09 2064 KSP Residual norm 1.080702145127e-08 2065 KSP Residual norm 1.101233769388e-08 2066 KSP Residual norm 1.099306098986e-08 2067 KSP Residual norm 1.124803426964e-08 2068 KSP Residual norm 1.183694566938e-08 2069 KSP Residual norm 1.145792372067e-08 2070 KSP Residual norm 1.041547906245e-08 2071 KSP Residual norm 9.871020973997e-09 2072 KSP Residual norm 9.161857076316e-09 2073 KSP Residual norm 8.267421448092e-09 2074 KSP Residual norm 7.895001813094e-09 2075 KSP Residual norm 8.030575766766e-09 2076 KSP Residual norm 7.784006674563e-09 2077 KSP Residual norm 6.564235338202e-09 2078 KSP Residual norm 6.011576670459e-09 2079 KSP Residual norm 5.871462759477e-09 2080 KSP Residual norm 5.419567445641e-09 2081 KSP Residual norm 4.597519438500e-09 2082 KSP Residual norm 4.226957859381e-09 2083 KSP Residual norm 4.415885326075e-09 2084 KSP Residual norm 4.449295156468e-09 2085 KSP Residual norm 4.029722722723e-09 2086 KSP Residual norm 3.707480170466e-09 2087 KSP Residual norm 3.834906124398e-09 2088 KSP Residual norm 3.862734618636e-09 2089 KSP Residual norm 3.568600227124e-09 2090 KSP Residual norm 3.452225474400e-09 2091 KSP Residual norm 3.989421030181e-09 2092 KSP Residual norm 4.808559410741e-09 2093 KSP Residual norm 5.071308160787e-09 2094 KSP Residual norm 5.304399069072e-09 2095 KSP Residual norm 6.232520962593e-09 2096 KSP Residual norm 7.250939909301e-09 2097 KSP Residual norm 7.178354715506e-09 2098 KSP Residual norm 6.668509367606e-09 2099 KSP Residual norm 6.920186896191e-09 2100 KSP Residual norm 8.044591961235e-09 2101 KSP Residual norm 8.572338116056e-09 2102 KSP Residual norm 8.589921835449e-09 2103 KSP Residual norm 9.603182268822e-09 2104 KSP Residual norm 1.113859303748e-08 2105 KSP Residual norm 1.122691963643e-08 2106 KSP Residual norm 1.102081131919e-08 2107 KSP Residual norm 1.131855632082e-08 2108 KSP Residual norm 1.145658292058e-08 2109 KSP Residual norm 1.123578497315e-08 2110 KSP Residual norm 1.064087568512e-08 2111 KSP Residual norm 1.052421671550e-08 2112 KSP Residual norm 1.018539284527e-08 2113 KSP Residual norm 9.582476339267e-09 2114 KSP Residual norm 9.047997301229e-09 2115 KSP Residual norm 8.149919442029e-09 2116 KSP Residual norm 7.413352469129e-09 2117 KSP Residual norm 6.762678437961e-09 2118 KSP Residual norm 6.234759641656e-09 2119 KSP Residual norm 6.021826477371e-09 2120 KSP Residual norm 5.906263593390e-09 2121 KSP Residual norm 5.670976330175e-09 2122 KSP Residual norm 5.135499528580e-09 2123 KSP Residual norm 4.651345894403e-09 2124 KSP Residual norm 4.567896368396e-09 2125 KSP Residual norm 4.379734816592e-09 2126 KSP Residual norm 3.988119161519e-09 2127 KSP Residual norm 3.645233406037e-09 2128 KSP Residual norm 3.594578588641e-09 2129 KSP Residual norm 3.799772163555e-09 2130 KSP Residual norm 3.874407248191e-09 2131 KSP Residual norm 3.899523925124e-09 2132 KSP Residual norm 3.941871180715e-09 2133 KSP Residual norm 4.220383101920e-09 2134 KSP Residual norm 4.638510413027e-09 2135 KSP Residual norm 4.921723051313e-09 2136 KSP Residual norm 5.415365280333e-09 2137 KSP Residual norm 6.197773697727e-09 2138 KSP Residual norm 6.519207022254e-09 2139 KSP Residual norm 6.524634021206e-09 2140 KSP Residual norm 7.188245941792e-09 2141 KSP Residual norm 9.104321152297e-09 2142 KSP Residual norm 1.013847489269e-08 2143 KSP Residual norm 1.013238294629e-08 2144 KSP Residual norm 1.100532121028e-08 2145 KSP Residual norm 1.256019105239e-08 2146 KSP Residual norm 1.328149455892e-08 2147 KSP Residual norm 1.376931111546e-08 2148 KSP Residual norm 1.459563897719e-08 2149 KSP Residual norm 1.470429193024e-08 2150 KSP Residual norm 1.375560507137e-08 2151 KSP Residual norm 1.267627228963e-08 2152 KSP Residual norm 1.237541171865e-08 2153 KSP Residual norm 1.300234164549e-08 2154 KSP Residual norm 1.340529952548e-08 2155 KSP Residual norm 1.289142213633e-08 2156 KSP Residual norm 1.227270539917e-08 2157 KSP Residual norm 1.220879997904e-08 2158 KSP Residual norm 1.102751780957e-08 2159 KSP Residual norm 9.346548841594e-09 2160 KSP Residual norm 8.714478472795e-09 2161 KSP Residual norm 8.703613006859e-09 2162 KSP Residual norm 8.628193419787e-09 2163 KSP Residual norm 7.849825671412e-09 2164 KSP Residual norm 7.176800477640e-09 2165 KSP Residual norm 6.796051542640e-09 2166 KSP Residual norm 6.260209098082e-09 2167 KSP Residual norm 5.933056967721e-09 2168 KSP Residual norm 6.153632090377e-09 2169 KSP Residual norm 6.448354297310e-09 2170 KSP Residual norm 6.255303154254e-09 2171 KSP Residual norm 6.092975081473e-09 2172 KSP Residual norm 6.452634646636e-09 2173 KSP Residual norm 6.970074186049e-09 2174 KSP Residual norm 6.985377630371e-09 2175 KSP Residual norm 6.938485007048e-09 2176 KSP Residual norm 7.184606294804e-09 2177 KSP Residual norm 7.581414708776e-09 2178 KSP Residual norm 7.700585512259e-09 2179 KSP Residual norm 7.782657130356e-09 2180 KSP Residual norm 8.532754847073e-09 2181 KSP Residual norm 9.276343146556e-09 2182 KSP Residual norm 9.705554876659e-09 2183 KSP Residual norm 1.036653537425e-08 2184 KSP Residual norm 1.162089117302e-08 2185 KSP Residual norm 1.276264634329e-08 2186 KSP Residual norm 1.270274745815e-08 2187 KSP Residual norm 1.330431323139e-08 2188 KSP Residual norm 1.565994751065e-08 2189 KSP Residual norm 1.743337151094e-08 2190 KSP Residual norm 1.685584536665e-08 2191 KSP Residual norm 1.685732393210e-08 2192 KSP Residual norm 1.806024323263e-08 2193 KSP Residual norm 1.820978501429e-08 2194 KSP Residual norm 1.757941035548e-08 2195 KSP Residual norm 1.737878089742e-08 2196 KSP Residual norm 1.821200319246e-08 2197 KSP Residual norm 1.887672839494e-08 2198 KSP Residual norm 1.833388517475e-08 2199 KSP Residual norm 1.755218724790e-08 2200 KSP Residual norm 1.652514085015e-08 2201 KSP Residual norm 1.540189602546e-08 2202 KSP Residual norm 1.362227524045e-08 2203 KSP Residual norm 1.250178007269e-08 2204 KSP Residual norm 1.197898676209e-08 2205 KSP Residual norm 1.107428842686e-08 2206 KSP Residual norm 1.026194276605e-08 2207 KSP Residual norm 9.421250565185e-09 2208 KSP Residual norm 8.736419867530e-09 2209 KSP Residual norm 7.570641010130e-09 2210 KSP Residual norm 6.458702895554e-09 2211 KSP Residual norm 5.977767379276e-09 2212 KSP Residual norm 5.854562933165e-09 2213 KSP Residual norm 5.629457453704e-09 2214 KSP Residual norm 5.136084482492e-09 2215 KSP Residual norm 4.941347486066e-09 2216 KSP Residual norm 4.778514083511e-09 2217 KSP Residual norm 4.564253784304e-09 2218 KSP Residual norm 4.296953423563e-09 2219 KSP Residual norm 4.183040046050e-09 2220 KSP Residual norm 4.055234767666e-09 2221 KSP Residual norm 4.000812566048e-09 2222 KSP Residual norm 4.055458569808e-09 2223 KSP Residual norm 4.176607568961e-09 2224 KSP Residual norm 4.236486273663e-09 2225 KSP Residual norm 4.423720668725e-09 2226 KSP Residual norm 4.344304158398e-09 2227 KSP Residual norm 4.266936521673e-09 2228 KSP Residual norm 4.313535237041e-09 2229 KSP Residual norm 4.546054737770e-09 2230 KSP Residual norm 4.622964529090e-09 2231 KSP Residual norm 4.804454582027e-09 2232 KSP Residual norm 5.115850555569e-09 2233 KSP Residual norm 5.834416680875e-09 2234 KSP Residual norm 6.689995442789e-09 2235 KSP Residual norm 7.098021519377e-09 2236 KSP Residual norm 7.533528438990e-09 2237 KSP Residual norm 8.502148216529e-09 2238 KSP Residual norm 9.757832100791e-09 2239 KSP Residual norm 1.035498832046e-08 2240 KSP Residual norm 1.068832314800e-08 2241 KSP Residual norm 1.156280626256e-08 2242 KSP Residual norm 1.233987441610e-08 2243 KSP Residual norm 1.271097415990e-08 2244 KSP Residual norm 1.334160456664e-08 2245 KSP Residual norm 1.405762190009e-08 2246 KSP Residual norm 1.496343494878e-08 2247 KSP Residual norm 1.581882857281e-08 2248 KSP Residual norm 1.737848290642e-08 2249 KSP Residual norm 1.840533082126e-08 2250 KSP Residual norm 1.866635029912e-08 2251 KSP Residual norm 1.898010812871e-08 2252 KSP Residual norm 1.852337708487e-08 2253 KSP Residual norm 1.825786239476e-08 2254 KSP Residual norm 1.786171860744e-08 2255 KSP Residual norm 1.701521787917e-08 2256 KSP Residual norm 1.706541593836e-08 2257 KSP Residual norm 1.710391270987e-08 2258 KSP Residual norm 1.643045265157e-08 2259 KSP Residual norm 1.514080342739e-08 2260 KSP Residual norm 1.491368249483e-08 2261 KSP Residual norm 1.505806019867e-08 2262 KSP Residual norm 1.385720256023e-08 2263 KSP Residual norm 1.284049729924e-08 2264 KSP Residual norm 1.292276488384e-08 2265 KSP Residual norm 1.319321029002e-08 2266 KSP Residual norm 1.224451181118e-08 2267 KSP Residual norm 1.065966854373e-08 2268 KSP Residual norm 9.787363081609e-09 2269 KSP Residual norm 9.122735711913e-09 2270 KSP Residual norm 8.165035569607e-09 2271 KSP Residual norm 7.237998127129e-09 2272 KSP Residual norm 6.517751187211e-09 2273 KSP Residual norm 6.344590541439e-09 2274 KSP Residual norm 6.340678306741e-09 2275 KSP Residual norm 5.715836233839e-09 2276 KSP Residual norm 5.397662857791e-09 2277 KSP Residual norm 5.230339854715e-09 2278 KSP Residual norm 4.586690577107e-09 2279 KSP Residual norm 3.964709291929e-09 2280 KSP Residual norm 3.949355277585e-09 2281 KSP Residual norm 4.327578375786e-09 2282 KSP Residual norm 4.296132073917e-09 2283 KSP Residual norm 4.106768919770e-09 2284 KSP Residual norm 4.286676700930e-09 2285 KSP Residual norm 4.869388691933e-09 2286 KSP Residual norm 4.973304533827e-09 2287 KSP Residual norm 4.584311545845e-09 2288 KSP Residual norm 4.461168235661e-09 2289 KSP Residual norm 4.854803468229e-09 2290 KSP Residual norm 5.415828792883e-09 2291 KSP Residual norm 5.920845157786e-09 2292 KSP Residual norm 6.796133099936e-09 2293 KSP Residual norm 7.551207915952e-09 2294 KSP Residual norm 7.734265715194e-09 2295 KSP Residual norm 8.006329828344e-09 2296 KSP Residual norm 9.407041749222e-09 2297 KSP Residual norm 1.034294380601e-08 2298 KSP Residual norm 1.023796060750e-08 2299 KSP Residual norm 1.086984507408e-08 2300 KSP Residual norm 1.307877565763e-08 2301 KSP Residual norm 1.448597116570e-08 2302 KSP Residual norm 1.507716576250e-08 2303 KSP Residual norm 1.683366414771e-08 2304 KSP Residual norm 1.970119004799e-08 2305 KSP Residual norm 2.174094043351e-08 2306 KSP Residual norm 2.180344378516e-08 2307 KSP Residual norm 2.136448330890e-08 2308 KSP Residual norm 2.181920651456e-08 2309 KSP Residual norm 2.379090777135e-08 2310 KSP Residual norm 2.551817412914e-08 2311 KSP Residual norm 2.674366462544e-08 2312 KSP Residual norm 2.743131747632e-08 2313 KSP Residual norm 2.733983139363e-08 2314 KSP Residual norm 2.721171356242e-08 2315 KSP Residual norm 2.749040538779e-08 2316 KSP Residual norm 2.956209040306e-08 2317 KSP Residual norm 3.060062594230e-08 2318 KSP Residual norm 2.874799849491e-08 2319 KSP Residual norm 2.665268146674e-08 2320 KSP Residual norm 2.772857280795e-08 2321 KSP Residual norm 2.990654125322e-08 2322 KSP Residual norm 2.952899520359e-08 2323 KSP Residual norm 2.792293548103e-08 2324 KSP Residual norm 2.748598367564e-08 2325 KSP Residual norm 2.699478973869e-08 2326 KSP Residual norm 2.643128884756e-08 2327 KSP Residual norm 2.511313506779e-08 2328 KSP Residual norm 2.276669574923e-08 2329 KSP Residual norm 2.054602062372e-08 2330 KSP Residual norm 1.778330730040e-08 2331 KSP Residual norm 1.621541250870e-08 2332 KSP Residual norm 1.637764654673e-08 2333 KSP Residual norm 1.708994064404e-08 2334 KSP Residual norm 1.651742693394e-08 2335 KSP Residual norm 1.500281755858e-08 2336 KSP Residual norm 1.480045593165e-08 2337 KSP Residual norm 1.484276302076e-08 2338 KSP Residual norm 1.351462542533e-08 2339 KSP Residual norm 1.191827402502e-08 2340 KSP Residual norm 1.168662864764e-08 2341 KSP Residual norm 1.156797835628e-08 2342 KSP Residual norm 1.099678036576e-08 2343 KSP Residual norm 1.041450375091e-08 2344 KSP Residual norm 1.028811312833e-08 2345 KSP Residual norm 1.043648171379e-08 2346 KSP Residual norm 1.065689900500e-08 2347 KSP Residual norm 1.061214361887e-08 2348 KSP Residual norm 1.064871968811e-08 2349 KSP Residual norm 1.097582498907e-08 2350 KSP Residual norm 1.096078399204e-08 2351 KSP Residual norm 1.088517211304e-08 2352 KSP Residual norm 1.094875043189e-08 2353 KSP Residual norm 1.135711333287e-08 2354 KSP Residual norm 1.141560937711e-08 2355 KSP Residual norm 1.180195398114e-08 2356 KSP Residual norm 1.229408989707e-08 2357 KSP Residual norm 1.264918363694e-08 2358 KSP Residual norm 1.364164231703e-08 2359 KSP Residual norm 1.391387796191e-08 2360 KSP Residual norm 1.363171861209e-08 2361 KSP Residual norm 1.432132433201e-08 2362 KSP Residual norm 1.651146219121e-08 2363 KSP Residual norm 1.739455026269e-08 2364 KSP Residual norm 1.786970005355e-08 2365 KSP Residual norm 1.927386266125e-08 2366 KSP Residual norm 1.944002539024e-08 2367 KSP Residual norm 1.825303313704e-08 2368 KSP Residual norm 1.845942173573e-08 2369 KSP Residual norm 2.064613582774e-08 2370 KSP Residual norm 2.163600731320e-08 2371 KSP Residual norm 2.066319842332e-08 2372 KSP Residual norm 1.972281547162e-08 2373 KSP Residual norm 2.022046954332e-08 2374 KSP Residual norm 2.156094129891e-08 2375 KSP Residual norm 2.134747485529e-08 2376 KSP Residual norm 2.146466009581e-08 2377 KSP Residual norm 2.134242298498e-08 2378 KSP Residual norm 1.963694682345e-08 2379 KSP Residual norm 1.788889710446e-08 2380 KSP Residual norm 1.735742274775e-08 2381 KSP Residual norm 1.724287054888e-08 2382 KSP Residual norm 1.657333204418e-08 2383 KSP Residual norm 1.601977523985e-08 2384 KSP Residual norm 1.592980339719e-08 2385 KSP Residual norm 1.643528353038e-08 2386 KSP Residual norm 1.524305740226e-08 2387 KSP Residual norm 1.281738325880e-08 2388 KSP Residual norm 1.181102707146e-08 2389 KSP Residual norm 1.169762846482e-08 2390 KSP Residual norm 1.088331115373e-08 2391 KSP Residual norm 9.926596869260e-09 2392 KSP Residual norm 1.046136112309e-08 2393 KSP Residual norm 1.154364345489e-08 2394 KSP Residual norm 1.128720253874e-08 2395 KSP Residual norm 1.049168851742e-08 2396 KSP Residual norm 1.027055660276e-08 2397 KSP Residual norm 9.769471007648e-09 2398 KSP Residual norm 9.050582904842e-09 2399 KSP Residual norm 8.767274646178e-09 2400 KSP Residual norm 8.191786148679e-09 2401 KSP Residual norm 7.420475335849e-09 2402 KSP Residual norm 6.882142194668e-09 2403 KSP Residual norm 6.682470226477e-09 2404 KSP Residual norm 6.928923087287e-09 2405 KSP Residual norm 7.340393361661e-09 2406 KSP Residual norm 7.249095540000e-09 2407 KSP Residual norm 6.733587081173e-09 2408 KSP Residual norm 6.819910980198e-09 2409 KSP Residual norm 6.952100736657e-09 2410 KSP Residual norm 6.714877154834e-09 2411 KSP Residual norm 6.345565949842e-09 2412 KSP Residual norm 6.283651581953e-09 2413 KSP Residual norm 6.166382733402e-09 2414 KSP Residual norm 6.083828042828e-09 2415 KSP Residual norm 6.426227761535e-09 2416 KSP Residual norm 6.809718841112e-09 2417 KSP Residual norm 7.311895519664e-09 2418 KSP Residual norm 7.885695040613e-09 2419 KSP Residual norm 8.293295987256e-09 2420 KSP Residual norm 8.904912306156e-09 2421 KSP Residual norm 9.151088014953e-09 2422 KSP Residual norm 9.488850887938e-09 2423 KSP Residual norm 9.949153052234e-09 2424 KSP Residual norm 1.035318592256e-08 2425 KSP Residual norm 1.129525959125e-08 2426 KSP Residual norm 1.248408113736e-08 2427 KSP Residual norm 1.272650576370e-08 2428 KSP Residual norm 1.306691949963e-08 2429 KSP Residual norm 1.464659469576e-08 2430 KSP Residual norm 1.618687285881e-08 2431 KSP Residual norm 1.634874142497e-08 2432 KSP Residual norm 1.642309782360e-08 2433 KSP Residual norm 1.663658078417e-08 2434 KSP Residual norm 1.719226764242e-08 2435 KSP Residual norm 1.667708411286e-08 2436 KSP Residual norm 1.642638923839e-08 2437 KSP Residual norm 1.747457374319e-08 2438 KSP Residual norm 1.864204342869e-08 2439 KSP Residual norm 1.929847755246e-08 2440 KSP Residual norm 2.078650894668e-08 2441 KSP Residual norm 2.225887764060e-08 2442 KSP Residual norm 2.142787325635e-08 2443 KSP Residual norm 2.045771881845e-08 2444 KSP Residual norm 2.028908326380e-08 2445 KSP Residual norm 2.025756936723e-08 2446 KSP Residual norm 2.000005424982e-08 2447 KSP Residual norm 2.015345692386e-08 2448 KSP Residual norm 1.985966609084e-08 2449 KSP Residual norm 1.949178423742e-08 2450 KSP Residual norm 1.926432884163e-08 2451 KSP Residual norm 1.904975476332e-08 2452 KSP Residual norm 1.832013783270e-08 2453 KSP Residual norm 1.797097355915e-08 2454 KSP Residual norm 1.730011121490e-08 2455 KSP Residual norm 1.538872307345e-08 2456 KSP Residual norm 1.335901658519e-08 2457 KSP Residual norm 1.327395760494e-08 2458 KSP Residual norm 1.400663273338e-08 2459 KSP Residual norm 1.284416949329e-08 2460 KSP Residual norm 1.129660109025e-08 2461 KSP Residual norm 1.050438029609e-08 2462 KSP Residual norm 1.009692800540e-08 2463 KSP Residual norm 9.121674194319e-09 2464 KSP Residual norm 8.183286029058e-09 2465 KSP Residual norm 8.044195697374e-09 2466 KSP Residual norm 7.933283195800e-09 2467 KSP Residual norm 7.750461889317e-09 2468 KSP Residual norm 7.971090596231e-09 2469 KSP Residual norm 8.010696023109e-09 2470 KSP Residual norm 7.359914869366e-09 2471 KSP Residual norm 6.860528063962e-09 2472 KSP Residual norm 6.503824121187e-09 2473 KSP Residual norm 6.548231468425e-09 2474 KSP Residual norm 6.669686341449e-09 2475 KSP Residual norm 6.323909462190e-09 2476 KSP Residual norm 5.851288299069e-09 2477 KSP Residual norm 5.953926704682e-09 2478 KSP Residual norm 6.193786534656e-09 2479 KSP Residual norm 6.382855695774e-09 2480 KSP Residual norm 6.671346108551e-09 2481 KSP Residual norm 7.442589185542e-09 2482 KSP Residual norm 7.337372968494e-09 2483 KSP Residual norm 6.445569232161e-09 2484 KSP Residual norm 6.312266940722e-09 2485 KSP Residual norm 7.071272824753e-09 2486 KSP Residual norm 7.247878160167e-09 2487 KSP Residual norm 7.541604235255e-09 2488 KSP Residual norm 8.742718413793e-09 2489 KSP Residual norm 9.975076976230e-09 2490 KSP Residual norm 9.818973442672e-09 2491 KSP Residual norm 9.588119286841e-09 2492 KSP Residual norm 1.006198416762e-08 2493 KSP Residual norm 1.023433506863e-08 2494 KSP Residual norm 9.375473190107e-09 2495 KSP Residual norm 9.406593938183e-09 2496 KSP Residual norm 1.035641440449e-08 2497 KSP Residual norm 1.173675682404e-08 2498 KSP Residual norm 1.234265114038e-08 2499 KSP Residual norm 1.218622622784e-08 2500 KSP Residual norm 1.267436912264e-08 2501 KSP Residual norm 1.367715131961e-08 2502 KSP Residual norm 1.440018768183e-08 2503 KSP Residual norm 1.508619267072e-08 2504 KSP Residual norm 1.549639570931e-08 2505 KSP Residual norm 1.549556075005e-08 2506 KSP Residual norm 1.532075581523e-08 2507 KSP Residual norm 1.488645109230e-08 2508 KSP Residual norm 1.490285559522e-08 2509 KSP Residual norm 1.535874586409e-08 2510 KSP Residual norm 1.529893480935e-08 2511 KSP Residual norm 1.477956846158e-08 2512 KSP Residual norm 1.449415402558e-08 2513 KSP Residual norm 1.395941129514e-08 2514 KSP Residual norm 1.352716939284e-08 2515 KSP Residual norm 1.339741886346e-08 2516 KSP Residual norm 1.406782498654e-08 2517 KSP Residual norm 1.411843736185e-08 2518 KSP Residual norm 1.341304356648e-08 2519 KSP Residual norm 1.250833133895e-08 2520 KSP Residual norm 1.164073705047e-08 2521 KSP Residual norm 1.102281414615e-08 2522 KSP Residual norm 1.121542648851e-08 2523 KSP Residual norm 1.120181542176e-08 2524 KSP Residual norm 1.045407361062e-08 2525 KSP Residual norm 9.856428805003e-09 2526 KSP Residual norm 1.050294993783e-08 2527 KSP Residual norm 1.057898997227e-08 2528 KSP Residual norm 1.010643202692e-08 2529 KSP Residual norm 9.862866919683e-09 2530 KSP Residual norm 9.819560967018e-09 2531 KSP Residual norm 9.347930850288e-09 2532 KSP Residual norm 8.814108334421e-09 2533 KSP Residual norm 8.355099459461e-09 2534 KSP Residual norm 7.656266700119e-09 2535 KSP Residual norm 6.844999420530e-09 2536 KSP Residual norm 6.186101317347e-09 2537 KSP Residual norm 5.948199628343e-09 2538 KSP Residual norm 5.428610563762e-09 2539 KSP Residual norm 4.982983964608e-09 2540 KSP Residual norm 4.743777264316e-09 2541 KSP Residual norm 4.439648638180e-09 2542 KSP Residual norm 4.195255132473e-09 2543 KSP Residual norm 4.065784517026e-09 2544 KSP Residual norm 3.929976686012e-09 2545 KSP Residual norm 3.639299967240e-09 2546 KSP Residual norm 3.485553495785e-09 2547 KSP Residual norm 3.446133224639e-09 2548 KSP Residual norm 3.232074628774e-09 2549 KSP Residual norm 3.181015483236e-09 2550 KSP Residual norm 3.326870400750e-09 2551 KSP Residual norm 3.226293002314e-09 2552 KSP Residual norm 3.184635222081e-09 2553 KSP Residual norm 3.337720238838e-09 2554 KSP Residual norm 3.212644515614e-09 2555 KSP Residual norm 2.696877692495e-09 2556 KSP Residual norm 2.463682139664e-09 2557 KSP Residual norm 2.558194255093e-09 2558 KSP Residual norm 2.563770499815e-09 2559 KSP Residual norm 2.431233136895e-09 2560 KSP Residual norm 2.424757965240e-09 2561 KSP Residual norm 2.548973524170e-09 2562 KSP Residual norm 2.530831065549e-09 2563 KSP Residual norm 2.360776790444e-09 2564 KSP Residual norm 2.355650575473e-09 2565 KSP Residual norm 2.497101659022e-09 2566 KSP Residual norm 2.730907634833e-09 2567 KSP Residual norm 2.775505433545e-09 2568 KSP Residual norm 2.848580323351e-09 2569 KSP Residual norm 2.813573068914e-09 2570 KSP Residual norm 2.645912046140e-09 2571 KSP Residual norm 2.683189599129e-09 2572 KSP Residual norm 2.910642491421e-09 2573 KSP Residual norm 3.213506736844e-09 2574 KSP Residual norm 3.345571303534e-09 2575 KSP Residual norm 3.487662006987e-09 2576 KSP Residual norm 3.783578088364e-09 2577 KSP Residual norm 4.126694757239e-09 2578 KSP Residual norm 3.966934834228e-09 2579 KSP Residual norm 3.938350939242e-09 2580 KSP Residual norm 4.250763281452e-09 2581 KSP Residual norm 4.502495779862e-09 2582 KSP Residual norm 4.481611396450e-09 2583 KSP Residual norm 4.516924485885e-09 2584 KSP Residual norm 4.889234434903e-09 2585 KSP Residual norm 5.170670476666e-09 2586 KSP Residual norm 5.287022824447e-09 2587 KSP Residual norm 5.472855119755e-09 2588 KSP Residual norm 6.016180997802e-09 2589 KSP Residual norm 6.466073329607e-09 2590 KSP Residual norm 6.461270221363e-09 2591 KSP Residual norm 6.973125077034e-09 2592 KSP Residual norm 8.079750056309e-09 2593 KSP Residual norm 8.432477647692e-09 2594 KSP Residual norm 8.137728246274e-09 2595 KSP Residual norm 8.638262398545e-09 2596 KSP Residual norm 9.621070313194e-09 2597 KSP Residual norm 1.017564474859e-08 2598 KSP Residual norm 1.113646865170e-08 2599 KSP Residual norm 1.210416684416e-08 2600 KSP Residual norm 1.204361929287e-08 2601 KSP Residual norm 1.081674699687e-08 2602 KSP Residual norm 1.095188488372e-08 2603 KSP Residual norm 1.179884589471e-08 2604 KSP Residual norm 1.175099070447e-08 2605 KSP Residual norm 1.141770008882e-08 2606 KSP Residual norm 1.091718332799e-08 2607 KSP Residual norm 9.470926568680e-09 2608 KSP Residual norm 8.258630750609e-09 2609 KSP Residual norm 7.783601965341e-09 2610 KSP Residual norm 7.766901649221e-09 2611 KSP Residual norm 7.693115141302e-09 2612 KSP Residual norm 7.479212544901e-09 2613 KSP Residual norm 7.160041634254e-09 2614 KSP Residual norm 6.875186912935e-09 2615 KSP Residual norm 6.676283571063e-09 2616 KSP Residual norm 6.321536866394e-09 2617 KSP Residual norm 5.755425502962e-09 2618 KSP Residual norm 5.041812747416e-09 2619 KSP Residual norm 4.803958775249e-09 2620 KSP Residual norm 4.527691722789e-09 2621 KSP Residual norm 3.938022685674e-09 2622 KSP Residual norm 3.633758372931e-09 2623 KSP Residual norm 3.738165368208e-09 2624 KSP Residual norm 3.737226076933e-09 2625 KSP Residual norm 3.707298492599e-09 2626 KSP Residual norm 3.620430699791e-09 2627 KSP Residual norm 3.386068448171e-09 2628 KSP Residual norm 3.058281514011e-09 2629 KSP Residual norm 3.003157638890e-09 2630 KSP Residual norm 2.969980901234e-09 2631 KSP Residual norm 2.940273353269e-09 2632 KSP Residual norm 2.954900854932e-09 2633 KSP Residual norm 2.897806303214e-09 2634 KSP Residual norm 2.853203319843e-09 2635 KSP Residual norm 2.831533447956e-09 2636 KSP Residual norm 2.865348566520e-09 2637 KSP Residual norm 2.693143836174e-09 2638 KSP Residual norm 2.543197466684e-09 2639 KSP Residual norm 2.605114993433e-09 2640 KSP Residual norm 2.438432052851e-09 2641 KSP Residual norm 2.047586379768e-09 2642 KSP Residual norm 1.972889415337e-09 2643 KSP Residual norm 2.033219169920e-09 2644 KSP Residual norm 1.966769919814e-09 2645 KSP Residual norm 1.917810564614e-09 2646 KSP Residual norm 1.985817200822e-09 2647 KSP Residual norm 1.949760672782e-09 2648 KSP Residual norm 1.747593677883e-09 2649 KSP Residual norm 1.623742393478e-09 2650 KSP Residual norm 1.632056805693e-09 2651 KSP Residual norm 1.633024128293e-09 2652 KSP Residual norm 1.633658193414e-09 2653 KSP Residual norm 1.664147180709e-09 2654 KSP Residual norm 1.651738804611e-09 2655 KSP Residual norm 1.595038366389e-09 2656 KSP Residual norm 1.591886299043e-09 2657 KSP Residual norm 1.724656738337e-09 2658 KSP Residual norm 1.746019525312e-09 2659 KSP Residual norm 1.670899676364e-09 2660 KSP Residual norm 1.664490068992e-09 2661 KSP Residual norm 1.710245233422e-09 2662 KSP Residual norm 1.666637660573e-09 2663 KSP Residual norm 1.641749951492e-09 2664 KSP Residual norm 1.776495440843e-09 2665 KSP Residual norm 1.942706172570e-09 2666 KSP Residual norm 2.042551620639e-09 2667 KSP Residual norm 2.064228148803e-09 2668 KSP Residual norm 1.989940200222e-09 2669 KSP Residual norm 2.046889113418e-09 2670 KSP Residual norm 2.090670968148e-09 2671 KSP Residual norm 2.096608064050e-09 2672 KSP Residual norm 2.122985727959e-09 2673 KSP Residual norm 2.275071665218e-09 2674 KSP Residual norm 2.367572071053e-09 2675 KSP Residual norm 2.453195372003e-09 2676 KSP Residual norm 2.538209806628e-09 2677 KSP Residual norm 2.609260859620e-09 2678 KSP Residual norm 2.683192285067e-09 2679 KSP Residual norm 2.943762479220e-09 2680 KSP Residual norm 3.370405634032e-09 2681 KSP Residual norm 3.512036945023e-09 2682 KSP Residual norm 3.454051649847e-09 2683 KSP Residual norm 3.708667211463e-09 2684 KSP Residual norm 4.037301460335e-09 2685 KSP Residual norm 4.033221032910e-09 2686 KSP Residual norm 4.111770099676e-09 2687 KSP Residual norm 4.433198848466e-09 2688 KSP Residual norm 4.618878394352e-09 2689 KSP Residual norm 4.667963732074e-09 2690 KSP Residual norm 5.121670619576e-09 2691 KSP Residual norm 6.055643165290e-09 2692 KSP Residual norm 6.781998240055e-09 2693 KSP Residual norm 6.704413128935e-09 2694 KSP Residual norm 7.146735612908e-09 2695 KSP Residual norm 7.809076285434e-09 2696 KSP Residual norm 7.528292059068e-09 2697 KSP Residual norm 7.224417949506e-09 2698 KSP Residual norm 7.873490618361e-09 2699 KSP Residual norm 8.190501200265e-09 2700 KSP Residual norm 7.956062943814e-09 2701 KSP Residual norm 8.029773409733e-09 2702 KSP Residual norm 8.172195344241e-09 2703 KSP Residual norm 7.316976202145e-09 2704 KSP Residual norm 7.383627383403e-09 2705 KSP Residual norm 8.025433562290e-09 2706 KSP Residual norm 8.044174191405e-09 2707 KSP Residual norm 7.319178042659e-09 2708 KSP Residual norm 7.053531366224e-09 2709 KSP Residual norm 7.551057668651e-09 2710 KSP Residual norm 7.754112504251e-09 2711 KSP Residual norm 7.051177355493e-09 2712 KSP Residual norm 6.641770430867e-09 2713 KSP Residual norm 7.062704584469e-09 2714 KSP Residual norm 7.203891904612e-09 2715 KSP Residual norm 7.004978665553e-09 2716 KSP Residual norm 6.971547092758e-09 2717 KSP Residual norm 7.074571122640e-09 2718 KSP Residual norm 6.687758800246e-09 2719 KSP Residual norm 6.030491946341e-09 2720 KSP Residual norm 5.689079675670e-09 2721 KSP Residual norm 5.685601729554e-09 2722 KSP Residual norm 5.571475446101e-09 2723 KSP Residual norm 5.458015308795e-09 2724 KSP Residual norm 5.987585674021e-09 2725 KSP Residual norm 6.525759775764e-09 2726 KSP Residual norm 6.088099092887e-09 2727 KSP Residual norm 5.165062904965e-09 2728 KSP Residual norm 4.817167108380e-09 2729 KSP Residual norm 4.917429258260e-09 2730 KSP Residual norm 4.854362205433e-09 2731 KSP Residual norm 4.788917869001e-09 2732 KSP Residual norm 4.869046501371e-09 2733 KSP Residual norm 4.911218160683e-09 2734 KSP Residual norm 4.727465018014e-09 2735 KSP Residual norm 4.508961542211e-09 2736 KSP Residual norm 4.285587394285e-09 2737 KSP Residual norm 4.233663917543e-09 2738 KSP Residual norm 3.998049838763e-09 2739 KSP Residual norm 4.094092113693e-09 2740 KSP Residual norm 4.362354970961e-09 2741 KSP Residual norm 4.046515936904e-09 2742 KSP Residual norm 3.644528894172e-09 2743 KSP Residual norm 3.400212779024e-09 2744 KSP Residual norm 3.051614841802e-09 2745 KSP Residual norm 2.597870368737e-09 2746 KSP Residual norm 2.387162647797e-09 2747 KSP Residual norm 2.470185917105e-09 2748 KSP Residual norm 2.517048682483e-09 2749 KSP Residual norm 2.286531072746e-09 2750 KSP Residual norm 2.225236744650e-09 2751 KSP Residual norm 2.481211823057e-09 2752 KSP Residual norm 2.549869509997e-09 2753 KSP Residual norm 2.455895955843e-09 2754 KSP Residual norm 2.700828122162e-09 2755 KSP Residual norm 2.796573935434e-09 2756 KSP Residual norm 2.539217728129e-09 2757 KSP Residual norm 2.477827386413e-09 2758 KSP Residual norm 2.717717604447e-09 2759 KSP Residual norm 2.634131272035e-09 2760 KSP Residual norm 2.377625069149e-09 2761 KSP Residual norm 2.426894982896e-09 2762 KSP Residual norm 2.726329437074e-09 2763 KSP Residual norm 2.952286275947e-09 2764 KSP Residual norm 3.059023337260e-09 2765 KSP Residual norm 3.412026777966e-09 2766 KSP Residual norm 3.832020233261e-09 2767 KSP Residual norm 3.882187716093e-09 2768 KSP Residual norm 3.702401870749e-09 2769 KSP Residual norm 3.767290470641e-09 2770 KSP Residual norm 3.854393758332e-09 2771 KSP Residual norm 3.803424903145e-09 2772 KSP Residual norm 3.760490795144e-09 2773 KSP Residual norm 3.841332849153e-09 2774 KSP Residual norm 4.149639117958e-09 2775 KSP Residual norm 4.451461506373e-09 2776 KSP Residual norm 4.588461465987e-09 2777 KSP Residual norm 4.793871863084e-09 2778 KSP Residual norm 5.122474683701e-09 2779 KSP Residual norm 5.371431526482e-09 2780 KSP Residual norm 5.135808906442e-09 2781 KSP Residual norm 5.260793911794e-09 2782 KSP Residual norm 5.857165718928e-09 2783 KSP Residual norm 6.381694480696e-09 2784 KSP Residual norm 6.512497144483e-09 2785 KSP Residual norm 6.533173277313e-09 2786 KSP Residual norm 6.236108955963e-09 2787 KSP Residual norm 5.923025411680e-09 2788 KSP Residual norm 6.490975876393e-09 2789 KSP Residual norm 7.166027410531e-09 2790 KSP Residual norm 7.209129431455e-09 2791 KSP Residual norm 7.051566073490e-09 2792 KSP Residual norm 8.030358704945e-09 2793 KSP Residual norm 9.065137803635e-09 2794 KSP Residual norm 8.713556521944e-09 2795 KSP Residual norm 8.484180104001e-09 2796 KSP Residual norm 9.125206091159e-09 2797 KSP Residual norm 9.099759915808e-09 2798 KSP Residual norm 8.370391946821e-09 2799 KSP Residual norm 8.475231855307e-09 2800 KSP Residual norm 8.868159029256e-09 2801 KSP Residual norm 9.241592810673e-09 2802 KSP Residual norm 9.771551586297e-09 2803 KSP Residual norm 1.067380319552e-08 2804 KSP Residual norm 1.063704818106e-08 2805 KSP Residual norm 9.116227620334e-09 2806 KSP Residual norm 8.254306417035e-09 2807 KSP Residual norm 8.773947661138e-09 2808 KSP Residual norm 9.074387130542e-09 2809 KSP Residual norm 8.681804154588e-09 2810 KSP Residual norm 8.389928293070e-09 2811 KSP Residual norm 8.158151962869e-09 2812 KSP Residual norm 7.454687897331e-09 2813 KSP Residual norm 6.725390281443e-09 2814 KSP Residual norm 6.442580471280e-09 2815 KSP Residual norm 6.594842144887e-09 2816 KSP Residual norm 6.175468393674e-09 2817 KSP Residual norm 5.948917296506e-09 2818 KSP Residual norm 6.048394176460e-09 2819 KSP Residual norm 5.771031073760e-09 2820 KSP Residual norm 5.058013250054e-09 2821 KSP Residual norm 5.005042538350e-09 2822 KSP Residual norm 5.174646635816e-09 2823 KSP Residual norm 5.297891862913e-09 2824 KSP Residual norm 5.218582991506e-09 2825 KSP Residual norm 5.184571403376e-09 2826 KSP Residual norm 4.877724163497e-09 2827 KSP Residual norm 4.376173545524e-09 2828 KSP Residual norm 3.879420756698e-09 2829 KSP Residual norm 3.439396018550e-09 2830 KSP Residual norm 3.214760183503e-09 2831 KSP Residual norm 3.260007814028e-09 2832 KSP Residual norm 3.356921458829e-09 2833 KSP Residual norm 3.290688450504e-09 2834 KSP Residual norm 3.163246366208e-09 2835 KSP Residual norm 2.955435903195e-09 2836 KSP Residual norm 2.686316122894e-09 2837 KSP Residual norm 2.508014674461e-09 2838 KSP Residual norm 2.588974821038e-09 2839 KSP Residual norm 2.550966485951e-09 2840 KSP Residual norm 2.414762982927e-09 2841 KSP Residual norm 2.454092884211e-09 2842 KSP Residual norm 2.624941114781e-09 2843 KSP Residual norm 2.543681467224e-09 2844 KSP Residual norm 2.298222735014e-09 2845 KSP Residual norm 2.225369846626e-09 2846 KSP Residual norm 2.246466942198e-09 2847 KSP Residual norm 2.102849256218e-09 2848 KSP Residual norm 1.977287471427e-09 2849 KSP Residual norm 1.948360601334e-09 2850 KSP Residual norm 1.856638539318e-09 2851 KSP Residual norm 1.762288146514e-09 2852 KSP Residual norm 1.783254519760e-09 2853 KSP Residual norm 1.769731460209e-09 2854 KSP Residual norm 1.638201600958e-09 2855 KSP Residual norm 1.640964192792e-09 2856 KSP Residual norm 1.707268582552e-09 2857 KSP Residual norm 1.636066496918e-09 2858 KSP Residual norm 1.567517222222e-09 2859 KSP Residual norm 1.553610820170e-09 2860 KSP Residual norm 1.444711488101e-09 2861 KSP Residual norm 1.451218386629e-09 2862 KSP Residual norm 1.550811297135e-09 2863 KSP Residual norm 1.559425206311e-09 2864 KSP Residual norm 1.460071333249e-09 2865 KSP Residual norm 1.458246533022e-09 2866 KSP Residual norm 1.486452152424e-09 2867 KSP Residual norm 1.323864434657e-09 2868 KSP Residual norm 1.233595590287e-09 2869 KSP Residual norm 1.319880348938e-09 2870 KSP Residual norm 1.358133108944e-09 2871 KSP Residual norm 1.229871199549e-09 2872 KSP Residual norm 1.185891883058e-09 2873 KSP Residual norm 1.290082776236e-09 2874 KSP Residual norm 1.328038321947e-09 2875 KSP Residual norm 1.317538364645e-09 2876 KSP Residual norm 1.542014710595e-09 2877 KSP Residual norm 1.841984536167e-09 2878 KSP Residual norm 1.832611976675e-09 2879 KSP Residual norm 1.716262365752e-09 2880 KSP Residual norm 1.875582344960e-09 2881 KSP Residual norm 1.875574764548e-09 2882 KSP Residual norm 1.716840129990e-09 2883 KSP Residual norm 1.712643057583e-09 2884 KSP Residual norm 1.723655624701e-09 2885 KSP Residual norm 1.615411584530e-09 2886 KSP Residual norm 1.670801956517e-09 2887 KSP Residual norm 1.936127413420e-09 2888 KSP Residual norm 2.067951967520e-09 2889 KSP Residual norm 2.048300456370e-09 2890 KSP Residual norm 2.380921225057e-09 2891 KSP Residual norm 2.939536957883e-09 2892 KSP Residual norm 2.883724653657e-09 2893 KSP Residual norm 2.620489080121e-09 2894 KSP Residual norm 2.794877193591e-09 2895 KSP Residual norm 3.063376289321e-09 2896 KSP Residual norm 3.085815768655e-09 2897 KSP Residual norm 3.481081390266e-09 2898 KSP Residual norm 4.376768108428e-09 2899 KSP Residual norm 4.534877663315e-09 2900 KSP Residual norm 4.313685017615e-09 2901 KSP Residual norm 4.776805577283e-09 2902 KSP Residual norm 5.332124197875e-09 2903 KSP Residual norm 5.283946927594e-09 2904 KSP Residual norm 5.206005257629e-09 2905 KSP Residual norm 5.531416350161e-09 2906 KSP Residual norm 5.770601536643e-09 2907 KSP Residual norm 5.625769849871e-09 2908 KSP Residual norm 5.859016127016e-09 2909 KSP Residual norm 6.771953315917e-09 2910 KSP Residual norm 6.633159492461e-09 2911 KSP Residual norm 6.342419821005e-09 2912 KSP Residual norm 7.254879118095e-09 2913 KSP Residual norm 8.377135113658e-09 2914 KSP Residual norm 8.569161025211e-09 2915 KSP Residual norm 9.312597458480e-09 2916 KSP Residual norm 1.064500502958e-08 2917 KSP Residual norm 9.539093393468e-09 2918 KSP Residual norm 8.338333024324e-09 2919 KSP Residual norm 8.587066813218e-09 2920 KSP Residual norm 8.604327767202e-09 2921 KSP Residual norm 7.754690685014e-09 2922 KSP Residual norm 7.879401406835e-09 2923 KSP Residual norm 8.525506892141e-09 2924 KSP Residual norm 8.171632447277e-09 2925 KSP Residual norm 7.773640417745e-09 2926 KSP Residual norm 8.475946380531e-09 2927 KSP Residual norm 9.508679071670e-09 2928 KSP Residual norm 8.940031354942e-09 2929 KSP Residual norm 7.973121653315e-09 2930 KSP Residual norm 8.106807794805e-09 2931 KSP Residual norm 7.751015696510e-09 2932 KSP Residual norm 6.964268803932e-09 2933 KSP Residual norm 7.146897865479e-09 2934 KSP Residual norm 7.754893270261e-09 2935 KSP Residual norm 7.446432183350e-09 2936 KSP Residual norm 7.386712276105e-09 2937 KSP Residual norm 8.284067578477e-09 2938 KSP Residual norm 8.328359191638e-09 2939 KSP Residual norm 7.377727155840e-09 2940 KSP Residual norm 7.323373514112e-09 2941 KSP Residual norm 7.793799213935e-09 2942 KSP Residual norm 7.965467083386e-09 2943 KSP Residual norm 8.213607710602e-09 2944 KSP Residual norm 8.561815971657e-09 2945 KSP Residual norm 7.878068525703e-09 2946 KSP Residual norm 7.420791883455e-09 2947 KSP Residual norm 7.863356862940e-09 2948 KSP Residual norm 7.863276098415e-09 2949 KSP Residual norm 7.025720508924e-09 2950 KSP Residual norm 7.060734268418e-09 2951 KSP Residual norm 7.491688506212e-09 2952 KSP Residual norm 7.023666795226e-09 2953 KSP Residual norm 6.861601581092e-09 2954 KSP Residual norm 7.597566780992e-09 2955 KSP Residual norm 7.691056761959e-09 2956 KSP Residual norm 6.241066401361e-09 2957 KSP Residual norm 5.425320168397e-09 2958 KSP Residual norm 5.290801509322e-09 2959 KSP Residual norm 4.941614739538e-09 2960 KSP Residual norm 4.406386230385e-09 2961 KSP Residual norm 4.563522198694e-09 2962 KSP Residual norm 4.834622009435e-09 2963 KSP Residual norm 4.264077591252e-09 2964 KSP Residual norm 3.974265877067e-09 2965 KSP Residual norm 4.403543971859e-09 2966 KSP Residual norm 4.572109693925e-09 2967 KSP Residual norm 4.209832593942e-09 2968 KSP Residual norm 4.153200826871e-09 2969 KSP Residual norm 4.148509905583e-09 2970 KSP Residual norm 3.592961096984e-09 2971 KSP Residual norm 3.193099977120e-09 2972 KSP Residual norm 3.436900596257e-09 2973 KSP Residual norm 3.434859820714e-09 2974 KSP Residual norm 2.785066309851e-09 2975 KSP Residual norm 2.507524768731e-09 2976 KSP Residual norm 2.468199841417e-09 2977 KSP Residual norm 2.346264531816e-09 2978 KSP Residual norm 2.275806403748e-09 2979 KSP Residual norm 2.399053098719e-09 2980 KSP Residual norm 2.276114535933e-09 2981 KSP Residual norm 1.959634703803e-09 2982 KSP Residual norm 1.895949273734e-09 2983 KSP Residual norm 1.865352832052e-09 2984 KSP Residual norm 1.563795650052e-09 2985 KSP Residual norm 1.321316532213e-09 2986 KSP Residual norm 1.348404868013e-09 2987 KSP Residual norm 1.499017736279e-09 2988 KSP Residual norm 1.575753400117e-09 2989 KSP Residual norm 1.710788127459e-09 2990 KSP Residual norm 1.794851007666e-09 2991 KSP Residual norm 1.644877061362e-09 2992 KSP Residual norm 1.531454885692e-09 2993 KSP Residual norm 1.561861640916e-09 2994 KSP Residual norm 1.477089349020e-09 2995 KSP Residual norm 1.373532635474e-09 2996 KSP Residual norm 1.456041262603e-09 2997 KSP Residual norm 1.449127861492e-09 2998 KSP Residual norm 1.262200032375e-09 2999 KSP Residual norm 1.164317447865e-09 3000 KSP Residual norm 1.239948277565e-09 3001 KSP Residual norm 1.222526005301e-09 3002 KSP Residual norm 1.230302163924e-09 3003 KSP Residual norm 1.373228741632e-09 3004 KSP Residual norm 1.361535219102e-09 3005 KSP Residual norm 1.177391542922e-09 3006 KSP Residual norm 1.154565614024e-09 3007 KSP Residual norm 1.303716877196e-09 3008 KSP Residual norm 1.339522452573e-09 3009 KSP Residual norm 1.256579854699e-09 3010 KSP Residual norm 1.321970306015e-09 3011 KSP Residual norm 1.537405482794e-09 3012 KSP Residual norm 1.698741613566e-09 3013 KSP Residual norm 1.861423793441e-09 3014 KSP Residual norm 2.116416739772e-09 3015 KSP Residual norm 2.243853583469e-09 3016 KSP Residual norm 2.067642833082e-09 3017 KSP Residual norm 1.954864562509e-09 3018 KSP Residual norm 1.900540004228e-09 3019 KSP Residual norm 2.008258020293e-09 3020 KSP Residual norm 2.197543783982e-09 3021 KSP Residual norm 2.654859755512e-09 3022 KSP Residual norm 3.240325603636e-09 3023 KSP Residual norm 3.459656420052e-09 3024 KSP Residual norm 3.222389204022e-09 3025 KSP Residual norm 3.087834144647e-09 3026 KSP Residual norm 3.202111267821e-09 3027 KSP Residual norm 3.314517025302e-09 3028 KSP Residual norm 3.527196808291e-09 3029 KSP Residual norm 3.854647521812e-09 3030 KSP Residual norm 4.069833432228e-09 3031 KSP Residual norm 3.797579376863e-09 3032 KSP Residual norm 3.756568511683e-09 3033 KSP Residual norm 4.220371445873e-09 3034 KSP Residual norm 4.708176427459e-09 3035 KSP Residual norm 4.945317812649e-09 3036 KSP Residual norm 5.478547484712e-09 3037 KSP Residual norm 6.304052270056e-09 3038 KSP Residual norm 6.212088314622e-09 3039 KSP Residual norm 5.574782358113e-09 3040 KSP Residual norm 5.404259855328e-09 3041 KSP Residual norm 5.264638183573e-09 3042 KSP Residual norm 5.065569317258e-09 3043 KSP Residual norm 5.476634470799e-09 3044 KSP Residual norm 6.665331891441e-09 3045 KSP Residual norm 7.150343226969e-09 3046 KSP Residual norm 6.543786821056e-09 3047 KSP Residual norm 6.575224293123e-09 3048 KSP Residual norm 7.418459536915e-09 3049 KSP Residual norm 7.525868476148e-09 3050 KSP Residual norm 7.407022326650e-09 3051 KSP Residual norm 8.706708035138e-09 3052 KSP Residual norm 1.007171199879e-08 3053 KSP Residual norm 1.040563158757e-08 3054 KSP Residual norm 1.189989200710e-08 3055 KSP Residual norm 1.403656580712e-08 3056 KSP Residual norm 1.340357329117e-08 3057 KSP Residual norm 1.218924437528e-08 3058 KSP Residual norm 1.159805098760e-08 3059 KSP Residual norm 1.089749016716e-08 3060 KSP Residual norm 1.090186042450e-08 3061 KSP Residual norm 1.170812688833e-08 3062 KSP Residual norm 1.232752889129e-08 3063 KSP Residual norm 1.187314394978e-08 3064 KSP Residual norm 1.273939337025e-08 3065 KSP Residual norm 1.406711256206e-08 3066 KSP Residual norm 1.284334849219e-08 3067 KSP Residual norm 1.184724141761e-08 3068 KSP Residual norm 1.242920030687e-08 3069 KSP Residual norm 1.342516168474e-08 3070 KSP Residual norm 1.276476049647e-08 3071 KSP Residual norm 1.283265934621e-08 3072 KSP Residual norm 1.410385333557e-08 3073 KSP Residual norm 1.421030635713e-08 3074 KSP Residual norm 1.475209154677e-08 3075 KSP Residual norm 1.559518964895e-08 3076 KSP Residual norm 1.602050726284e-08 3077 KSP Residual norm 1.578645683888e-08 3078 KSP Residual norm 1.652892810577e-08 3079 KSP Residual norm 1.731873506897e-08 3080 KSP Residual norm 1.662545165567e-08 3081 KSP Residual norm 1.620931807147e-08 3082 KSP Residual norm 1.614408978898e-08 3083 KSP Residual norm 1.548144106433e-08 3084 KSP Residual norm 1.410795113480e-08 3085 KSP Residual norm 1.361617814659e-08 3086 KSP Residual norm 1.369352009898e-08 3087 KSP Residual norm 1.344892264215e-08 3088 KSP Residual norm 1.207989224807e-08 3089 KSP Residual norm 1.055610165908e-08 3090 KSP Residual norm 1.110538215000e-08 3091 KSP Residual norm 1.298088117877e-08 3092 KSP Residual norm 1.302948940212e-08 3093 KSP Residual norm 1.211091835662e-08 3094 KSP Residual norm 1.255864463435e-08 3095 KSP Residual norm 1.314728100174e-08 3096 KSP Residual norm 1.244011012240e-08 3097 KSP Residual norm 1.238340792393e-08 3098 KSP Residual norm 1.271769064545e-08 3099 KSP Residual norm 1.219357023231e-08 3100 KSP Residual norm 1.123626620032e-08 3101 KSP Residual norm 1.193437827121e-08 3102 KSP Residual norm 1.258315135802e-08 3103 KSP Residual norm 1.229617769317e-08 3104 KSP Residual norm 1.171454473972e-08 3105 KSP Residual norm 1.092961840404e-08 3106 KSP Residual norm 9.400961936808e-09 3107 KSP Residual norm 8.398473933527e-09 3108 KSP Residual norm 8.073021701780e-09 3109 KSP Residual norm 7.422606183511e-09 3110 KSP Residual norm 6.598688053983e-09 3111 KSP Residual norm 6.077520639605e-09 3112 KSP Residual norm 5.563492777509e-09 3113 KSP Residual norm 5.712883851106e-09 3114 KSP Residual norm 6.810186234463e-09 3115 KSP Residual norm 7.918293618336e-09 3116 KSP Residual norm 7.679418510660e-09 3117 KSP Residual norm 6.795873755473e-09 3118 KSP Residual norm 6.854336630074e-09 3119 KSP Residual norm 7.027148875858e-09 3120 KSP Residual norm 6.610531080791e-09 3121 KSP Residual norm 6.612310218810e-09 3122 KSP Residual norm 6.711445252418e-09 3123 KSP Residual norm 5.886986326375e-09 3124 KSP Residual norm 5.407965813146e-09 3125 KSP Residual norm 5.667418719386e-09 3126 KSP Residual norm 5.702868378387e-09 3127 KSP Residual norm 5.475288168510e-09 3128 KSP Residual norm 5.444311147240e-09 3129 KSP Residual norm 5.209025034882e-09 3130 KSP Residual norm 4.764739261775e-09 3131 KSP Residual norm 4.632632955447e-09 3132 KSP Residual norm 5.145605987653e-09 3133 KSP Residual norm 5.041135631143e-09 3134 KSP Residual norm 4.081951474773e-09 3135 KSP Residual norm 3.627692762278e-09 3136 KSP Residual norm 3.718177911700e-09 3137 KSP Residual norm 3.744799480105e-09 3138 KSP Residual norm 3.695572669460e-09 3139 KSP Residual norm 3.972488854413e-09 3140 KSP Residual norm 4.477639314267e-09 3141 KSP Residual norm 4.861315216699e-09 3142 KSP Residual norm 4.862480007791e-09 3143 KSP Residual norm 4.294398664109e-09 3144 KSP Residual norm 4.085632771950e-09 3145 KSP Residual norm 4.358872461936e-09 3146 KSP Residual norm 4.650881057237e-09 3147 KSP Residual norm 4.654100506448e-09 3148 KSP Residual norm 4.610817886843e-09 3149 KSP Residual norm 4.604152233344e-09 3150 KSP Residual norm 4.747286376013e-09 3151 KSP Residual norm 4.856090740634e-09 3152 KSP Residual norm 4.501567993933e-09 3153 KSP Residual norm 3.909764657320e-09 3154 KSP Residual norm 4.166272857820e-09 3155 KSP Residual norm 5.107226582300e-09 3156 KSP Residual norm 5.496526379146e-09 3157 KSP Residual norm 5.060108390780e-09 3158 KSP Residual norm 4.603693552068e-09 3159 KSP Residual norm 4.423141534586e-09 3160 KSP Residual norm 4.450086263480e-09 3161 KSP Residual norm 4.765868255664e-09 3162 KSP Residual norm 4.728737508027e-09 3163 KSP Residual norm 4.225321541224e-09 3164 KSP Residual norm 3.969670247250e-09 3165 KSP Residual norm 4.359049255995e-09 3166 KSP Residual norm 4.558233037824e-09 3167 KSP Residual norm 4.313343448882e-09 3168 KSP Residual norm 4.360678389435e-09 3169 KSP Residual norm 4.648607092232e-09 3170 KSP Residual norm 4.668675636775e-09 3171 KSP Residual norm 4.948690524349e-09 3172 KSP Residual norm 5.502370130860e-09 3173 KSP Residual norm 5.624167600766e-09 3174 KSP Residual norm 5.734029720277e-09 3175 KSP Residual norm 6.354798184490e-09 3176 KSP Residual norm 6.812857913567e-09 3177 KSP Residual norm 6.831967731584e-09 3178 KSP Residual norm 7.290704132342e-09 3179 KSP Residual norm 7.298003630084e-09 3180 KSP Residual norm 6.612265085764e-09 3181 KSP Residual norm 6.519338901436e-09 3182 KSP Residual norm 6.404983790976e-09 3183 KSP Residual norm 6.359795902134e-09 3184 KSP Residual norm 6.861813656435e-09 3185 KSP Residual norm 7.795713316422e-09 3186 KSP Residual norm 8.304269800623e-09 3187 KSP Residual norm 8.907704175398e-09 3188 KSP Residual norm 9.027928409947e-09 3189 KSP Residual norm 7.722684766093e-09 3190 KSP Residual norm 6.403581130323e-09 3191 KSP Residual norm 5.955411343752e-09 3192 KSP Residual norm 5.929991010043e-09 3193 KSP Residual norm 5.891787892020e-09 3194 KSP Residual norm 6.859891011974e-09 3195 KSP Residual norm 8.986886250718e-09 3196 KSP Residual norm 1.105195420081e-08 3197 KSP Residual norm 1.269057641791e-08 3198 KSP Residual norm 1.363392100764e-08 3199 KSP Residual norm 1.301592977214e-08 3200 KSP Residual norm 1.322105988826e-08 3201 KSP Residual norm 1.484726971122e-08 3202 KSP Residual norm 1.572643940722e-08 3203 KSP Residual norm 1.568759154209e-08 3204 KSP Residual norm 1.510737834790e-08 3205 KSP Residual norm 1.355573402621e-08 3206 KSP Residual norm 1.259690011737e-08 3207 KSP Residual norm 1.318766330706e-08 3208 KSP Residual norm 1.371488612381e-08 3209 KSP Residual norm 1.388327712197e-08 3210 KSP Residual norm 1.541992151500e-08 3211 KSP Residual norm 1.840416600020e-08 3212 KSP Residual norm 1.872128937209e-08 3213 KSP Residual norm 1.794119065455e-08 3214 KSP Residual norm 1.970768102999e-08 3215 KSP Residual norm 2.328880478971e-08 3216 KSP Residual norm 2.387532724903e-08 3217 KSP Residual norm 2.319687198235e-08 3218 KSP Residual norm 2.317305635572e-08 3219 KSP Residual norm 2.199672474085e-08 3220 KSP Residual norm 2.039507534836e-08 3221 KSP Residual norm 1.951505705194e-08 3222 KSP Residual norm 1.889827626342e-08 3223 KSP Residual norm 1.689262047862e-08 3224 KSP Residual norm 1.642409429496e-08 3225 KSP Residual norm 1.785694234356e-08 3226 KSP Residual norm 1.989301555152e-08 3227 KSP Residual norm 2.063549350396e-08 3228 KSP Residual norm 2.074923177145e-08 3229 KSP Residual norm 1.955570386305e-08 3230 KSP Residual norm 1.757596576290e-08 3231 KSP Residual norm 1.780320598367e-08 3232 KSP Residual norm 2.060389743580e-08 3233 KSP Residual norm 2.199439410504e-08 3234 KSP Residual norm 1.940992441617e-08 3235 KSP Residual norm 1.497985432431e-08 3236 KSP Residual norm 1.137578028754e-08 3237 KSP Residual norm 9.591794383709e-09 3238 KSP Residual norm 9.963270114348e-09 3239 KSP Residual norm 1.297700060654e-08 3240 KSP Residual norm 1.659306646168e-08 3241 KSP Residual norm 1.855090668625e-08 3242 KSP Residual norm 1.758295404025e-08 3243 KSP Residual norm 1.402624013748e-08 3244 KSP Residual norm 1.141735707857e-08 3245 KSP Residual norm 1.080105145163e-08 3246 KSP Residual norm 1.098475692728e-08 3247 KSP Residual norm 1.201837614835e-08 3248 KSP Residual norm 1.343465893889e-08 3249 KSP Residual norm 1.370482802318e-08 3250 KSP Residual norm 1.305334274353e-08 3251 KSP Residual norm 1.300428676185e-08 3252 KSP Residual norm 1.250321381691e-08 3253 KSP Residual norm 1.091132868831e-08 3254 KSP Residual norm 9.673770720204e-09 3255 KSP Residual norm 8.559557475995e-09 3256 KSP Residual norm 7.535639018088e-09 3257 KSP Residual norm 7.434011933152e-09 3258 KSP Residual norm 8.428838187541e-09 3259 KSP Residual norm 8.874166489691e-09 3260 KSP Residual norm 9.110673722111e-09 3261 KSP Residual norm 9.655628995657e-09 3262 KSP Residual norm 8.858171800408e-09 3263 KSP Residual norm 7.583496958332e-09 3264 KSP Residual norm 7.686864276349e-09 3265 KSP Residual norm 8.162156249675e-09 3266 KSP Residual norm 7.182901310783e-09 3267 KSP Residual norm 5.694521719456e-09 3268 KSP Residual norm 4.795371785557e-09 3269 KSP Residual norm 4.508338780642e-09 3270 KSP Residual norm 4.669984353910e-09 3271 KSP Residual norm 5.056332637406e-09 3272 KSP Residual norm 5.252464453011e-09 3273 KSP Residual norm 4.595783529628e-09 3274 KSP Residual norm 3.736683430581e-09 3275 KSP Residual norm 3.102926248272e-09 3276 KSP Residual norm 2.880059573310e-09 3277 KSP Residual norm 3.203845172389e-09 3278 KSP Residual norm 4.158253314906e-09 3279 KSP Residual norm 5.247627484845e-09 3280 KSP Residual norm 4.892056275754e-09 3281 KSP Residual norm 4.025741432791e-09 3282 KSP Residual norm 3.339865426301e-09 3283 KSP Residual norm 2.963169041294e-09 3284 KSP Residual norm 2.952512741400e-09 3285 KSP Residual norm 3.667541702976e-09 3286 KSP Residual norm 5.007856786367e-09 3287 KSP Residual norm 5.390779332652e-09 3288 KSP Residual norm 5.161106326147e-09 3289 KSP Residual norm 4.479365045580e-09 3290 KSP Residual norm 3.447441418488e-09 3291 KSP Residual norm 3.006145069891e-09 3292 KSP Residual norm 3.220482711388e-09 3293 KSP Residual norm 3.844573265255e-09 3294 KSP Residual norm 4.853046094149e-09 3295 KSP Residual norm 5.888435641430e-09 3296 KSP Residual norm 5.189350743014e-09 3297 KSP Residual norm 3.821923786726e-09 3298 KSP Residual norm 2.944069975844e-09 3299 KSP Residual norm 2.494582051319e-09 3300 KSP Residual norm 2.289549150106e-09 3301 KSP Residual norm 2.693737324925e-09 3302 KSP Residual norm 3.849838348754e-09 3303 KSP Residual norm 4.835675269624e-09 3304 KSP Residual norm 4.842406200654e-09 3305 KSP Residual norm 4.140087834679e-09 3306 KSP Residual norm 3.104338243174e-09 3307 KSP Residual norm 2.280168121652e-09 3308 KSP Residual norm 2.278600374061e-09 3309 KSP Residual norm 3.028713403240e-09 3310 KSP Residual norm 3.952119361701e-09 3311 KSP Residual norm 4.563429295278e-09 3312 KSP Residual norm 4.912516331425e-09 3313 KSP Residual norm 4.283205127752e-09 3314 KSP Residual norm 2.978436681738e-09 3315 KSP Residual norm 2.235974179447e-09 3316 KSP Residual norm 2.128837817857e-09 3317 KSP Residual norm 2.460975418829e-09 3318 KSP Residual norm 3.473176827566e-09 3319 KSP Residual norm 5.327524719125e-09 3320 KSP Residual norm 7.602820272497e-09 3321 KSP Residual norm 8.736056237634e-09 3322 KSP Residual norm 7.792595273450e-09 3323 KSP Residual norm 5.341113496356e-09 3324 KSP Residual norm 3.488015788308e-09 3325 KSP Residual norm 2.454463916934e-09 3326 KSP Residual norm 2.282630117337e-09 3327 KSP Residual norm 2.813481656631e-09 3328 KSP Residual norm 4.558997094791e-09 3329 KSP Residual norm 7.242921443219e-09 3330 KSP Residual norm 9.149458129878e-09 3331 KSP Residual norm 8.914627283200e-09 3332 KSP Residual norm 7.094245399923e-09 3333 KSP Residual norm 5.419270550633e-09 3334 KSP Residual norm 4.448583198149e-09 3335 KSP Residual norm 4.236271542197e-09 3336 KSP Residual norm 5.039249378949e-09 3337 KSP Residual norm 7.366984118376e-09 3338 KSP Residual norm 1.037594555909e-08 3339 KSP Residual norm 1.162983327888e-08 3340 KSP Residual norm 1.050870031574e-08 3341 KSP Residual norm 7.441416017776e-09 3342 KSP Residual norm 5.010563682115e-09 3343 KSP Residual norm 4.413285821309e-09 3344 KSP Residual norm 5.322154899682e-09 3345 KSP Residual norm 7.016982336268e-09 3346 KSP Residual norm 9.979442615695e-09 3347 KSP Residual norm 1.474643017374e-08 3348 KSP Residual norm 1.733890250344e-08 3349 KSP Residual norm 1.446227057509e-08 3350 KSP Residual norm 1.050725350013e-08 3351 KSP Residual norm 8.239754991790e-09 3352 KSP Residual norm 7.162606610542e-09 3353 KSP Residual norm 7.480554202824e-09 3354 KSP Residual norm 8.817729642449e-09 3355 KSP Residual norm 9.784886768666e-09 3356 KSP Residual norm 1.042163648032e-08 3357 KSP Residual norm 1.195558535219e-08 3358 KSP Residual norm 1.495754937394e-08 3359 KSP Residual norm 1.848763294491e-08 3360 KSP Residual norm 2.201414106633e-08 3361 KSP Residual norm 2.214608803136e-08 3362 KSP Residual norm 1.676070062554e-08 3363 KSP Residual norm 1.149327007865e-08 3364 KSP Residual norm 8.811553408434e-09 3365 KSP Residual norm 8.283292405272e-09 3366 KSP Residual norm 9.567674989402e-09 3367 KSP Residual norm 1.238564210484e-08 3368 KSP Residual norm 1.677732723026e-08 3369 KSP Residual norm 1.973952324089e-08 3370 KSP Residual norm 1.966571506297e-08 3371 KSP Residual norm 1.896617235163e-08 3372 KSP Residual norm 1.894264412529e-08 3373 KSP Residual norm 1.737574021949e-08 3374 KSP Residual norm 1.387375292483e-08 3375 KSP Residual norm 9.964126659752e-09 3376 KSP Residual norm 8.616881730334e-09 3377 KSP Residual norm 9.781470363234e-09 3378 KSP Residual norm 1.227707331458e-08 3379 KSP Residual norm 1.313910049365e-08 3380 KSP Residual norm 1.244759299923e-08 3381 KSP Residual norm 1.154824615892e-08 3382 KSP Residual norm 1.172001068772e-08 3383 KSP Residual norm 1.403982505480e-08 3384 KSP Residual norm 2.046507871966e-08 3385 KSP Residual norm 2.836687901483e-08 3386 KSP Residual norm 2.599907708862e-08 3387 KSP Residual norm 1.854074056045e-08 3388 KSP Residual norm 1.238517619725e-08 3389 KSP Residual norm 8.879617534034e-09 3390 KSP Residual norm 7.985660882134e-09 3391 KSP Residual norm 9.315423658071e-09 3392 KSP Residual norm 1.233130825883e-08 3393 KSP Residual norm 1.513783762599e-08 3394 KSP Residual norm 1.603231607686e-08 3395 KSP Residual norm 1.598920430535e-08 3396 KSP Residual norm 1.563997468101e-08 3397 KSP Residual norm 1.679420713924e-08 3398 KSP Residual norm 1.909814408703e-08 3399 KSP Residual norm 1.915915456556e-08 3400 KSP Residual norm 1.400561661626e-08 3401 KSP Residual norm 9.254220043579e-09 3402 KSP Residual norm 6.710728517274e-09 3403 KSP Residual norm 5.844165206147e-09 3404 KSP Residual norm 6.775210785790e-09 3405 KSP Residual norm 8.383885863897e-09 3406 KSP Residual norm 8.142195003197e-09 3407 KSP Residual norm 7.429295023237e-09 3408 KSP Residual norm 7.450045106787e-09 3409 KSP Residual norm 8.319381837160e-09 3410 KSP Residual norm 9.743181900403e-09 3411 KSP Residual norm 1.034264402570e-08 3412 KSP Residual norm 8.503383264827e-09 3413 KSP Residual norm 6.680884299923e-09 3414 KSP Residual norm 5.978583101887e-09 3415 KSP Residual norm 5.396327215863e-09 3416 KSP Residual norm 4.906741046136e-09 3417 KSP Residual norm 5.073807402895e-09 3418 KSP Residual norm 5.420870803317e-09 3419 KSP Residual norm 5.771218189403e-09 3420 KSP Residual norm 6.592935130818e-09 3421 KSP Residual norm 8.223593132123e-09 3422 KSP Residual norm 9.679187682916e-09 3423 KSP Residual norm 9.162956655868e-09 3424 KSP Residual norm 7.796048626480e-09 3425 KSP Residual norm 6.857145822099e-09 3426 KSP Residual norm 6.720563872184e-09 3427 KSP Residual norm 6.037427380372e-09 3428 KSP Residual norm 4.375346839404e-09 3429 KSP Residual norm 3.291231325204e-09 3430 KSP Residual norm 2.920900712586e-09 3431 KSP Residual norm 2.951378168560e-09 3432 KSP Residual norm 2.896509468203e-09 3433 KSP Residual norm 2.628621810408e-09 3434 KSP Residual norm 2.401503206811e-09 3435 KSP Residual norm 2.530568475210e-09 3436 KSP Residual norm 3.111733203033e-09 3437 KSP Residual norm 3.732599896702e-09 3438 KSP Residual norm 3.712379672726e-09 3439 KSP Residual norm 3.715903901211e-09 3440 KSP Residual norm 4.126600187252e-09 3441 KSP Residual norm 4.616603692625e-09 3442 KSP Residual norm 4.628010444851e-09 3443 KSP Residual norm 3.615513806336e-09 3444 KSP Residual norm 2.614434540573e-09 3445 KSP Residual norm 2.155166720355e-09 3446 KSP Residual norm 2.115499977160e-09 3447 KSP Residual norm 2.228860007009e-09 3448 KSP Residual norm 2.094895885754e-09 3449 KSP Residual norm 1.721287915797e-09 3450 KSP Residual norm 1.409575608678e-09 3451 KSP Residual norm 1.332561498124e-09 3452 KSP Residual norm 1.397428291659e-09 3453 KSP Residual norm 1.509421119018e-09 3454 KSP Residual norm 1.573759917446e-09 3455 KSP Residual norm 1.601785165852e-09 3456 KSP Residual norm 1.695432082500e-09 3457 KSP Residual norm 2.043800520175e-09 3458 KSP Residual norm 2.641172998665e-09 3459 KSP Residual norm 2.935314969085e-09 3460 KSP Residual norm 2.718901339719e-09 3461 KSP Residual norm 2.603017000757e-09 3462 KSP Residual norm 2.703826232617e-09 3463 KSP Residual norm 2.518002819759e-09 3464 KSP Residual norm 1.992587298427e-09 3465 KSP Residual norm 1.511476163769e-09 3466 KSP Residual norm 1.270937120814e-09 3467 KSP Residual norm 1.225146112608e-09 3468 KSP Residual norm 1.266371279650e-09 3469 KSP Residual norm 1.287642511863e-09 3470 KSP Residual norm 1.374755462896e-09 3471 KSP Residual norm 1.511319802229e-09 3472 KSP Residual norm 1.803117455527e-09 3473 KSP Residual norm 2.213441955137e-09 3474 KSP Residual norm 2.215090950986e-09 3475 KSP Residual norm 1.853171977986e-09 3476 KSP Residual norm 1.619802216275e-09 3477 KSP Residual norm 1.506425003094e-09 3478 KSP Residual norm 1.420890578758e-09 3479 KSP Residual norm 1.338890917208e-09 3480 KSP Residual norm 1.219600077279e-09 3481 KSP Residual norm 9.977942922470e-10 3482 KSP Residual norm 8.423461529404e-10 3483 KSP Residual norm 9.014035420764e-10 3484 KSP Residual norm 1.080199421668e-09 3485 KSP Residual norm 1.212921777832e-09 3486 KSP Residual norm 1.245220679264e-09 3487 KSP Residual norm 1.412802039780e-09 3488 KSP Residual norm 1.746093758333e-09 3489 KSP Residual norm 2.142250818160e-09 3490 KSP Residual norm 2.281029040563e-09 3491 KSP Residual norm 2.027007369661e-09 3492 KSP Residual norm 1.814600826662e-09 3493 KSP Residual norm 1.843902150852e-09 3494 KSP Residual norm 2.041995327305e-09 3495 KSP Residual norm 1.961006969503e-09 3496 KSP Residual norm 1.643166073521e-09 3497 KSP Residual norm 1.561095553378e-09 3498 KSP Residual norm 1.710259231828e-09 3499 KSP Residual norm 1.857646579708e-09 3500 KSP Residual norm 1.755617942552e-09 3501 KSP Residual norm 1.560841148162e-09 3502 KSP Residual norm 1.570133046983e-09 3503 KSP Residual norm 1.864655412285e-09 3504 KSP Residual norm 2.223247244869e-09 3505 KSP Residual norm 2.306305379952e-09 3506 KSP Residual norm 2.039472409439e-09 3507 KSP Residual norm 1.808739826017e-09 3508 KSP Residual norm 1.823828650786e-09 3509 KSP Residual norm 1.914624790355e-09 3510 KSP Residual norm 1.766986653823e-09 3511 KSP Residual norm 1.532752708034e-09 3512 KSP Residual norm 1.534680509611e-09 3513 KSP Residual norm 1.709677101193e-09 3514 KSP Residual norm 1.856782186056e-09 3515 KSP Residual norm 1.657432978924e-09 3516 KSP Residual norm 1.273121353862e-09 3517 KSP Residual norm 1.224455718100e-09 3518 KSP Residual norm 1.403249412630e-09 3519 KSP Residual norm 1.624649644915e-09 3520 KSP Residual norm 1.785241620716e-09 3521 KSP Residual norm 1.911035235758e-09 3522 KSP Residual norm 2.328634992011e-09 3523 KSP Residual norm 3.296973586588e-09 3524 KSP Residual norm 4.896969965639e-09 3525 KSP Residual norm 5.500909302397e-09 3526 KSP Residual norm 4.845050467002e-09 3527 KSP Residual norm 4.397761048191e-09 3528 KSP Residual norm 4.574699972851e-09 3529 KSP Residual norm 4.908621211998e-09 3530 KSP Residual norm 4.643132930641e-09 3531 KSP Residual norm 3.614594972120e-09 3532 KSP Residual norm 2.895717915970e-09 3533 KSP Residual norm 2.932586395937e-09 3534 KSP Residual norm 3.488618209129e-09 3535 KSP Residual norm 4.099366657662e-09 3536 KSP Residual norm 4.185695133241e-09 3537 KSP Residual norm 4.185445437787e-09 3538 KSP Residual norm 5.093938232234e-09 3539 KSP Residual norm 7.531831956806e-09 3540 KSP Residual norm 1.003060138542e-08 3541 KSP Residual norm 9.071805650648e-09 3542 KSP Residual norm 7.813867608053e-09 3543 KSP Residual norm 8.660891144438e-09 3544 KSP Residual norm 1.083239957048e-08 3545 KSP Residual norm 1.060797959618e-08 3546 KSP Residual norm 8.051040216406e-09 3547 KSP Residual norm 6.483700880004e-09 3548 KSP Residual norm 6.171761629785e-09 3549 KSP Residual norm 6.432047785318e-09 3550 KSP Residual norm 6.224660810513e-09 3551 KSP Residual norm 5.525447723576e-09 3552 KSP Residual norm 5.515604098047e-09 3553 KSP Residual norm 6.654738665945e-09 3554 KSP Residual norm 7.856590179732e-09 3555 KSP Residual norm 8.306250699822e-09 3556 KSP Residual norm 8.889530172384e-09 3557 KSP Residual norm 1.040715392118e-08 3558 KSP Residual norm 1.271666228128e-08 3559 KSP Residual norm 1.482910151918e-08 3560 KSP Residual norm 1.399640681425e-08 3561 KSP Residual norm 1.148482692464e-08 3562 KSP Residual norm 1.011782386133e-08 3563 KSP Residual norm 1.033662916531e-08 3564 KSP Residual norm 1.003734807936e-08 3565 KSP Residual norm 8.330341997878e-09 3566 KSP Residual norm 6.896945348191e-09 3567 KSP Residual norm 6.236435503963e-09 3568 KSP Residual norm 6.020297506649e-09 3569 KSP Residual norm 5.575399864494e-09 3570 KSP Residual norm 5.000739651273e-09 3571 KSP Residual norm 5.229468815244e-09 3572 KSP Residual norm 6.767901056203e-09 3573 KSP Residual norm 8.664279885066e-09 3574 KSP Residual norm 8.947586303432e-09 3575 KSP Residual norm 9.028249028046e-09 3576 KSP Residual norm 1.046710807339e-08 3577 KSP Residual norm 1.325629425899e-08 3578 KSP Residual norm 1.552482970550e-08 3579 KSP Residual norm 1.541030603723e-08 3580 KSP Residual norm 1.502321697973e-08 3581 KSP Residual norm 1.572897171304e-08 3582 KSP Residual norm 1.592143150348e-08 3583 KSP Residual norm 1.463282285019e-08 3584 KSP Residual norm 1.215857856446e-08 3585 KSP Residual norm 1.043967230688e-08 3586 KSP Residual norm 9.993546258971e-09 3587 KSP Residual norm 8.901126538423e-09 3588 KSP Residual norm 7.098600697915e-09 3589 KSP Residual norm 5.852469115337e-09 3590 KSP Residual norm 5.659826898317e-09 3591 KSP Residual norm 6.105513355409e-09 3592 KSP Residual norm 6.300337302379e-09 3593 KSP Residual norm 5.927908222259e-09 3594 KSP Residual norm 5.664654351998e-09 3595 KSP Residual norm 6.318917995181e-09 3596 KSP Residual norm 7.838197036978e-09 3597 KSP Residual norm 8.536136733933e-09 3598 KSP Residual norm 7.763976923173e-09 3599 KSP Residual norm 7.985879471433e-09 3600 KSP Residual norm 9.738862691594e-09 3601 KSP Residual norm 1.091695508309e-08 3602 KSP Residual norm 1.001104974036e-08 3603 KSP Residual norm 9.233867125388e-09 3604 KSP Residual norm 9.537585595872e-09 3605 KSP Residual norm 9.935544775921e-09 3606 KSP Residual norm 8.765739171866e-09 3607 KSP Residual norm 6.351142737339e-09 3608 KSP Residual norm 5.025162851112e-09 3609 KSP Residual norm 4.878278010713e-09 3610 KSP Residual norm 5.292002158799e-09 3611 KSP Residual norm 4.990771065089e-09 3612 KSP Residual norm 4.167520853998e-09 3613 KSP Residual norm 3.756224737184e-09 3614 KSP Residual norm 4.070223348321e-09 3615 KSP Residual norm 5.097976574503e-09 3616 KSP Residual norm 5.502139616920e-09 3617 KSP Residual norm 4.875598963190e-09 3618 KSP Residual norm 4.578059685077e-09 3619 KSP Residual norm 5.059461202823e-09 3620 KSP Residual norm 5.632415761366e-09 3621 KSP Residual norm 5.712537216145e-09 3622 KSP Residual norm 5.817794543408e-09 3623 KSP Residual norm 6.245394355653e-09 3624 KSP Residual norm 7.188889660444e-09 3625 KSP Residual norm 7.897495504395e-09 3626 KSP Residual norm 6.337194697730e-09 3627 KSP Residual norm 4.975859345504e-09 3628 KSP Residual norm 4.563651490302e-09 3629 KSP Residual norm 4.144989088426e-09 3630 KSP Residual norm 3.236740240349e-09 3631 KSP Residual norm 2.703936085570e-09 3632 KSP Residual norm 2.575225817969e-09 3633 KSP Residual norm 2.444870236042e-09 3634 KSP Residual norm 2.477149010409e-09 3635 KSP Residual norm 2.586264969643e-09 3636 KSP Residual norm 2.714795375250e-09 3637 KSP Residual norm 3.049619731988e-09 3638 KSP Residual norm 3.322843944377e-09 3639 KSP Residual norm 3.013827671514e-09 3640 KSP Residual norm 2.777575793326e-09 3641 KSP Residual norm 3.127649029513e-09 3642 KSP Residual norm 3.669267842667e-09 3643 KSP Residual norm 3.808706261272e-09 3644 KSP Residual norm 3.755146904996e-09 3645 KSP Residual norm 3.750486986120e-09 3646 KSP Residual norm 3.843585330706e-09 3647 KSP Residual norm 3.811076409355e-09 3648 KSP Residual norm 3.220294997649e-09 3649 KSP Residual norm 2.314041764327e-09 3650 KSP Residual norm 1.818075811897e-09 3651 KSP Residual norm 1.677953281875e-09 3652 KSP Residual norm 1.617488234897e-09 3653 KSP Residual norm 1.517610583976e-09 3654 KSP Residual norm 1.506090675990e-09 3655 KSP Residual norm 1.504816607156e-09 3656 KSP Residual norm 1.535251724128e-09 3657 KSP Residual norm 1.593317822618e-09 3658 KSP Residual norm 1.604531477442e-09 3659 KSP Residual norm 1.686018335962e-09 3660 KSP Residual norm 2.008847516854e-09 3661 KSP Residual norm 2.256427498138e-09 3662 KSP Residual norm 2.218662480385e-09 3663 KSP Residual norm 2.349126109980e-09 3664 KSP Residual norm 2.815456879471e-09 3665 KSP Residual norm 3.105243592841e-09 3666 KSP Residual norm 3.111697946625e-09 3667 KSP Residual norm 2.986875519869e-09 3668 KSP Residual norm 2.738098545043e-09 3669 KSP Residual norm 2.503551675799e-09 3670 KSP Residual norm 2.312713419424e-09 3671 KSP Residual norm 1.964188495057e-09 3672 KSP Residual norm 1.681739997756e-09 3673 KSP Residual norm 1.591585888744e-09 3674 KSP Residual norm 1.500636319990e-09 3675 KSP Residual norm 1.308034015344e-09 3676 KSP Residual norm 1.229565721675e-09 3677 KSP Residual norm 1.207891801592e-09 3678 KSP Residual norm 1.146178320427e-09 3679 KSP Residual norm 1.202806594448e-09 3680 KSP Residual norm 1.344935981461e-09 3681 KSP Residual norm 1.314064474627e-09 3682 KSP Residual norm 1.250809723212e-09 3683 KSP Residual norm 1.342232779997e-09 3684 KSP Residual norm 1.525539282968e-09 3685 KSP Residual norm 1.688057331076e-09 3686 KSP Residual norm 1.729490059005e-09 3687 KSP Residual norm 1.577498416576e-09 3688 KSP Residual norm 1.472874836476e-09 3689 KSP Residual norm 1.575346556740e-09 3690 KSP Residual norm 1.622605689175e-09 3691 KSP Residual norm 1.535992678461e-09 3692 KSP Residual norm 1.470848407673e-09 3693 KSP Residual norm 1.502047000898e-09 3694 KSP Residual norm 1.510651871516e-09 3695 KSP Residual norm 1.398937444999e-09 3696 KSP Residual norm 1.171234761594e-09 3697 KSP Residual norm 9.915466067651e-10 3698 KSP Residual norm 9.936758316105e-10 3699 KSP Residual norm 1.032032245054e-09 3700 KSP Residual norm 8.945241911377e-10 3701 KSP Residual norm 7.632597830874e-10 3702 KSP Residual norm 7.273934964478e-10 3703 KSP Residual norm 7.200169323034e-10 3704 KSP Residual norm 6.843857792227e-10 3705 KSP Residual norm 6.525055860551e-10 3706 KSP Residual norm 6.520652839039e-10 3707 KSP Residual norm 7.185286985055e-10 3708 KSP Residual norm 8.810604846284e-10 3709 KSP Residual norm 9.567504870817e-10 3710 KSP Residual norm 8.990483957396e-10 3711 KSP Residual norm 9.344793403503e-10 3712 KSP Residual norm 1.122997504736e-09 3713 KSP Residual norm 1.247726837693e-09 3714 KSP Residual norm 1.242987426230e-09 3715 KSP Residual norm 1.206685272823e-09 3716 KSP Residual norm 1.202928772893e-09 3717 KSP Residual norm 1.257492782641e-09 3718 KSP Residual norm 1.295793216627e-09 3719 KSP Residual norm 1.254096893974e-09 3720 KSP Residual norm 1.317976543469e-09 3721 KSP Residual norm 1.477217056374e-09 3722 KSP Residual norm 1.417778060759e-09 3723 KSP Residual norm 1.279049064543e-09 3724 KSP Residual norm 1.323701259263e-09 3725 KSP Residual norm 1.307753080391e-09 3726 KSP Residual norm 1.183826071489e-09 3727 KSP Residual norm 1.078327845374e-09 3728 KSP Residual norm 9.536633163961e-10 3729 KSP Residual norm 8.901825635324e-10 3730 KSP Residual norm 8.999343130855e-10 3731 KSP Residual norm 8.890597721268e-10 3732 KSP Residual norm 8.843304223038e-10 3733 KSP Residual norm 1.000598872012e-09 3734 KSP Residual norm 1.037538219714e-09 3735 KSP Residual norm 9.754158718212e-10 3736 KSP Residual norm 9.880149344183e-10 3737 KSP Residual norm 1.017366680519e-09 3738 KSP Residual norm 9.549603006671e-10 3739 KSP Residual norm 9.258921550845e-10 3740 KSP Residual norm 1.027346761231e-09 3741 KSP Residual norm 1.166874318109e-09 3742 KSP Residual norm 1.213370029486e-09 3743 KSP Residual norm 1.184871608298e-09 3744 KSP Residual norm 1.174473860715e-09 3745 KSP Residual norm 1.282706697747e-09 3746 KSP Residual norm 1.420218538661e-09 3747 KSP Residual norm 1.213617751835e-09 3748 KSP Residual norm 1.012616242323e-09 3749 KSP Residual norm 1.007353910335e-09 3750 KSP Residual norm 1.056593527376e-09 3751 KSP Residual norm 9.580832383252e-10 3752 KSP Residual norm 8.219281961795e-10 3753 KSP Residual norm 7.745848997630e-10 3754 KSP Residual norm 7.931571443373e-10 3755 KSP Residual norm 8.467406308335e-10 3756 KSP Residual norm 8.745814946721e-10 3757 KSP Residual norm 8.909135899981e-10 3758 KSP Residual norm 1.054596979550e-09 3759 KSP Residual norm 1.438467378917e-09 3760 KSP Residual norm 1.592665138470e-09 3761 KSP Residual norm 1.654950136113e-09 3762 KSP Residual norm 1.863240466378e-09 3763 KSP Residual norm 2.196282931642e-09 3764 KSP Residual norm 2.421102402238e-09 3765 KSP Residual norm 2.628663441533e-09 3766 KSP Residual norm 2.835077530258e-09 3767 KSP Residual norm 3.175214114881e-09 3768 KSP Residual norm 3.697360871015e-09 3769 KSP Residual norm 3.913774246882e-09 3770 KSP Residual norm 3.550579905576e-09 3771 KSP Residual norm 3.304815922808e-09 3772 KSP Residual norm 3.351194014807e-09 3773 KSP Residual norm 3.177330698243e-09 3774 KSP Residual norm 3.028958586793e-09 3775 KSP Residual norm 3.006011160859e-09 3776 KSP Residual norm 2.990140440619e-09 3777 KSP Residual norm 2.857406980637e-09 3778 KSP Residual norm 2.582408262295e-09 3779 KSP Residual norm 2.337385010000e-09 3780 KSP Residual norm 2.367977574793e-09 3781 KSP Residual norm 2.603199274005e-09 3782 KSP Residual norm 2.626986692675e-09 3783 KSP Residual norm 2.507646852925e-09 3784 KSP Residual norm 2.642769899751e-09 3785 KSP Residual norm 3.006809518600e-09 3786 KSP Residual norm 3.449837784267e-09 3787 KSP Residual norm 3.967858597187e-09 3788 KSP Residual norm 4.334255750080e-09 3789 KSP Residual norm 4.564948892290e-09 3790 KSP Residual norm 4.986634678652e-09 3791 KSP Residual norm 5.282722680059e-09 3792 KSP Residual norm 5.527223810294e-09 3793 KSP Residual norm 6.295891355624e-09 3794 KSP Residual norm 7.053334963000e-09 3795 KSP Residual norm 6.909173011517e-09 3796 KSP Residual norm 6.791535027020e-09 3797 KSP Residual norm 6.811888049361e-09 3798 KSP Residual norm 6.131671212365e-09 3799 KSP Residual norm 5.152257158523e-09 3800 KSP Residual norm 4.778158384516e-09 3801 KSP Residual norm 4.612516711446e-09 3802 KSP Residual norm 4.562565820156e-09 3803 KSP Residual norm 4.617156905796e-09 3804 KSP Residual norm 4.358636100480e-09 3805 KSP Residual norm 3.923004546151e-09 3806 KSP Residual norm 3.919094602564e-09 3807 KSP Residual norm 4.137582611235e-09 3808 KSP Residual norm 4.166762557996e-09 3809 KSP Residual norm 4.157836207038e-09 3810 KSP Residual norm 4.550592070789e-09 3811 KSP Residual norm 5.318499256785e-09 3812 KSP Residual norm 6.100259232099e-09 3813 KSP Residual norm 6.554542794443e-09 3814 KSP Residual norm 6.665500876471e-09 3815 KSP Residual norm 6.966151539805e-09 3816 KSP Residual norm 7.630082757005e-09 3817 KSP Residual norm 7.781720204336e-09 3818 KSP Residual norm 8.374231106959e-09 3819 KSP Residual norm 9.886597355096e-09 3820 KSP Residual norm 1.013796062841e-08 3821 KSP Residual norm 9.153183359250e-09 3822 KSP Residual norm 9.287889893112e-09 3823 KSP Residual norm 9.002107067915e-09 3824 KSP Residual norm 7.771162647686e-09 3825 KSP Residual norm 6.611323733583e-09 3826 KSP Residual norm 6.193950774151e-09 3827 KSP Residual norm 6.006860989876e-09 3828 KSP Residual norm 5.548811001483e-09 3829 KSP Residual norm 4.599172770080e-09 3830 KSP Residual norm 3.842230780770e-09 3831 KSP Residual norm 3.622414122581e-09 3832 KSP Residual norm 3.540520346846e-09 3833 KSP Residual norm 3.393468599325e-09 3834 KSP Residual norm 3.331446904417e-09 3835 KSP Residual norm 3.514217766739e-09 3836 KSP Residual norm 3.400366281540e-09 3837 KSP Residual norm 3.648341368355e-09 3838 KSP Residual norm 4.473208234460e-09 3839 KSP Residual norm 5.249163562313e-09 3840 KSP Residual norm 5.378339602120e-09 3841 KSP Residual norm 5.335318247842e-09 3842 KSP Residual norm 5.401980040816e-09 3843 KSP Residual norm 5.490386757551e-09 3844 KSP Residual norm 6.083459229561e-09 3845 KSP Residual norm 6.846727677443e-09 3846 KSP Residual norm 7.646885057928e-09 3847 KSP Residual norm 8.071956101029e-09 3848 KSP Residual norm 7.697960396304e-09 3849 KSP Residual norm 6.772149457789e-09 3850 KSP Residual norm 6.757555864440e-09 3851 KSP Residual norm 6.974016978612e-09 3852 KSP Residual norm 6.227274736239e-09 3853 KSP Residual norm 5.481396616309e-09 3854 KSP Residual norm 5.168459942046e-09 3855 KSP Residual norm 4.953866468095e-09 3856 KSP Residual norm 4.176294030401e-09 3857 KSP Residual norm 3.652326452112e-09 3858 KSP Residual norm 3.531139316008e-09 3859 KSP Residual norm 3.543912231323e-09 3860 KSP Residual norm 3.661827914926e-09 3861 KSP Residual norm 3.492909356536e-09 3862 KSP Residual norm 3.275490068445e-09 3863 KSP Residual norm 3.590862719165e-09 3864 KSP Residual norm 3.816489981118e-09 3865 KSP Residual norm 3.531138746383e-09 3866 KSP Residual norm 3.391043727578e-09 3867 KSP Residual norm 3.676184742966e-09 3868 KSP Residual norm 4.067098670559e-09 3869 KSP Residual norm 4.410510232256e-09 3870 KSP Residual norm 4.692212412907e-09 3871 KSP Residual norm 5.069891878057e-09 3872 KSP Residual norm 5.790013201131e-09 3873 KSP Residual norm 6.516195442822e-09 3874 KSP Residual norm 6.255156678025e-09 3875 KSP Residual norm 6.063038053539e-09 3876 KSP Residual norm 6.873342625158e-09 3877 KSP Residual norm 6.943824440307e-09 3878 KSP Residual norm 5.840117772191e-09 3879 KSP Residual norm 5.144276309692e-09 3880 KSP Residual norm 4.937580479937e-09 3881 KSP Residual norm 4.648807688539e-09 3882 KSP Residual norm 4.087635552319e-09 3883 KSP Residual norm 3.443168088894e-09 3884 KSP Residual norm 3.022006583162e-09 3885 KSP Residual norm 2.859607370810e-09 3886 KSP Residual norm 2.711318373182e-09 3887 KSP Residual norm 2.405511441447e-09 3888 KSP Residual norm 2.263807464443e-09 3889 KSP Residual norm 2.424374509329e-09 3890 KSP Residual norm 2.412882269713e-09 3891 KSP Residual norm 2.287460266215e-09 3892 KSP Residual norm 2.301876313823e-09 3893 KSP Residual norm 2.296806627701e-09 3894 KSP Residual norm 2.152629906513e-09 3895 KSP Residual norm 2.190917755703e-09 3896 KSP Residual norm 2.344507727148e-09 3897 KSP Residual norm 2.482478128543e-09 3898 KSP Residual norm 2.712795011790e-09 3899 KSP Residual norm 2.919811289920e-09 3900 KSP Residual norm 2.986213208333e-09 3901 KSP Residual norm 3.125605230841e-09 3902 KSP Residual norm 3.380979670104e-09 3903 KSP Residual norm 3.283101953680e-09 3904 KSP Residual norm 3.127916120918e-09 3905 KSP Residual norm 3.264692015038e-09 3906 KSP Residual norm 3.250029145900e-09 3907 KSP Residual norm 2.946166507740e-09 3908 KSP Residual norm 2.441775149444e-09 3909 KSP Residual norm 2.202470467250e-09 3910 KSP Residual norm 2.255373430765e-09 3911 KSP Residual norm 2.353184882674e-09 3912 KSP Residual norm 1.965759220991e-09 3913 KSP Residual norm 1.710185402939e-09 3914 KSP Residual norm 1.758919952194e-09 3915 KSP Residual norm 1.795062037273e-09 3916 KSP Residual norm 1.606618785435e-09 3917 KSP Residual norm 1.413826151756e-09 3918 KSP Residual norm 1.283914299032e-09 3919 KSP Residual norm 1.232261291845e-09 3920 KSP Residual norm 1.270161260177e-09 3921 KSP Residual norm 1.331332352538e-09 3922 KSP Residual norm 1.355313865150e-09 3923 KSP Residual norm 1.448917738770e-09 3924 KSP Residual norm 1.422362403069e-09 3925 KSP Residual norm 1.323384522043e-09 3926 KSP Residual norm 1.400218158388e-09 3927 KSP Residual norm 1.656249119206e-09 3928 KSP Residual norm 1.643725725901e-09 3929 KSP Residual norm 1.545717747768e-09 3930 KSP Residual norm 1.788582198740e-09 3931 KSP Residual norm 2.181566592400e-09 3932 KSP Residual norm 2.474210455805e-09 3933 KSP Residual norm 2.796826555992e-09 3934 KSP Residual norm 3.083005629464e-09 3935 KSP Residual norm 3.066266695523e-09 3936 KSP Residual norm 3.105401664552e-09 3937 KSP Residual norm 3.256112899940e-09 3938 KSP Residual norm 3.370399409679e-09 3939 KSP Residual norm 3.834842269870e-09 3940 KSP Residual norm 4.501186120374e-09 3941 KSP Residual norm 4.250334618898e-09 3942 KSP Residual norm 3.809590443123e-09 3943 KSP Residual norm 3.756192953808e-09 3944 KSP Residual norm 3.544461085062e-09 3945 KSP Residual norm 3.223589182631e-09 3946 KSP Residual norm 3.239258085287e-09 3947 KSP Residual norm 3.360968668144e-09 3948 KSP Residual norm 3.261628998873e-09 3949 KSP Residual norm 3.290467660641e-09 3950 KSP Residual norm 3.330341012352e-09 3951 KSP Residual norm 3.239143428321e-09 3952 KSP Residual norm 3.277741365203e-09 3953 KSP Residual norm 3.101023282081e-09 3954 KSP Residual norm 2.892501125850e-09 3955 KSP Residual norm 2.856842261511e-09 3956 KSP Residual norm 2.718427935489e-09 3957 KSP Residual norm 2.344248405193e-09 3958 KSP Residual norm 2.086248636909e-09 3959 KSP Residual norm 2.184448011477e-09 3960 KSP Residual norm 2.415925399694e-09 3961 KSP Residual norm 2.576513553863e-09 3962 KSP Residual norm 2.737083658035e-09 3963 KSP Residual norm 2.965443467108e-09 3964 KSP Residual norm 3.164083481828e-09 3965 KSP Residual norm 3.510281077294e-09 3966 KSP Residual norm 3.399060445216e-09 3967 KSP Residual norm 3.331824499354e-09 3968 KSP Residual norm 4.042252202875e-09 3969 KSP Residual norm 4.870044093273e-09 3970 KSP Residual norm 4.732398806993e-09 3971 KSP Residual norm 4.662942850983e-09 3972 KSP Residual norm 5.472474969545e-09 3973 KSP Residual norm 5.954313908438e-09 3974 KSP Residual norm 5.991476831585e-09 3975 KSP Residual norm 6.001363596620e-09 3976 KSP Residual norm 5.704183033542e-09 3977 KSP Residual norm 5.385095024192e-09 3978 KSP Residual norm 5.166172984365e-09 3979 KSP Residual norm 4.943023104709e-09 3980 KSP Residual norm 5.018177715546e-09 3981 KSP Residual norm 5.380343119999e-09 3982 KSP Residual norm 5.293265303117e-09 3983 KSP Residual norm 4.729796013721e-09 3984 KSP Residual norm 4.491842566742e-09 3985 KSP Residual norm 4.206419801015e-09 3986 KSP Residual norm 3.537659762079e-09 3987 KSP Residual norm 3.053300180235e-09 3988 KSP Residual norm 2.929047013735e-09 3989 KSP Residual norm 3.031029702849e-09 3990 KSP Residual norm 3.198866711459e-09 3991 KSP Residual norm 3.376940985965e-09 3992 KSP Residual norm 3.418156071235e-09 3993 KSP Residual norm 3.599086389464e-09 3994 KSP Residual norm 3.900519119845e-09 3995 KSP Residual norm 4.034070946923e-09 3996 KSP Residual norm 4.253185778860e-09 3997 KSP Residual norm 4.733176776867e-09 3998 KSP Residual norm 4.789128503963e-09 3999 KSP Residual norm 5.016136563041e-09 4000 KSP Residual norm 6.138581703231e-09 4001 KSP Residual norm 7.391652942867e-09 4002 KSP Residual norm 7.998761074459e-09 4003 KSP Residual norm 8.679932031793e-09 4004 KSP Residual norm 9.030328191917e-09 4005 KSP Residual norm 9.546904892416e-09 4006 KSP Residual norm 9.690104383683e-09 4007 KSP Residual norm 9.483808381788e-09 4008 KSP Residual norm 1.012588277637e-08 4009 KSP Residual norm 1.195623866828e-08 4010 KSP Residual norm 1.299482278386e-08 4011 KSP Residual norm 1.146707382361e-08 4012 KSP Residual norm 1.075923528113e-08 4013 KSP Residual norm 1.160376593462e-08 4014 KSP Residual norm 1.198684568405e-08 4015 KSP Residual norm 1.135153309086e-08 4016 KSP Residual norm 1.082837345504e-08 4017 KSP Residual norm 1.031045035497e-08 4018 KSP Residual norm 1.052033547756e-08 4019 KSP Residual norm 1.121419895757e-08 4020 KSP Residual norm 1.075574081981e-08 4021 KSP Residual norm 9.633510824262e-09 4022 KSP Residual norm 9.145104141050e-09 4023 KSP Residual norm 8.862961841246e-09 4024 KSP Residual norm 8.449292454283e-09 4025 KSP Residual norm 8.648500752338e-09 4026 KSP Residual norm 8.690671206694e-09 4027 KSP Residual norm 8.056508664370e-09 4028 KSP Residual norm 7.550167290996e-09 4029 KSP Residual norm 7.347148765984e-09 4030 KSP Residual norm 7.720956633865e-09 4031 KSP Residual norm 8.762261865027e-09 4032 KSP Residual norm 9.789504732310e-09 4033 KSP Residual norm 1.054739089320e-08 4034 KSP Residual norm 1.202211959112e-08 4035 KSP Residual norm 1.330043382411e-08 4036 KSP Residual norm 1.384409930158e-08 4037 KSP Residual norm 1.450610028685e-08 4038 KSP Residual norm 1.525461548139e-08 4039 KSP Residual norm 1.617032057945e-08 4040 KSP Residual norm 1.850654071556e-08 4041 KSP Residual norm 2.256133336626e-08 4042 KSP Residual norm 2.536293554944e-08 4043 KSP Residual norm 2.655949217321e-08 4044 KSP Residual norm 2.804175255232e-08 4045 KSP Residual norm 2.832350176504e-08 4046 KSP Residual norm 2.678458985868e-08 4047 KSP Residual norm 2.592132192655e-08 4048 KSP Residual norm 2.428932682514e-08 4049 KSP Residual norm 2.289364976159e-08 4050 KSP Residual norm 2.386280178571e-08 4051 KSP Residual norm 2.559287999412e-08 4052 KSP Residual norm 2.551019264666e-08 4053 KSP Residual norm 2.562574769128e-08 4054 KSP Residual norm 2.597386692494e-08 4055 KSP Residual norm 2.249744794070e-08 4056 KSP Residual norm 1.908553073541e-08 4057 KSP Residual norm 1.694977829547e-08 4058 KSP Residual norm 1.520039098624e-08 4059 KSP Residual norm 1.419021852844e-08 4060 KSP Residual norm 1.505391318405e-08 4061 KSP Residual norm 1.691624430843e-08 4062 KSP Residual norm 1.887752572332e-08 4063 KSP Residual norm 2.089335104529e-08 4064 KSP Residual norm 2.096062736842e-08 4065 KSP Residual norm 2.093531483198e-08 4066 KSP Residual norm 2.395871862416e-08 4067 KSP Residual norm 2.764262925708e-08 4068 KSP Residual norm 2.796000832214e-08 4069 KSP Residual norm 2.950163871848e-08 4070 KSP Residual norm 3.096548082627e-08 4071 KSP Residual norm 2.737353312829e-08 4072 KSP Residual norm 2.467622001632e-08 4073 KSP Residual norm 2.594441514126e-08 4074 KSP Residual norm 2.916291972136e-08 4075 KSP Residual norm 3.509932607141e-08 4076 KSP Residual norm 3.962877952113e-08 4077 KSP Residual norm 3.630341801036e-08 4078 KSP Residual norm 3.300351337923e-08 4079 KSP Residual norm 3.323608754991e-08 4080 KSP Residual norm 3.176039818243e-08 4081 KSP Residual norm 2.982314458706e-08 4082 KSP Residual norm 2.896449247494e-08 4083 KSP Residual norm 2.793509115658e-08 4084 KSP Residual norm 2.737195005100e-08 4085 KSP Residual norm 2.779980127325e-08 4086 KSP Residual norm 2.722947138431e-08 4087 KSP Residual norm 2.612623382793e-08 4088 KSP Residual norm 2.501790278787e-08 4089 KSP Residual norm 2.351751942911e-08 4090 KSP Residual norm 2.218826538225e-08 4091 KSP Residual norm 2.130630498099e-08 4092 KSP Residual norm 2.004207099690e-08 4093 KSP Residual norm 1.816738300576e-08 4094 KSP Residual norm 1.691446504243e-08 4095 KSP Residual norm 1.668342826023e-08 4096 KSP Residual norm 1.634621207380e-08 4097 KSP Residual norm 1.714037481271e-08 4098 KSP Residual norm 1.888614396702e-08 4099 KSP Residual norm 1.863132773296e-08 4100 KSP Residual norm 1.762413134112e-08 4101 KSP Residual norm 1.750019528592e-08 4102 KSP Residual norm 1.852630960347e-08 4103 KSP Residual norm 2.062065606814e-08 4104 KSP Residual norm 2.252875273729e-08 4105 KSP Residual norm 2.123433448712e-08 4106 KSP Residual norm 2.049951634218e-08 4107 KSP Residual norm 2.329117223251e-08 4108 KSP Residual norm 2.620288744294e-08 4109 KSP Residual norm 2.685391666831e-08 4110 KSP Residual norm 2.686171126390e-08 4111 KSP Residual norm 2.870542605525e-08 4112 KSP Residual norm 3.094716380764e-08 4113 KSP Residual norm 3.216424756745e-08 4114 KSP Residual norm 3.107694630275e-08 4115 KSP Residual norm 2.957154947520e-08 4116 KSP Residual norm 2.907453110586e-08 4117 KSP Residual norm 2.754462213531e-08 4118 KSP Residual norm 2.579690912899e-08 4119 KSP Residual norm 2.635957030566e-08 4120 KSP Residual norm 2.690930515613e-08 4121 KSP Residual norm 2.469704710466e-08 4122 KSP Residual norm 2.276160595317e-08 4123 KSP Residual norm 2.239011260345e-08 4124 KSP Residual norm 1.947472082430e-08 4125 KSP Residual norm 1.634594843772e-08 4126 KSP Residual norm 1.488364202351e-08 4127 KSP Residual norm 1.438218002644e-08 4128 KSP Residual norm 1.382772194901e-08 4129 KSP Residual norm 1.234173421728e-08 4130 KSP Residual norm 1.047495750096e-08 4131 KSP Residual norm 9.085157241814e-09 4132 KSP Residual norm 8.136981267910e-09 4133 KSP Residual norm 7.285496427528e-09 4134 KSP Residual norm 7.286630350205e-09 4135 KSP Residual norm 7.946798028906e-09 4136 KSP Residual norm 7.572693487448e-09 4137 KSP Residual norm 7.334653980611e-09 4138 KSP Residual norm 8.189783149138e-09 4139 KSP Residual norm 8.691719981189e-09 4140 KSP Residual norm 8.124376114180e-09 4141 KSP Residual norm 8.056947204493e-09 4142 KSP Residual norm 8.606917507653e-09 4143 KSP Residual norm 9.058460095816e-09 4144 KSP Residual norm 1.028394190173e-08 4145 KSP Residual norm 1.118370605514e-08 4146 KSP Residual norm 1.206094568346e-08 4147 KSP Residual norm 1.396219885964e-08 4148 KSP Residual norm 1.483429894488e-08 4149 KSP Residual norm 1.452459617645e-08 4150 KSP Residual norm 1.588978837067e-08 4151 KSP Residual norm 1.768844989849e-08 4152 KSP Residual norm 1.760452549557e-08 4153 KSP Residual norm 1.700454262298e-08 4154 KSP Residual norm 1.702873449272e-08 4155 KSP Residual norm 1.625919412683e-08 4156 KSP Residual norm 1.583500187118e-08 4157 KSP Residual norm 1.536250272407e-08 4158 KSP Residual norm 1.522311426297e-08 4159 KSP Residual norm 1.558377789524e-08 4160 KSP Residual norm 1.526581281702e-08 4161 KSP Residual norm 1.319908355880e-08 4162 KSP Residual norm 1.168698314210e-08 4163 KSP Residual norm 1.107355672789e-08 4164 KSP Residual norm 9.846608787340e-09 4165 KSP Residual norm 8.162049245704e-09 4166 KSP Residual norm 7.475128684377e-09 4167 KSP Residual norm 7.661293881858e-09 4168 KSP Residual norm 7.459525934709e-09 4169 KSP Residual norm 6.670745519652e-09 4170 KSP Residual norm 5.975738328781e-09 4171 KSP Residual norm 5.318842124371e-09 4172 KSP Residual norm 4.904736050409e-09 4173 KSP Residual norm 4.712092107810e-09 4174 KSP Residual norm 4.552037007123e-09 4175 KSP Residual norm 4.446510720792e-09 4176 KSP Residual norm 4.492990640962e-09 4177 KSP Residual norm 4.340031148562e-09 4178 KSP Residual norm 3.957186146268e-09 4179 KSP Residual norm 3.843270844058e-09 4180 KSP Residual norm 3.650804251796e-09 4181 KSP Residual norm 3.311840002391e-09 4182 KSP Residual norm 3.091613548750e-09 4183 KSP Residual norm 2.924261451883e-09 4184 KSP Residual norm 2.998860183074e-09 4185 KSP Residual norm 3.257379147781e-09 4186 KSP Residual norm 3.449507582137e-09 4187 KSP Residual norm 3.681822791336e-09 4188 KSP Residual norm 4.456311747118e-09 4189 KSP Residual norm 5.044426395091e-09 4190 KSP Residual norm 4.824981240267e-09 4191 KSP Residual norm 4.884669170982e-09 4192 KSP Residual norm 5.231543707371e-09 4193 KSP Residual norm 4.970053567421e-09 4194 KSP Residual norm 4.660102985186e-09 4195 KSP Residual norm 4.943784503481e-09 4196 KSP Residual norm 5.632527178562e-09 4197 KSP Residual norm 6.159333909842e-09 4198 KSP Residual norm 6.285453778351e-09 4199 KSP Residual norm 6.446054116102e-09 4200 KSP Residual norm 7.129726438283e-09 4201 KSP Residual norm 7.509207436768e-09 4202 KSP Residual norm 6.935196027170e-09 4203 KSP Residual norm 6.599431319334e-09 4204 KSP Residual norm 6.669173933611e-09 4205 KSP Residual norm 6.111956360389e-09 4206 KSP Residual norm 5.408974684279e-09 4207 KSP Residual norm 5.421897673000e-09 4208 KSP Residual norm 5.537024176479e-09 4209 KSP Residual norm 4.743457554778e-09 4210 KSP Residual norm 3.886038481710e-09 4211 KSP Residual norm 3.486041456167e-09 4212 KSP Residual norm 3.444291350160e-09 4213 KSP Residual norm 3.230682605508e-09 4214 KSP Residual norm 2.968808764869e-09 4215 KSP Residual norm 2.718272352020e-09 4216 KSP Residual norm 2.717145540790e-09 4217 KSP Residual norm 2.733024094369e-09 4218 KSP Residual norm 2.318789251138e-09 4219 KSP Residual norm 1.954530954662e-09 4220 KSP Residual norm 1.863062618998e-09 4221 KSP Residual norm 1.743141338225e-09 4222 KSP Residual norm 1.584866437917e-09 4223 KSP Residual norm 1.528200625323e-09 4224 KSP Residual norm 1.638050612201e-09 4225 KSP Residual norm 1.813101976102e-09 4226 KSP Residual norm 1.855052778576e-09 4227 KSP Residual norm 1.874694526406e-09 4228 KSP Residual norm 1.933001713081e-09 4229 KSP Residual norm 1.933590237496e-09 4230 KSP Residual norm 1.852684849220e-09 4231 KSP Residual norm 1.794808688686e-09 4232 KSP Residual norm 1.902146850067e-09 4233 KSP Residual norm 2.221411289596e-09 4234 KSP Residual norm 2.656851636003e-09 4235 KSP Residual norm 3.032401865104e-09 4236 KSP Residual norm 3.381271090432e-09 4237 KSP Residual norm 3.627879707749e-09 4238 KSP Residual norm 3.539856838872e-09 4239 KSP Residual norm 3.634330312444e-09 4240 KSP Residual norm 4.034810686687e-09 4241 KSP Residual norm 4.513582389211e-09 4242 KSP Residual norm 4.651712607100e-09 4243 KSP Residual norm 4.395766984937e-09 4244 KSP Residual norm 4.335803356630e-09 4245 KSP Residual norm 4.749171512618e-09 4246 KSP Residual norm 5.431555925950e-09 4247 KSP Residual norm 5.718827886358e-09 4248 KSP Residual norm 5.513411243351e-09 4249 KSP Residual norm 5.443989217755e-09 4250 KSP Residual norm 5.264544027474e-09 4251 KSP Residual norm 5.016099572822e-09 4252 KSP Residual norm 5.245565687844e-09 4253 KSP Residual norm 5.615137487815e-09 4254 KSP Residual norm 5.909964063308e-09 4255 KSP Residual norm 5.718021487749e-09 4256 KSP Residual norm 5.171547508364e-09 4257 KSP Residual norm 4.745730365847e-09 4258 KSP Residual norm 4.601586215582e-09 4259 KSP Residual norm 4.482226526972e-09 4260 KSP Residual norm 4.165847924446e-09 4261 KSP Residual norm 3.982769321471e-09 4262 KSP Residual norm 3.741860613841e-09 4263 KSP Residual norm 3.376888371868e-09 4264 KSP Residual norm 3.085969404021e-09 4265 KSP Residual norm 3.099497896422e-09 4266 KSP Residual norm 3.212001003247e-09 4267 KSP Residual norm 3.002628414977e-09 4268 KSP Residual norm 2.813528963082e-09 4269 KSP Residual norm 2.634489823438e-09 4270 KSP Residual norm 2.413899154472e-09 4271 KSP Residual norm 2.443024880013e-09 4272 KSP Residual norm 2.586223128221e-09 4273 KSP Residual norm 2.731353001399e-09 4274 KSP Residual norm 2.891984655130e-09 4275 KSP Residual norm 3.033746057386e-09 4276 KSP Residual norm 3.141960066188e-09 4277 KSP Residual norm 3.054300320564e-09 4278 KSP Residual norm 2.937418157421e-09 4279 KSP Residual norm 2.719452987627e-09 4280 KSP Residual norm 2.881251994346e-09 4281 KSP Residual norm 3.347970414028e-09 4282 KSP Residual norm 3.535354797926e-09 4283 KSP Residual norm 3.417610077430e-09 4284 KSP Residual norm 3.737110690476e-09 4285 KSP Residual norm 4.701719482213e-09 4286 KSP Residual norm 5.659287698474e-09 4287 KSP Residual norm 5.706078813548e-09 4288 KSP Residual norm 5.485752305123e-09 4289 KSP Residual norm 5.587703646830e-09 4290 KSP Residual norm 5.877343062071e-09 4291 KSP Residual norm 5.900809554339e-09 4292 KSP Residual norm 5.852848389844e-09 4293 KSP Residual norm 6.765236661867e-09 4294 KSP Residual norm 7.805716639767e-09 4295 KSP Residual norm 7.943667805252e-09 4296 KSP Residual norm 8.020819989258e-09 4297 KSP Residual norm 8.134173165531e-09 4298 KSP Residual norm 7.818857676861e-09 4299 KSP Residual norm 7.375357027175e-09 4300 KSP Residual norm 7.270318299032e-09 4301 KSP Residual norm 7.459559362965e-09 4302 KSP Residual norm 7.398045285850e-09 4303 KSP Residual norm 6.785702976254e-09 4304 KSP Residual norm 6.368121562976e-09 4305 KSP Residual norm 6.260410644986e-09 4306 KSP Residual norm 6.066404209616e-09 4307 KSP Residual norm 5.420916789146e-09 4308 KSP Residual norm 4.907630782920e-09 4309 KSP Residual norm 4.907514085623e-09 4310 KSP Residual norm 5.165438878515e-09 4311 KSP Residual norm 5.179321072307e-09 4312 KSP Residual norm 5.207561392076e-09 4313 KSP Residual norm 5.299448255555e-09 4314 KSP Residual norm 5.346346693285e-09 4315 KSP Residual norm 5.216749278720e-09 4316 KSP Residual norm 5.445059688063e-09 4317 KSP Residual norm 5.777890646193e-09 4318 KSP Residual norm 5.799778295627e-09 4319 KSP Residual norm 5.471910328914e-09 4320 KSP Residual norm 5.894159083144e-09 4321 KSP Residual norm 6.726516622871e-09 4322 KSP Residual norm 6.997943434322e-09 4323 KSP Residual norm 7.002186748675e-09 4324 KSP Residual norm 7.743685352975e-09 4325 KSP Residual norm 9.203681043662e-09 4326 KSP Residual norm 1.073950772096e-08 4327 KSP Residual norm 1.123710863032e-08 4328 KSP Residual norm 1.066033340128e-08 4329 KSP Residual norm 1.019790983388e-08 4330 KSP Residual norm 1.059777357551e-08 4331 KSP Residual norm 1.056558658733e-08 4332 KSP Residual norm 1.062520924734e-08 4333 KSP Residual norm 1.128869046972e-08 4334 KSP Residual norm 1.245676160904e-08 4335 KSP Residual norm 1.284596857107e-08 4336 KSP Residual norm 1.257009636847e-08 4337 KSP Residual norm 1.345342783289e-08 4338 KSP Residual norm 1.467720076671e-08 4339 KSP Residual norm 1.473766466890e-08 4340 KSP Residual norm 1.444039945698e-08 4341 KSP Residual norm 1.526813382813e-08 4342 KSP Residual norm 1.610284591133e-08 4343 KSP Residual norm 1.572054222357e-08 4344 KSP Residual norm 1.449874668720e-08 4345 KSP Residual norm 1.303199757291e-08 4346 KSP Residual norm 1.306943127101e-08 4347 KSP Residual norm 1.463635979882e-08 4348 KSP Residual norm 1.514037510983e-08 4349 KSP Residual norm 1.367364350534e-08 4350 KSP Residual norm 1.326311503293e-08 4351 KSP Residual norm 1.382265862700e-08 4352 KSP Residual norm 1.275336095274e-08 4353 KSP Residual norm 1.090060101873e-08 4354 KSP Residual norm 1.031322886619e-08 4355 KSP Residual norm 9.637513403177e-09 4356 KSP Residual norm 8.511479238782e-09 4357 KSP Residual norm 7.676982152008e-09 4358 KSP Residual norm 7.543455802657e-09 4359 KSP Residual norm 7.745212821546e-09 4360 KSP Residual norm 7.701607622134e-09 4361 KSP Residual norm 7.536264377322e-09 4362 KSP Residual norm 8.216321487838e-09 4363 KSP Residual norm 8.942988115404e-09 4364 KSP Residual norm 9.221724948602e-09 4365 KSP Residual norm 9.345838316154e-09 4366 KSP Residual norm 9.672923092334e-09 4367 KSP Residual norm 1.004792346341e-08 4368 KSP Residual norm 1.012360272445e-08 4369 KSP Residual norm 9.356963292462e-09 4370 KSP Residual norm 8.728407388702e-09 4371 KSP Residual norm 8.592728439421e-09 4372 KSP Residual norm 8.500617610062e-09 4373 KSP Residual norm 8.788357452974e-09 4374 KSP Residual norm 9.975237962618e-09 4375 KSP Residual norm 1.170321721394e-08 4376 KSP Residual norm 1.294928252575e-08 4377 KSP Residual norm 1.486996616436e-08 4378 KSP Residual norm 1.879004917031e-08 4379 KSP Residual norm 2.369483262819e-08 4380 KSP Residual norm 2.547815706533e-08 4381 KSP Residual norm 2.572960842747e-08 4382 KSP Residual norm 2.504738703577e-08 4383 KSP Residual norm 2.414096068253e-08 4384 KSP Residual norm 2.429378707486e-08 4385 KSP Residual norm 2.515871370492e-08 4386 KSP Residual norm 2.614817263368e-08 4387 KSP Residual norm 2.582993589384e-08 4388 KSP Residual norm 2.399701043274e-08 4389 KSP Residual norm 2.251570113433e-08 4390 KSP Residual norm 2.313387383880e-08 4391 KSP Residual norm 2.302830082774e-08 4392 KSP Residual norm 2.060495429578e-08 4393 KSP Residual norm 1.987617078622e-08 4394 KSP Residual norm 2.014584457462e-08 4395 KSP Residual norm 1.921767959856e-08 4396 KSP Residual norm 1.786962458401e-08 4397 KSP Residual norm 1.837691393774e-08 4398 KSP Residual norm 1.858735647288e-08 4399 KSP Residual norm 1.774811585154e-08 4400 KSP Residual norm 1.637791978956e-08 4401 KSP Residual norm 1.417205799884e-08 4402 KSP Residual norm 1.329565042525e-08 4403 KSP Residual norm 1.361061604922e-08 4404 KSP Residual norm 1.324875718853e-08 4405 KSP Residual norm 1.150636799842e-08 4406 KSP Residual norm 1.066779636820e-08 4407 KSP Residual norm 1.005213221176e-08 4408 KSP Residual norm 9.226708270903e-09 4409 KSP Residual norm 8.510105202036e-09 4410 KSP Residual norm 8.472928529278e-09 4411 KSP Residual norm 8.529789250606e-09 4412 KSP Residual norm 8.350403083680e-09 4413 KSP Residual norm 7.749874929050e-09 4414 KSP Residual norm 6.713944395903e-09 4415 KSP Residual norm 6.110083916602e-09 4416 KSP Residual norm 6.005297817749e-09 4417 KSP Residual norm 5.750924992241e-09 4418 KSP Residual norm 5.392262662601e-09 4419 KSP Residual norm 5.583852183324e-09 4420 KSP Residual norm 6.094412575025e-09 4421 KSP Residual norm 6.598026601936e-09 4422 KSP Residual norm 7.045626590495e-09 4423 KSP Residual norm 7.382207352571e-09 4424 KSP Residual norm 7.676866813408e-09 4425 KSP Residual norm 7.816356815656e-09 4426 KSP Residual norm 7.974539198641e-09 4427 KSP Residual norm 8.331683773800e-09 4428 KSP Residual norm 9.088757003644e-09 4429 KSP Residual norm 8.942375396417e-09 4430 KSP Residual norm 8.443627397115e-09 4431 KSP Residual norm 8.623504812431e-09 4432 KSP Residual norm 9.311227665985e-09 4433 KSP Residual norm 9.900900216796e-09 4434 KSP Residual norm 1.034921235909e-08 4435 KSP Residual norm 1.070503140969e-08 4436 KSP Residual norm 1.075426995056e-08 4437 KSP Residual norm 1.135231363456e-08 4438 KSP Residual norm 1.221304855777e-08 4439 KSP Residual norm 1.355929373291e-08 4440 KSP Residual norm 1.425322532427e-08 4441 KSP Residual norm 1.339775040289e-08 4442 KSP Residual norm 1.228850479973e-08 4443 KSP Residual norm 1.254823398645e-08 4444 KSP Residual norm 1.295451308381e-08 4445 KSP Residual norm 1.224026505632e-08 4446 KSP Residual norm 1.162021966628e-08 4447 KSP Residual norm 1.163355510336e-08 4448 KSP Residual norm 1.105638083317e-08 4449 KSP Residual norm 1.023403779303e-08 4450 KSP Residual norm 9.667067790283e-09 4451 KSP Residual norm 9.590855397668e-09 4452 KSP Residual norm 9.764728450687e-09 4453 KSP Residual norm 9.342500122628e-09 4454 KSP Residual norm 8.369833232295e-09 4455 KSP Residual norm 8.012823655708e-09 4456 KSP Residual norm 7.370505346557e-09 4457 KSP Residual norm 6.200852618681e-09 4458 KSP Residual norm 5.550758031790e-09 4459 KSP Residual norm 5.645844212568e-09 4460 KSP Residual norm 5.803222057793e-09 4461 KSP Residual norm 5.576525562385e-09 4462 KSP Residual norm 5.512018507493e-09 4463 KSP Residual norm 5.599990438732e-09 4464 KSP Residual norm 5.634363976412e-09 4465 KSP Residual norm 5.578508349380e-09 4466 KSP Residual norm 5.403545019877e-09 4467 KSP Residual norm 5.581963696443e-09 4468 KSP Residual norm 5.429837219272e-09 4469 KSP Residual norm 4.724371673842e-09 4470 KSP Residual norm 4.275741818411e-09 4471 KSP Residual norm 4.117184833097e-09 4472 KSP Residual norm 4.061274718065e-09 4473 KSP Residual norm 3.999519684824e-09 4474 KSP Residual norm 4.076836054810e-09 4475 KSP Residual norm 4.149843041037e-09 4476 KSP Residual norm 3.978307561847e-09 4477 KSP Residual norm 3.692179653273e-09 4478 KSP Residual norm 3.584663436887e-09 4479 KSP Residual norm 3.563221197194e-09 4480 KSP Residual norm 3.578350785514e-09 4481 KSP Residual norm 3.549524919806e-09 4482 KSP Residual norm 3.893877860437e-09 4483 KSP Residual norm 4.273710962295e-09 4484 KSP Residual norm 4.360180788498e-09 4485 KSP Residual norm 4.620296307254e-09 4486 KSP Residual norm 5.068605362933e-09 4487 KSP Residual norm 5.144665378643e-09 4488 KSP Residual norm 4.929813364267e-09 4489 KSP Residual norm 4.672877012100e-09 4490 KSP Residual norm 4.537853093509e-09 4491 KSP Residual norm 4.959428113319e-09 4492 KSP Residual norm 5.406926137398e-09 4493 KSP Residual norm 5.337353646484e-09 4494 KSP Residual norm 5.098400959075e-09 4495 KSP Residual norm 5.167764127576e-09 4496 KSP Residual norm 5.554008133821e-09 4497 KSP Residual norm 5.781585684727e-09 4498 KSP Residual norm 5.641348983860e-09 4499 KSP Residual norm 5.679107000879e-09 4500 KSP Residual norm 5.836242647329e-09 4501 KSP Residual norm 6.029904152289e-09 4502 KSP Residual norm 6.146260262992e-09 4503 KSP Residual norm 6.292545942332e-09 4504 KSP Residual norm 6.304079771697e-09 4505 KSP Residual norm 6.438052195134e-09 4506 KSP Residual norm 5.923168176896e-09 4507 KSP Residual norm 5.593885550453e-09 4508 KSP Residual norm 5.593585904686e-09 4509 KSP Residual norm 5.527409825839e-09 4510 KSP Residual norm 5.592600656993e-09 4511 KSP Residual norm 5.778679723841e-09 4512 KSP Residual norm 5.579350885259e-09 4513 KSP Residual norm 4.820948400231e-09 4514 KSP Residual norm 4.058377601432e-09 4515 KSP Residual norm 3.856173689448e-09 4516 KSP Residual norm 4.310398432242e-09 4517 KSP Residual norm 4.797923329822e-09 4518 KSP Residual norm 4.807178577905e-09 4519 KSP Residual norm 4.725781111792e-09 4520 KSP Residual norm 4.469949107498e-09 4521 KSP Residual norm 3.965434357696e-09 4522 KSP Residual norm 3.489119099765e-09 4523 KSP Residual norm 3.292372330620e-09 4524 KSP Residual norm 3.269512763409e-09 4525 KSP Residual norm 3.039178880571e-09 4526 KSP Residual norm 2.944182459164e-09 4527 KSP Residual norm 3.318058093893e-09 4528 KSP Residual norm 3.952038673233e-09 4529 KSP Residual norm 4.175794248013e-09 4530 KSP Residual norm 4.099899999796e-09 4531 KSP Residual norm 4.257002780707e-09 4532 KSP Residual norm 4.708080778572e-09 4533 KSP Residual norm 4.748032588629e-09 4534 KSP Residual norm 4.478432338986e-09 4535 KSP Residual norm 4.563089705965e-09 4536 KSP Residual norm 5.331753434118e-09 4537 KSP Residual norm 6.465397132174e-09 4538 KSP Residual norm 7.132535025954e-09 4539 KSP Residual norm 7.329692891462e-09 4540 KSP Residual norm 7.782169349642e-09 4541 KSP Residual norm 7.963491956554e-09 4542 KSP Residual norm 7.463401236906e-09 4543 KSP Residual norm 6.836178494570e-09 4544 KSP Residual norm 6.579792745096e-09 4545 KSP Residual norm 6.328626826214e-09 4546 KSP Residual norm 5.935498601518e-09 4547 KSP Residual norm 5.827015094633e-09 4548 KSP Residual norm 5.853409336034e-09 4549 KSP Residual norm 5.875100594497e-09 4550 KSP Residual norm 5.771427233238e-09 4551 KSP Residual norm 5.868175404828e-09 4552 KSP Residual norm 5.772535302942e-09 4553 KSP Residual norm 5.167618612895e-09 4554 KSP Residual norm 4.428022731934e-09 4555 KSP Residual norm 3.904923561116e-09 4556 KSP Residual norm 3.746972592587e-09 4557 KSP Residual norm 3.812391993713e-09 4558 KSP Residual norm 3.683015150848e-09 4559 KSP Residual norm 3.499745861664e-09 4560 KSP Residual norm 3.317222455896e-09 4561 KSP Residual norm 3.200568725504e-09 4562 KSP Residual norm 3.266284000413e-09 4563 KSP Residual norm 3.538036443489e-09 4564 KSP Residual norm 3.956845584557e-09 4565 KSP Residual norm 4.149454382759e-09 4566 KSP Residual norm 3.776247227187e-09 4567 KSP Residual norm 3.348827313314e-09 4568 KSP Residual norm 3.264810837285e-09 4569 KSP Residual norm 3.323977004611e-09 4570 KSP Residual norm 3.630633551004e-09 4571 KSP Residual norm 4.091066176955e-09 4572 KSP Residual norm 4.419450226798e-09 4573 KSP Residual norm 4.471577034523e-09 4574 KSP Residual norm 4.693665961920e-09 4575 KSP Residual norm 5.273196443647e-09 4576 KSP Residual norm 6.223293584980e-09 4577 KSP Residual norm 7.194371646463e-09 4578 KSP Residual norm 7.884343686008e-09 4579 KSP Residual norm 7.928397637554e-09 4580 KSP Residual norm 7.335596095598e-09 4581 KSP Residual norm 7.226727521418e-09 4582 KSP Residual norm 7.640983934240e-09 4583 KSP Residual norm 8.131770971756e-09 4584 KSP Residual norm 8.850593805399e-09 4585 KSP Residual norm 9.213373643826e-09 4586 KSP Residual norm 9.810888843997e-09 4587 KSP Residual norm 1.064152218643e-08 4588 KSP Residual norm 1.125917849645e-08 4589 KSP Residual norm 1.174896379031e-08 4590 KSP Residual norm 1.206965078081e-08 4591 KSP Residual norm 1.208614586616e-08 4592 KSP Residual norm 1.299647622147e-08 4593 KSP Residual norm 1.454259945395e-08 4594 KSP Residual norm 1.440163300519e-08 4595 KSP Residual norm 1.355896597284e-08 4596 KSP Residual norm 1.348090389684e-08 4597 KSP Residual norm 1.317987150217e-08 4598 KSP Residual norm 1.313399412472e-08 4599 KSP Residual norm 1.406602265138e-08 4600 KSP Residual norm 1.563055412371e-08 4601 KSP Residual norm 1.565056842844e-08 4602 KSP Residual norm 1.431907643321e-08 4603 KSP Residual norm 1.292421865946e-08 4604 KSP Residual norm 1.267001786593e-08 4605 KSP Residual norm 1.331753125256e-08 4606 KSP Residual norm 1.323806484535e-08 4607 KSP Residual norm 1.242461944425e-08 4608 KSP Residual norm 1.244938525249e-08 4609 KSP Residual norm 1.212648492091e-08 4610 KSP Residual norm 1.099199039113e-08 4611 KSP Residual norm 1.043401020233e-08 4612 KSP Residual norm 1.112749460897e-08 4613 KSP Residual norm 1.169228678178e-08 4614 KSP Residual norm 1.119976757439e-08 4615 KSP Residual norm 9.812439260818e-09 4616 KSP Residual norm 9.407128337426e-09 4617 KSP Residual norm 1.012731065037e-08 4618 KSP Residual norm 9.957784532997e-09 4619 KSP Residual norm 8.554660745344e-09 4620 KSP Residual norm 7.984730755424e-09 4621 KSP Residual norm 7.911824637650e-09 4622 KSP Residual norm 7.467738018371e-09 4623 KSP Residual norm 7.401094114672e-09 4624 KSP Residual norm 8.300238295214e-09 4625 KSP Residual norm 9.254622331041e-09 4626 KSP Residual norm 9.768295196698e-09 4627 KSP Residual norm 1.000354133302e-08 4628 KSP Residual norm 9.911918232159e-09 4629 KSP Residual norm 9.524726183856e-09 4630 KSP Residual norm 9.049377157010e-09 4631 KSP Residual norm 8.566510503846e-09 4632 KSP Residual norm 8.438043212595e-09 4633 KSP Residual norm 8.713386361315e-09 4634 KSP Residual norm 8.717435561180e-09 4635 KSP Residual norm 8.395221686020e-09 4636 KSP Residual norm 8.391192500668e-09 4637 KSP Residual norm 8.177723154534e-09 4638 KSP Residual norm 7.884856910439e-09 4639 KSP Residual norm 8.751791029403e-09 4640 KSP Residual norm 1.130944382272e-08 4641 KSP Residual norm 1.387552113801e-08 4642 KSP Residual norm 1.527223209850e-08 4643 KSP Residual norm 1.627800364679e-08 4644 KSP Residual norm 1.690272181788e-08 4645 KSP Residual norm 1.682415508072e-08 4646 KSP Residual norm 1.624056035877e-08 4647 KSP Residual norm 1.541059619989e-08 4648 KSP Residual norm 1.513394843085e-08 4649 KSP Residual norm 1.578309331553e-08 4650 KSP Residual norm 1.723878941045e-08 4651 KSP Residual norm 1.908129004268e-08 4652 KSP Residual norm 1.992434285491e-08 4653 KSP Residual norm 2.005692059579e-08 4654 KSP Residual norm 1.985645214815e-08 4655 KSP Residual norm 2.102233108048e-08 4656 KSP Residual norm 2.200196912457e-08 4657 KSP Residual norm 2.296829523019e-08 4658 KSP Residual norm 2.345042257052e-08 4659 KSP Residual norm 2.285787793768e-08 4660 KSP Residual norm 2.145601395066e-08 4661 KSP Residual norm 1.983664711669e-08 4662 KSP Residual norm 1.946788847880e-08 4663 KSP Residual norm 2.099026751386e-08 4664 KSP Residual norm 2.312241200213e-08 4665 KSP Residual norm 2.466653512521e-08 4666 KSP Residual norm 2.536699373022e-08 4667 KSP Residual norm 2.484334606353e-08 4668 KSP Residual norm 2.243544778483e-08 4669 KSP Residual norm 1.890286329022e-08 4670 KSP Residual norm 1.797216984915e-08 4671 KSP Residual norm 1.926884691754e-08 4672 KSP Residual norm 2.030959603977e-08 4673 KSP Residual norm 2.043685124682e-08 4674 KSP Residual norm 2.048101384626e-08 4675 KSP Residual norm 2.158484950053e-08 4676 KSP Residual norm 2.199582473120e-08 4677 KSP Residual norm 2.103472998988e-08 4678 KSP Residual norm 2.149597765021e-08 4679 KSP Residual norm 2.293141703823e-08 4680 KSP Residual norm 2.377618789888e-08 4681 KSP Residual norm 2.522005360651e-08 4682 KSP Residual norm 2.528017363569e-08 4683 KSP Residual norm 2.264543763588e-08 4684 KSP Residual norm 2.001167062435e-08 4685 KSP Residual norm 1.960034733123e-08 4686 KSP Residual norm 2.011271132987e-08 4687 KSP Residual norm 1.991553147992e-08 4688 KSP Residual norm 2.032520471236e-08 4689 KSP Residual norm 2.103797214965e-08 4690 KSP Residual norm 2.087872001534e-08 4691 KSP Residual norm 2.222354262499e-08 4692 KSP Residual norm 2.435560744284e-08 4693 KSP Residual norm 2.470797337115e-08 4694 KSP Residual norm 2.561738733446e-08 4695 KSP Residual norm 2.960927311504e-08 4696 KSP Residual norm 3.435621988849e-08 4697 KSP Residual norm 3.613545705494e-08 4698 KSP Residual norm 3.554454802162e-08 4699 KSP Residual norm 3.501155710179e-08 4700 KSP Residual norm 3.736528549071e-08 4701 KSP Residual norm 4.096732674634e-08 4702 KSP Residual norm 4.137800122019e-08 4703 KSP Residual norm 4.271182015368e-08 4704 KSP Residual norm 4.758737558831e-08 4705 KSP Residual norm 5.496376912285e-08 4706 KSP Residual norm 5.918422422082e-08 4707 KSP Residual norm 5.803018981240e-08 4708 KSP Residual norm 5.774158798443e-08 4709 KSP Residual norm 5.730581838701e-08 4710 KSP Residual norm 4.981560210110e-08 4711 KSP Residual norm 4.296326028138e-08 4712 KSP Residual norm 4.590214314341e-08 4713 KSP Residual norm 5.582860602854e-08 4714 KSP Residual norm 5.731783066456e-08 4715 KSP Residual norm 5.473075021318e-08 4716 KSP Residual norm 5.510097730117e-08 4717 KSP Residual norm 5.338735720112e-08 4718 KSP Residual norm 4.948445554735e-08 4719 KSP Residual norm 5.070110866220e-08 4720 KSP Residual norm 5.267720778595e-08 4721 KSP Residual norm 4.954577041311e-08 4722 KSP Residual norm 4.251327655883e-08 4723 KSP Residual norm 3.752949289707e-08 4724 KSP Residual norm 3.751706343107e-08 4725 KSP Residual norm 3.796565389299e-08 4726 KSP Residual norm 3.542100103321e-08 4727 KSP Residual norm 3.265849063750e-08 4728 KSP Residual norm 3.334157486061e-08 4729 KSP Residual norm 3.461237448150e-08 4730 KSP Residual norm 3.342762736315e-08 4731 KSP Residual norm 3.283319216386e-08 4732 KSP Residual norm 3.131778139581e-08 4733 KSP Residual norm 2.799484005029e-08 4734 KSP Residual norm 2.455385155547e-08 4735 KSP Residual norm 2.318480331650e-08 4736 KSP Residual norm 2.531433163145e-08 4737 KSP Residual norm 2.659905422589e-08 4738 KSP Residual norm 2.460269462742e-08 4739 KSP Residual norm 2.207018186510e-08 4740 KSP Residual norm 2.148754937358e-08 4741 KSP Residual norm 2.050485358404e-08 4742 KSP Residual norm 1.928839203929e-08 4743 KSP Residual norm 1.989566258135e-08 4744 KSP Residual norm 2.108809820707e-08 4745 KSP Residual norm 2.162060936077e-08 4746 KSP Residual norm 2.232311434411e-08 4747 KSP Residual norm 2.309502741508e-08 4748 KSP Residual norm 2.112635800096e-08 4749 KSP Residual norm 1.764144993327e-08 4750 KSP Residual norm 1.532471990940e-08 4751 KSP Residual norm 1.534883061703e-08 4752 KSP Residual norm 1.599513424840e-08 4753 KSP Residual norm 1.559579321581e-08 4754 KSP Residual norm 1.574787193525e-08 4755 KSP Residual norm 1.668916258815e-08 4756 KSP Residual norm 1.796671453595e-08 4757 KSP Residual norm 1.840866192564e-08 4758 KSP Residual norm 2.015100092714e-08 4759 KSP Residual norm 2.301540407020e-08 4760 KSP Residual norm 2.352201881528e-08 4761 KSP Residual norm 2.070088872898e-08 4762 KSP Residual norm 1.731773729206e-08 4763 KSP Residual norm 1.602966643744e-08 4764 KSP Residual norm 1.760237112724e-08 4765 KSP Residual norm 2.131750529686e-08 4766 KSP Residual norm 2.338127354619e-08 4767 KSP Residual norm 2.267563643419e-08 4768 KSP Residual norm 2.254188134663e-08 4769 KSP Residual norm 2.316912058871e-08 4770 KSP Residual norm 2.432448810286e-08 4771 KSP Residual norm 2.691688723784e-08 4772 KSP Residual norm 2.785590105648e-08 4773 KSP Residual norm 2.772649291794e-08 4774 KSP Residual norm 2.819449867580e-08 4775 KSP Residual norm 2.824434714443e-08 4776 KSP Residual norm 2.725330979286e-08 4777 KSP Residual norm 2.760973101336e-08 4778 KSP Residual norm 2.932051620582e-08 4779 KSP Residual norm 3.038384844789e-08 4780 KSP Residual norm 3.106119825245e-08 4781 KSP Residual norm 3.436911547667e-08 4782 KSP Residual norm 4.163354799929e-08 4783 KSP Residual norm 4.603713433040e-08 4784 KSP Residual norm 4.385744029804e-08 4785 KSP Residual norm 4.053361358956e-08 4786 KSP Residual norm 3.830935118003e-08 4787 KSP Residual norm 3.825713640832e-08 4788 KSP Residual norm 4.170247798212e-08 4789 KSP Residual norm 4.514832352264e-08 4790 KSP Residual norm 4.878898452952e-08 4791 KSP Residual norm 4.881405156868e-08 4792 KSP Residual norm 4.571897115914e-08 4793 KSP Residual norm 3.740695275880e-08 4794 KSP Residual norm 3.079704581811e-08 4795 KSP Residual norm 2.809809340303e-08 4796 KSP Residual norm 2.664867630619e-08 4797 KSP Residual norm 2.569906274283e-08 4798 KSP Residual norm 2.662146814451e-08 4799 KSP Residual norm 2.821224238559e-08 4800 KSP Residual norm 2.822152704136e-08 4801 KSP Residual norm 2.648204771183e-08 4802 KSP Residual norm 2.384585931635e-08 4803 KSP Residual norm 2.112715877328e-08 4804 KSP Residual norm 1.799728466472e-08 4805 KSP Residual norm 1.542783482960e-08 4806 KSP Residual norm 1.412257959635e-08 4807 KSP Residual norm 1.475737217341e-08 4808 KSP Residual norm 1.623938258837e-08 4809 KSP Residual norm 1.749772496376e-08 4810 KSP Residual norm 1.587116654716e-08 4811 KSP Residual norm 1.264255047880e-08 4812 KSP Residual norm 1.111888509492e-08 4813 KSP Residual norm 1.088656798194e-08 4814 KSP Residual norm 1.127786351296e-08 4815 KSP Residual norm 1.108678488969e-08 4816 KSP Residual norm 1.126902691454e-08 4817 KSP Residual norm 1.186497775834e-08 4818 KSP Residual norm 1.165273399847e-08 4819 KSP Residual norm 1.119576459189e-08 4820 KSP Residual norm 1.080585178547e-08 4821 KSP Residual norm 1.010908106424e-08 4822 KSP Residual norm 9.431581605174e-09 4823 KSP Residual norm 1.006278287225e-08 4824 KSP Residual norm 1.237956844519e-08 4825 KSP Residual norm 1.436552315991e-08 4826 KSP Residual norm 1.506678705748e-08 4827 KSP Residual norm 1.474441248226e-08 4828 KSP Residual norm 1.389914058152e-08 4829 KSP Residual norm 1.244227551803e-08 4830 KSP Residual norm 1.198834436868e-08 4831 KSP Residual norm 1.310484095010e-08 4832 KSP Residual norm 1.527173244762e-08 4833 KSP Residual norm 1.672019630074e-08 4834 KSP Residual norm 1.721490378932e-08 4835 KSP Residual norm 1.724478214237e-08 4836 KSP Residual norm 1.664378514838e-08 4837 KSP Residual norm 1.670554081302e-08 4838 KSP Residual norm 1.810102366011e-08 4839 KSP Residual norm 2.070992070853e-08 4840 KSP Residual norm 2.084588091879e-08 4841 KSP Residual norm 1.846518223639e-08 4842 KSP Residual norm 1.713136892600e-08 4843 KSP Residual norm 1.727183646496e-08 4844 KSP Residual norm 1.888224741805e-08 4845 KSP Residual norm 2.076084166089e-08 4846 KSP Residual norm 2.033888331407e-08 4847 KSP Residual norm 1.666131117618e-08 4848 KSP Residual norm 1.222990602068e-08 4849 KSP Residual norm 9.614745602963e-09 4850 KSP Residual norm 8.820729636134e-09 4851 KSP Residual norm 9.496024668948e-09 4852 KSP Residual norm 1.162359565026e-08 4853 KSP Residual norm 1.395337063903e-08 4854 KSP Residual norm 1.543395023871e-08 4855 KSP Residual norm 1.697234894704e-08 4856 KSP Residual norm 1.725366921091e-08 4857 KSP Residual norm 1.617333682707e-08 4858 KSP Residual norm 1.518822598988e-08 4859 KSP Residual norm 1.449004507645e-08 4860 KSP Residual norm 1.350731303737e-08 4861 KSP Residual norm 1.240736596951e-08 4862 KSP Residual norm 1.164600229163e-08 4863 KSP Residual norm 1.149188369592e-08 4864 KSP Residual norm 1.169780622027e-08 4865 KSP Residual norm 1.146089153872e-08 4866 KSP Residual norm 1.013354909887e-08 4867 KSP Residual norm 7.756779508254e-09 4868 KSP Residual norm 5.831225775674e-09 4869 KSP Residual norm 5.478398771190e-09 4870 KSP Residual norm 6.306819480271e-09 4871 KSP Residual norm 7.333932222604e-09 4872 KSP Residual norm 7.530781463187e-09 4873 KSP Residual norm 7.218217312269e-09 4874 KSP Residual norm 7.232214889971e-09 4875 KSP Residual norm 7.302802212965e-09 4876 KSP Residual norm 7.467797385251e-09 4877 KSP Residual norm 7.549343859485e-09 4878 KSP Residual norm 8.330775108373e-09 4879 KSP Residual norm 9.160302127397e-09 4880 KSP Residual norm 8.673269588745e-09 4881 KSP Residual norm 7.629482665876e-09 4882 KSP Residual norm 7.098130117016e-09 4883 KSP Residual norm 7.680077795249e-09 4884 KSP Residual norm 9.172277269174e-09 4885 KSP Residual norm 1.056279333447e-08 4886 KSP Residual norm 1.074500060177e-08 4887 KSP Residual norm 9.960746742120e-09 4888 KSP Residual norm 9.160063525599e-09 4889 KSP Residual norm 8.415152576178e-09 4890 KSP Residual norm 8.635545672044e-09 4891 KSP Residual norm 9.312435861864e-09 4892 KSP Residual norm 9.452476078317e-09 4893 KSP Residual norm 9.772888171879e-09 4894 KSP Residual norm 1.005828268570e-08 4895 KSP Residual norm 1.007567536896e-08 4896 KSP Residual norm 1.119778769306e-08 4897 KSP Residual norm 1.320941829797e-08 4898 KSP Residual norm 1.478426659633e-08 4899 KSP Residual norm 1.619326148499e-08 4900 KSP Residual norm 1.799211278318e-08 4901 KSP Residual norm 1.902189221524e-08 4902 KSP Residual norm 1.874904472085e-08 4903 KSP Residual norm 1.571428621817e-08 4904 KSP Residual norm 1.188808632533e-08 4905 KSP Residual norm 9.112861206695e-09 4906 KSP Residual norm 8.785510693330e-09 4907 KSP Residual norm 1.036652566963e-08 4908 KSP Residual norm 1.255446155584e-08 4909 KSP Residual norm 1.491630099282e-08 4910 KSP Residual norm 1.662795794254e-08 4911 KSP Residual norm 1.641913912090e-08 4912 KSP Residual norm 1.562516408707e-08 4913 KSP Residual norm 1.501428479427e-08 4914 KSP Residual norm 1.414550995303e-08 4915 KSP Residual norm 1.290009604761e-08 4916 KSP Residual norm 1.181571432721e-08 4917 KSP Residual norm 1.157977522630e-08 4918 KSP Residual norm 1.229825493817e-08 4919 KSP Residual norm 1.251758107647e-08 4920 KSP Residual norm 1.234461417289e-08 4921 KSP Residual norm 1.142014850600e-08 4922 KSP Residual norm 9.108783525908e-09 4923 KSP Residual norm 7.385518143750e-09 4924 KSP Residual norm 6.547490547502e-09 4925 KSP Residual norm 6.445733302151e-09 4926 KSP Residual norm 6.627721399506e-09 4927 KSP Residual norm 6.581702008430e-09 4928 KSP Residual norm 6.509018520856e-09 4929 KSP Residual norm 5.562468472910e-09 4930 KSP Residual norm 4.450233620632e-09 4931 KSP Residual norm 4.101065823594e-09 4932 KSP Residual norm 4.374306984133e-09 4933 KSP Residual norm 4.465617833336e-09 4934 KSP Residual norm 4.441678007321e-09 4935 KSP Residual norm 4.045912912909e-09 4936 KSP Residual norm 3.429384674495e-09 4937 KSP Residual norm 3.068817653667e-09 4938 KSP Residual norm 3.349440267742e-09 4939 KSP Residual norm 3.996418627159e-09 4940 KSP Residual norm 4.122748167128e-09 4941 KSP Residual norm 3.295359386461e-09 4942 KSP Residual norm 2.336935399635e-09 4943 KSP Residual norm 1.865460848373e-09 4944 KSP Residual norm 1.922806803833e-09 4945 KSP Residual norm 2.244178942190e-09 4946 KSP Residual norm 2.970694619915e-09 4947 KSP Residual norm 4.299148548092e-09 4948 KSP Residual norm 5.544189723292e-09 4949 KSP Residual norm 5.539528944675e-09 4950 KSP Residual norm 4.359220456487e-09 4951 KSP Residual norm 3.171481446028e-09 4952 KSP Residual norm 2.662849987243e-09 4953 KSP Residual norm 2.782015079983e-09 4954 KSP Residual norm 3.579834962474e-09 4955 KSP Residual norm 4.837564197020e-09 4956 KSP Residual norm 6.244282226683e-09 4957 KSP Residual norm 7.134808492107e-09 4958 KSP Residual norm 6.422811926555e-09 4959 KSP Residual norm 5.118162471442e-09 4960 KSP Residual norm 4.158518978554e-09 4961 KSP Residual norm 3.779679949800e-09 4962 KSP Residual norm 4.447562656411e-09 4963 KSP Residual norm 6.820518130582e-09 4964 KSP Residual norm 1.069437440185e-08 4965 KSP Residual norm 1.425772909790e-08 4966 KSP Residual norm 1.432663852981e-08 4967 KSP Residual norm 1.135071334905e-08 4968 KSP Residual norm 8.778983926237e-09 4969 KSP Residual norm 7.345709657090e-09 4970 KSP Residual norm 7.113865108685e-09 4971 KSP Residual norm 7.517489421788e-09 4972 KSP Residual norm 8.664510260268e-09 4973 KSP Residual norm 1.068738813208e-08 4974 KSP Residual norm 1.252362696137e-08 4975 KSP Residual norm 1.287126437938e-08 4976 KSP Residual norm 1.284679980752e-08 4977 KSP Residual norm 1.370578197082e-08 4978 KSP Residual norm 1.388600373684e-08 4979 KSP Residual norm 1.210338975543e-08 4980 KSP Residual norm 1.033220366595e-08 4981 KSP Residual norm 9.256646046466e-09 4982 KSP Residual norm 8.905757623719e-09 4983 KSP Residual norm 9.648228717873e-09 4984 KSP Residual norm 1.210547337299e-08 4985 KSP Residual norm 1.640704748554e-08 4986 KSP Residual norm 2.200888106662e-08 4987 KSP Residual norm 2.569572637259e-08 4988 KSP Residual norm 2.254532657601e-08 4989 KSP Residual norm 1.740613530582e-08 4990 KSP Residual norm 1.420714497249e-08 4991 KSP Residual norm 1.400989471760e-08 4992 KSP Residual norm 1.680004856641e-08 4993 KSP Residual norm 2.086413240602e-08 4994 KSP Residual norm 2.142772065225e-08 4995 KSP Residual norm 1.929507476186e-08 4996 KSP Residual norm 1.659369997318e-08 4997 KSP Residual norm 1.425490368105e-08 4998 KSP Residual norm 1.250348406266e-08 4999 KSP Residual norm 1.193612153398e-08 5000 KSP Residual norm 1.320239130845e-08 5001 KSP Residual norm 1.638955127381e-08 5002 KSP Residual norm 1.780202221191e-08 5003 KSP Residual norm 1.585831874195e-08 5004 KSP Residual norm 1.442006670578e-08 5005 KSP Residual norm 1.354185998209e-08 5006 KSP Residual norm 1.212395397269e-08 5007 KSP Residual norm 1.260279275678e-08 5008 KSP Residual norm 1.444314908468e-08 5009 KSP Residual norm 1.433630390323e-08 5010 KSP Residual norm 1.269288155003e-08 5011 KSP Residual norm 1.146076975620e-08 5012 KSP Residual norm 1.153821724390e-08 5013 KSP Residual norm 1.121764201639e-08 5014 KSP Residual norm 1.020533909900e-08 5015 KSP Residual norm 1.049787348836e-08 5016 KSP Residual norm 1.169658487134e-08 5017 KSP Residual norm 1.316416544087e-08 5018 KSP Residual norm 1.498196462253e-08 5019 KSP Residual norm 1.535375249176e-08 5020 KSP Residual norm 1.304361963290e-08 5021 KSP Residual norm 9.653745677574e-09 5022 KSP Residual norm 7.093685751387e-09 5023 KSP Residual norm 5.875275617781e-09 5024 KSP Residual norm 6.409674139606e-09 5025 KSP Residual norm 8.516755292871e-09 5026 KSP Residual norm 1.162050991216e-08 5027 KSP Residual norm 1.547700002461e-08 5028 KSP Residual norm 1.817392110143e-08 5029 KSP Residual norm 1.923658279728e-08 5030 KSP Residual norm 2.015678144366e-08 5031 KSP Residual norm 2.239867562395e-08 5032 KSP Residual norm 2.454507004261e-08 5033 KSP Residual norm 2.466830078705e-08 5034 KSP Residual norm 2.178639571084e-08 5035 KSP Residual norm 1.752884029559e-08 5036 KSP Residual norm 1.533330306142e-08 5037 KSP Residual norm 1.703412357091e-08 5038 KSP Residual norm 2.301453457961e-08 5039 KSP Residual norm 3.383896021517e-08 5040 KSP Residual norm 4.667546126074e-08 5041 KSP Residual norm 4.743776097208e-08 5042 KSP Residual norm 3.546241798944e-08 5043 KSP Residual norm 2.661668092378e-08 5044 KSP Residual norm 2.118767242121e-08 5045 KSP Residual norm 2.017452541057e-08 5046 KSP Residual norm 2.342324448713e-08 5047 KSP Residual norm 3.186306060983e-08 5048 KSP Residual norm 4.187445881377e-08 5049 KSP Residual norm 4.404834500196e-08 5050 KSP Residual norm 4.176783494062e-08 5051 KSP Residual norm 4.327807235002e-08 5052 KSP Residual norm 4.858598085486e-08 5053 KSP Residual norm 5.755879591062e-08 5054 KSP Residual norm 7.096394351257e-08 5055 KSP Residual norm 7.153896698332e-08 5056 KSP Residual norm 5.609646088476e-08 5057 KSP Residual norm 4.007361196747e-08 5058 KSP Residual norm 3.457339719149e-08 5059 KSP Residual norm 3.420588555973e-08 5060 KSP Residual norm 3.480078544618e-08 5061 KSP Residual norm 3.984430873579e-08 5062 KSP Residual norm 4.947366140697e-08 5063 KSP Residual norm 5.933850691645e-08 5064 KSP Residual norm 6.210855675848e-08 5065 KSP Residual norm 6.838948855737e-08 5066 KSP Residual norm 7.609842526292e-08 5067 KSP Residual norm 7.705854488968e-08 5068 KSP Residual norm 7.170952972397e-08 5069 KSP Residual norm 6.036586461790e-08 5070 KSP Residual norm 4.941685584211e-08 5071 KSP Residual norm 4.579667716942e-08 5072 KSP Residual norm 4.055067292630e-08 5073 KSP Residual norm 3.336312231012e-08 5074 KSP Residual norm 2.853983111673e-08 5075 KSP Residual norm 2.704321409172e-08 5076 KSP Residual norm 2.976551883545e-08 5077 KSP Residual norm 3.519684239241e-08 5078 KSP Residual norm 3.915271197418e-08 5079 KSP Residual norm 4.063834894950e-08 5080 KSP Residual norm 4.414751903856e-08 5081 KSP Residual norm 5.115189675552e-08 5082 KSP Residual norm 5.427108370732e-08 5083 KSP Residual norm 5.255259146103e-08 5084 KSP Residual norm 4.376465109747e-08 5085 KSP Residual norm 3.627093036893e-08 5086 KSP Residual norm 3.382003233561e-08 5087 KSP Residual norm 3.100238718570e-08 5088 KSP Residual norm 2.629669775198e-08 5089 KSP Residual norm 2.400303568831e-08 5090 KSP Residual norm 2.564369317749e-08 5091 KSP Residual norm 2.858395728973e-08 5092 KSP Residual norm 3.155578786005e-08 5093 KSP Residual norm 3.115029883236e-08 5094 KSP Residual norm 3.033086292063e-08 5095 KSP Residual norm 2.999524495595e-08 5096 KSP Residual norm 3.025279202159e-08 5097 KSP Residual norm 3.023543255782e-08 5098 KSP Residual norm 3.390371597983e-08 5099 KSP Residual norm 4.213649769957e-08 5100 KSP Residual norm 4.540779784042e-08 5101 KSP Residual norm 4.012749310366e-08 5102 KSP Residual norm 3.591959654820e-08 5103 KSP Residual norm 3.307825011770e-08 5104 KSP Residual norm 3.155700662430e-08 5105 KSP Residual norm 3.372406185415e-08 5106 KSP Residual norm 3.561106395454e-08 5107 KSP Residual norm 3.196186459722e-08 5108 KSP Residual norm 2.864642745048e-08 5109 KSP Residual norm 3.075433200834e-08 5110 KSP Residual norm 4.016424786053e-08 5111 KSP Residual norm 5.735981612076e-08 5112 KSP Residual norm 6.717125853433e-08 5113 KSP Residual norm 6.675377276531e-08 5114 KSP Residual norm 6.073610630430e-08 5115 KSP Residual norm 5.341936627133e-08 5116 KSP Residual norm 4.798732402329e-08 5117 KSP Residual norm 4.740413626195e-08 5118 KSP Residual norm 4.539093095482e-08 5119 KSP Residual norm 4.449704695558e-08 5120 KSP Residual norm 5.051419707338e-08 5121 KSP Residual norm 6.487170435771e-08 5122 KSP Residual norm 7.315325194322e-08 5123 KSP Residual norm 7.050392285597e-08 5124 KSP Residual norm 6.707601866936e-08 5125 KSP Residual norm 6.504106930922e-08 5126 KSP Residual norm 6.306739374666e-08 5127 KSP Residual norm 5.813557429572e-08 5128 KSP Residual norm 5.083581670795e-08 5129 KSP Residual norm 4.309732123363e-08 5130 KSP Residual norm 3.574840227741e-08 5131 KSP Residual norm 3.044394015008e-08 5132 KSP Residual norm 2.964737229147e-08 5133 KSP Residual norm 3.157239422021e-08 5134 KSP Residual norm 3.079346758429e-08 5135 KSP Residual norm 2.872432189207e-08 5136 KSP Residual norm 3.087574312812e-08 5137 KSP Residual norm 3.682593556440e-08 5138 KSP Residual norm 3.860270961689e-08 5139 KSP Residual norm 3.208487550889e-08 5140 KSP Residual norm 2.460511229225e-08 5141 KSP Residual norm 2.115980086332e-08 5142 KSP Residual norm 1.981593056871e-08 5143 KSP Residual norm 1.768767221868e-08 5144 KSP Residual norm 1.654395947429e-08 5145 KSP Residual norm 1.637543189844e-08 5146 KSP Residual norm 1.704120847862e-08 5147 KSP Residual norm 1.991254252302e-08 5148 KSP Residual norm 2.482591233804e-08 5149 KSP Residual norm 2.864935672461e-08 5150 KSP Residual norm 2.947643296294e-08 5151 KSP Residual norm 2.705657438407e-08 5152 KSP Residual norm 2.311379229491e-08 5153 KSP Residual norm 2.009147992668e-08 5154 KSP Residual norm 2.002285361090e-08 5155 KSP Residual norm 2.033663607418e-08 5156 KSP Residual norm 2.000806508433e-08 5157 KSP Residual norm 1.919298530495e-08 5158 KSP Residual norm 1.893331231195e-08 5159 KSP Residual norm 1.994697124102e-08 5160 KSP Residual norm 2.025064752367e-08 5161 KSP Residual norm 1.843423505000e-08 5162 KSP Residual norm 1.727988771247e-08 5163 KSP Residual norm 1.928816724901e-08 5164 KSP Residual norm 2.368469769701e-08 5165 KSP Residual norm 2.852820226565e-08 5166 KSP Residual norm 3.112632993780e-08 5167 KSP Residual norm 3.264026792152e-08 5168 KSP Residual norm 3.594119730346e-08 5169 KSP Residual norm 4.113909968316e-08 5170 KSP Residual norm 4.550078146693e-08 5171 KSP Residual norm 4.905041552807e-08 5172 KSP Residual norm 4.972253374712e-08 5173 KSP Residual norm 4.845320194246e-08 5174 KSP Residual norm 5.094076024480e-08 5175 KSP Residual norm 5.667027143174e-08 5176 KSP Residual norm 5.344219432902e-08 5177 KSP Residual norm 4.444556102080e-08 5178 KSP Residual norm 3.815302257821e-08 5179 KSP Residual norm 3.477771238035e-08 5180 KSP Residual norm 3.442569468372e-08 5181 KSP Residual norm 3.729520586277e-08 5182 KSP Residual norm 3.657186954468e-08 5183 KSP Residual norm 3.327317723019e-08 5184 KSP Residual norm 3.473118929870e-08 5185 KSP Residual norm 4.102404415132e-08 5186 KSP Residual norm 5.033978926796e-08 5187 KSP Residual norm 5.260137925124e-08 5188 KSP Residual norm 4.588945144679e-08 5189 KSP Residual norm 3.961682638206e-08 5190 KSP Residual norm 3.649263373234e-08 5191 KSP Residual norm 3.352431495708e-08 5192 KSP Residual norm 3.090321109331e-08 5193 KSP Residual norm 3.038593633632e-08 5194 KSP Residual norm 2.862795616464e-08 5195 KSP Residual norm 2.712694686206e-08 5196 KSP Residual norm 2.898652871948e-08 5197 KSP Residual norm 3.312172447600e-08 5198 KSP Residual norm 3.561220746246e-08 5199 KSP Residual norm 3.436974015334e-08 5200 KSP Residual norm 3.340514669094e-08 5201 KSP Residual norm 3.678849440370e-08 5202 KSP Residual norm 3.973087062860e-08 5203 KSP Residual norm 3.678225810301e-08 5204 KSP Residual norm 2.668639935272e-08 5205 KSP Residual norm 1.853486472886e-08 5206 KSP Residual norm 1.580071114004e-08 5207 KSP Residual norm 1.686418064116e-08 5208 KSP Residual norm 1.758813551771e-08 5209 KSP Residual norm 1.515614074033e-08 5210 KSP Residual norm 1.288319130983e-08 5211 KSP Residual norm 1.317391640092e-08 5212 KSP Residual norm 1.563337324921e-08 5213 KSP Residual norm 1.836910623640e-08 5214 KSP Residual norm 2.056642292708e-08 5215 KSP Residual norm 1.925031382118e-08 5216 KSP Residual norm 1.720975912107e-08 5217 KSP Residual norm 1.788815852889e-08 5218 KSP Residual norm 2.101536466453e-08 5219 KSP Residual norm 2.161086233809e-08 5220 KSP Residual norm 1.812037079142e-08 5221 KSP Residual norm 1.527979160653e-08 5222 KSP Residual norm 1.342501862572e-08 5223 KSP Residual norm 1.259688137080e-08 5224 KSP Residual norm 1.179364640373e-08 5225 KSP Residual norm 1.070759367784e-08 5226 KSP Residual norm 9.457439578089e-09 5227 KSP Residual norm 8.486357972886e-09 5228 KSP Residual norm 8.290822990554e-09 5229 KSP Residual norm 9.083640801084e-09 5230 KSP Residual norm 1.000637351468e-08 5231 KSP Residual norm 1.002361985055e-08 5232 KSP Residual norm 1.003527027386e-08 5233 KSP Residual norm 1.225113043607e-08 5234 KSP Residual norm 1.601564768659e-08 5235 KSP Residual norm 1.736761751154e-08 5236 KSP Residual norm 1.486255993590e-08 5237 KSP Residual norm 1.264623323407e-08 5238 KSP Residual norm 1.268508034831e-08 5239 KSP Residual norm 1.250520225390e-08 5240 KSP Residual norm 1.011529613086e-08 5241 KSP Residual norm 8.249633152909e-09 5242 KSP Residual norm 7.640723367700e-09 5243 KSP Residual norm 8.185096525785e-09 5244 KSP Residual norm 9.609117617589e-09 5245 KSP Residual norm 1.273140701005e-08 5246 KSP Residual norm 1.475681235338e-08 5247 KSP Residual norm 1.425556747858e-08 5248 KSP Residual norm 1.483417063846e-08 5249 KSP Residual norm 1.809520173185e-08 5250 KSP Residual norm 2.116817791637e-08 5251 KSP Residual norm 1.985627844838e-08 5252 KSP Residual norm 1.641719245565e-08 5253 KSP Residual norm 1.471310814147e-08 5254 KSP Residual norm 1.547728334189e-08 5255 KSP Residual norm 1.593319138311e-08 5256 KSP Residual norm 1.385382624309e-08 5257 KSP Residual norm 1.119404791589e-08 5258 KSP Residual norm 9.799598156439e-09 5259 KSP Residual norm 1.029597943692e-08 5260 KSP Residual norm 1.170230403495e-08 5261 KSP Residual norm 1.184246092942e-08 5262 KSP Residual norm 1.118995834206e-08 5263 KSP Residual norm 1.209723012229e-08 5264 KSP Residual norm 1.572976451323e-08 5265 KSP Residual norm 2.033251134475e-08 5266 KSP Residual norm 2.239834017548e-08 5267 KSP Residual norm 2.383826011678e-08 5268 KSP Residual norm 2.611021208586e-08 5269 KSP Residual norm 3.046799874404e-08 5270 KSP Residual norm 3.238729515163e-08 5271 KSP Residual norm 2.703351629213e-08 5272 KSP Residual norm 2.049389663795e-08 5273 KSP Residual norm 1.668812317949e-08 5274 KSP Residual norm 1.616529540939e-08 5275 KSP Residual norm 1.514579955202e-08 5276 KSP Residual norm 1.146275410462e-08 5277 KSP Residual norm 8.824697577751e-09 5278 KSP Residual norm 8.524219536924e-09 5279 KSP Residual norm 9.477256818725e-09 5280 KSP Residual norm 9.891137539235e-09 5281 KSP Residual norm 9.277695621351e-09 5282 KSP Residual norm 8.402033626064e-09 5283 KSP Residual norm 8.385781790609e-09 5284 KSP Residual norm 9.354022238415e-09 5285 KSP Residual norm 1.077324267005e-08 5286 KSP Residual norm 1.236187170584e-08 5287 KSP Residual norm 1.394642291994e-08 5288 KSP Residual norm 1.506244159752e-08 5289 KSP Residual norm 1.507379796867e-08 5290 KSP Residual norm 1.446701595202e-08 5291 KSP Residual norm 1.295485047363e-08 5292 KSP Residual norm 1.063944757153e-08 5293 KSP Residual norm 8.182100034095e-09 5294 KSP Residual norm 6.384151057184e-09 5295 KSP Residual norm 6.005578865285e-09 5296 KSP Residual norm 5.956673258174e-09 5297 KSP Residual norm 5.147616207091e-09 5298 KSP Residual norm 4.249563992660e-09 5299 KSP Residual norm 3.653482392368e-09 5300 KSP Residual norm 3.414677724441e-09 5301 KSP Residual norm 3.495775209757e-09 5302 KSP Residual norm 3.247562101001e-09 5303 KSP Residual norm 2.790333461525e-09 5304 KSP Residual norm 2.820342360928e-09 5305 KSP Residual norm 3.365912072226e-09 5306 KSP Residual norm 3.590845478438e-09 5307 KSP Residual norm 3.732368589552e-09 5308 KSP Residual norm 4.359455283687e-09 5309 KSP Residual norm 5.355016091407e-09 5310 KSP Residual norm 5.797584109377e-09 5311 KSP Residual norm 5.095928918586e-09 5312 KSP Residual norm 4.347574145616e-09 5313 KSP Residual norm 4.240336244211e-09 5314 KSP Residual norm 4.382981690610e-09 5315 KSP Residual norm 3.916259523109e-09 5316 KSP Residual norm 3.111055161120e-09 5317 KSP Residual norm 2.573428969216e-09 5318 KSP Residual norm 2.427586837119e-09 5319 KSP Residual norm 2.629512104524e-09 5320 KSP Residual norm 2.984688708083e-09 5321 KSP Residual norm 3.070314844570e-09 5322 KSP Residual norm 2.964760243063e-09 5323 KSP Residual norm 3.321250151048e-09 5324 KSP Residual norm 3.781034799418e-09 5325 KSP Residual norm 3.749972407214e-09 5326 KSP Residual norm 3.682448544110e-09 5327 KSP Residual norm 4.018279550388e-09 5328 KSP Residual norm 4.752849609874e-09 5329 KSP Residual norm 5.004416912220e-09 5330 KSP Residual norm 4.390491830602e-09 5331 KSP Residual norm 3.885204957264e-09 5332 KSP Residual norm 4.213007921542e-09 5333 KSP Residual norm 4.820467902833e-09 5334 KSP Residual norm 4.418578653671e-09 5335 KSP Residual norm 3.564515839121e-09 5336 KSP Residual norm 3.330067381858e-09 5337 KSP Residual norm 3.625641042392e-09 5338 KSP Residual norm 4.232034616018e-09 5339 KSP Residual norm 4.141857203394e-09 5340 KSP Residual norm 3.553180048494e-09 5341 KSP Residual norm 3.476263188203e-09 5342 KSP Residual norm 4.307176369541e-09 5343 KSP Residual norm 5.332259479532e-09 5344 KSP Residual norm 5.654245885439e-09 5345 KSP Residual norm 5.572287615420e-09 5346 KSP Residual norm 5.631119773992e-09 5347 KSP Residual norm 6.377943354568e-09 5348 KSP Residual norm 7.078274948484e-09 5349 KSP Residual norm 6.244546244897e-09 5350 KSP Residual norm 5.337343886572e-09 5351 KSP Residual norm 5.055306282829e-09 5352 KSP Residual norm 5.429366643238e-09 5353 KSP Residual norm 5.768309355254e-09 5354 KSP Residual norm 5.645714697861e-09 5355 KSP Residual norm 5.312766365789e-09 5356 KSP Residual norm 5.166523623852e-09 5357 KSP Residual norm 5.809262700725e-09 5358 KSP Residual norm 7.389148226883e-09 5359 KSP Residual norm 8.442307929158e-09 5360 KSP Residual norm 7.777782279602e-09 5361 KSP Residual norm 6.931507684953e-09 5362 KSP Residual norm 6.878703689288e-09 5363 KSP Residual norm 7.186930506665e-09 5364 KSP Residual norm 7.031153323067e-09 5365 KSP Residual norm 6.659537111718e-09 5366 KSP Residual norm 6.552438945355e-09 5367 KSP Residual norm 6.254848238852e-09 5368 KSP Residual norm 5.744820335509e-09 5369 KSP Residual norm 5.791288067911e-09 5370 KSP Residual norm 5.418981306944e-09 5371 KSP Residual norm 4.594824485963e-09 5372 KSP Residual norm 4.488692006420e-09 5373 KSP Residual norm 4.861162386596e-09 5374 KSP Residual norm 4.493796104688e-09 5375 KSP Residual norm 4.276800642426e-09 5376 KSP Residual norm 4.739573310928e-09 5377 KSP Residual norm 5.068638550061e-09 5378 KSP Residual norm 5.090178993738e-09 5379 KSP Residual norm 5.199563220339e-09 5380 KSP Residual norm 5.357313869232e-09 5381 KSP Residual norm 5.553626258659e-09 5382 KSP Residual norm 5.620911772695e-09 5383 KSP Residual norm 5.159766906255e-09 5384 KSP Residual norm 5.244434465665e-09 5385 KSP Residual norm 5.848557469261e-09 5386 KSP Residual norm 5.783674046793e-09 5387 KSP Residual norm 4.475875754395e-09 5388 KSP Residual norm 3.505881141009e-09 5389 KSP Residual norm 3.148345082934e-09 5390 KSP Residual norm 3.140052767082e-09 5391 KSP Residual norm 2.797662273665e-09 5392 KSP Residual norm 2.053696273210e-09 5393 KSP Residual norm 1.681068186430e-09 5394 KSP Residual norm 1.749306879109e-09 5395 KSP Residual norm 1.882328947971e-09 5396 KSP Residual norm 1.961876638273e-09 5397 KSP Residual norm 2.123724920126e-09 5398 KSP Residual norm 2.176649392262e-09 5399 KSP Residual norm 2.303329872114e-09 5400 KSP Residual norm 2.632300269578e-09 5401 KSP Residual norm 2.750900546691e-09 5402 KSP Residual norm 2.675619384512e-09 5403 KSP Residual norm 2.919486612801e-09 5404 KSP Residual norm 3.236854291377e-09 5405 KSP Residual norm 3.170728042157e-09 5406 KSP Residual norm 3.064542262822e-09 5407 KSP Residual norm 3.072213621733e-09 5408 KSP Residual norm 3.115991040150e-09 5409 KSP Residual norm 2.940906854004e-09 5410 KSP Residual norm 2.574771579917e-09 5411 KSP Residual norm 2.394058407618e-09 5412 KSP Residual norm 2.391180866390e-09 5413 KSP Residual norm 2.225869718608e-09 5414 KSP Residual norm 2.251853479309e-09 5415 KSP Residual norm 2.445822661166e-09 5416 KSP Residual norm 2.454786393016e-09 5417 KSP Residual norm 2.747961023447e-09 5418 KSP Residual norm 3.340970416372e-09 5419 KSP Residual norm 3.703536366534e-09 5420 KSP Residual norm 3.914892117138e-09 5421 KSP Residual norm 4.345161827262e-09 5422 KSP Residual norm 4.767629749741e-09 5423 KSP Residual norm 5.040497772574e-09 5424 KSP Residual norm 5.164075057149e-09 5425 KSP Residual norm 5.094240251972e-09 5426 KSP Residual norm 5.065068593267e-09 5427 KSP Residual norm 5.623765342033e-09 5428 KSP Residual norm 6.162100631127e-09 5429 KSP Residual norm 5.974722336476e-09 5430 KSP Residual norm 5.238336565323e-09 5431 KSP Residual norm 4.984266880309e-09 5432 KSP Residual norm 5.585447213326e-09 5433 KSP Residual norm 6.334038803696e-09 5434 KSP Residual norm 6.295352014384e-09 5435 KSP Residual norm 6.227006965680e-09 5436 KSP Residual norm 6.803133217504e-09 5437 KSP Residual norm 7.638998936898e-09 5438 KSP Residual norm 8.128597580473e-09 5439 KSP Residual norm 7.730125116558e-09 5440 KSP Residual norm 7.068133806988e-09 5441 KSP Residual norm 6.858017473477e-09 5442 KSP Residual norm 7.321260914968e-09 5443 KSP Residual norm 7.332733210398e-09 5444 KSP Residual norm 7.435705496443e-09 5445 KSP Residual norm 8.064828369762e-09 5446 KSP Residual norm 9.565893185868e-09 5447 KSP Residual norm 1.034541354452e-08 5448 KSP Residual norm 9.590598462393e-09 5449 KSP Residual norm 8.816340289060e-09 5450 KSP Residual norm 8.499779469801e-09 5451 KSP Residual norm 9.046570380422e-09 5452 KSP Residual norm 9.332070476580e-09 5453 KSP Residual norm 8.729119829103e-09 5454 KSP Residual norm 8.507749371153e-09 5455 KSP Residual norm 9.354382785873e-09 5456 KSP Residual norm 1.028016254189e-08 5457 KSP Residual norm 9.914715904167e-09 5458 KSP Residual norm 8.928021546735e-09 5459 KSP Residual norm 8.871165670153e-09 5460 KSP Residual norm 9.767890287258e-09 5461 KSP Residual norm 9.817284326424e-09 5462 KSP Residual norm 9.826845373987e-09 5463 KSP Residual norm 9.430969583427e-09 5464 KSP Residual norm 9.358237584212e-09 5465 KSP Residual norm 1.017435885681e-08 5466 KSP Residual norm 1.135054152660e-08 5467 KSP Residual norm 1.089315814087e-08 5468 KSP Residual norm 1.018803065093e-08 5469 KSP Residual norm 1.138971066069e-08 5470 KSP Residual norm 1.420589522448e-08 5471 KSP Residual norm 1.458939509948e-08 5472 KSP Residual norm 1.268384318239e-08 5473 KSP Residual norm 1.173988664556e-08 5474 KSP Residual norm 1.069369187998e-08 5475 KSP Residual norm 9.520827743729e-09 5476 KSP Residual norm 8.969588344310e-09 5477 KSP Residual norm 8.930775474906e-09 5478 KSP Residual norm 9.155881630595e-09 5479 KSP Residual norm 9.066232518798e-09 5480 KSP Residual norm 8.493827474869e-09 5481 KSP Residual norm 7.211839855203e-09 5482 KSP Residual norm 6.285667475568e-09 5483 KSP Residual norm 5.882077365447e-09 5484 KSP Residual norm 6.053635525245e-09 5485 KSP Residual norm 5.945522294873e-09 5486 KSP Residual norm 5.162865259241e-09 5487 KSP Residual norm 5.118586924583e-09 5488 KSP Residual norm 5.757943731693e-09 5489 KSP Residual norm 6.018462252815e-09 5490 KSP Residual norm 5.880298433806e-09 5491 KSP Residual norm 5.917464523076e-09 5492 KSP Residual norm 6.255456280424e-09 5493 KSP Residual norm 7.018813858356e-09 5494 KSP Residual norm 7.417753399832e-09 5495 KSP Residual norm 6.901719067828e-09 5496 KSP Residual norm 6.687732525812e-09 5497 KSP Residual norm 7.450161183973e-09 5498 KSP Residual norm 7.697404740340e-09 5499 KSP Residual norm 7.151949562440e-09 5500 KSP Residual norm 6.445657990398e-09 5501 KSP Residual norm 6.010183819955e-09 5502 KSP Residual norm 5.966745878790e-09 5503 KSP Residual norm 5.881482326544e-09 5504 KSP Residual norm 4.988414938319e-09 5505 KSP Residual norm 4.124520600936e-09 5506 KSP Residual norm 3.877139412178e-09 5507 KSP Residual norm 3.876536582491e-09 5508 KSP Residual norm 3.673786049933e-09 5509 KSP Residual norm 3.336387891478e-09 5510 KSP Residual norm 3.300013871202e-09 5511 KSP Residual norm 3.591894633496e-09 5512 KSP Residual norm 3.771410752346e-09 5513 KSP Residual norm 3.923724529071e-09 5514 KSP Residual norm 4.586441670445e-09 5515 KSP Residual norm 5.016024845974e-09 5516 KSP Residual norm 4.759019832499e-09 5517 KSP Residual norm 5.136684610872e-09 5518 KSP Residual norm 6.235519587485e-09 5519 KSP Residual norm 6.634081107457e-09 5520 KSP Residual norm 6.035546927336e-09 5521 KSP Residual norm 6.271087794583e-09 5522 KSP Residual norm 6.750690160849e-09 5523 KSP Residual norm 6.706695110247e-09 5524 KSP Residual norm 6.638881241763e-09 5525 KSP Residual norm 6.325113175857e-09 5526 KSP Residual norm 5.803471722733e-09 5527 KSP Residual norm 5.771190438348e-09 5528 KSP Residual norm 6.031731534417e-09 5529 KSP Residual norm 6.013012406909e-09 5530 KSP Residual norm 5.938254413672e-09 5531 KSP Residual norm 6.733790720242e-09 5532 KSP Residual norm 8.543132902528e-09 5533 KSP Residual norm 9.266225786333e-09 5534 KSP Residual norm 8.932333621643e-09 5535 KSP Residual norm 9.342424336230e-09 5536 KSP Residual norm 1.147456780593e-08 5537 KSP Residual norm 1.485847655002e-08 5538 KSP Residual norm 1.610572716282e-08 5539 KSP Residual norm 1.669657128903e-08 5540 KSP Residual norm 1.812196496756e-08 5541 KSP Residual norm 2.099155993717e-08 5542 KSP Residual norm 2.127793447970e-08 5543 KSP Residual norm 2.011781633319e-08 5544 KSP Residual norm 2.109445773440e-08 5545 KSP Residual norm 2.330411711226e-08 5546 KSP Residual norm 2.316723688848e-08 5547 KSP Residual norm 2.136839532724e-08 5548 KSP Residual norm 2.214809050108e-08 5549 KSP Residual norm 2.412774017544e-08 5550 KSP Residual norm 2.420498911693e-08 5551 KSP Residual norm 2.234022518322e-08 5552 KSP Residual norm 2.083387186328e-08 5553 KSP Residual norm 2.080096682943e-08 5554 KSP Residual norm 2.191487112178e-08 5555 KSP Residual norm 2.303801419813e-08 5556 KSP Residual norm 2.334319331371e-08 5557 KSP Residual norm 2.485210730561e-08 5558 KSP Residual norm 2.787249285305e-08 5559 KSP Residual norm 3.185299178363e-08 5560 KSP Residual norm 3.680262634151e-08 5561 KSP Residual norm 3.898718822296e-08 5562 KSP Residual norm 4.261392263539e-08 5563 KSP Residual norm 4.872254768579e-08 5564 KSP Residual norm 5.548824945258e-08 5565 KSP Residual norm 5.872324210471e-08 5566 KSP Residual norm 5.735192575544e-08 5567 KSP Residual norm 5.211156796489e-08 5568 KSP Residual norm 4.617614764550e-08 5569 KSP Residual norm 3.938250407406e-08 5570 KSP Residual norm 3.660565445409e-08 5571 KSP Residual norm 3.597379650148e-08 5572 KSP Residual norm 3.212215598149e-08 5573 KSP Residual norm 3.004110415491e-08 5574 KSP Residual norm 3.135032074160e-08 5575 KSP Residual norm 3.172140669796e-08 5576 KSP Residual norm 2.787048812011e-08 5577 KSP Residual norm 2.612036552597e-08 5578 KSP Residual norm 2.693932852706e-08 5579 KSP Residual norm 2.820402119464e-08 5580 KSP Residual norm 3.012982430485e-08 5581 KSP Residual norm 3.270713757602e-08 5582 KSP Residual norm 3.337205701952e-08 5583 KSP Residual norm 3.496508884533e-08 5584 KSP Residual norm 3.965545192651e-08 5585 KSP Residual norm 4.176209474025e-08 5586 KSP Residual norm 3.909966171697e-08 5587 KSP Residual norm 3.837815406682e-08 5588 KSP Residual norm 4.039239877197e-08 5589 KSP Residual norm 3.891136143854e-08 5590 KSP Residual norm 3.916441691126e-08 5591 KSP Residual norm 4.153766022088e-08 5592 KSP Residual norm 3.887766035514e-08 5593 KSP Residual norm 3.302755884711e-08 5594 KSP Residual norm 2.971919198533e-08 5595 KSP Residual norm 2.864628234602e-08 5596 KSP Residual norm 2.580760054679e-08 5597 KSP Residual norm 2.404929008707e-08 5598 KSP Residual norm 2.449910513666e-08 5599 KSP Residual norm 2.321436110552e-08 5600 KSP Residual norm 1.987905826288e-08 5601 KSP Residual norm 1.943720364151e-08 5602 KSP Residual norm 2.153932517945e-08 5603 KSP Residual norm 2.146034184697e-08 5604 KSP Residual norm 2.230019136445e-08 5605 KSP Residual norm 2.524621550237e-08 5606 KSP Residual norm 2.760253767765e-08 5607 KSP Residual norm 2.847138359806e-08 5608 KSP Residual norm 2.772981667068e-08 5609 KSP Residual norm 2.521245172404e-08 5610 KSP Residual norm 2.449870354785e-08 5611 KSP Residual norm 2.682469513611e-08 5612 KSP Residual norm 2.740541459740e-08 5613 KSP Residual norm 2.589999032089e-08 5614 KSP Residual norm 2.455675590031e-08 5615 KSP Residual norm 2.329041780451e-08 5616 KSP Residual norm 2.178485700060e-08 5617 KSP Residual norm 2.077064459006e-08 5618 KSP Residual norm 1.890095094995e-08 5619 KSP Residual norm 1.703535942804e-08 5620 KSP Residual norm 1.586457606685e-08 5621 KSP Residual norm 1.507412097753e-08 5622 KSP Residual norm 1.491110964296e-08 5623 KSP Residual norm 1.504736504896e-08 5624 KSP Residual norm 1.597859811295e-08 5625 KSP Residual norm 1.872310216689e-08 5626 KSP Residual norm 2.079994615551e-08 5627 KSP Residual norm 1.970333058801e-08 5628 KSP Residual norm 2.002603063324e-08 5629 KSP Residual norm 2.285600263010e-08 5630 KSP Residual norm 2.443524888109e-08 5631 KSP Residual norm 2.718995287373e-08 5632 KSP Residual norm 3.539178131941e-08 5633 KSP Residual norm 4.432201291216e-08 5634 KSP Residual norm 4.724631822443e-08 5635 KSP Residual norm 4.844726172059e-08 5636 KSP Residual norm 4.673145779427e-08 5637 KSP Residual norm 4.674193185386e-08 5638 KSP Residual norm 5.070468290648e-08 5639 KSP Residual norm 5.351441477534e-08 5640 KSP Residual norm 5.461767586295e-08 5641 KSP Residual norm 5.451887222688e-08 5642 KSP Residual norm 5.367154533557e-08 5643 KSP Residual norm 5.636261500611e-08 5644 KSP Residual norm 5.350177586329e-08 5645 KSP Residual norm 4.625835696476e-08 5646 KSP Residual norm 4.705719284550e-08 5647 KSP Residual norm 4.886282316022e-08 5648 KSP Residual norm 3.964387836097e-08 5649 KSP Residual norm 3.370975258849e-08 5650 KSP Residual norm 3.686692373523e-08 5651 KSP Residual norm 4.291161181734e-08 5652 KSP Residual norm 4.629409149072e-08 5653 KSP Residual norm 4.751049211227e-08 5654 KSP Residual norm 4.860026514681e-08 5655 KSP Residual norm 5.331791903609e-08 5656 KSP Residual norm 5.601682714104e-08 5657 KSP Residual norm 5.272722404958e-08 5658 KSP Residual norm 5.586688679224e-08 5659 KSP Residual norm 6.723938005517e-08 5660 KSP Residual norm 7.571913222994e-08 5661 KSP Residual norm 7.490541476786e-08 5662 KSP Residual norm 7.878357837021e-08 5663 KSP Residual norm 8.790864309981e-08 5664 KSP Residual norm 9.113778822172e-08 5665 KSP Residual norm 8.583150294521e-08 5666 KSP Residual norm 8.136108855064e-08 5667 KSP Residual norm 8.233120463721e-08 5668 KSP Residual norm 7.528723510048e-08 5669 KSP Residual norm 6.366563210064e-08 5670 KSP Residual norm 5.874837470335e-08 5671 KSP Residual norm 5.431597552163e-08 5672 KSP Residual norm 4.819563201572e-08 5673 KSP Residual norm 4.738315303963e-08 5674 KSP Residual norm 5.257788273270e-08 5675 KSP Residual norm 5.279990245751e-08 5676 KSP Residual norm 5.294166794154e-08 5677 KSP Residual norm 5.664252658313e-08 5678 KSP Residual norm 5.206444921232e-08 5679 KSP Residual norm 4.792304512614e-08 5680 KSP Residual norm 5.144030289911e-08 5681 KSP Residual norm 5.060775284561e-08 5682 KSP Residual norm 5.207967182338e-08 5683 KSP Residual norm 6.055225854499e-08 5684 KSP Residual norm 7.282313410407e-08 5685 KSP Residual norm 8.413071583340e-08 5686 KSP Residual norm 9.152531765068e-08 5687 KSP Residual norm 9.477879754250e-08 5688 KSP Residual norm 1.006128875632e-07 5689 KSP Residual norm 1.010898096238e-07 5690 KSP Residual norm 9.365303720768e-08 5691 KSP Residual norm 9.309286285570e-08 5692 KSP Residual norm 9.225324700132e-08 5693 KSP Residual norm 8.271514500700e-08 5694 KSP Residual norm 7.358723186206e-08 5695 KSP Residual norm 6.494167359046e-08 5696 KSP Residual norm 5.536674074442e-08 5697 KSP Residual norm 4.950522947500e-08 5698 KSP Residual norm 4.717142548697e-08 5699 KSP Residual norm 4.221458840930e-08 5700 KSP Residual norm 3.922209227223e-08 5701 KSP Residual norm 4.048303049937e-08 5702 KSP Residual norm 4.397172333271e-08 5703 KSP Residual norm 4.359317623468e-08 5704 KSP Residual norm 4.001127863843e-08 5705 KSP Residual norm 3.985597044914e-08 5706 KSP Residual norm 4.233440454446e-08 5707 KSP Residual norm 4.466504937352e-08 5708 KSP Residual norm 4.429060125716e-08 5709 KSP Residual norm 4.613716216899e-08 5710 KSP Residual norm 5.274789454310e-08 5711 KSP Residual norm 5.703353055084e-08 5712 KSP Residual norm 5.593959328317e-08 5713 KSP Residual norm 5.705466655647e-08 5714 KSP Residual norm 5.677661321640e-08 5715 KSP Residual norm 5.404922801283e-08 5716 KSP Residual norm 5.715669770734e-08 5717 KSP Residual norm 6.127921801424e-08 5718 KSP Residual norm 5.967780081763e-08 5719 KSP Residual norm 5.323899053565e-08 5720 KSP Residual norm 4.944186688129e-08 5721 KSP Residual norm 4.642797513728e-08 5722 KSP Residual norm 3.893223276039e-08 5723 KSP Residual norm 3.199174067739e-08 5724 KSP Residual norm 3.077940599420e-08 5725 KSP Residual norm 3.366839790567e-08 5726 KSP Residual norm 3.300624784741e-08 5727 KSP Residual norm 3.091647825379e-08 5728 KSP Residual norm 2.874090550990e-08 5729 KSP Residual norm 2.489068229393e-08 5730 KSP Residual norm 2.306768069770e-08 5731 KSP Residual norm 2.436294017451e-08 5732 KSP Residual norm 2.442663750324e-08 5733 KSP Residual norm 2.372003424017e-08 5734 KSP Residual norm 2.479974935550e-08 5735 KSP Residual norm 2.676521658349e-08 5736 KSP Residual norm 2.960853741445e-08 5737 KSP Residual norm 3.124437445236e-08 5738 KSP Residual norm 3.357520749142e-08 5739 KSP Residual norm 4.003483429786e-08 5740 KSP Residual norm 4.455042195861e-08 5741 KSP Residual norm 4.157398487451e-08 5742 KSP Residual norm 4.147434696137e-08 5743 KSP Residual norm 4.746776042974e-08 5744 KSP Residual norm 5.456934984640e-08 5745 KSP Residual norm 5.884014158974e-08 5746 KSP Residual norm 5.890808501872e-08 5747 KSP Residual norm 5.449078574259e-08 5748 KSP Residual norm 5.292630562777e-08 5749 KSP Residual norm 4.985709322932e-08 5750 KSP Residual norm 4.706516672267e-08 5751 KSP Residual norm 4.766439519705e-08 5752 KSP Residual norm 4.950733577290e-08 5753 KSP Residual norm 4.678637797217e-08 5754 KSP Residual norm 4.131771022137e-08 5755 KSP Residual norm 3.597295292120e-08 5756 KSP Residual norm 3.388196462898e-08 5757 KSP Residual norm 3.313494414729e-08 5758 KSP Residual norm 3.175716517903e-08 5759 KSP Residual norm 3.052031574310e-08 5760 KSP Residual norm 3.240460600490e-08 5761 KSP Residual norm 3.473274891880e-08 5762 KSP Residual norm 3.483212600354e-08 5763 KSP Residual norm 3.558956505627e-08 5764 KSP Residual norm 3.664029639833e-08 5765 KSP Residual norm 3.844038827804e-08 5766 KSP Residual norm 4.084497040166e-08 5767 KSP Residual norm 3.766836987527e-08 5768 KSP Residual norm 3.519655715984e-08 5769 KSP Residual norm 4.063534875725e-08 5770 KSP Residual norm 4.765191108157e-08 5771 KSP Residual norm 4.782936385760e-08 5772 KSP Residual norm 5.403902181482e-08 5773 KSP Residual norm 5.818643702623e-08 5774 KSP Residual norm 5.189404460853e-08 5775 KSP Residual norm 5.103054856265e-08 5776 KSP Residual norm 5.685251888370e-08 5777 KSP Residual norm 5.621661882561e-08 5778 KSP Residual norm 5.352314277389e-08 5779 KSP Residual norm 5.376211728174e-08 5780 KSP Residual norm 5.201824803976e-08 5781 KSP Residual norm 4.678503248873e-08 5782 KSP Residual norm 4.301049624167e-08 5783 KSP Residual norm 3.857998530108e-08 5784 KSP Residual norm 3.535176833896e-08 5785 KSP Residual norm 3.458155927086e-08 5786 KSP Residual norm 3.736679283351e-08 5787 KSP Residual norm 4.090445872541e-08 5788 KSP Residual norm 3.623148829083e-08 5789 KSP Residual norm 2.935989936062e-08 5790 KSP Residual norm 2.688623669509e-08 5791 KSP Residual norm 2.338872587409e-08 5792 KSP Residual norm 1.852915587968e-08 5793 KSP Residual norm 1.648662220424e-08 5794 KSP Residual norm 1.587363219678e-08 5795 KSP Residual norm 1.532915378613e-08 5796 KSP Residual norm 1.590968009132e-08 5797 KSP Residual norm 1.681559660711e-08 5798 KSP Residual norm 1.721807196553e-08 5799 KSP Residual norm 1.863188700922e-08 5800 KSP Residual norm 1.772203214152e-08 5801 KSP Residual norm 1.595284376132e-08 5802 KSP Residual norm 1.616046401886e-08 5803 KSP Residual norm 1.808146352440e-08 5804 KSP Residual norm 1.892865974859e-08 5805 KSP Residual norm 1.902783691273e-08 5806 KSP Residual norm 1.989032648195e-08 5807 KSP Residual norm 2.102123789926e-08 5808 KSP Residual norm 2.209121866540e-08 5809 KSP Residual norm 2.254982800485e-08 5810 KSP Residual norm 2.313258474437e-08 5811 KSP Residual norm 2.581612200369e-08 5812 KSP Residual norm 2.752323982525e-08 5813 KSP Residual norm 2.514485359277e-08 5814 KSP Residual norm 2.269065321902e-08 5815 KSP Residual norm 2.135998547325e-08 5816 KSP Residual norm 2.032274378164e-08 5817 KSP Residual norm 1.934221578882e-08 5818 KSP Residual norm 1.762992179274e-08 5819 KSP Residual norm 1.631201927052e-08 5820 KSP Residual norm 1.693077226574e-08 5821 KSP Residual norm 1.699921318427e-08 5822 KSP Residual norm 1.561506317076e-08 5823 KSP Residual norm 1.539487052324e-08 5824 KSP Residual norm 1.585745488096e-08 5825 KSP Residual norm 1.622218989555e-08 5826 KSP Residual norm 1.622404518393e-08 5827 KSP Residual norm 1.527388064429e-08 5828 KSP Residual norm 1.566218623719e-08 5829 KSP Residual norm 1.656628072173e-08 5830 KSP Residual norm 1.481428143463e-08 5831 KSP Residual norm 1.358038354312e-08 5832 KSP Residual norm 1.552817860191e-08 5833 KSP Residual norm 1.896489426435e-08 5834 KSP Residual norm 2.015742360109e-08 5835 KSP Residual norm 2.135412904179e-08 5836 KSP Residual norm 2.282637598818e-08 5837 KSP Residual norm 2.447075053033e-08 5838 KSP Residual norm 2.689817398861e-08 5839 KSP Residual norm 2.884005616034e-08 5840 KSP Residual norm 2.771649184354e-08 5841 KSP Residual norm 2.466670222750e-08 5842 KSP Residual norm 2.404685156236e-08 5843 KSP Residual norm 2.633227191781e-08 5844 KSP Residual norm 2.927161580957e-08 5845 KSP Residual norm 3.051817865047e-08 5846 KSP Residual norm 2.879558382870e-08 5847 KSP Residual norm 2.846484323974e-08 5848 KSP Residual norm 2.923170724909e-08 5849 KSP Residual norm 2.952234902823e-08 5850 KSP Residual norm 3.186280421965e-08 5851 KSP Residual norm 3.302613262426e-08 5852 KSP Residual norm 3.005668813402e-08 5853 KSP Residual norm 3.157945371410e-08 5854 KSP Residual norm 3.588787867689e-08 5855 KSP Residual norm 3.412946188602e-08 5856 KSP Residual norm 3.320629247537e-08 5857 KSP Residual norm 3.340366889490e-08 5858 KSP Residual norm 3.254701539974e-08 5859 KSP Residual norm 3.244789262914e-08 5860 KSP Residual norm 3.281573442612e-08 5861 KSP Residual norm 3.233363644266e-08 5862 KSP Residual norm 3.246773987535e-08 5863 KSP Residual norm 3.055371953699e-08 5864 KSP Residual norm 3.095554409509e-08 5865 KSP Residual norm 3.562656762785e-08 5866 KSP Residual norm 3.736493667298e-08 5867 KSP Residual norm 3.393707430832e-08 5868 KSP Residual norm 3.265206858804e-08 5869 KSP Residual norm 3.040950142648e-08 5870 KSP Residual norm 2.721687435818e-08 5871 KSP Residual norm 2.683378707112e-08 5872 KSP Residual norm 2.591301638545e-08 5873 KSP Residual norm 2.373674591767e-08 5874 KSP Residual norm 2.457936036754e-08 5875 KSP Residual norm 2.634295833012e-08 5876 KSP Residual norm 2.440797022890e-08 5877 KSP Residual norm 2.223377749378e-08 5878 KSP Residual norm 2.176301126012e-08 5879 KSP Residual norm 1.970397119345e-08 5880 KSP Residual norm 1.646534441138e-08 5881 KSP Residual norm 1.406467831075e-08 5882 KSP Residual norm 1.348038313322e-08 5883 KSP Residual norm 1.404483534354e-08 5884 KSP Residual norm 1.336211384772e-08 5885 KSP Residual norm 1.331992184729e-08 5886 KSP Residual norm 1.483164024070e-08 5887 KSP Residual norm 1.422194956739e-08 5888 KSP Residual norm 1.275547236712e-08 5889 KSP Residual norm 1.290332807865e-08 5890 KSP Residual norm 1.313588677336e-08 5891 KSP Residual norm 1.186726738001e-08 5892 KSP Residual norm 1.243162997039e-08 5893 KSP Residual norm 1.402012635567e-08 5894 KSP Residual norm 1.354801656375e-08 5895 KSP Residual norm 1.308003003559e-08 5896 KSP Residual norm 1.363945758030e-08 5897 KSP Residual norm 1.290186535514e-08 5898 KSP Residual norm 1.269778164537e-08 5899 KSP Residual norm 1.325926286643e-08 5900 KSP Residual norm 1.242521809992e-08 5901 KSP Residual norm 1.121272945360e-08 5902 KSP Residual norm 1.070454435813e-08 5903 KSP Residual norm 1.024797932572e-08 5904 KSP Residual norm 1.066504486653e-08 5905 KSP Residual norm 1.081858001868e-08 5906 KSP Residual norm 9.525114275304e-09 5907 KSP Residual norm 8.810036239512e-09 5908 KSP Residual norm 9.150124403636e-09 5909 KSP Residual norm 8.731791657391e-09 5910 KSP Residual norm 7.484627383632e-09 5911 KSP Residual norm 6.027600998446e-09 5912 KSP Residual norm 5.096028604431e-09 5913 KSP Residual norm 4.742381062650e-09 5914 KSP Residual norm 4.272051792203e-09 5915 KSP Residual norm 3.922544404131e-09 5916 KSP Residual norm 3.930225759224e-09 5917 KSP Residual norm 3.893886602269e-09 5918 KSP Residual norm 3.621738687027e-09 5919 KSP Residual norm 3.407281058935e-09 5920 KSP Residual norm 3.162966929247e-09 5921 KSP Residual norm 3.097278195669e-09 5922 KSP Residual norm 3.619849018096e-09 5923 KSP Residual norm 4.030402051644e-09 5924 KSP Residual norm 3.995320428027e-09 5925 KSP Residual norm 3.883388468016e-09 5926 KSP Residual norm 4.057528791695e-09 5927 KSP Residual norm 4.122437285510e-09 5928 KSP Residual norm 3.863876557664e-09 5929 KSP Residual norm 3.419983414710e-09 5930 KSP Residual norm 3.102986403806e-09 5931 KSP Residual norm 3.195460644492e-09 5932 KSP Residual norm 3.191190083509e-09 5933 KSP Residual norm 2.915337234532e-09 5934 KSP Residual norm 2.913235443711e-09 5935 KSP Residual norm 3.226921055124e-09 5936 KSP Residual norm 3.433108078079e-09 5937 KSP Residual norm 3.567664707585e-09 5938 KSP Residual norm 3.804543966619e-09 5939 KSP Residual norm 3.897592785736e-09 5940 KSP Residual norm 3.870750766798e-09 5941 KSP Residual norm 3.600308993079e-09 5942 KSP Residual norm 3.263708488569e-09 5943 KSP Residual norm 3.073175168529e-09 5944 KSP Residual norm 3.079858883540e-09 5945 KSP Residual norm 3.079345925879e-09 5946 KSP Residual norm 2.966165680211e-09 5947 KSP Residual norm 2.820676115316e-09 5948 KSP Residual norm 2.882584058455e-09 5949 KSP Residual norm 3.246209067552e-09 5950 KSP Residual norm 3.252713611405e-09 5951 KSP Residual norm 3.009534650270e-09 5952 KSP Residual norm 3.249549322718e-09 5953 KSP Residual norm 3.358625983803e-09 5954 KSP Residual norm 2.876299251941e-09 5955 KSP Residual norm 2.941656501352e-09 5956 KSP Residual norm 3.553386669602e-09 5957 KSP Residual norm 3.702412733751e-09 5958 KSP Residual norm 3.973492574675e-09 5959 KSP Residual norm 4.929498967592e-09 5960 KSP Residual norm 5.340033359255e-09 5961 KSP Residual norm 5.239596817279e-09 5962 KSP Residual norm 5.496853746315e-09 5963 KSP Residual norm 5.585256439527e-09 5964 KSP Residual norm 5.863909137578e-09 5965 KSP Residual norm 5.957077491569e-09 5966 KSP Residual norm 5.808221728001e-09 5967 KSP Residual norm 5.589532490413e-09 5968 KSP Residual norm 5.365430363546e-09 5969 KSP Residual norm 5.215546493725e-09 5970 KSP Residual norm 5.449424138904e-09 5971 KSP Residual norm 5.511971146379e-09 5972 KSP Residual norm 5.629324543543e-09 5973 KSP Residual norm 5.441398073214e-09 5974 KSP Residual norm 4.589820056398e-09 5975 KSP Residual norm 4.134381164950e-09 5976 KSP Residual norm 4.567986217820e-09 5977 KSP Residual norm 4.727973123536e-09 5978 KSP Residual norm 4.093434231705e-09 5979 KSP Residual norm 3.947130709671e-09 5980 KSP Residual norm 3.934809736837e-09 5981 KSP Residual norm 3.567769743207e-09 5982 KSP Residual norm 3.280966031334e-09 5983 KSP Residual norm 3.501025449181e-09 5984 KSP Residual norm 3.872891020338e-09 5985 KSP Residual norm 4.250255910451e-09 5986 KSP Residual norm 4.503198492871e-09 5987 KSP Residual norm 4.482947805069e-09 5988 KSP Residual norm 4.708546364810e-09 5989 KSP Residual norm 5.430493063650e-09 5990 KSP Residual norm 5.992987414875e-09 5991 KSP Residual norm 6.621315571136e-09 5992 KSP Residual norm 6.630560068689e-09 5993 KSP Residual norm 6.103285643087e-09 5994 KSP Residual norm 6.277315703186e-09 5995 KSP Residual norm 6.723508621494e-09 5996 KSP Residual norm 6.891389417011e-09 5997 KSP Residual norm 7.137279211919e-09 5998 KSP Residual norm 7.025331043085e-09 5999 KSP Residual norm 7.278619412332e-09 6000 KSP Residual norm 8.273612466523e-09 6001 KSP Residual norm 9.047608991952e-09 6002 KSP Residual norm 8.724612062836e-09 6003 KSP Residual norm 7.883151831466e-09 6004 KSP Residual norm 6.587703651199e-09 6005 KSP Residual norm 5.501947508073e-09 6006 KSP Residual norm 4.902011993415e-09 6007 KSP Residual norm 4.564150862236e-09 6008 KSP Residual norm 4.187352652828e-09 6009 KSP Residual norm 3.902993654184e-09 6010 KSP Residual norm 3.624513508135e-09 6011 KSP Residual norm 3.138732874830e-09 6012 KSP Residual norm 3.091461240906e-09 6013 KSP Residual norm 2.882865174095e-09 6014 KSP Residual norm 2.484161291479e-09 6015 KSP Residual norm 2.413570427382e-09 6016 KSP Residual norm 2.584409879002e-09 6017 KSP Residual norm 2.450639779402e-09 6018 KSP Residual norm 2.133860362854e-09 6019 KSP Residual norm 2.053977250023e-09 6020 KSP Residual norm 2.183797318644e-09 6021 KSP Residual norm 2.171589150396e-09 6022 KSP Residual norm 2.106037596595e-09 6023 KSP Residual norm 2.384740291365e-09 6024 KSP Residual norm 2.614852486569e-09 6025 KSP Residual norm 2.632666986823e-09 6026 KSP Residual norm 2.956364287423e-09 6027 KSP Residual norm 3.496051256385e-09 6028 KSP Residual norm 3.597714022034e-09 6029 KSP Residual norm 3.507751220193e-09 6030 KSP Residual norm 3.564218580868e-09 6031 KSP Residual norm 3.417849212233e-09 6032 KSP Residual norm 3.087887395563e-09 6033 KSP Residual norm 3.131057078944e-09 6034 KSP Residual norm 3.059890349461e-09 6035 KSP Residual norm 2.815913559532e-09 6036 KSP Residual norm 2.994575498133e-09 6037 KSP Residual norm 3.261564599599e-09 6038 KSP Residual norm 2.939985402942e-09 6039 KSP Residual norm 2.572772307967e-09 6040 KSP Residual norm 2.711814309728e-09 6041 KSP Residual norm 2.876141614649e-09 6042 KSP Residual norm 2.471263959136e-09 6043 KSP Residual norm 2.150843621287e-09 6044 KSP Residual norm 2.110771765902e-09 6045 KSP Residual norm 1.973128080382e-09 6046 KSP Residual norm 1.658285852779e-09 6047 KSP Residual norm 1.478257766395e-09 6048 KSP Residual norm 1.398461360217e-09 6049 KSP Residual norm 1.257721415319e-09 6050 KSP Residual norm 1.230077630566e-09 6051 KSP Residual norm 1.366780363788e-09 6052 KSP Residual norm 1.447661639832e-09 6053 KSP Residual norm 1.467433637614e-09 6054 KSP Residual norm 1.637944811330e-09 6055 KSP Residual norm 1.637073128045e-09 6056 KSP Residual norm 1.407615863290e-09 6057 KSP Residual norm 1.295967981216e-09 6058 KSP Residual norm 1.260969021591e-09 6059 KSP Residual norm 1.193544831026e-09 6060 KSP Residual norm 1.226326144130e-09 6061 KSP Residual norm 1.338988504040e-09 6062 KSP Residual norm 1.379490010129e-09 6063 KSP Residual norm 1.305348433261e-09 6064 KSP Residual norm 1.309951248148e-09 6065 KSP Residual norm 1.441952613325e-09 6066 KSP Residual norm 1.441560574500e-09 6067 KSP Residual norm 1.370522163808e-09 6068 KSP Residual norm 1.396785645819e-09 6069 KSP Residual norm 1.327540706724e-09 6070 KSP Residual norm 1.134223716186e-09 6071 KSP Residual norm 1.019994062090e-09 6072 KSP Residual norm 9.906695103783e-10 6073 KSP Residual norm 8.957107405108e-10 6074 KSP Residual norm 7.712231518853e-10 6075 KSP Residual norm 7.180590087166e-10 6076 KSP Residual norm 6.971859092812e-10 6077 KSP Residual norm 6.504845748402e-10 6078 KSP Residual norm 5.899800056985e-10 6079 KSP Residual norm 5.263537964676e-10 6080 KSP Residual norm 4.807841418775e-10 6081 KSP Residual norm 4.750325054956e-10 6082 KSP Residual norm 4.815109172544e-10 6083 KSP Residual norm 5.229360749998e-10 6084 KSP Residual norm 5.835856606864e-10 6085 KSP Residual norm 5.841966786634e-10 6086 KSP Residual norm 5.293249621404e-10 6087 KSP Residual norm 5.179276325707e-10 6088 KSP Residual norm 5.305128432316e-10 6089 KSP Residual norm 5.409634169791e-10 6090 KSP Residual norm 5.667376798817e-10 6091 KSP Residual norm 5.959675897582e-10 6092 KSP Residual norm 6.084385027759e-10 6093 KSP Residual norm 6.409566008094e-10 6094 KSP Residual norm 7.523174799242e-10 6095 KSP Residual norm 9.165508057686e-10 6096 KSP Residual norm 9.777751237552e-10 6097 KSP Residual norm 1.009857263645e-09 6098 KSP Residual norm 1.069894625240e-09 6099 KSP Residual norm 1.037523876715e-09 6100 KSP Residual norm 9.540853027552e-10 6101 KSP Residual norm 1.038199777235e-09 6102 KSP Residual norm 1.216750394909e-09 6103 KSP Residual norm 1.339664515904e-09 6104 KSP Residual norm 1.429808019692e-09 6105 KSP Residual norm 1.422475584757e-09 6106 KSP Residual norm 1.253923307797e-09 6107 KSP Residual norm 1.160346256599e-09 6108 KSP Residual norm 1.095092435170e-09 6109 KSP Residual norm 1.066123491652e-09 6110 KSP Residual norm 1.115510906723e-09 6111 KSP Residual norm 1.124500249174e-09 6112 KSP Residual norm 1.044469580038e-09 6113 KSP Residual norm 9.716163592221e-10 6114 KSP Residual norm 9.711855081516e-10 6115 KSP Residual norm 9.445999518440e-10 6116 KSP Residual norm 8.414672368225e-10 6117 KSP Residual norm 7.592855436613e-10 6118 KSP Residual norm 7.414786602700e-10 6119 KSP Residual norm 7.706050395643e-10 6120 KSP Residual norm 8.210346337546e-10 6121 KSP Residual norm 8.661127475929e-10 6122 KSP Residual norm 9.226687285164e-10 6123 KSP Residual norm 9.151718581911e-10 6124 KSP Residual norm 8.768534120734e-10 6125 KSP Residual norm 8.822340695894e-10 6126 KSP Residual norm 8.801284595006e-10 6127 KSP Residual norm 9.205037176446e-10 6128 KSP Residual norm 1.055337995344e-09 6129 KSP Residual norm 1.125154773548e-09 6130 KSP Residual norm 1.204727637066e-09 6131 KSP Residual norm 1.261058667951e-09 6132 KSP Residual norm 1.258718773204e-09 6133 KSP Residual norm 1.242934921786e-09 6134 KSP Residual norm 1.228105695423e-09 6135 KSP Residual norm 1.278359202185e-09 6136 KSP Residual norm 1.530124185230e-09 6137 KSP Residual norm 1.941785222394e-09 6138 KSP Residual norm 1.876238513400e-09 6139 KSP Residual norm 1.726009243890e-09 6140 KSP Residual norm 1.769993266254e-09 6141 KSP Residual norm 1.719812529162e-09 6142 KSP Residual norm 1.627583776111e-09 6143 KSP Residual norm 1.715138047280e-09 6144 KSP Residual norm 1.965100580902e-09 6145 KSP Residual norm 2.088278876042e-09 6146 KSP Residual norm 1.843395302318e-09 6147 KSP Residual norm 1.754835279707e-09 6148 KSP Residual norm 1.972849453439e-09 6149 KSP Residual norm 1.812016212797e-09 6150 KSP Residual norm 1.426216567062e-09 6151 KSP Residual norm 1.321175823639e-09 6152 KSP Residual norm 1.290100567075e-09 6153 KSP Residual norm 1.082748146368e-09 6154 KSP Residual norm 9.382295146216e-10 6155 KSP Residual norm 9.764082351994e-10 6156 KSP Residual norm 9.813166722612e-10 6157 KSP Residual norm 8.244297960418e-10 6158 KSP Residual norm 7.213123403286e-10 6159 KSP Residual norm 7.633548162143e-10 6160 KSP Residual norm 7.912410887567e-10 6161 KSP Residual norm 8.092046161619e-10 6162 KSP Residual norm 8.326451759415e-10 6163 KSP Residual norm 8.438375382619e-10 6164 KSP Residual norm 7.937358189110e-10 6165 KSP Residual norm 7.376332168813e-10 6166 KSP Residual norm 7.090211326279e-10 6167 KSP Residual norm 7.715572417440e-10 6168 KSP Residual norm 9.518302090087e-10 6169 KSP Residual norm 1.051850190965e-09 6170 KSP Residual norm 9.706087322195e-10 6171 KSP Residual norm 9.559483056377e-10 6172 KSP Residual norm 9.661835004997e-10 6173 KSP Residual norm 8.667718298097e-10 6174 KSP Residual norm 7.522997690931e-10 6175 KSP Residual norm 7.278019173816e-10 6176 KSP Residual norm 7.442018403560e-10 6177 KSP Residual norm 7.758293404192e-10 6178 KSP Residual norm 8.182207343272e-10 6179 KSP Residual norm 8.174150430618e-10 6180 KSP Residual norm 7.764645499542e-10 6181 KSP Residual norm 6.950102409887e-10 6182 KSP Residual norm 6.072187708247e-10 6183 KSP Residual norm 5.623545023331e-10 6184 KSP Residual norm 5.417003970036e-10 6185 KSP Residual norm 4.957863648050e-10 6186 KSP Residual norm 4.478478143951e-10 6187 KSP Residual norm 4.472729378064e-10 6188 KSP Residual norm 4.765792993957e-10 6189 KSP Residual norm 4.834719144242e-10 6190 KSP Residual norm 4.612236257706e-10 6191 KSP Residual norm 4.133564637427e-10 6192 KSP Residual norm 3.587137041385e-10 6193 KSP Residual norm 2.917236412788e-10 6194 KSP Residual norm 2.416539047235e-10 6195 KSP Residual norm 2.163406820456e-10 6196 KSP Residual norm 1.954512762638e-10 6197 KSP Residual norm 1.877918407682e-10 6198 KSP Residual norm 2.015040226482e-10 6199 KSP Residual norm 2.113513428306e-10 6200 KSP Residual norm 2.074538885013e-10 6201 KSP Residual norm 2.339587315253e-10 6202 KSP Residual norm 2.833924069414e-10 6203 KSP Residual norm 2.701714838635e-10 6204 KSP Residual norm 2.424744975288e-10 6205 KSP Residual norm 2.552869279444e-10 6206 KSP Residual norm 2.591339699979e-10 6207 KSP Residual norm 2.392501186390e-10 6208 KSP Residual norm 2.556631771841e-10 6209 KSP Residual norm 2.921850353967e-10 6210 KSP Residual norm 2.957147992054e-10 6211 KSP Residual norm 3.133807487314e-10 6212 KSP Residual norm 3.485972684957e-10 6213 KSP Residual norm 3.514264720022e-10 6214 KSP Residual norm 3.385155163660e-10 6215 KSP Residual norm 3.574849814861e-10 6216 KSP Residual norm 3.666372327114e-10 6217 KSP Residual norm 3.313401297541e-10 6218 KSP Residual norm 2.797873300889e-10 6219 KSP Residual norm 2.490550347177e-10 6220 KSP Residual norm 2.352702180207e-10 6221 KSP Residual norm 2.368705151026e-10 6222 KSP Residual norm 2.337693119992e-10 6223 KSP Residual norm 2.220315766721e-10 6224 KSP Residual norm 2.167788164279e-10 6225 KSP Residual norm 2.242644816550e-10 6226 KSP Residual norm 2.333673637625e-10 6227 KSP Residual norm 2.263421285836e-10 6228 KSP Residual norm 2.231179081167e-10 6229 KSP Residual norm 2.242337779947e-10 6230 KSP Residual norm 2.095479106598e-10 6231 KSP Residual norm 1.927843736266e-10 6232 KSP Residual norm 1.815759693624e-10 6233 KSP Residual norm 1.626895202263e-10 6234 KSP Residual norm 1.464031914533e-10 6235 KSP Residual norm 1.406479504847e-10 6236 KSP Residual norm 1.395819670637e-10 6237 KSP Residual norm 1.405615196478e-10 6238 KSP Residual norm 1.341150979370e-10 6239 KSP Residual norm 1.229534008486e-10 6240 KSP Residual norm 1.288967960381e-10 6241 KSP Residual norm 1.436823952829e-10 6242 KSP Residual norm 1.472923262999e-10 6243 KSP Residual norm 1.450117493266e-10 6244 KSP Residual norm 1.533282471238e-10 6245 KSP Residual norm 1.825775139465e-10 6246 KSP Residual norm 1.982390168984e-10 6247 KSP Residual norm 1.822360309035e-10 6248 KSP Residual norm 1.800088380475e-10 6249 KSP Residual norm 2.021788691006e-10 6250 KSP Residual norm 2.025343478524e-10 6251 KSP Residual norm 1.864497963624e-10 6252 KSP Residual norm 1.972406251204e-10 6253 KSP Residual norm 2.288374788676e-10 6254 KSP Residual norm 2.383176107803e-10 6255 KSP Residual norm 2.541587068068e-10 6256 KSP Residual norm 2.881014865925e-10 6257 KSP Residual norm 3.004626436238e-10 6258 KSP Residual norm 2.872293184840e-10 6259 KSP Residual norm 2.566834692510e-10 6260 KSP Residual norm 2.515171395428e-10 6261 KSP Residual norm 2.830604881970e-10 6262 KSP Residual norm 3.265257112164e-10 6263 KSP Residual norm 3.159910957903e-10 6264 KSP Residual norm 2.815434709111e-10 6265 KSP Residual norm 2.896256852540e-10 6266 KSP Residual norm 3.238463093672e-10 6267 KSP Residual norm 3.281224761298e-10 6268 KSP Residual norm 3.204524010184e-10 6269 KSP Residual norm 3.410557383108e-10 6270 KSP Residual norm 3.387991625434e-10 6271 KSP Residual norm 3.075856571655e-10 6272 KSP Residual norm 3.109727572838e-10 6273 KSP Residual norm 3.163654673418e-10 6274 KSP Residual norm 2.939606574436e-10 6275 KSP Residual norm 2.852225921766e-10 6276 KSP Residual norm 3.002546471669e-10 6277 KSP Residual norm 3.013884827165e-10 6278 KSP Residual norm 2.962663826877e-10 6279 KSP Residual norm 3.231058763996e-10 6280 KSP Residual norm 3.905623438735e-10 6281 KSP Residual norm 4.076379640084e-10 6282 KSP Residual norm 3.824437491362e-10 6283 KSP Residual norm 3.897912477407e-10 6284 KSP Residual norm 4.190534361217e-10 6285 KSP Residual norm 4.503505758587e-10 6286 KSP Residual norm 4.582387244468e-10 6287 KSP Residual norm 4.558214941901e-10 6288 KSP Residual norm 4.269638379075e-10 6289 KSP Residual norm 4.258194229445e-10 6290 KSP Residual norm 4.703272439840e-10 6291 KSP Residual norm 5.098082032675e-10 6292 KSP Residual norm 5.420034119942e-10 6293 KSP Residual norm 5.760054392162e-10 6294 KSP Residual norm 6.004460526476e-10 6295 KSP Residual norm 5.812367327462e-10 6296 KSP Residual norm 5.353855736650e-10 6297 KSP Residual norm 5.001832361543e-10 6298 KSP Residual norm 4.926244826373e-10 6299 KSP Residual norm 4.650297778421e-10 6300 KSP Residual norm 4.196714896196e-10 6301 KSP Residual norm 4.058818421963e-10 6302 KSP Residual norm 4.700768397871e-10 6303 KSP Residual norm 4.924110317896e-10 6304 KSP Residual norm 4.367754519826e-10 6305 KSP Residual norm 4.265500723651e-10 6306 KSP Residual norm 4.504135988059e-10 6307 KSP Residual norm 4.458240593066e-10 6308 KSP Residual norm 4.121214543228e-10 6309 KSP Residual norm 3.766105876827e-10 6310 KSP Residual norm 3.360231890837e-10 6311 KSP Residual norm 3.017396657983e-10 6312 KSP Residual norm 2.677521406967e-10 6313 KSP Residual norm 2.571865825211e-10 6314 KSP Residual norm 2.753497901649e-10 6315 KSP Residual norm 2.764189385642e-10 6316 KSP Residual norm 2.727474523450e-10 6317 KSP Residual norm 2.798470473773e-10 6318 KSP Residual norm 2.753448696369e-10 6319 KSP Residual norm 2.679323660446e-10 6320 KSP Residual norm 2.568638675681e-10 6321 KSP Residual norm 2.197014293816e-10 6322 KSP Residual norm 1.870897251100e-10 6323 KSP Residual norm 1.743848069825e-10 6324 KSP Residual norm 1.605645745554e-10 6325 KSP Residual norm 1.500907760447e-10 6326 KSP Residual norm 1.610144506994e-10 6327 KSP Residual norm 1.696409481464e-10 6328 KSP Residual norm 1.713855008041e-10 6329 KSP Residual norm 1.847604893698e-10 6330 KSP Residual norm 2.093327978241e-10 6331 KSP Residual norm 2.179070236922e-10 6332 KSP Residual norm 2.326397394307e-10 6333 KSP Residual norm 2.479170661909e-10 6334 KSP Residual norm 2.491816104608e-10 6335 KSP Residual norm 2.557530423970e-10 6336 KSP Residual norm 2.741468772154e-10 6337 KSP Residual norm 2.898483418870e-10 6338 KSP Residual norm 3.119935472738e-10 6339 KSP Residual norm 3.173405256087e-10 6340 KSP Residual norm 3.213690836209e-10 6341 KSP Residual norm 3.610569911896e-10 6342 KSP Residual norm 4.043192657301e-10 6343 KSP Residual norm 3.897945183911e-10 6344 KSP Residual norm 3.346846964313e-10 6345 KSP Residual norm 2.904601764277e-10 6346 KSP Residual norm 2.754361070914e-10 6347 KSP Residual norm 2.644967432290e-10 6348 KSP Residual norm 2.769539945148e-10 6349 KSP Residual norm 3.105556917380e-10 6350 KSP Residual norm 3.492466651845e-10 6351 KSP Residual norm 3.389084390668e-10 6352 KSP Residual norm 2.819463434478e-10 6353 KSP Residual norm 2.352155435793e-10 6354 KSP Residual norm 2.268384863729e-10 6355 KSP Residual norm 2.308718293963e-10 6356 KSP Residual norm 2.248163438398e-10 6357 KSP Residual norm 1.992544136665e-10 6358 KSP Residual norm 1.800688171609e-10 6359 KSP Residual norm 1.731410868788e-10 6360 KSP Residual norm 1.789306126663e-10 6361 KSP Residual norm 1.942057118674e-10 6362 KSP Residual norm 2.192862230693e-10 6363 KSP Residual norm 2.530867599793e-10 6364 KSP Residual norm 2.437307071684e-10 6365 KSP Residual norm 2.205587886984e-10 6366 KSP Residual norm 2.253545114318e-10 6367 KSP Residual norm 2.457097225294e-10 6368 KSP Residual norm 2.544127890458e-10 6369 KSP Residual norm 2.737492705560e-10 6370 KSP Residual norm 3.020267286924e-10 6371 KSP Residual norm 3.167802711409e-10 6372 KSP Residual norm 3.375404950499e-10 6373 KSP Residual norm 3.630486238884e-10 6374 KSP Residual norm 3.933456431989e-10 6375 KSP Residual norm 4.128652515807e-10 6376 KSP Residual norm 4.293376724038e-10 6377 KSP Residual norm 4.817508985821e-10 6378 KSP Residual norm 5.725695123185e-10 6379 KSP Residual norm 6.860755156960e-10 6380 KSP Residual norm 7.282407335104e-10 6381 KSP Residual norm 6.833735458715e-10 6382 KSP Residual norm 6.465236275899e-10 6383 KSP Residual norm 6.079309715847e-10 6384 KSP Residual norm 5.785636361805e-10 6385 KSP Residual norm 5.871232418348e-10 6386 KSP Residual norm 6.577771095019e-10 6387 KSP Residual norm 7.426685937826e-10 6388 KSP Residual norm 7.560635566238e-10 6389 KSP Residual norm 6.808766948295e-10 6390 KSP Residual norm 5.949503264164e-10 6391 KSP Residual norm 5.952566020695e-10 6392 KSP Residual norm 6.550098523723e-10 6393 KSP Residual norm 7.241410218210e-10 6394 KSP Residual norm 7.605848364720e-10 6395 KSP Residual norm 6.748618925520e-10 6396 KSP Residual norm 5.470121166179e-10 6397 KSP Residual norm 4.606790119738e-10 6398 KSP Residual norm 4.261714417924e-10 6399 KSP Residual norm 4.436587636036e-10 6400 KSP Residual norm 4.865112969887e-10 6401 KSP Residual norm 4.754744923398e-10 6402 KSP Residual norm 4.484014400693e-10 6403 KSP Residual norm 4.744128087425e-10 6404 KSP Residual norm 5.017251084149e-10 6405 KSP Residual norm 5.174642628707e-10 6406 KSP Residual norm 5.724809815916e-10 6407 KSP Residual norm 6.364677553129e-10 6408 KSP Residual norm 6.031011443463e-10 6409 KSP Residual norm 5.417202460204e-10 6410 KSP Residual norm 5.185423380994e-10 6411 KSP Residual norm 5.250114407695e-10 6412 KSP Residual norm 5.085543409080e-10 6413 KSP Residual norm 5.124222783833e-10 6414 KSP Residual norm 5.561587802150e-10 6415 KSP Residual norm 5.963740805950e-10 6416 KSP Residual norm 5.984293547900e-10 6417 KSP Residual norm 6.033476562135e-10 6418 KSP Residual norm 6.207092012846e-10 6419 KSP Residual norm 6.323024741097e-10 6420 KSP Residual norm 6.634040693950e-10 6421 KSP Residual norm 7.017154912422e-10 6422 KSP Residual norm 7.582622234446e-10 6423 KSP Residual norm 8.035602992376e-10 6424 KSP Residual norm 7.661197422270e-10 6425 KSP Residual norm 6.635986112766e-10 6426 KSP Residual norm 5.917249049254e-10 6427 KSP Residual norm 5.531288519334e-10 6428 KSP Residual norm 5.210300074508e-10 6429 KSP Residual norm 4.538101004060e-10 6430 KSP Residual norm 4.045178496337e-10 6431 KSP Residual norm 3.940498724669e-10 6432 KSP Residual norm 4.190712043959e-10 6433 KSP Residual norm 4.610078580380e-10 6434 KSP Residual norm 4.267906039007e-10 6435 KSP Residual norm 3.269681005470e-10 6436 KSP Residual norm 2.378093553926e-10 6437 KSP Residual norm 1.929933709121e-10 6438 KSP Residual norm 1.802056765507e-10 6439 KSP Residual norm 1.805860736803e-10 6440 KSP Residual norm 2.106644412859e-10 6441 KSP Residual norm 2.246812384673e-10 6442 KSP Residual norm 2.066525965362e-10 6443 KSP Residual norm 1.818157863826e-10 6444 KSP Residual norm 1.693902783889e-10 6445 KSP Residual norm 1.741820113712e-10 6446 KSP Residual norm 1.974544753032e-10 6447 KSP Residual norm 2.375389502317e-10 6448 KSP Residual norm 2.701102959806e-10 6449 KSP Residual norm 3.047877367579e-10 6450 KSP Residual norm 3.400219192936e-10 6451 KSP Residual norm 3.404307578002e-10 6452 KSP Residual norm 3.098401256180e-10 6453 KSP Residual norm 2.846752339198e-10 6454 KSP Residual norm 2.808942295429e-10 6455 KSP Residual norm 3.104913000576e-10 6456 KSP Residual norm 3.819986498191e-10 6457 KSP Residual norm 4.723432304684e-10 6458 KSP Residual norm 4.703308369125e-10 6459 KSP Residual norm 4.270761025285e-10 6460 KSP Residual norm 4.069680509020e-10 6461 KSP Residual norm 3.482848298651e-10 6462 KSP Residual norm 2.865069561315e-10 6463 KSP Residual norm 2.848719368076e-10 6464 KSP Residual norm 3.524077365810e-10 6465 KSP Residual norm 4.598183147179e-10 6466 KSP Residual norm 5.275257838962e-10 6467 KSP Residual norm 5.381402588865e-10 6468 KSP Residual norm 4.984872583671e-10 6469 KSP Residual norm 4.368739085752e-10 6470 KSP Residual norm 3.826662557108e-10 6471 KSP Residual norm 3.765752808542e-10 6472 KSP Residual norm 4.322947679607e-10 6473 KSP Residual norm 4.506772985047e-10 6474 KSP Residual norm 4.048400192512e-10 6475 KSP Residual norm 3.762901723937e-10 6476 KSP Residual norm 3.456788956612e-10 6477 KSP Residual norm 2.930329537930e-10 6478 KSP Residual norm 2.436994619607e-10 6479 KSP Residual norm 2.166929256296e-10 6480 KSP Residual norm 2.176824847894e-10 6481 KSP Residual norm 2.427612983031e-10 6482 KSP Residual norm 2.679473891524e-10 6483 KSP Residual norm 3.192524236235e-10 6484 KSP Residual norm 4.116640471076e-10 6485 KSP Residual norm 4.080103150140e-10 6486 KSP Residual norm 3.259163284976e-10 6487 KSP Residual norm 2.842153899509e-10 6488 KSP Residual norm 2.684223911229e-10 6489 KSP Residual norm 2.563126825894e-10 6490 KSP Residual norm 2.796631747537e-10 6491 KSP Residual norm 3.243065215993e-10 6492 KSP Residual norm 3.423239895566e-10 6493 KSP Residual norm 3.488103428522e-10 6494 KSP Residual norm 3.300842459012e-10 6495 KSP Residual norm 3.240713603148e-10 6496 KSP Residual norm 3.748708932488e-10 6497 KSP Residual norm 4.788164382671e-10 6498 KSP Residual norm 5.155541168561e-10 6499 KSP Residual norm 4.755282050327e-10 6500 KSP Residual norm 5.218048468492e-10 6501 KSP Residual norm 6.793519256831e-10 6502 KSP Residual norm 8.911628837045e-10 6503 KSP Residual norm 1.011608850952e-09 6504 KSP Residual norm 1.148770359969e-09 6505 KSP Residual norm 1.344025089794e-09 6506 KSP Residual norm 1.347628794633e-09 6507 KSP Residual norm 1.239708231500e-09 6508 KSP Residual norm 1.248436740172e-09 6509 KSP Residual norm 1.331682474511e-09 6510 KSP Residual norm 1.358245191015e-09 6511 KSP Residual norm 1.344751221803e-09 6512 KSP Residual norm 1.403572711183e-09 6513 KSP Residual norm 1.721428756151e-09 6514 KSP Residual norm 2.187980636782e-09 6515 KSP Residual norm 2.395889866743e-09 6516 KSP Residual norm 2.463653105084e-09 6517 KSP Residual norm 2.246805874406e-09 6518 KSP Residual norm 1.787328587150e-09 6519 KSP Residual norm 1.563696834264e-09 6520 KSP Residual norm 1.540331921302e-09 6521 KSP Residual norm 1.659943252996e-09 6522 KSP Residual norm 1.986883296707e-09 6523 KSP Residual norm 2.301551233574e-09 6524 KSP Residual norm 2.466650120138e-09 6525 KSP Residual norm 2.319452517618e-09 6526 KSP Residual norm 1.978882880874e-09 6527 KSP Residual norm 1.742468354651e-09 6528 KSP Residual norm 1.667649375158e-09 6529 KSP Residual norm 1.608776287804e-09 6530 KSP Residual norm 1.529310273765e-09 6531 KSP Residual norm 1.497394088890e-09 6532 KSP Residual norm 1.460383037326e-09 6533 KSP Residual norm 1.484218728896e-09 6534 KSP Residual norm 1.729349228423e-09 6535 KSP Residual norm 1.982675355019e-09 6536 KSP Residual norm 2.217430661750e-09 6537 KSP Residual norm 2.582590219215e-09 6538 KSP Residual norm 2.760789025930e-09 6539 KSP Residual norm 2.819155921149e-09 6540 KSP Residual norm 2.888536056411e-09 6541 KSP Residual norm 2.674400073539e-09 6542 KSP Residual norm 2.245597852661e-09 6543 KSP Residual norm 2.078170434854e-09 6544 KSP Residual norm 2.246113567402e-09 6545 KSP Residual norm 2.612021289349e-09 6546 KSP Residual norm 2.950396741392e-09 6547 KSP Residual norm 2.664985826732e-09 6548 KSP Residual norm 2.239550312015e-09 6549 KSP Residual norm 2.026909988328e-09 6550 KSP Residual norm 2.095804760795e-09 6551 KSP Residual norm 2.457049432314e-09 6552 KSP Residual norm 3.023046243584e-09 6553 KSP Residual norm 3.420728901160e-09 6554 KSP Residual norm 3.471493227182e-09 6555 KSP Residual norm 3.404146863324e-09 6556 KSP Residual norm 3.663283063505e-09 6557 KSP Residual norm 4.022843999757e-09 6558 KSP Residual norm 4.489878029133e-09 6559 KSP Residual norm 5.440554898449e-09 6560 KSP Residual norm 6.421947738668e-09 6561 KSP Residual norm 6.712229769471e-09 6562 KSP Residual norm 6.026279422836e-09 6563 KSP Residual norm 4.886298153641e-09 6564 KSP Residual norm 4.283833716698e-09 6565 KSP Residual norm 4.452749073000e-09 6566 KSP Residual norm 5.235320926129e-09 6567 KSP Residual norm 5.870606895264e-09 6568 KSP Residual norm 5.775381140027e-09 6569 KSP Residual norm 6.197174653752e-09 6570 KSP Residual norm 7.586854436747e-09 6571 KSP Residual norm 8.528588961390e-09 6572 KSP Residual norm 7.803346516889e-09 6573 KSP Residual norm 5.933312324180e-09 6574 KSP Residual norm 4.687879640052e-09 6575 KSP Residual norm 3.870526405457e-09 6576 KSP Residual norm 3.577002097000e-09 6577 KSP Residual norm 3.967550802274e-09 6578 KSP Residual norm 5.258986777645e-09 6579 KSP Residual norm 6.551061436806e-09 6580 KSP Residual norm 6.489290034267e-09 6581 KSP Residual norm 5.741771570274e-09 6582 KSP Residual norm 5.210910244808e-09 6583 KSP Residual norm 5.015094099240e-09 6584 KSP Residual norm 4.879126564060e-09 6585 KSP Residual norm 4.357245603815e-09 6586 KSP Residual norm 3.890352742026e-09 6587 KSP Residual norm 3.720350690431e-09 6588 KSP Residual norm 3.759121508489e-09 6589 KSP Residual norm 4.089814488284e-09 6590 KSP Residual norm 4.719131430882e-09 6591 KSP Residual norm 5.568008944066e-09 6592 KSP Residual norm 6.016325988875e-09 6593 KSP Residual norm 5.524965143199e-09 6594 KSP Residual norm 4.439871059433e-09 6595 KSP Residual norm 4.013916259937e-09 6596 KSP Residual norm 4.677630166835e-09 6597 KSP Residual norm 6.011989019577e-09 6598 KSP Residual norm 7.214705523408e-09 6599 KSP Residual norm 7.412015657686e-09 6600 KSP Residual norm 6.388216344634e-09 6601 KSP Residual norm 5.723631458667e-09 6602 KSP Residual norm 6.154323047313e-09 6603 KSP Residual norm 7.794179384334e-09 6604 KSP Residual norm 9.661310722476e-09 6605 KSP Residual norm 1.121424379946e-08 6606 KSP Residual norm 1.263090933027e-08 6607 KSP Residual norm 1.363323505911e-08 6608 KSP Residual norm 1.329374835817e-08 6609 KSP Residual norm 1.363039383226e-08 6610 KSP Residual norm 1.550836290477e-08 6611 KSP Residual norm 1.590008708803e-08 6612 KSP Residual norm 1.406088284965e-08 6613 KSP Residual norm 1.165087361072e-08 6614 KSP Residual norm 9.490133798845e-09 6615 KSP Residual norm 8.821126378019e-09 6616 KSP Residual norm 1.031112761626e-08 6617 KSP Residual norm 1.318102355182e-08 6618 KSP Residual norm 1.529229794917e-08 6619 KSP Residual norm 1.599600512280e-08 6620 KSP Residual norm 1.491240467935e-08 6621 KSP Residual norm 1.192266413880e-08 6622 KSP Residual norm 8.008071445255e-09 6623 KSP Residual norm 5.898046353192e-09 6624 KSP Residual norm 5.908269513864e-09 6625 KSP Residual norm 7.392622300744e-09 6626 KSP Residual norm 8.810371693225e-09 6627 KSP Residual norm 8.852780409790e-09 6628 KSP Residual norm 7.997281031672e-09 6629 KSP Residual norm 7.962591993603e-09 6630 KSP Residual norm 8.166862966071e-09 6631 KSP Residual norm 7.475744119734e-09 6632 KSP Residual norm 6.325371323035e-09 6633 KSP Residual norm 5.869571791637e-09 6634 KSP Residual norm 5.894541001207e-09 6635 KSP Residual norm 6.055317436212e-09 6636 KSP Residual norm 5.872978597288e-09 6637 KSP Residual norm 5.677816023874e-09 6638 KSP Residual norm 5.817489706461e-09 6639 KSP Residual norm 5.695247177206e-09 6640 KSP Residual norm 5.420719950715e-09 6641 KSP Residual norm 5.775522841277e-09 6642 KSP Residual norm 6.098393330035e-09 6643 KSP Residual norm 5.749471910001e-09 6644 KSP Residual norm 5.572039166707e-09 6645 KSP Residual norm 5.586576422614e-09 6646 KSP Residual norm 5.383116078703e-09 6647 KSP Residual norm 5.639545397361e-09 6648 KSP Residual norm 6.405485701487e-09 6649 KSP Residual norm 7.525270062973e-09 6650 KSP Residual norm 8.633097278371e-09 6651 KSP Residual norm 8.863150718294e-09 6652 KSP Residual norm 8.786241132595e-09 6653 KSP Residual norm 9.424063732970e-09 6654 KSP Residual norm 1.031988169205e-08 6655 KSP Residual norm 1.139350926011e-08 6656 KSP Residual norm 1.107548120163e-08 6657 KSP Residual norm 1.023262186123e-08 6658 KSP Residual norm 8.035535933135e-09 6659 KSP Residual norm 5.153943083056e-09 6660 KSP Residual norm 3.658505070927e-09 6661 KSP Residual norm 3.478002624998e-09 6662 KSP Residual norm 4.174293068048e-09 6663 KSP Residual norm 5.334399609419e-09 6664 KSP Residual norm 6.246140186447e-09 6665 KSP Residual norm 6.676319555720e-09 6666 KSP Residual norm 6.173274193301e-09 6667 KSP Residual norm 4.637985093362e-09 6668 KSP Residual norm 3.445899915101e-09 6669 KSP Residual norm 3.141031147395e-09 6670 KSP Residual norm 3.215893959917e-09 6671 KSP Residual norm 3.247086887875e-09 6672 KSP Residual norm 3.310977415365e-09 6673 KSP Residual norm 3.573486609286e-09 6674 KSP Residual norm 3.543517717311e-09 6675 KSP Residual norm 3.014918710543e-09 6676 KSP Residual norm 2.675986670560e-09 6677 KSP Residual norm 2.708267374484e-09 6678 KSP Residual norm 2.721603014777e-09 6679 KSP Residual norm 2.333768407990e-09 6680 KSP Residual norm 1.967486402491e-09 6681 KSP Residual norm 1.788432988997e-09 6682 KSP Residual norm 1.564962301050e-09 6683 KSP Residual norm 1.344424125767e-09 6684 KSP Residual norm 1.274721422861e-09 6685 KSP Residual norm 1.293580164012e-09 6686 KSP Residual norm 1.301445421268e-09 6687 KSP Residual norm 1.248763413528e-09 6688 KSP Residual norm 1.136771382849e-09 6689 KSP Residual norm 1.134174313993e-09 6690 KSP Residual norm 1.216765378574e-09 6691 KSP Residual norm 1.198021816840e-09 6692 KSP Residual norm 1.149802852169e-09 6693 KSP Residual norm 1.186782639767e-09 6694 KSP Residual norm 1.262036138120e-09 6695 KSP Residual norm 1.285048724637e-09 6696 KSP Residual norm 1.244359926930e-09 6697 KSP Residual norm 1.268122388447e-09 6698 KSP Residual norm 1.372113478002e-09 6699 KSP Residual norm 1.460316405898e-09 6700 KSP Residual norm 1.234788207039e-09 6701 KSP Residual norm 1.038442205529e-09 6702 KSP Residual norm 1.002475198995e-09 6703 KSP Residual norm 9.789996062005e-10 6704 KSP Residual norm 9.215037938859e-10 6705 KSP Residual norm 9.862946833538e-10 6706 KSP Residual norm 1.144574469155e-09 6707 KSP Residual norm 1.326158176502e-09 6708 KSP Residual norm 1.441128283487e-09 6709 KSP Residual norm 1.533219331278e-09 6710 KSP Residual norm 1.582922208288e-09 6711 KSP Residual norm 1.483806155660e-09 6712 KSP Residual norm 1.191497453905e-09 6713 KSP Residual norm 1.047720736319e-09 6714 KSP Residual norm 1.195444643356e-09 6715 KSP Residual norm 1.480457650189e-09 6716 KSP Residual norm 1.702134045396e-09 6717 KSP Residual norm 1.693515580718e-09 6718 KSP Residual norm 1.542016185545e-09 6719 KSP Residual norm 1.378430623969e-09 6720 KSP Residual norm 1.317913165570e-09 6721 KSP Residual norm 1.337466184841e-09 6722 KSP Residual norm 1.376394550666e-09 6723 KSP Residual norm 1.317866099262e-09 6724 KSP Residual norm 1.142831902394e-09 6725 KSP Residual norm 1.019698600541e-09 6726 KSP Residual norm 9.918755423397e-10 6727 KSP Residual norm 1.095064652796e-09 6728 KSP Residual norm 1.332602240762e-09 6729 KSP Residual norm 1.505875301879e-09 6730 KSP Residual norm 1.452640591614e-09 6731 KSP Residual norm 1.267036689643e-09 6732 KSP Residual norm 1.087261957242e-09 6733 KSP Residual norm 9.981030972776e-10 6734 KSP Residual norm 1.028679034830e-09 6735 KSP Residual norm 1.228377617254e-09 6736 KSP Residual norm 1.414617476672e-09 6737 KSP Residual norm 1.274653929966e-09 6738 KSP Residual norm 1.050661995309e-09 6739 KSP Residual norm 9.513557099561e-10 6740 KSP Residual norm 9.244325202735e-10 6741 KSP Residual norm 8.288580797736e-10 6742 KSP Residual norm 7.653426261965e-10 6743 KSP Residual norm 7.500709871445e-10 6744 KSP Residual norm 7.358653429414e-10 6745 KSP Residual norm 7.042308544787e-10 6746 KSP Residual norm 6.596497821137e-10 6747 KSP Residual norm 5.891169191577e-10 6748 KSP Residual norm 5.822955107112e-10 6749 KSP Residual norm 6.246413425060e-10 6750 KSP Residual norm 6.776322951171e-10 6751 KSP Residual norm 7.385765652418e-10 6752 KSP Residual norm 7.360861072653e-10 6753 KSP Residual norm 6.414107254985e-10 6754 KSP Residual norm 5.568823697080e-10 6755 KSP Residual norm 4.740313407343e-10 6756 KSP Residual norm 4.290153051654e-10 6757 KSP Residual norm 4.538698025575e-10 6758 KSP Residual norm 5.637307594738e-10 6759 KSP Residual norm 7.530379894012e-10 6760 KSP Residual norm 9.364526954020e-10 6761 KSP Residual norm 9.888115923025e-10 6762 KSP Residual norm 9.537405238699e-10 6763 KSP Residual norm 9.968182976428e-10 6764 KSP Residual norm 1.153514251379e-09 6765 KSP Residual norm 1.271403342949e-09 6766 KSP Residual norm 1.255286336932e-09 6767 KSP Residual norm 1.341286487208e-09 6768 KSP Residual norm 1.438035111113e-09 6769 KSP Residual norm 1.266780821131e-09 6770 KSP Residual norm 1.098907765336e-09 6771 KSP Residual norm 9.860073772260e-10 6772 KSP Residual norm 8.665133644121e-10 6773 KSP Residual norm 8.668256471420e-10 6774 KSP Residual norm 1.131686076490e-09 6775 KSP Residual norm 1.427204003729e-09 6776 KSP Residual norm 1.649206224182e-09 6777 KSP Residual norm 1.814665710604e-09 6778 KSP Residual norm 1.865043104375e-09 6779 KSP Residual norm 1.742526063924e-09 6780 KSP Residual norm 1.568604672012e-09 6781 KSP Residual norm 1.471765011932e-09 6782 KSP Residual norm 1.404635264886e-09 6783 KSP Residual norm 1.367507400347e-09 6784 KSP Residual norm 1.373105569943e-09 6785 KSP Residual norm 1.469737044005e-09 6786 KSP Residual norm 1.742190642927e-09 6787 KSP Residual norm 2.024394365648e-09 6788 KSP Residual norm 2.024796258535e-09 6789 KSP Residual norm 2.014655683968e-09 6790 KSP Residual norm 2.216826935791e-09 6791 KSP Residual norm 2.220490313002e-09 6792 KSP Residual norm 2.035552624819e-09 6793 KSP Residual norm 1.901886035064e-09 6794 KSP Residual norm 1.893755837638e-09 6795 KSP Residual norm 1.916784246615e-09 6796 KSP Residual norm 1.833321348125e-09 6797 KSP Residual norm 1.734325229142e-09 6798 KSP Residual norm 1.573313906055e-09 6799 KSP Residual norm 1.251395660282e-09 6800 KSP Residual norm 1.055411691331e-09 6801 KSP Residual norm 1.061411753261e-09 6802 KSP Residual norm 1.244206706469e-09 6803 KSP Residual norm 1.449857439154e-09 6804 KSP Residual norm 1.381414576901e-09 6805 KSP Residual norm 1.074279081792e-09 6806 KSP Residual norm 9.061050596284e-10 6807 KSP Residual norm 8.828639309019e-10 6808 KSP Residual norm 9.727880036403e-10 6809 KSP Residual norm 1.146208460024e-09 6810 KSP Residual norm 1.319858093909e-09 6811 KSP Residual norm 1.223914289464e-09 6812 KSP Residual norm 1.072845061935e-09 6813 KSP Residual norm 9.754616312528e-10 6814 KSP Residual norm 9.274707988584e-10 6815 KSP Residual norm 8.861121877844e-10 6816 KSP Residual norm 8.012584031958e-10 6817 KSP Residual norm 7.550432189709e-10 6818 KSP Residual norm 7.786795571308e-10 6819 KSP Residual norm 8.206471164611e-10 6820 KSP Residual norm 9.006156956187e-10 6821 KSP Residual norm 1.044718041364e-09 6822 KSP Residual norm 1.201571654795e-09 6823 KSP Residual norm 1.384866599876e-09 6824 KSP Residual norm 1.647284149964e-09 6825 KSP Residual norm 1.819546054565e-09 6826 KSP Residual norm 1.828048596638e-09 6827 KSP Residual norm 1.648231941434e-09 6828 KSP Residual norm 1.436421728125e-09 6829 KSP Residual norm 1.405790622190e-09 6830 KSP Residual norm 1.411681089188e-09 6831 KSP Residual norm 1.270934545971e-09 6832 KSP Residual norm 1.144593812285e-09 6833 KSP Residual norm 1.067610450662e-09 6834 KSP Residual norm 1.036206178449e-09 6835 KSP Residual norm 1.180901241746e-09 6836 KSP Residual norm 1.585244886411e-09 6837 KSP Residual norm 2.067452595606e-09 6838 KSP Residual norm 2.084744214581e-09 6839 KSP Residual norm 1.755935674982e-09 6840 KSP Residual norm 1.551585259230e-09 6841 KSP Residual norm 1.514546892698e-09 6842 KSP Residual norm 1.466585120469e-09 6843 KSP Residual norm 1.240108556191e-09 6844 KSP Residual norm 1.008842909820e-09 6845 KSP Residual norm 9.275476986437e-10 6846 KSP Residual norm 8.762538169827e-10 6847 KSP Residual norm 8.680177699708e-10 6848 KSP Residual norm 9.357353085644e-10 6849 KSP Residual norm 1.015815535650e-09 6850 KSP Residual norm 1.112092024327e-09 6851 KSP Residual norm 1.130782265429e-09 6852 KSP Residual norm 1.139446593935e-09 6853 KSP Residual norm 1.211330433256e-09 6854 KSP Residual norm 1.316172780462e-09 6855 KSP Residual norm 1.293546255501e-09 6856 KSP Residual norm 1.105026293257e-09 6857 KSP Residual norm 8.160889061259e-10 6858 KSP Residual norm 6.118822203858e-10 6859 KSP Residual norm 5.432504930712e-10 6860 KSP Residual norm 5.833435234305e-10 6861 KSP Residual norm 5.951182954519e-10 6862 KSP Residual norm 5.286095427287e-10 6863 KSP Residual norm 4.678595170274e-10 6864 KSP Residual norm 4.472174990979e-10 6865 KSP Residual norm 4.709901684399e-10 6866 KSP Residual norm 5.133162018208e-10 6867 KSP Residual norm 5.423524709494e-10 6868 KSP Residual norm 4.985922606311e-10 6869 KSP Residual norm 4.370730034082e-10 6870 KSP Residual norm 4.205715761094e-10 6871 KSP Residual norm 4.549735180315e-10 6872 KSP Residual norm 5.032134491512e-10 6873 KSP Residual norm 5.204014811682e-10 6874 KSP Residual norm 4.952175197828e-10 6875 KSP Residual norm 4.109939321273e-10 6876 KSP Residual norm 3.626591508772e-10 6877 KSP Residual norm 3.596369030522e-10 6878 KSP Residual norm 3.834365778097e-10 6879 KSP Residual norm 4.191993777908e-10 6880 KSP Residual norm 4.016112495376e-10 6881 KSP Residual norm 3.596569970850e-10 6882 KSP Residual norm 3.965381195627e-10 6883 KSP Residual norm 4.832771931900e-10 6884 KSP Residual norm 4.822306260822e-10 6885 KSP Residual norm 4.518518438917e-10 6886 KSP Residual norm 4.356738692225e-10 6887 KSP Residual norm 4.275945643416e-10 6888 KSP Residual norm 4.111862107973e-10 6889 KSP Residual norm 3.673217682131e-10 6890 KSP Residual norm 2.898354700426e-10 6891 KSP Residual norm 2.608445939070e-10 6892 KSP Residual norm 2.842607279233e-10 6893 KSP Residual norm 3.384152451048e-10 6894 KSP Residual norm 3.983908610737e-10 6895 KSP Residual norm 4.192706440053e-10 6896 KSP Residual norm 3.978994275492e-10 6897 KSP Residual norm 3.658088806617e-10 6898 KSP Residual norm 3.404654186631e-10 6899 KSP Residual norm 2.906383586793e-10 6900 KSP Residual norm 2.127962673358e-10 6901 KSP Residual norm 1.507976490382e-10 6902 KSP Residual norm 1.307689191667e-10 6903 KSP Residual norm 1.366691962961e-10 6904 KSP Residual norm 1.430541169433e-10 6905 KSP Residual norm 1.531926143403e-10 6906 KSP Residual norm 1.728444998741e-10 6907 KSP Residual norm 2.020999590256e-10 6908 KSP Residual norm 2.422547845998e-10 6909 KSP Residual norm 2.766628999584e-10 6910 KSP Residual norm 2.808395389371e-10 6911 KSP Residual norm 2.447779370443e-10 6912 KSP Residual norm 1.955617619281e-10 6913 KSP Residual norm 1.621781823430e-10 6914 KSP Residual norm 1.442208719555e-10 6915 KSP Residual norm 1.293529116327e-10 6916 KSP Residual norm 1.195731480900e-10 6917 KSP Residual norm 1.130933240904e-10 6918 KSP Residual norm 1.086892142717e-10 6919 KSP Residual norm 1.088579274117e-10 6920 KSP Residual norm 1.226378516771e-10 6921 KSP Residual norm 1.354537586031e-10 6922 KSP Residual norm 1.280661835705e-10 6923 KSP Residual norm 1.158157621603e-10 6924 KSP Residual norm 1.128689057961e-10 6925 KSP Residual norm 1.259026116125e-10 6926 KSP Residual norm 1.550159500855e-10 6927 KSP Residual norm 1.709124984108e-10 6928 KSP Residual norm 1.571447513851e-10 6929 KSP Residual norm 1.421729230743e-10 6930 KSP Residual norm 1.341765612678e-10 6931 KSP Residual norm 1.391495708193e-10 6932 KSP Residual norm 1.402892865521e-10 6933 KSP Residual norm 1.238103802749e-10 6934 KSP Residual norm 1.120932800918e-10 6935 KSP Residual norm 1.194694328787e-10 6936 KSP Residual norm 1.423886809514e-10 6937 KSP Residual norm 1.697464312651e-10 6938 KSP Residual norm 1.896396103745e-10 6939 KSP Residual norm 1.895676948134e-10 6940 KSP Residual norm 1.978519167125e-10 6941 KSP Residual norm 2.115802038159e-10 6942 KSP Residual norm 1.950253681897e-10 6943 KSP Residual norm 1.832636273447e-10 6944 KSP Residual norm 1.765129899884e-10 6945 KSP Residual norm 1.554251261762e-10 6946 KSP Residual norm 1.387284529801e-10 6947 KSP Residual norm 1.368928643872e-10 6948 KSP Residual norm 1.480856532515e-10 6949 KSP Residual norm 1.495203726019e-10 6950 KSP Residual norm 1.345839527133e-10 6951 KSP Residual norm 1.370265261708e-10 6952 KSP Residual norm 1.673014984887e-10 6953 KSP Residual norm 1.982000514097e-10 6954 KSP Residual norm 1.949181362237e-10 6955 KSP Residual norm 1.787721551309e-10 6956 KSP Residual norm 1.684502043365e-10 6957 KSP Residual norm 1.800829936919e-10 6958 KSP Residual norm 2.068560740665e-10 6959 KSP Residual norm 1.987218727410e-10 6960 KSP Residual norm 1.683268519909e-10 6961 KSP Residual norm 1.440837774904e-10 6962 KSP Residual norm 1.245413082518e-10 6963 KSP Residual norm 1.116122069027e-10 6964 KSP Residual norm 9.490089834267e-11 2 KSP Residual norm 7.462775074989e-02 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 1.370782211717e-10 1 KSP Residual norm 1.024280861285e-10 2 KSP Residual norm 8.809415769179e-11 3 KSP Residual norm 8.952638270118e-11 4 KSP Residual norm 9.170144644054e-11 5 KSP Residual norm 1.015924671212e-10 6 KSP Residual norm 1.090354326476e-10 7 KSP Residual norm 1.083022647467e-10 8 KSP Residual norm 1.115250420304e-10 9 KSP Residual norm 1.146958386652e-10 10 KSP Residual norm 1.213468824702e-10 11 KSP Residual norm 1.248008835517e-10 12 KSP Residual norm 1.279282928110e-10 13 KSP Residual norm 1.262291076395e-10 14 KSP Residual norm 1.267310794394e-10 15 KSP Residual norm 1.345188093647e-10 16 KSP Residual norm 1.326440469462e-10 17 KSP Residual norm 1.205168724598e-10 18 KSP Residual norm 1.109518468129e-10 19 KSP Residual norm 1.082741073597e-10 20 KSP Residual norm 1.022304856179e-10 21 KSP Residual norm 9.268527960155e-11 22 KSP Residual norm 8.675437032246e-11 23 KSP Residual norm 8.607941649840e-11 24 KSP Residual norm 8.495173915288e-11 25 KSP Residual norm 8.731095959945e-11 26 KSP Residual norm 9.320113828702e-11 27 KSP Residual norm 9.595527572231e-11 28 KSP Residual norm 9.844853409854e-11 29 KSP Residual norm 1.058241380014e-10 30 KSP Residual norm 1.144602170426e-10 31 KSP Residual norm 1.144631886591e-10 32 KSP Residual norm 1.173182816165e-10 33 KSP Residual norm 1.210190311017e-10 34 KSP Residual norm 1.233616135891e-10 35 KSP Residual norm 1.276967245169e-10 36 KSP Residual norm 1.333969492316e-10 37 KSP Residual norm 1.336700407046e-10 38 KSP Residual norm 1.352284098615e-10 39 KSP Residual norm 1.345435070010e-10 40 KSP Residual norm 1.375670433396e-10 41 KSP Residual norm 1.408515514924e-10 42 KSP Residual norm 1.415067527770e-10 43 KSP Residual norm 1.427988684151e-10 44 KSP Residual norm 1.467385600740e-10 45 KSP Residual norm 1.477175980117e-10 46 KSP Residual norm 1.439405824022e-10 47 KSP Residual norm 1.340761526111e-10 48 KSP Residual norm 1.284776859775e-10 49 KSP Residual norm 1.272634642818e-10 50 KSP Residual norm 1.251645854220e-10 51 KSP Residual norm 1.191584847788e-10 52 KSP Residual norm 1.180693543343e-10 53 KSP Residual norm 1.257432345004e-10 54 KSP Residual norm 1.276786548171e-10 55 KSP Residual norm 1.225720485378e-10 56 KSP Residual norm 1.203380069954e-10 57 KSP Residual norm 1.253775090002e-10 58 KSP Residual norm 1.337194055236e-10 59 KSP Residual norm 1.364485139739e-10 60 KSP Residual norm 1.346369955987e-10 61 KSP Residual norm 1.332166723746e-10 62 KSP Residual norm 1.351690212207e-10 63 KSP Residual norm 1.336224791293e-10 64 KSP Residual norm 1.316183334895e-10 65 KSP Residual norm 1.283255871955e-10 66 KSP Residual norm 1.280107983614e-10 67 KSP Residual norm 1.334385052474e-10 68 KSP Residual norm 1.455686495920e-10 69 KSP Residual norm 1.474165854733e-10 70 KSP Residual norm 1.443911531470e-10 71 KSP Residual norm 1.383005315067e-10 72 KSP Residual norm 1.424509589141e-10 73 KSP Residual norm 1.452279436447e-10 74 KSP Residual norm 1.459641824736e-10 75 KSP Residual norm 1.486949358196e-10 76 KSP Residual norm 1.482577552812e-10 77 KSP Residual norm 1.504535383313e-10 78 KSP Residual norm 1.520308256914e-10 79 KSP Residual norm 1.462250230483e-10 80 KSP Residual norm 1.436731628227e-10 81 KSP Residual norm 1.456408506250e-10 82 KSP Residual norm 1.508742863934e-10 83 KSP Residual norm 1.570474392129e-10 84 KSP Residual norm 1.631604167515e-10 85 KSP Residual norm 1.729320268404e-10 86 KSP Residual norm 1.924365245337e-10 87 KSP Residual norm 2.142991567629e-10 88 KSP Residual norm 2.173638870644e-10 89 KSP Residual norm 2.117880718435e-10 90 KSP Residual norm 2.119706929147e-10 91 KSP Residual norm 2.178888716512e-10 92 KSP Residual norm 2.176340444744e-10 93 KSP Residual norm 2.151516071092e-10 94 KSP Residual norm 2.226646903909e-10 95 KSP Residual norm 2.262785167746e-10 96 KSP Residual norm 2.082083899623e-10 97 KSP Residual norm 1.940865129763e-10 98 KSP Residual norm 1.951383538222e-10 99 KSP Residual norm 2.001799867116e-10 100 KSP Residual norm 1.826998745777e-10 101 KSP Residual norm 1.676725859187e-10 102 KSP Residual norm 1.679116252673e-10 103 KSP Residual norm 1.760268633782e-10 104 KSP Residual norm 1.858781812568e-10 105 KSP Residual norm 1.921082065578e-10 106 KSP Residual norm 1.899607332838e-10 107 KSP Residual norm 1.940632120325e-10 108 KSP Residual norm 2.067570865301e-10 109 KSP Residual norm 2.106726895208e-10 110 KSP Residual norm 2.108004838200e-10 111 KSP Residual norm 2.238135531736e-10 112 KSP Residual norm 2.385934188207e-10 113 KSP Residual norm 2.476389895293e-10 114 KSP Residual norm 2.402959718718e-10 115 KSP Residual norm 2.249316114863e-10 116 KSP Residual norm 2.229416679389e-10 117 KSP Residual norm 2.307625999260e-10 118 KSP Residual norm 2.384735963982e-10 119 KSP Residual norm 2.287964517971e-10 120 KSP Residual norm 2.159229930791e-10 121 KSP Residual norm 2.111487417633e-10 122 KSP Residual norm 2.078675839142e-10 123 KSP Residual norm 2.081804952191e-10 124 KSP Residual norm 2.028084967396e-10 125 KSP Residual norm 2.004710268426e-10 126 KSP Residual norm 1.806870305234e-10 127 KSP Residual norm 1.606079541305e-10 128 KSP Residual norm 1.553349685359e-10 129 KSP Residual norm 1.599583703164e-10 130 KSP Residual norm 1.584549501655e-10 131 KSP Residual norm 1.600493477242e-10 132 KSP Residual norm 1.700704620458e-10 133 KSP Residual norm 1.817942036827e-10 134 KSP Residual norm 1.843583325260e-10 135 KSP Residual norm 1.851963123539e-10 136 KSP Residual norm 1.941311489195e-10 137 KSP Residual norm 2.030720296350e-10 138 KSP Residual norm 2.138447562322e-10 139 KSP Residual norm 2.164137175361e-10 140 KSP Residual norm 1.969482555612e-10 141 KSP Residual norm 1.849858350896e-10 142 KSP Residual norm 1.863610726808e-10 143 KSP Residual norm 1.925275912987e-10 144 KSP Residual norm 1.905834593884e-10 145 KSP Residual norm 1.966932617883e-10 146 KSP Residual norm 2.118177213161e-10 147 KSP Residual norm 2.149176862885e-10 148 KSP Residual norm 2.140618436693e-10 149 KSP Residual norm 2.152519872613e-10 150 KSP Residual norm 2.169156497856e-10 151 KSP Residual norm 2.211943265641e-10 152 KSP Residual norm 2.192940993071e-10 153 KSP Residual norm 2.112328373672e-10 154 KSP Residual norm 1.994325011035e-10 155 KSP Residual norm 1.961692410338e-10 156 KSP Residual norm 2.000446677332e-10 157 KSP Residual norm 2.017753121297e-10 158 KSP Residual norm 1.957197200429e-10 159 KSP Residual norm 1.910823383668e-10 160 KSP Residual norm 1.948932434983e-10 161 KSP Residual norm 2.057613665037e-10 162 KSP Residual norm 2.211722042956e-10 163 KSP Residual norm 2.448774920590e-10 164 KSP Residual norm 2.537300022253e-10 165 KSP Residual norm 2.357318306264e-10 166 KSP Residual norm 2.187091172586e-10 167 KSP Residual norm 2.040543037176e-10 168 KSP Residual norm 1.958575670866e-10 169 KSP Residual norm 1.945551941975e-10 170 KSP Residual norm 1.874393057691e-10 171 KSP Residual norm 1.858032197341e-10 172 KSP Residual norm 1.779295598240e-10 173 KSP Residual norm 1.737546245916e-10 174 KSP Residual norm 1.716663169552e-10 175 KSP Residual norm 1.640698049013e-10 176 KSP Residual norm 1.585278673977e-10 177 KSP Residual norm 1.625705522370e-10 178 KSP Residual norm 1.690410274065e-10 179 KSP Residual norm 1.609857630911e-10 180 KSP Residual norm 1.479484183037e-10 181 KSP Residual norm 1.438765276419e-10 182 KSP Residual norm 1.437253696174e-10 183 KSP Residual norm 1.457652916028e-10 184 KSP Residual norm 1.425451663809e-10 185 KSP Residual norm 1.376476620658e-10 186 KSP Residual norm 1.359029127913e-10 187 KSP Residual norm 1.412367520538e-10 188 KSP Residual norm 1.401843940100e-10 189 KSP Residual norm 1.299507649264e-10 190 KSP Residual norm 1.169634788028e-10 191 KSP Residual norm 1.124330620856e-10 192 KSP Residual norm 1.087739844612e-10 193 KSP Residual norm 1.037784090813e-10 194 KSP Residual norm 1.042404915809e-10 195 KSP Residual norm 1.055532969214e-10 196 KSP Residual norm 1.050814465517e-10 197 KSP Residual norm 1.100835975196e-10 198 KSP Residual norm 1.129168914742e-10 199 KSP Residual norm 1.096001299019e-10 200 KSP Residual norm 1.081552615807e-10 201 KSP Residual norm 1.076201326032e-10 202 KSP Residual norm 1.050595189629e-10 203 KSP Residual norm 1.086588493934e-10 204 KSP Residual norm 1.151470281834e-10 205 KSP Residual norm 1.203408196405e-10 206 KSP Residual norm 1.195789705997e-10 207 KSP Residual norm 1.160890720372e-10 208 KSP Residual norm 1.098967509948e-10 209 KSP Residual norm 1.056007883989e-10 210 KSP Residual norm 1.016288998892e-10 211 KSP Residual norm 9.981817443041e-11 212 KSP Residual norm 1.008770198492e-10 213 KSP Residual norm 9.953099838669e-11 214 KSP Residual norm 9.278286869131e-11 215 KSP Residual norm 8.543344037330e-11 216 KSP Residual norm 8.118759851636e-11 217 KSP Residual norm 7.644850441865e-11 218 KSP Residual norm 7.119653570590e-11 219 KSP Residual norm 6.578798584845e-11 220 KSP Residual norm 6.033004018435e-11 221 KSP Residual norm 5.857612826322e-11 222 KSP Residual norm 5.885803996835e-11 223 KSP Residual norm 5.788031563802e-11 224 KSP Residual norm 5.747781483825e-11 225 KSP Residual norm 5.800282261246e-11 226 KSP Residual norm 5.729498615643e-11 227 KSP Residual norm 5.703521897442e-11 228 KSP Residual norm 5.911364796619e-11 229 KSP Residual norm 6.218105257491e-11 230 KSP Residual norm 6.329160094182e-11 231 KSP Residual norm 6.502990381022e-11 232 KSP Residual norm 6.617662151158e-11 233 KSP Residual norm 6.553709494720e-11 234 KSP Residual norm 6.549108194265e-11 235 KSP Residual norm 6.736959956683e-11 236 KSP Residual norm 6.778885000950e-11 237 KSP Residual norm 6.905434914429e-11 238 KSP Residual norm 6.941177257737e-11 239 KSP Residual norm 6.634362951098e-11 240 KSP Residual norm 6.347627937793e-11 241 KSP Residual norm 6.373662289251e-11 242 KSP Residual norm 6.637973752670e-11 243 KSP Residual norm 7.020154873095e-11 244 KSP Residual norm 7.473837464684e-11 245 KSP Residual norm 7.652677504169e-11 246 KSP Residual norm 7.480994700470e-11 247 KSP Residual norm 7.200983676419e-11 248 KSP Residual norm 7.122097552731e-11 249 KSP Residual norm 7.318450188452e-11 250 KSP Residual norm 7.372651584289e-11 251 KSP Residual norm 7.174963896698e-11 252 KSP Residual norm 7.402034403170e-11 253 KSP Residual norm 7.601799018494e-11 254 KSP Residual norm 7.932790069849e-11 255 KSP Residual norm 8.036559369947e-11 256 KSP Residual norm 8.097733044114e-11 257 KSP Residual norm 8.131596928036e-11 258 KSP Residual norm 8.487134141151e-11 259 KSP Residual norm 8.432673414151e-11 260 KSP Residual norm 8.641315410572e-11 261 KSP Residual norm 8.570600189544e-11 262 KSP Residual norm 8.493207923149e-11 263 KSP Residual norm 8.417758395909e-11 264 KSP Residual norm 8.313217157673e-11 265 KSP Residual norm 8.249680541616e-11 266 KSP Residual norm 8.589071431350e-11 267 KSP Residual norm 9.324633631003e-11 268 KSP Residual norm 9.545117079377e-11 269 KSP Residual norm 9.008839903066e-11 270 KSP Residual norm 8.772845860571e-11 271 KSP Residual norm 9.218107181305e-11 272 KSP Residual norm 9.469442869812e-11 273 KSP Residual norm 9.212151286481e-11 274 KSP Residual norm 9.080889926646e-11 275 KSP Residual norm 9.097792158798e-11 276 KSP Residual norm 8.982856804657e-11 277 KSP Residual norm 9.149086072101e-11 278 KSP Residual norm 9.533832662962e-11 279 KSP Residual norm 9.709037522266e-11 280 KSP Residual norm 9.676839933355e-11 281 KSP Residual norm 9.593114967158e-11 282 KSP Residual norm 9.173370213274e-11 283 KSP Residual norm 8.707470315126e-11 284 KSP Residual norm 7.967812173087e-11 285 KSP Residual norm 7.460987057520e-11 286 KSP Residual norm 7.493734274119e-11 287 KSP Residual norm 7.861297003712e-11 288 KSP Residual norm 7.886006319948e-11 289 KSP Residual norm 7.408027493244e-11 290 KSP Residual norm 6.989202492740e-11 291 KSP Residual norm 6.962453590656e-11 292 KSP Residual norm 7.203315711140e-11 293 KSP Residual norm 7.147553922817e-11 294 KSP Residual norm 6.930069805271e-11 295 KSP Residual norm 7.219069689586e-11 296 KSP Residual norm 7.904494219056e-11 297 KSP Residual norm 8.633758917157e-11 298 KSP Residual norm 8.826926192710e-11 299 KSP Residual norm 9.359061079782e-11 300 KSP Residual norm 9.676640495858e-11 301 KSP Residual norm 1.003895179952e-10 302 KSP Residual norm 1.052961749340e-10 303 KSP Residual norm 1.098566792687e-10 304 KSP Residual norm 1.131066655583e-10 305 KSP Residual norm 1.126462901128e-10 306 KSP Residual norm 1.164446202685e-10 307 KSP Residual norm 1.237600055850e-10 308 KSP Residual norm 1.262984007612e-10 309 KSP Residual norm 1.209429193689e-10 310 KSP Residual norm 1.129162806094e-10 311 KSP Residual norm 1.081215251179e-10 312 KSP Residual norm 1.037112129996e-10 313 KSP Residual norm 1.038507511790e-10 314 KSP Residual norm 1.022585920240e-10 315 KSP Residual norm 1.037232002363e-10 316 KSP Residual norm 1.096019913574e-10 317 KSP Residual norm 1.117697584112e-10 318 KSP Residual norm 1.085742049833e-10 319 KSP Residual norm 1.116772789910e-10 320 KSP Residual norm 1.215335745413e-10 321 KSP Residual norm 1.299977491068e-10 322 KSP Residual norm 1.320200399599e-10 323 KSP Residual norm 1.306204431776e-10 324 KSP Residual norm 1.341199555983e-10 325 KSP Residual norm 1.430196569389e-10 326 KSP Residual norm 1.420217962038e-10 327 KSP Residual norm 1.374596204533e-10 328 KSP Residual norm 1.360981260376e-10 329 KSP Residual norm 1.319010006072e-10 330 KSP Residual norm 1.257548801076e-10 331 KSP Residual norm 1.232125432291e-10 332 KSP Residual norm 1.193480732321e-10 333 KSP Residual norm 1.169280735538e-10 334 KSP Residual norm 1.179654396871e-10 335 KSP Residual norm 1.223991621312e-10 336 KSP Residual norm 1.301578037909e-10 337 KSP Residual norm 1.367343341005e-10 338 KSP Residual norm 1.506216893972e-10 339 KSP Residual norm 1.636350956967e-10 340 KSP Residual norm 1.676831597766e-10 341 KSP Residual norm 1.596905837717e-10 342 KSP Residual norm 1.491096052729e-10 343 KSP Residual norm 1.426876571613e-10 344 KSP Residual norm 1.464460307966e-10 345 KSP Residual norm 1.519683452860e-10 346 KSP Residual norm 1.494018942247e-10 347 KSP Residual norm 1.462197335329e-10 348 KSP Residual norm 1.440145077903e-10 349 KSP Residual norm 1.411252862876e-10 350 KSP Residual norm 1.378855642234e-10 351 KSP Residual norm 1.394836623807e-10 352 KSP Residual norm 1.406921589879e-10 353 KSP Residual norm 1.308634002462e-10 354 KSP Residual norm 1.249431696280e-10 355 KSP Residual norm 1.312750480769e-10 356 KSP Residual norm 1.315878162037e-10 357 KSP Residual norm 1.269463171173e-10 358 KSP Residual norm 1.254232516280e-10 359 KSP Residual norm 1.289871622715e-10 360 KSP Residual norm 1.338467961730e-10 361 KSP Residual norm 1.352002233898e-10 362 KSP Residual norm 1.309099062832e-10 363 KSP Residual norm 1.292177460147e-10 364 KSP Residual norm 1.333204976252e-10 365 KSP Residual norm 1.376872353931e-10 366 KSP Residual norm 1.411472303274e-10 367 KSP Residual norm 1.401570531848e-10 368 KSP Residual norm 1.360468635118e-10 369 KSP Residual norm 1.375380424409e-10 370 KSP Residual norm 1.401212969986e-10 371 KSP Residual norm 1.326408541078e-10 372 KSP Residual norm 1.214519658595e-10 373 KSP Residual norm 1.150583955129e-10 374 KSP Residual norm 1.115844470655e-10 375 KSP Residual norm 1.062305503542e-10 376 KSP Residual norm 9.797964208442e-11 377 KSP Residual norm 9.608790823163e-11 378 KSP Residual norm 9.746546776563e-11 379 KSP Residual norm 9.842063732815e-11 380 KSP Residual norm 9.677981058817e-11 381 KSP Residual norm 1.003420358087e-10 382 KSP Residual norm 1.047654583029e-10 383 KSP Residual norm 1.040658120500e-10 384 KSP Residual norm 9.819264093864e-11 385 KSP Residual norm 9.044904130672e-11 386 KSP Residual norm 8.769804014454e-11 387 KSP Residual norm 8.782691381774e-11 388 KSP Residual norm 8.889264442449e-11 389 KSP Residual norm 9.011781020588e-11 390 KSP Residual norm 9.056370360392e-11 391 KSP Residual norm 9.013963799066e-11 392 KSP Residual norm 9.108096321946e-11 393 KSP Residual norm 9.387709588279e-11 394 KSP Residual norm 9.455709425885e-11 395 KSP Residual norm 9.143848190643e-11 396 KSP Residual norm 8.904516552009e-11 397 KSP Residual norm 8.791868545759e-11 398 KSP Residual norm 8.861452192126e-11 399 KSP Residual norm 8.936027466983e-11 400 KSP Residual norm 9.025502599390e-11 401 KSP Residual norm 8.884123850125e-11 402 KSP Residual norm 8.278342496209e-11 403 KSP Residual norm 8.072960835880e-11 404 KSP Residual norm 8.343209521330e-11 405 KSP Residual norm 8.494389481873e-11 406 KSP Residual norm 8.122455602851e-11 407 KSP Residual norm 8.251758892002e-11 408 KSP Residual norm 8.861369495284e-11 409 KSP Residual norm 9.095449686780e-11 410 KSP Residual norm 8.907494710767e-11 411 KSP Residual norm 8.670745205835e-11 412 KSP Residual norm 9.024053702827e-11 413 KSP Residual norm 9.405704513442e-11 414 KSP Residual norm 9.128912456732e-11 415 KSP Residual norm 8.985896870899e-11 416 KSP Residual norm 9.378544285377e-11 417 KSP Residual norm 9.867326234355e-11 418 KSP Residual norm 1.009237720336e-10 419 KSP Residual norm 9.705789085782e-11 420 KSP Residual norm 9.464615610631e-11 421 KSP Residual norm 9.726877611052e-11 422 KSP Residual norm 1.010513595090e-10 423 KSP Residual norm 1.006015096923e-10 424 KSP Residual norm 9.899501264832e-11 425 KSP Residual norm 1.013094889359e-10 426 KSP Residual norm 1.051013135303e-10 427 KSP Residual norm 1.073804508653e-10 428 KSP Residual norm 1.088021232731e-10 429 KSP Residual norm 1.114547280784e-10 430 KSP Residual norm 1.135506532441e-10 431 KSP Residual norm 1.148321898402e-10 432 KSP Residual norm 1.159096200224e-10 433 KSP Residual norm 1.164558280000e-10 434 KSP Residual norm 1.122352997235e-10 435 KSP Residual norm 1.087145382814e-10 436 KSP Residual norm 1.097894909348e-10 437 KSP Residual norm 1.122527996191e-10 438 KSP Residual norm 1.137491995240e-10 439 KSP Residual norm 1.140774570496e-10 440 KSP Residual norm 1.112589613742e-10 441 KSP Residual norm 1.094983941213e-10 442 KSP Residual norm 1.142562871324e-10 443 KSP Residual norm 1.238810878572e-10 444 KSP Residual norm 1.228791871381e-10 445 KSP Residual norm 1.175064140499e-10 446 KSP Residual norm 1.077466736331e-10 447 KSP Residual norm 1.039927052253e-10 448 KSP Residual norm 1.043533088316e-10 449 KSP Residual norm 1.047440870394e-10 450 KSP Residual norm 1.055522127257e-10 451 KSP Residual norm 1.036720028434e-10 452 KSP Residual norm 1.018155327586e-10 453 KSP Residual norm 1.024067371058e-10 454 KSP Residual norm 1.037494737515e-10 455 KSP Residual norm 1.033572392592e-10 456 KSP Residual norm 1.031045485840e-10 457 KSP Residual norm 1.054311682262e-10 458 KSP Residual norm 1.056995080539e-10 459 KSP Residual norm 1.044873666824e-10 460 KSP Residual norm 1.069091788929e-10 461 KSP Residual norm 1.093635854489e-10 462 KSP Residual norm 1.101887652280e-10 463 KSP Residual norm 1.155605951432e-10 464 KSP Residual norm 1.185883574394e-10 465 KSP Residual norm 1.195111948651e-10 466 KSP Residual norm 1.170226728991e-10 467 KSP Residual norm 1.158646890160e-10 468 KSP Residual norm 1.141738326626e-10 469 KSP Residual norm 1.133317624204e-10 470 KSP Residual norm 1.124826209429e-10 471 KSP Residual norm 1.155247428899e-10 472 KSP Residual norm 1.171227773134e-10 473 KSP Residual norm 1.123152465461e-10 474 KSP Residual norm 1.039973339485e-10 475 KSP Residual norm 1.049648716183e-10 476 KSP Residual norm 1.097148735963e-10 477 KSP Residual norm 1.114976794603e-10 478 KSP Residual norm 1.081918095913e-10 479 KSP Residual norm 1.079785802457e-10 480 KSP Residual norm 1.139522356396e-10 481 KSP Residual norm 1.148743570563e-10 482 KSP Residual norm 1.105669099725e-10 483 KSP Residual norm 1.072501538178e-10 484 KSP Residual norm 1.111845928806e-10 485 KSP Residual norm 1.230371241072e-10 486 KSP Residual norm 1.301767187665e-10 487 KSP Residual norm 1.398267503430e-10 488 KSP Residual norm 1.510580683768e-10 489 KSP Residual norm 1.572826545860e-10 490 KSP Residual norm 1.605159712300e-10 491 KSP Residual norm 1.624697856367e-10 492 KSP Residual norm 1.692094670052e-10 493 KSP Residual norm 1.739236721802e-10 494 KSP Residual norm 1.790000581442e-10 495 KSP Residual norm 1.778450573647e-10 496 KSP Residual norm 1.786454314163e-10 497 KSP Residual norm 1.780642972262e-10 498 KSP Residual norm 1.763582274327e-10 499 KSP Residual norm 1.689498450926e-10 500 KSP Residual norm 1.559343131137e-10 501 KSP Residual norm 1.481066078777e-10 502 KSP Residual norm 1.491899722413e-10 503 KSP Residual norm 1.480754888971e-10 504 KSP Residual norm 1.470274577219e-10 505 KSP Residual norm 1.488160610072e-10 506 KSP Residual norm 1.515779527617e-10 507 KSP Residual norm 1.572635069659e-10 508 KSP Residual norm 1.636582482232e-10 509 KSP Residual norm 1.641728507805e-10 510 KSP Residual norm 1.662097561259e-10 511 KSP Residual norm 1.746764405987e-10 512 KSP Residual norm 1.822360615908e-10 513 KSP Residual norm 1.815612354338e-10 514 KSP Residual norm 1.768132374405e-10 515 KSP Residual norm 1.818474128730e-10 516 KSP Residual norm 1.922637860255e-10 517 KSP Residual norm 1.988984178281e-10 518 KSP Residual norm 2.035266421635e-10 519 KSP Residual norm 2.039448805526e-10 520 KSP Residual norm 2.085091295652e-10 521 KSP Residual norm 2.170981952231e-10 522 KSP Residual norm 2.217800598437e-10 523 KSP Residual norm 2.226932170255e-10 524 KSP Residual norm 2.139060245389e-10 525 KSP Residual norm 2.014068018946e-10 526 KSP Residual norm 1.939654475187e-10 527 KSP Residual norm 1.930542050393e-10 528 KSP Residual norm 1.909651687465e-10 529 KSP Residual norm 1.825009181196e-10 530 KSP Residual norm 1.745936989771e-10 531 KSP Residual norm 1.800902804406e-10 532 KSP Residual norm 1.938026366554e-10 533 KSP Residual norm 1.965409974991e-10 534 KSP Residual norm 1.867434435718e-10 535 KSP Residual norm 1.812310566456e-10 536 KSP Residual norm 1.794267169647e-10 537 KSP Residual norm 1.743316904438e-10 538 KSP Residual norm 1.809021052603e-10 539 KSP Residual norm 1.941038206959e-10 540 KSP Residual norm 2.061227671070e-10 541 KSP Residual norm 2.204272495965e-10 542 KSP Residual norm 2.226107492790e-10 543 KSP Residual norm 2.145859724464e-10 544 KSP Residual norm 2.028559956101e-10 545 KSP Residual norm 2.010678420953e-10 546 KSP Residual norm 2.059400547046e-10 547 KSP Residual norm 2.155408765763e-10 548 KSP Residual norm 2.056221918493e-10 549 KSP Residual norm 1.918881705640e-10 550 KSP Residual norm 1.878014647773e-10 551 KSP Residual norm 1.913661209845e-10 552 KSP Residual norm 1.828508531043e-10 553 KSP Residual norm 1.693881765474e-10 554 KSP Residual norm 1.694584256926e-10 555 KSP Residual norm 1.760812233836e-10 556 KSP Residual norm 1.812192015929e-10 557 KSP Residual norm 1.877524918190e-10 558 KSP Residual norm 1.881720567202e-10 559 KSP Residual norm 1.783387366241e-10 560 KSP Residual norm 1.744480589160e-10 561 KSP Residual norm 1.814336627861e-10 562 KSP Residual norm 1.913469447756e-10 563 KSP Residual norm 2.010888578609e-10 564 KSP Residual norm 1.952870289662e-10 565 KSP Residual norm 1.893843467257e-10 566 KSP Residual norm 1.918549594254e-10 567 KSP Residual norm 1.989289966291e-10 568 KSP Residual norm 1.937183871854e-10 569 KSP Residual norm 1.842370681866e-10 570 KSP Residual norm 1.769928814404e-10 571 KSP Residual norm 1.772570923404e-10 572 KSP Residual norm 1.804216305224e-10 573 KSP Residual norm 1.789860976695e-10 574 KSP Residual norm 1.787037123703e-10 575 KSP Residual norm 1.759849887693e-10 576 KSP Residual norm 1.776969017259e-10 577 KSP Residual norm 1.830954458633e-10 578 KSP Residual norm 1.875910570900e-10 579 KSP Residual norm 1.984415040785e-10 580 KSP Residual norm 2.116606853177e-10 581 KSP Residual norm 2.053942917391e-10 582 KSP Residual norm 1.893566743926e-10 583 KSP Residual norm 1.785184685867e-10 584 KSP Residual norm 1.716183711272e-10 585 KSP Residual norm 1.736203895607e-10 586 KSP Residual norm 1.704484511659e-10 587 KSP Residual norm 1.644148485745e-10 588 KSP Residual norm 1.668583922259e-10 589 KSP Residual norm 1.810101343623e-10 590 KSP Residual norm 1.906796142367e-10 591 KSP Residual norm 1.935806982046e-10 592 KSP Residual norm 2.064014299958e-10 593 KSP Residual norm 2.197326419846e-10 594 KSP Residual norm 2.201830552860e-10 595 KSP Residual norm 2.125623328059e-10 596 KSP Residual norm 2.022460063513e-10 597 KSP Residual norm 2.001000385744e-10 598 KSP Residual norm 1.998768522064e-10 599 KSP Residual norm 1.933718982975e-10 600 KSP Residual norm 1.915740434624e-10 601 KSP Residual norm 1.933653714604e-10 602 KSP Residual norm 1.951435212449e-10 603 KSP Residual norm 2.038324333315e-10 604 KSP Residual norm 2.220329503539e-10 605 KSP Residual norm 2.495278854397e-10 606 KSP Residual norm 2.576209928494e-10 607 KSP Residual norm 2.421429961141e-10 608 KSP Residual norm 2.255355246934e-10 609 KSP Residual norm 2.231542636799e-10 610 KSP Residual norm 2.293456644988e-10 611 KSP Residual norm 2.331670371593e-10 612 KSP Residual norm 2.376547928324e-10 613 KSP Residual norm 2.422941280715e-10 614 KSP Residual norm 2.431381910845e-10 615 KSP Residual norm 2.401718386582e-10 616 KSP Residual norm 2.369321629805e-10 617 KSP Residual norm 2.335270642779e-10 618 KSP Residual norm 2.356230858869e-10 619 KSP Residual norm 2.327772748258e-10 620 KSP Residual norm 2.184154560928e-10 621 KSP Residual norm 2.101648426423e-10 622 KSP Residual norm 2.170030494840e-10 623 KSP Residual norm 2.177253876233e-10 624 KSP Residual norm 2.033105581876e-10 625 KSP Residual norm 1.954703863351e-10 626 KSP Residual norm 1.912352902099e-10 627 KSP Residual norm 1.865679218894e-10 628 KSP Residual norm 1.818188262214e-10 629 KSP Residual norm 1.833684548112e-10 630 KSP Residual norm 1.924942420484e-10 631 KSP Residual norm 2.036367036511e-10 632 KSP Residual norm 2.086889231729e-10 633 KSP Residual norm 2.114613038145e-10 634 KSP Residual norm 2.042754342130e-10 635 KSP Residual norm 1.881642970141e-10 636 KSP Residual norm 1.727757469396e-10 637 KSP Residual norm 1.652987962858e-10 638 KSP Residual norm 1.650230364386e-10 639 KSP Residual norm 1.717765677817e-10 640 KSP Residual norm 1.836789400160e-10 641 KSP Residual norm 2.000454387967e-10 642 KSP Residual norm 2.071921585415e-10 643 KSP Residual norm 2.081004931713e-10 644 KSP Residual norm 2.185069061240e-10 645 KSP Residual norm 2.287205604048e-10 646 KSP Residual norm 2.331019879264e-10 647 KSP Residual norm 2.316362949551e-10 648 KSP Residual norm 2.286823970508e-10 649 KSP Residual norm 2.293848085066e-10 650 KSP Residual norm 2.438784418810e-10 651 KSP Residual norm 2.526587403875e-10 652 KSP Residual norm 2.568420559779e-10 653 KSP Residual norm 2.425010569264e-10 654 KSP Residual norm 2.352198803424e-10 655 KSP Residual norm 2.455869466605e-10 656 KSP Residual norm 2.429215720402e-10 657 KSP Residual norm 2.345534661051e-10 658 KSP Residual norm 2.237167908436e-10 659 KSP Residual norm 2.126002743319e-10 660 KSP Residual norm 2.038458551779e-10 661 KSP Residual norm 2.067173994015e-10 662 KSP Residual norm 2.081124970665e-10 663 KSP Residual norm 2.139811202812e-10 664 KSP Residual norm 2.264061229577e-10 665 KSP Residual norm 2.375101095671e-10 666 KSP Residual norm 2.370122955716e-10 667 KSP Residual norm 2.266551856449e-10 668 KSP Residual norm 2.189360903161e-10 669 KSP Residual norm 2.179995890352e-10 670 KSP Residual norm 2.163092424936e-10 671 KSP Residual norm 2.183864370351e-10 672 KSP Residual norm 2.238014269049e-10 673 KSP Residual norm 2.358904192887e-10 674 KSP Residual norm 2.464595534492e-10 675 KSP Residual norm 2.335491417182e-10 676 KSP Residual norm 2.196323711133e-10 677 KSP Residual norm 2.213523242846e-10 678 KSP Residual norm 2.043120835486e-10 679 KSP Residual norm 2.075778319304e-10 680 KSP Residual norm 2.056906743936e-10 681 KSP Residual norm 2.101790268865e-10 682 KSP Residual norm 2.101145127540e-10 683 KSP Residual norm 2.087180344450e-10 684 KSP Residual norm 2.211517620371e-10 685 KSP Residual norm 2.279519948783e-10 686 KSP Residual norm 2.439915727417e-10 687 KSP Residual norm 2.596807864920e-10 688 KSP Residual norm 2.639790113494e-10 689 KSP Residual norm 2.663678174672e-10 690 KSP Residual norm 2.668064612300e-10 691 KSP Residual norm 2.764176897416e-10 692 KSP Residual norm 2.905522225811e-10 693 KSP Residual norm 2.960037874305e-10 694 KSP Residual norm 3.077213997327e-10 695 KSP Residual norm 2.949384884034e-10 696 KSP Residual norm 2.906327167550e-10 697 KSP Residual norm 3.046900008151e-10 698 KSP Residual norm 3.410358277854e-10 699 KSP Residual norm 3.730860907175e-10 700 KSP Residual norm 3.809336660159e-10 701 KSP Residual norm 3.624604287047e-10 702 KSP Residual norm 3.439749964009e-10 703 KSP Residual norm 3.281184804047e-10 704 KSP Residual norm 3.332878743402e-10 705 KSP Residual norm 3.414423602397e-10 706 KSP Residual norm 3.507964752287e-10 707 KSP Residual norm 3.712419659917e-10 708 KSP Residual norm 3.733357871312e-10 709 KSP Residual norm 3.622491729105e-10 710 KSP Residual norm 3.482167388562e-10 711 KSP Residual norm 3.392892409711e-10 712 KSP Residual norm 3.375639914086e-10 713 KSP Residual norm 3.696721110919e-10 714 KSP Residual norm 4.219383842350e-10 715 KSP Residual norm 4.688536216617e-10 716 KSP Residual norm 5.007353272304e-10 717 KSP Residual norm 5.384449342297e-10 718 KSP Residual norm 5.549011578188e-10 719 KSP Residual norm 5.536046853695e-10 720 KSP Residual norm 5.236485152690e-10 721 KSP Residual norm 5.094810323967e-10 722 KSP Residual norm 5.200268906537e-10 723 KSP Residual norm 5.413545252793e-10 724 KSP Residual norm 5.564693620909e-10 725 KSP Residual norm 5.772470736711e-10 726 KSP Residual norm 5.703687352867e-10 727 KSP Residual norm 5.448112157796e-10 728 KSP Residual norm 5.459180309984e-10 729 KSP Residual norm 5.637843170989e-10 730 KSP Residual norm 5.288179546452e-10 731 KSP Residual norm 4.853022180494e-10 732 KSP Residual norm 4.528402703427e-10 733 KSP Residual norm 4.316934089981e-10 734 KSP Residual norm 4.487153936643e-10 735 KSP Residual norm 4.676863146105e-10 736 KSP Residual norm 4.669185395562e-10 737 KSP Residual norm 4.796653238233e-10 738 KSP Residual norm 5.202958190228e-10 739 KSP Residual norm 5.800385748514e-10 740 KSP Residual norm 6.070581159075e-10 741 KSP Residual norm 6.234595529537e-10 742 KSP Residual norm 6.457826038961e-10 743 KSP Residual norm 7.079977095355e-10 744 KSP Residual norm 7.435357821093e-10 745 KSP Residual norm 7.146299986801e-10 746 KSP Residual norm 6.700412822349e-10 747 KSP Residual norm 6.512315167885e-10 748 KSP Residual norm 6.415254757139e-10 749 KSP Residual norm 6.151523267880e-10 750 KSP Residual norm 5.989118406893e-10 751 KSP Residual norm 5.854893269656e-10 752 KSP Residual norm 5.865341902098e-10 753 KSP Residual norm 5.741401246338e-10 754 KSP Residual norm 5.766190438691e-10 755 KSP Residual norm 6.105570653010e-10 756 KSP Residual norm 6.466463228210e-10 757 KSP Residual norm 6.296957862106e-10 758 KSP Residual norm 6.079600078386e-10 759 KSP Residual norm 6.195955242589e-10 760 KSP Residual norm 6.310398019099e-10 761 KSP Residual norm 6.130194509493e-10 762 KSP Residual norm 5.971752658804e-10 763 KSP Residual norm 5.866637196686e-10 764 KSP Residual norm 5.905912636370e-10 765 KSP Residual norm 6.202038742986e-10 766 KSP Residual norm 6.787100194676e-10 767 KSP Residual norm 7.113950100922e-10 768 KSP Residual norm 7.417949699915e-10 769 KSP Residual norm 7.587535106951e-10 770 KSP Residual norm 7.219782647258e-10 771 KSP Residual norm 6.614194213521e-10 772 KSP Residual norm 6.552952684021e-10 773 KSP Residual norm 7.064630656962e-10 774 KSP Residual norm 7.696705932934e-10 775 KSP Residual norm 7.894596408595e-10 776 KSP Residual norm 7.581219253549e-10 777 KSP Residual norm 7.199064923237e-10 778 KSP Residual norm 7.233021519686e-10 779 KSP Residual norm 7.255467824559e-10 780 KSP Residual norm 7.198755134715e-10 781 KSP Residual norm 6.967979475498e-10 782 KSP Residual norm 6.898243500686e-10 783 KSP Residual norm 6.952263245737e-10 784 KSP Residual norm 6.737457763147e-10 785 KSP Residual norm 6.468241740127e-10 786 KSP Residual norm 6.356857503848e-10 787 KSP Residual norm 6.230794137109e-10 788 KSP Residual norm 6.075857386720e-10 789 KSP Residual norm 6.333992773390e-10 790 KSP Residual norm 6.436249335699e-10 791 KSP Residual norm 6.006592593413e-10 792 KSP Residual norm 5.609887155837e-10 793 KSP Residual norm 5.460701893927e-10 794 KSP Residual norm 5.466387795694e-10 795 KSP Residual norm 5.245851781888e-10 796 KSP Residual norm 5.038967824906e-10 797 KSP Residual norm 5.110488244116e-10 798 KSP Residual norm 5.520342390421e-10 799 KSP Residual norm 5.862001935001e-10 800 KSP Residual norm 6.112081856144e-10 801 KSP Residual norm 6.434404934191e-10 802 KSP Residual norm 6.494527840936e-10 803 KSP Residual norm 6.559085679793e-10 804 KSP Residual norm 6.195347806035e-10 805 KSP Residual norm 5.781363372952e-10 806 KSP Residual norm 5.745697885265e-10 807 KSP Residual norm 6.132118848257e-10 808 KSP Residual norm 6.691546217644e-10 809 KSP Residual norm 6.734659332666e-10 810 KSP Residual norm 6.419195073217e-10 811 KSP Residual norm 5.864820706635e-10 812 KSP Residual norm 5.356310478348e-10 813 KSP Residual norm 4.926128580964e-10 814 KSP Residual norm 4.887564768928e-10 815 KSP Residual norm 5.051134065194e-10 816 KSP Residual norm 5.169022648947e-10 817 KSP Residual norm 5.261211478534e-10 818 KSP Residual norm 5.337550635041e-10 819 KSP Residual norm 5.318492120777e-10 820 KSP Residual norm 5.370820477013e-10 821 KSP Residual norm 5.415557843983e-10 822 KSP Residual norm 5.124846377840e-10 823 KSP Residual norm 4.698115925146e-10 824 KSP Residual norm 4.634140351634e-10 825 KSP Residual norm 4.980471182710e-10 826 KSP Residual norm 5.365674377454e-10 827 KSP Residual norm 5.274853483100e-10 828 KSP Residual norm 5.030326389075e-10 829 KSP Residual norm 4.846839440079e-10 830 KSP Residual norm 5.062069362945e-10 831 KSP Residual norm 5.378170278237e-10 832 KSP Residual norm 5.390812130215e-10 833 KSP Residual norm 5.294106454447e-10 834 KSP Residual norm 5.369020945543e-10 835 KSP Residual norm 5.480203621406e-10 836 KSP Residual norm 5.496370260960e-10 837 KSP Residual norm 5.248187798370e-10 838 KSP Residual norm 4.680871829364e-10 839 KSP Residual norm 4.351554960438e-10 840 KSP Residual norm 4.517457085751e-10 841 KSP Residual norm 4.894903837526e-10 842 KSP Residual norm 5.145747720301e-10 843 KSP Residual norm 5.322278669627e-10 844 KSP Residual norm 5.497680388346e-10 845 KSP Residual norm 5.455952265822e-10 846 KSP Residual norm 5.443095369480e-10 847 KSP Residual norm 5.504076086696e-10 848 KSP Residual norm 5.650793232657e-10 849 KSP Residual norm 6.039189590649e-10 850 KSP Residual norm 6.517088188967e-10 851 KSP Residual norm 6.943978819316e-10 852 KSP Residual norm 7.000419017684e-10 853 KSP Residual norm 6.491609267055e-10 854 KSP Residual norm 5.887657679061e-10 855 KSP Residual norm 5.598276904567e-10 856 KSP Residual norm 5.558752753957e-10 857 KSP Residual norm 5.426007227076e-10 858 KSP Residual norm 5.422251898715e-10 859 KSP Residual norm 5.317131612180e-10 860 KSP Residual norm 5.333062389980e-10 861 KSP Residual norm 5.539948219391e-10 862 KSP Residual norm 5.766572825092e-10 863 KSP Residual norm 5.588234741305e-10 864 KSP Residual norm 5.340018845522e-10 865 KSP Residual norm 5.133965851998e-10 866 KSP Residual norm 5.212229173195e-10 867 KSP Residual norm 5.202777311207e-10 868 KSP Residual norm 4.953825535372e-10 869 KSP Residual norm 4.664693594801e-10 870 KSP Residual norm 4.520155860631e-10 871 KSP Residual norm 4.328898864573e-10 872 KSP Residual norm 4.143775985066e-10 873 KSP Residual norm 3.995589095060e-10 874 KSP Residual norm 4.150352244873e-10 875 KSP Residual norm 4.286127172925e-10 876 KSP Residual norm 4.401296645760e-10 877 KSP Residual norm 4.389471524053e-10 878 KSP Residual norm 4.507336515775e-10 879 KSP Residual norm 4.716771291256e-10 880 KSP Residual norm 4.750881246811e-10 881 KSP Residual norm 4.937604806469e-10 882 KSP Residual norm 5.308729859348e-10 883 KSP Residual norm 5.596030874562e-10 884 KSP Residual norm 5.419538704588e-10 885 KSP Residual norm 5.116380577576e-10 886 KSP Residual norm 5.100411837061e-10 887 KSP Residual norm 5.136954916304e-10 888 KSP Residual norm 5.306415529980e-10 889 KSP Residual norm 5.562143643389e-10 890 KSP Residual norm 5.680597864048e-10 891 KSP Residual norm 5.558032240071e-10 892 KSP Residual norm 5.308725964856e-10 893 KSP Residual norm 4.922671299915e-10 894 KSP Residual norm 4.750722773361e-10 895 KSP Residual norm 4.716530834926e-10 896 KSP Residual norm 4.715429697206e-10 897 KSP Residual norm 4.770787880106e-10 898 KSP Residual norm 4.910169059875e-10 899 KSP Residual norm 4.950316053815e-10 900 KSP Residual norm 5.057671246262e-10 901 KSP Residual norm 5.187930935719e-10 902 KSP Residual norm 5.367652761539e-10 903 KSP Residual norm 5.733273157387e-10 904 KSP Residual norm 6.165699143876e-10 905 KSP Residual norm 6.534406142013e-10 906 KSP Residual norm 6.747733721164e-10 907 KSP Residual norm 6.884730288643e-10 908 KSP Residual norm 6.799991844174e-10 909 KSP Residual norm 6.584861364922e-10 910 KSP Residual norm 6.212351330236e-10 911 KSP Residual norm 6.010017298295e-10 912 KSP Residual norm 6.224121152797e-10 913 KSP Residual norm 6.495847200552e-10 914 KSP Residual norm 6.676052237209e-10 915 KSP Residual norm 6.902872280847e-10 916 KSP Residual norm 7.282544707337e-10 917 KSP Residual norm 7.613755389096e-10 918 KSP Residual norm 7.587551700286e-10 919 KSP Residual norm 7.648799665555e-10 920 KSP Residual norm 7.976559018038e-10 921 KSP Residual norm 8.844552661071e-10 922 KSP Residual norm 9.212573340073e-10 923 KSP Residual norm 8.330931288257e-10 924 KSP Residual norm 6.846769280374e-10 925 KSP Residual norm 6.096691827333e-10 926 KSP Residual norm 5.883429922012e-10 927 KSP Residual norm 5.791469660230e-10 928 KSP Residual norm 5.423228624356e-10 929 KSP Residual norm 5.213037569586e-10 930 KSP Residual norm 5.153642389121e-10 931 KSP Residual norm 5.355226153632e-10 932 KSP Residual norm 5.460843163977e-10 933 KSP Residual norm 5.639180128626e-10 934 KSP Residual norm 5.815237022150e-10 935 KSP Residual norm 5.826295123147e-10 936 KSP Residual norm 5.733978598475e-10 937 KSP Residual norm 5.565342534135e-10 938 KSP Residual norm 5.383603799260e-10 939 KSP Residual norm 5.217099253861e-10 940 KSP Residual norm 5.082194961971e-10 941 KSP Residual norm 4.915892536656e-10 942 KSP Residual norm 5.009220334167e-10 943 KSP Residual norm 5.378789901038e-10 944 KSP Residual norm 5.770886650123e-10 945 KSP Residual norm 5.850996770602e-10 946 KSP Residual norm 5.754041980581e-10 947 KSP Residual norm 5.860876194141e-10 948 KSP Residual norm 5.989489848833e-10 949 KSP Residual norm 5.954126770656e-10 950 KSP Residual norm 5.987889430119e-10 951 KSP Residual norm 5.979969544187e-10 952 KSP Residual norm 6.064645862086e-10 953 KSP Residual norm 6.120994361376e-10 954 KSP Residual norm 6.153306820402e-10 955 KSP Residual norm 6.300583550256e-10 956 KSP Residual norm 6.269213458754e-10 957 KSP Residual norm 6.079222943604e-10 958 KSP Residual norm 5.798121970466e-10 959 KSP Residual norm 5.581486311722e-10 960 KSP Residual norm 5.390408179636e-10 961 KSP Residual norm 5.475567769388e-10 962 KSP Residual norm 5.813670234804e-10 963 KSP Residual norm 6.058047222206e-10 964 KSP Residual norm 6.161861967216e-10 965 KSP Residual norm 5.958125211842e-10 966 KSP Residual norm 5.530137681377e-10 967 KSP Residual norm 5.447433280026e-10 968 KSP Residual norm 5.463469313784e-10 969 KSP Residual norm 5.353953557889e-10 970 KSP Residual norm 4.996266006901e-10 971 KSP Residual norm 4.498777755624e-10 972 KSP Residual norm 4.294140881110e-10 973 KSP Residual norm 4.329094971861e-10 974 KSP Residual norm 4.334356428128e-10 975 KSP Residual norm 4.181130481390e-10 976 KSP Residual norm 4.164840982480e-10 977 KSP Residual norm 4.265387002252e-10 978 KSP Residual norm 4.181379742852e-10 979 KSP Residual norm 4.089797802319e-10 980 KSP Residual norm 4.141032811975e-10 981 KSP Residual norm 4.301282973720e-10 982 KSP Residual norm 4.354042699665e-10 983 KSP Residual norm 4.185999254546e-10 984 KSP Residual norm 3.806601613392e-10 985 KSP Residual norm 3.484327200485e-10 986 KSP Residual norm 3.405848086642e-10 987 KSP Residual norm 3.369538859992e-10 988 KSP Residual norm 3.241564584193e-10 989 KSP Residual norm 3.087403756511e-10 990 KSP Residual norm 3.196517612153e-10 991 KSP Residual norm 3.319561483120e-10 992 KSP Residual norm 3.220990483384e-10 993 KSP Residual norm 2.990649040859e-10 994 KSP Residual norm 2.826147655680e-10 995 KSP Residual norm 2.807203753680e-10 996 KSP Residual norm 2.827366745410e-10 997 KSP Residual norm 2.649016975505e-10 998 KSP Residual norm 2.456412087091e-10 999 KSP Residual norm 2.365024348027e-10 1000 KSP Residual norm 2.350737135201e-10 1001 KSP Residual norm 2.314619652044e-10 1002 KSP Residual norm 2.243868500065e-10 1003 KSP Residual norm 2.251959779604e-10 1004 KSP Residual norm 2.281516736274e-10 1005 KSP Residual norm 2.246202415571e-10 1006 KSP Residual norm 2.136614334945e-10 1007 KSP Residual norm 2.168670598797e-10 1008 KSP Residual norm 2.300190928102e-10 1009 KSP Residual norm 2.366984139504e-10 1010 KSP Residual norm 2.366822958721e-10 1011 KSP Residual norm 2.351065225949e-10 1012 KSP Residual norm 2.425036166983e-10 1013 KSP Residual norm 2.522463564641e-10 1014 KSP Residual norm 2.562175678713e-10 1015 KSP Residual norm 2.413904408114e-10 1016 KSP Residual norm 2.344491184824e-10 1017 KSP Residual norm 2.423150455142e-10 1018 KSP Residual norm 2.497256711835e-10 1019 KSP Residual norm 2.419956933276e-10 1020 KSP Residual norm 2.385524517122e-10 1021 KSP Residual norm 2.392111116692e-10 1022 KSP Residual norm 2.386219416804e-10 1023 KSP Residual norm 2.398071671630e-10 1024 KSP Residual norm 2.327957994828e-10 1025 KSP Residual norm 2.214112512053e-10 1026 KSP Residual norm 2.119681445223e-10 1027 KSP Residual norm 2.084531515800e-10 1028 KSP Residual norm 2.075468942163e-10 1029 KSP Residual norm 2.172778890333e-10 1030 KSP Residual norm 2.304607637780e-10 1031 KSP Residual norm 2.307571974843e-10 1032 KSP Residual norm 2.270672161097e-10 1033 KSP Residual norm 2.171332783606e-10 1034 KSP Residual norm 2.084726701007e-10 1035 KSP Residual norm 2.044527273295e-10 1036 KSP Residual norm 2.039610154082e-10 1037 KSP Residual norm 2.103251134750e-10 1038 KSP Residual norm 2.111275095130e-10 1039 KSP Residual norm 2.040319165422e-10 1040 KSP Residual norm 1.815026160879e-10 1041 KSP Residual norm 1.595170077793e-10 1042 KSP Residual norm 1.543343906089e-10 1043 KSP Residual norm 1.572613776785e-10 1044 KSP Residual norm 1.588625712986e-10 1045 KSP Residual norm 1.507427693585e-10 1046 KSP Residual norm 1.427357673703e-10 1047 KSP Residual norm 1.422620684647e-10 1048 KSP Residual norm 1.467955931423e-10 1049 KSP Residual norm 1.470377098471e-10 1050 KSP Residual norm 1.402411794981e-10 1051 KSP Residual norm 1.312433511091e-10 1052 KSP Residual norm 1.211314413683e-10 1053 KSP Residual norm 1.173887690999e-10 1054 KSP Residual norm 1.155458216925e-10 1055 KSP Residual norm 1.128789069219e-10 1056 KSP Residual norm 1.084739911462e-10 1057 KSP Residual norm 1.106094782868e-10 1058 KSP Residual norm 1.147452985664e-10 1059 KSP Residual norm 1.149954604269e-10 1060 KSP Residual norm 1.193800286095e-10 1061 KSP Residual norm 1.309023486984e-10 1062 KSP Residual norm 1.420911020598e-10 1063 KSP Residual norm 1.488275369637e-10 1064 KSP Residual norm 1.446716718771e-10 1065 KSP Residual norm 1.358881141186e-10 1066 KSP Residual norm 1.309326531427e-10 1067 KSP Residual norm 1.303915544725e-10 1068 KSP Residual norm 1.232697390625e-10 1069 KSP Residual norm 1.197024778590e-10 1070 KSP Residual norm 1.132326897546e-10 1071 KSP Residual norm 1.078000950676e-10 1072 KSP Residual norm 1.119366759526e-10 1073 KSP Residual norm 1.235712482411e-10 1074 KSP Residual norm 1.357784939192e-10 1075 KSP Residual norm 1.440935529404e-10 1076 KSP Residual norm 1.471673405697e-10 1077 KSP Residual norm 1.483366631805e-10 1078 KSP Residual norm 1.471150335930e-10 1079 KSP Residual norm 1.406543892592e-10 1080 KSP Residual norm 1.340037081894e-10 1081 KSP Residual norm 1.330062692154e-10 1082 KSP Residual norm 1.351162630114e-10 1083 KSP Residual norm 1.293435446636e-10 1084 KSP Residual norm 1.177013518633e-10 1085 KSP Residual norm 1.211017322565e-10 1086 KSP Residual norm 1.318777753080e-10 1087 KSP Residual norm 1.376421660394e-10 1088 KSP Residual norm 1.347363939631e-10 1089 KSP Residual norm 1.400588227020e-10 1090 KSP Residual norm 1.448507844125e-10 1091 KSP Residual norm 1.450171301749e-10 1092 KSP Residual norm 1.359352635381e-10 1093 KSP Residual norm 1.171919570819e-10 1094 KSP Residual norm 1.049392967084e-10 1095 KSP Residual norm 1.089821022460e-10 1096 KSP Residual norm 1.185368426463e-10 1097 KSP Residual norm 1.259915564274e-10 1098 KSP Residual norm 1.375279809772e-10 1099 KSP Residual norm 1.526613666737e-10 1100 KSP Residual norm 1.565274408319e-10 1101 KSP Residual norm 1.618324786611e-10 1102 KSP Residual norm 1.730001357964e-10 1103 KSP Residual norm 1.795472643169e-10 1104 KSP Residual norm 1.813031367066e-10 1105 KSP Residual norm 1.786546660073e-10 1106 KSP Residual norm 1.697653039857e-10 1107 KSP Residual norm 1.640707343660e-10 1108 KSP Residual norm 1.615504585828e-10 1109 KSP Residual norm 1.615096731694e-10 1110 KSP Residual norm 1.591076231168e-10 1111 KSP Residual norm 1.522483473249e-10 1112 KSP Residual norm 1.465731763735e-10 1113 KSP Residual norm 1.407913237595e-10 1114 KSP Residual norm 1.354712014437e-10 1115 KSP Residual norm 1.320137220037e-10 1116 KSP Residual norm 1.313687778078e-10 1117 KSP Residual norm 1.310971848267e-10 1118 KSP Residual norm 1.292582460721e-10 1119 KSP Residual norm 1.287740026063e-10 1120 KSP Residual norm 1.300832330655e-10 1121 KSP Residual norm 1.332045617420e-10 1122 KSP Residual norm 1.331696634118e-10 1123 KSP Residual norm 1.342425034096e-10 1124 KSP Residual norm 1.387838479953e-10 1125 KSP Residual norm 1.476424102360e-10 1126 KSP Residual norm 1.504006860430e-10 1127 KSP Residual norm 1.491413771797e-10 1128 KSP Residual norm 1.505733843840e-10 1129 KSP Residual norm 1.485491107044e-10 1130 KSP Residual norm 1.435805257220e-10 1131 KSP Residual norm 1.395207280188e-10 1132 KSP Residual norm 1.311926480719e-10 1133 KSP Residual norm 1.207266302548e-10 1134 KSP Residual norm 1.119449237072e-10 1135 KSP Residual norm 1.062984855452e-10 1136 KSP Residual norm 1.068897403523e-10 1137 KSP Residual norm 1.076414656853e-10 1138 KSP Residual norm 1.118241265149e-10 1139 KSP Residual norm 1.081980927745e-10 1140 KSP Residual norm 1.019907962164e-10 1141 KSP Residual norm 1.010209122008e-10 1142 KSP Residual norm 1.013359420459e-10 1143 KSP Residual norm 9.932500479177e-11 1144 KSP Residual norm 1.003955484669e-10 1145 KSP Residual norm 1.021893570306e-10 1146 KSP Residual norm 9.926057322706e-11 1147 KSP Residual norm 9.648515930229e-11 1148 KSP Residual norm 9.569416903574e-11 1149 KSP Residual norm 9.387155764020e-11 1150 KSP Residual norm 9.180576887838e-11 1151 KSP Residual norm 8.553919419715e-11 1152 KSP Residual norm 7.851411577195e-11 1153 KSP Residual norm 7.390482045224e-11 1154 KSP Residual norm 7.414222446181e-11 1155 KSP Residual norm 7.530912275082e-11 1156 KSP Residual norm 7.673177110188e-11 1157 KSP Residual norm 7.622477299404e-11 1158 KSP Residual norm 7.838311761306e-11 1159 KSP Residual norm 7.837396709383e-11 1160 KSP Residual norm 7.634992677048e-11 1161 KSP Residual norm 7.094509786158e-11 1162 KSP Residual norm 6.505541987853e-11 1163 KSP Residual norm 6.317679983339e-11 1164 KSP Residual norm 6.341128387201e-11 1165 KSP Residual norm 6.644580379472e-11 1166 KSP Residual norm 7.151483215202e-11 1167 KSP Residual norm 7.336763537086e-11 1168 KSP Residual norm 7.504040049393e-11 1169 KSP Residual norm 7.708207055046e-11 1170 KSP Residual norm 7.541274845611e-11 1171 KSP Residual norm 7.370095283574e-11 1172 KSP Residual norm 7.580485043066e-11 1173 KSP Residual norm 7.830355452263e-11 1174 KSP Residual norm 8.066797742180e-11 1175 KSP Residual norm 7.990103318601e-11 1176 KSP Residual norm 7.663965484325e-11 1177 KSP Residual norm 7.482882539879e-11 1178 KSP Residual norm 7.968524832502e-11 1179 KSP Residual norm 8.672036338053e-11 1180 KSP Residual norm 8.883895115202e-11 1181 KSP Residual norm 8.754218112608e-11 1182 KSP Residual norm 8.411344146378e-11 1183 KSP Residual norm 7.951607563253e-11 1184 KSP Residual norm 7.594389116270e-11 1185 KSP Residual norm 7.361251969389e-11 1186 KSP Residual norm 7.591920100077e-11 1187 KSP Residual norm 8.128618839562e-11 1188 KSP Residual norm 8.450488832936e-11 1189 KSP Residual norm 8.036441484940e-11 1190 KSP Residual norm 7.391192244399e-11 1191 KSP Residual norm 6.724726767951e-11 1192 KSP Residual norm 6.272751829084e-11 1193 KSP Residual norm 6.203026007891e-11 1194 KSP Residual norm 6.886433780209e-11 1195 KSP Residual norm 8.067537826361e-11 1196 KSP Residual norm 8.622357125933e-11 1197 KSP Residual norm 8.412820022654e-11 1198 KSP Residual norm 8.112348323483e-11 1199 KSP Residual norm 8.154581038652e-11 1200 KSP Residual norm 8.706335192872e-11 1201 KSP Residual norm 8.447054725618e-11 1202 KSP Residual norm 7.724118069529e-11 1203 KSP Residual norm 7.411960389970e-11 1204 KSP Residual norm 7.543729877469e-11 1205 KSP Residual norm 7.311925981364e-11 1206 KSP Residual norm 6.941476000510e-11 1207 KSP Residual norm 6.818238229188e-11 1208 KSP Residual norm 6.818624709434e-11 1209 KSP Residual norm 6.895336654532e-11 1210 KSP Residual norm 6.680906918851e-11 1211 KSP Residual norm 6.478996361213e-11 1212 KSP Residual norm 6.275484748059e-11 1213 KSP Residual norm 5.882549072828e-11 1214 KSP Residual norm 5.667237302098e-11 1215 KSP Residual norm 5.560566027770e-11 1216 KSP Residual norm 5.677262811968e-11 1217 KSP Residual norm 5.925752482481e-11 1218 KSP Residual norm 6.093374373147e-11 1219 KSP Residual norm 5.561110239316e-11 1220 KSP Residual norm 5.004338363120e-11 1221 KSP Residual norm 5.023474380083e-11 1222 KSP Residual norm 5.188205042869e-11 1223 KSP Residual norm 5.060198019439e-11 1224 KSP Residual norm 4.747707982661e-11 1225 KSP Residual norm 4.404904468818e-11 1226 KSP Residual norm 4.149674942236e-11 1227 KSP Residual norm 4.170716880284e-11 1228 KSP Residual norm 4.336326415919e-11 1229 KSP Residual norm 4.424919560952e-11 1230 KSP Residual norm 4.693968400854e-11 1231 KSP Residual norm 5.081520696434e-11 1232 KSP Residual norm 5.593841637181e-11 1233 KSP Residual norm 5.962831229778e-11 1234 KSP Residual norm 6.284003277554e-11 1235 KSP Residual norm 6.518585461858e-11 1236 KSP Residual norm 7.253725859142e-11 1237 KSP Residual norm 8.168930806136e-11 1238 KSP Residual norm 8.821634436118e-11 1239 KSP Residual norm 9.017144481000e-11 1240 KSP Residual norm 8.364153335067e-11 1241 KSP Residual norm 7.251020235481e-11 1242 KSP Residual norm 6.516376464296e-11 1243 KSP Residual norm 5.863485348560e-11 1244 KSP Residual norm 5.533622447551e-11 1245 KSP Residual norm 5.594684030422e-11 1246 KSP Residual norm 5.575478407766e-11 1247 KSP Residual norm 5.370876361827e-11 1248 KSP Residual norm 5.088363611682e-11 1249 KSP Residual norm 4.974082879213e-11 1250 KSP Residual norm 4.968614173440e-11 1251 KSP Residual norm 5.000691508005e-11 1252 KSP Residual norm 5.188279568517e-11 1253 KSP Residual norm 5.420189479710e-11 1254 KSP Residual norm 5.745463020410e-11 1255 KSP Residual norm 6.122499852708e-11 1256 KSP Residual norm 5.869435965730e-11 1257 KSP Residual norm 5.475717282743e-11 1258 KSP Residual norm 5.548147909800e-11 1259 KSP Residual norm 6.042692200772e-11 1260 KSP Residual norm 6.073529236741e-11 1261 KSP Residual norm 5.937409153576e-11 1262 KSP Residual norm 5.822030769350e-11 1263 KSP Residual norm 6.085427522775e-11 1264 KSP Residual norm 6.499828725659e-11 1265 KSP Residual norm 6.088528127384e-11 1266 KSP Residual norm 5.346563602380e-11 1267 KSP Residual norm 4.964251423195e-11 1268 KSP Residual norm 4.813102228585e-11 1269 KSP Residual norm 4.898451606505e-11 1270 KSP Residual norm 5.135923313170e-11 1271 KSP Residual norm 5.207291080755e-11 1272 KSP Residual norm 5.528388309236e-11 1273 KSP Residual norm 6.076704479728e-11 1274 KSP Residual norm 6.551497089530e-11 1275 KSP Residual norm 6.795455552311e-11 1276 KSP Residual norm 6.562360529508e-11 1277 KSP Residual norm 6.262558533399e-11 1278 KSP Residual norm 5.918130092280e-11 1279 KSP Residual norm 5.944335412715e-11 1280 KSP Residual norm 6.372731648387e-11 1281 KSP Residual norm 6.773464855994e-11 1282 KSP Residual norm 6.994671750816e-11 1283 KSP Residual norm 6.844513578006e-11 1284 KSP Residual norm 6.434839578307e-11 1285 KSP Residual norm 6.348697366731e-11 1286 KSP Residual norm 6.263006197791e-11 1287 KSP Residual norm 6.409211454301e-11 1288 KSP Residual norm 6.723818923790e-11 1289 KSP Residual norm 7.147719605310e-11 1290 KSP Residual norm 7.531318784894e-11 1291 KSP Residual norm 7.722784130899e-11 1292 KSP Residual norm 7.816664567588e-11 1293 KSP Residual norm 7.907723963319e-11 1294 KSP Residual norm 7.621125387651e-11 1295 KSP Residual norm 7.112229320293e-11 1296 KSP Residual norm 6.709834973122e-11 1297 KSP Residual norm 6.217948223014e-11 1298 KSP Residual norm 6.251158409623e-11 1299 KSP Residual norm 6.329160660033e-11 1300 KSP Residual norm 6.122233371564e-11 1301 KSP Residual norm 5.984253630579e-11 1302 KSP Residual norm 5.900172232594e-11 1303 KSP Residual norm 5.672039758740e-11 1304 KSP Residual norm 5.574862872541e-11 1305 KSP Residual norm 5.361354220343e-11 1306 KSP Residual norm 5.377088040599e-11 1307 KSP Residual norm 5.588623451823e-11 1308 KSP Residual norm 5.747120052666e-11 1309 KSP Residual norm 6.067295334689e-11 1310 KSP Residual norm 6.384859635732e-11 1311 KSP Residual norm 6.310341777068e-11 1312 KSP Residual norm 5.830991254669e-11 1313 KSP Residual norm 5.664240703641e-11 1314 KSP Residual norm 5.937723401096e-11 1315 KSP Residual norm 6.365819082483e-11 1316 KSP Residual norm 6.373211964211e-11 1317 KSP Residual norm 6.170333267718e-11 1318 KSP Residual norm 5.979279405005e-11 1319 KSP Residual norm 5.843167365927e-11 1320 KSP Residual norm 5.672545999204e-11 1321 KSP Residual norm 5.506594549561e-11 1322 KSP Residual norm 5.164033773681e-11 1323 KSP Residual norm 4.889023895512e-11 1324 KSP Residual norm 4.925588299453e-11 1325 KSP Residual norm 5.206318313321e-11 1326 KSP Residual norm 5.604175265366e-11 1327 KSP Residual norm 5.869427852544e-11 1328 KSP Residual norm 5.558471988601e-11 1329 KSP Residual norm 5.060340633242e-11 1330 KSP Residual norm 4.714145744795e-11 1331 KSP Residual norm 4.588248638026e-11 1332 KSP Residual norm 4.647271469684e-11 1333 KSP Residual norm 4.925900373786e-11 1334 KSP Residual norm 5.057904596371e-11 1335 KSP Residual norm 5.234022260567e-11 1336 KSP Residual norm 5.325995949454e-11 1337 KSP Residual norm 5.317434813284e-11 1338 KSP Residual norm 5.191702846614e-11 1339 KSP Residual norm 4.897471704247e-11 1340 KSP Residual norm 4.813960513359e-11 1341 KSP Residual norm 4.855388387461e-11 1342 KSP Residual norm 4.599462972353e-11 1343 KSP Residual norm 4.374881970369e-11 1344 KSP Residual norm 4.156515429106e-11 1345 KSP Residual norm 3.895489405473e-11 1346 KSP Residual norm 3.890268026812e-11 1347 KSP Residual norm 4.136861148277e-11 1348 KSP Residual norm 4.480658150947e-11 1349 KSP Residual norm 5.006162819140e-11 1350 KSP Residual norm 4.942024962977e-11 1351 KSP Residual norm 4.775597961287e-11 1352 KSP Residual norm 4.648514825069e-11 1353 KSP Residual norm 4.608697958732e-11 1354 KSP Residual norm 4.917802547671e-11 1355 KSP Residual norm 4.824196332672e-11 1356 KSP Residual norm 4.680153692658e-11 1357 KSP Residual norm 4.644762281857e-11 1358 KSP Residual norm 4.632006199305e-11 1359 KSP Residual norm 4.742777795788e-11 1360 KSP Residual norm 4.837865339551e-11 1361 KSP Residual norm 4.838156302088e-11 1362 KSP Residual norm 4.724292198938e-11 1363 KSP Residual norm 4.657984693394e-11 1364 KSP Residual norm 4.364689605920e-11 1365 KSP Residual norm 4.389591065852e-11 1366 KSP Residual norm 4.602704010866e-11 1367 KSP Residual norm 4.687679292430e-11 1368 KSP Residual norm 4.622041138465e-11 1369 KSP Residual norm 4.160480676259e-11 1370 KSP Residual norm 3.673852490590e-11 1371 KSP Residual norm 3.401836640249e-11 1372 KSP Residual norm 3.430106439165e-11 1373 KSP Residual norm 3.521297274493e-11 1374 KSP Residual norm 3.398848156540e-11 1375 KSP Residual norm 3.309361803380e-11 1376 KSP Residual norm 3.349493768653e-11 1377 KSP Residual norm 3.477463286600e-11 1378 KSP Residual norm 3.508730728169e-11 1379 KSP Residual norm 3.407909647317e-11 1380 KSP Residual norm 3.140199538442e-11 1381 KSP Residual norm 3.032291983113e-11 1382 KSP Residual norm 2.985220889919e-11 1383 KSP Residual norm 2.943609536448e-11 1384 KSP Residual norm 3.023660508353e-11 1385 KSP Residual norm 3.207694714330e-11 1386 KSP Residual norm 3.107973980643e-11 1387 KSP Residual norm 2.995569493004e-11 1388 KSP Residual norm 2.994918615942e-11 1389 KSP Residual norm 3.000370630672e-11 1390 KSP Residual norm 2.898369471135e-11 1391 KSP Residual norm 2.771908107131e-11 1392 KSP Residual norm 2.911619782503e-11 1393 KSP Residual norm 3.223861850551e-11 1394 KSP Residual norm 3.375541451266e-11 1395 KSP Residual norm 3.258241973401e-11 1396 KSP Residual norm 3.189031351714e-11 1397 KSP Residual norm 3.270424923303e-11 1398 KSP Residual norm 3.179851172710e-11 1399 KSP Residual norm 2.878278523803e-11 1400 KSP Residual norm 2.832034562236e-11 1401 KSP Residual norm 3.111959398167e-11 1402 KSP Residual norm 3.450294410361e-11 1403 KSP Residual norm 3.538518714304e-11 1404 KSP Residual norm 3.211544745980e-11 1405 KSP Residual norm 2.927393941786e-11 1406 KSP Residual norm 2.847216185970e-11 1407 KSP Residual norm 2.673748328135e-11 1408 KSP Residual norm 2.421504025347e-11 1409 KSP Residual norm 2.244606319266e-11 1410 KSP Residual norm 2.163057581753e-11 1411 KSP Residual norm 2.269083634779e-11 1412 KSP Residual norm 2.520651337343e-11 1413 KSP Residual norm 2.777649826291e-11 1414 KSP Residual norm 2.863939481549e-11 1415 KSP Residual norm 2.732093638839e-11 1416 KSP Residual norm 2.728767698480e-11 1417 KSP Residual norm 2.858027349526e-11 1418 KSP Residual norm 2.867307570492e-11 1419 KSP Residual norm 2.725788163399e-11 1420 KSP Residual norm 2.606635531942e-11 1421 KSP Residual norm 2.429601815287e-11 1422 KSP Residual norm 2.174877744019e-11 1423 KSP Residual norm 2.065817771683e-11 1424 KSP Residual norm 2.159336542510e-11 1425 KSP Residual norm 2.394758166542e-11 1426 KSP Residual norm 2.593802033154e-11 1427 KSP Residual norm 2.749139089143e-11 1428 KSP Residual norm 2.806834714378e-11 1429 KSP Residual norm 2.630380288207e-11 1430 KSP Residual norm 2.484612708297e-11 1431 KSP Residual norm 2.392286200592e-11 1432 KSP Residual norm 2.403037929655e-11 1433 KSP Residual norm 2.541950325507e-11 1434 KSP Residual norm 2.788639621997e-11 1435 KSP Residual norm 2.965177169216e-11 1436 KSP Residual norm 3.041574894747e-11 1437 KSP Residual norm 3.072785215675e-11 1438 KSP Residual norm 3.310624061637e-11 1439 KSP Residual norm 3.691330597709e-11 1440 KSP Residual norm 3.712010771281e-11 1441 KSP Residual norm 3.320808663621e-11 1442 KSP Residual norm 2.964328803870e-11 1443 KSP Residual norm 2.794577106536e-11 1444 KSP Residual norm 2.854440239017e-11 1445 KSP Residual norm 2.902941459050e-11 1446 KSP Residual norm 2.744530268338e-11 1447 KSP Residual norm 2.672687223569e-11 1448 KSP Residual norm 2.801371361953e-11 1449 KSP Residual norm 3.080483772655e-11 1450 KSP Residual norm 3.309114705641e-11 1451 KSP Residual norm 3.357368196619e-11 1452 KSP Residual norm 3.303154836495e-11 1453 KSP Residual norm 3.266008867137e-11 1454 KSP Residual norm 3.301972640534e-11 1455 KSP Residual norm 3.125199515467e-11 1456 KSP Residual norm 2.979677344172e-11 1457 KSP Residual norm 2.829202581782e-11 1458 KSP Residual norm 2.773224969249e-11 1459 KSP Residual norm 2.521497493726e-11 1460 KSP Residual norm 2.282729831454e-11 1461 KSP Residual norm 2.299547442838e-11 1462 KSP Residual norm 2.501454377229e-11 1463 KSP Residual norm 2.703885952792e-11 1464 KSP Residual norm 2.797416274857e-11 1465 KSP Residual norm 2.830179981005e-11 1466 KSP Residual norm 3.025627164600e-11 1467 KSP Residual norm 3.187428516745e-11 1468 KSP Residual norm 3.078065480494e-11 1469 KSP Residual norm 2.874915157805e-11 1470 KSP Residual norm 2.762388703414e-11 1471 KSP Residual norm 2.649582627983e-11 1472 KSP Residual norm 2.704240088768e-11 1473 KSP Residual norm 2.864315846463e-11 1474 KSP Residual norm 2.897257884272e-11 1475 KSP Residual norm 2.747701010956e-11 1476 KSP Residual norm 2.666836462217e-11 1477 KSP Residual norm 2.695829800995e-11 1478 KSP Residual norm 3.053434935863e-11 1479 KSP Residual norm 3.598820845699e-11 1480 KSP Residual norm 3.765893267216e-11 1481 KSP Residual norm 3.467521617809e-11 1482 KSP Residual norm 3.052337971711e-11 1483 KSP Residual norm 2.814163879082e-11 1484 KSP Residual norm 2.772575792933e-11 1485 KSP Residual norm 2.812504718796e-11 1486 KSP Residual norm 2.775137737763e-11 1487 KSP Residual norm 2.783011515919e-11 1488 KSP Residual norm 3.002502109903e-11 1489 KSP Residual norm 3.271777789294e-11 1490 KSP Residual norm 3.664008296406e-11 1491 KSP Residual norm 3.900039017270e-11 1492 KSP Residual norm 3.638728742011e-11 1493 KSP Residual norm 3.384389531936e-11 1494 KSP Residual norm 3.114063530372e-11 1495 KSP Residual norm 3.068808382847e-11 1496 KSP Residual norm 3.444289642021e-11 1497 KSP Residual norm 4.319965794469e-11 1498 KSP Residual norm 5.075521002532e-11 1499 KSP Residual norm 5.027477815867e-11 1500 KSP Residual norm 4.493952993719e-11 1501 KSP Residual norm 4.017880630228e-11 1502 KSP Residual norm 3.710549316153e-11 1503 KSP Residual norm 3.585014407402e-11 1504 KSP Residual norm 3.692811419992e-11 1505 KSP Residual norm 3.698431795929e-11 1506 KSP Residual norm 3.467703478123e-11 1507 KSP Residual norm 3.271708379039e-11 1508 KSP Residual norm 3.133897019229e-11 1509 KSP Residual norm 3.182786408871e-11 1510 KSP Residual norm 3.407660285374e-11 1511 KSP Residual norm 3.853490785694e-11 1512 KSP Residual norm 4.368638379677e-11 1513 KSP Residual norm 4.500511477964e-11 1514 KSP Residual norm 4.392428749016e-11 1515 KSP Residual norm 4.207449629175e-11 1516 KSP Residual norm 3.787075973202e-11 1517 KSP Residual norm 3.340000650597e-11 1518 KSP Residual norm 3.030166034859e-11 1519 KSP Residual norm 2.756948105011e-11 1520 KSP Residual norm 2.496392043860e-11 1521 KSP Residual norm 2.481008126916e-11 1522 KSP Residual norm 2.831355831373e-11 1523 KSP Residual norm 3.621062491559e-11 1524 KSP Residual norm 4.695828492181e-11 1525 KSP Residual norm 5.585391226767e-11 1526 KSP Residual norm 5.988100223420e-11 1527 KSP Residual norm 5.669928580340e-11 1528 KSP Residual norm 5.119353256049e-11 1529 KSP Residual norm 4.315188064721e-11 1530 KSP Residual norm 3.616482128589e-11 1531 KSP Residual norm 3.282678577329e-11 1532 KSP Residual norm 3.161509240991e-11 1533 KSP Residual norm 3.294008051507e-11 1534 KSP Residual norm 3.615593372873e-11 1535 KSP Residual norm 3.933613010667e-11 1536 KSP Residual norm 3.994605192355e-11 1537 KSP Residual norm 3.935046515920e-11 1538 KSP Residual norm 3.952564541791e-11 1539 KSP Residual norm 4.087214422067e-11 1540 KSP Residual norm 4.284568250052e-11 1541 KSP Residual norm 4.075779793887e-11 1542 KSP Residual norm 3.589558446884e-11 1543 KSP Residual norm 3.182527370981e-11 1544 KSP Residual norm 2.930585407660e-11 1545 KSP Residual norm 2.725557441410e-11 1546 KSP Residual norm 2.488858603277e-11 1547 KSP Residual norm 2.283622177416e-11 1548 KSP Residual norm 2.253496994928e-11 1549 KSP Residual norm 2.277359853803e-11 1550 KSP Residual norm 2.317237489289e-11 1551 KSP Residual norm 2.387534567051e-11 1552 KSP Residual norm 2.460200305402e-11 1553 KSP Residual norm 2.595059609423e-11 1554 KSP Residual norm 2.802790157245e-11 1555 KSP Residual norm 2.957084915308e-11 1556 KSP Residual norm 3.141905718152e-11 1557 KSP Residual norm 3.308939151054e-11 1558 KSP Residual norm 3.265298233048e-11 1559 KSP Residual norm 3.146389757611e-11 1560 KSP Residual norm 3.155831679037e-11 1561 KSP Residual norm 3.491764815514e-11 1562 KSP Residual norm 3.809069285049e-11 1563 KSP Residual norm 3.930824446404e-11 1564 KSP Residual norm 3.747276133215e-11 1565 KSP Residual norm 3.246307697024e-11 1566 KSP Residual norm 2.854751960405e-11 1567 KSP Residual norm 2.691847380379e-11 1568 KSP Residual norm 2.712681159762e-11 1569 KSP Residual norm 2.884262850746e-11 1570 KSP Residual norm 3.279691580764e-11 1571 KSP Residual norm 3.630357691665e-11 1572 KSP Residual norm 3.945290037455e-11 1573 KSP Residual norm 4.282388985784e-11 1574 KSP Residual norm 4.368630280859e-11 1575 KSP Residual norm 4.083999329904e-11 1576 KSP Residual norm 3.753898603556e-11 1577 KSP Residual norm 3.395497515106e-11 1578 KSP Residual norm 2.981920923298e-11 1579 KSP Residual norm 2.955852447007e-11 1580 KSP Residual norm 3.412834800693e-11 1581 KSP Residual norm 4.347404111469e-11 1582 KSP Residual norm 5.184624968067e-11 1583 KSP Residual norm 4.653887914245e-11 1584 KSP Residual norm 3.499044185267e-11 1585 KSP Residual norm 2.839030300144e-11 1586 KSP Residual norm 2.806700484554e-11 1587 KSP Residual norm 3.270415336715e-11 1588 KSP Residual norm 4.080762992750e-11 1589 KSP Residual norm 4.738497674959e-11 1590 KSP Residual norm 4.643437058747e-11 1591 KSP Residual norm 3.868220609496e-11 1592 KSP Residual norm 3.414665614423e-11 1593 KSP Residual norm 3.533824530226e-11 1594 KSP Residual norm 3.877626134923e-11 1595 KSP Residual norm 3.881883663426e-11 1596 KSP Residual norm 3.369100127839e-11 1597 KSP Residual norm 2.862743379353e-11 1598 KSP Residual norm 2.711072883465e-11 1599 KSP Residual norm 2.991104056452e-11 1600 KSP Residual norm 3.815453531513e-11 1601 KSP Residual norm 4.445839858471e-11 1602 KSP Residual norm 4.220488864059e-11 1603 KSP Residual norm 3.670186384085e-11 1604 KSP Residual norm 3.428597446211e-11 1605 KSP Residual norm 3.438934861919e-11 1606 KSP Residual norm 3.516442529147e-11 1607 KSP Residual norm 3.613454785533e-11 1608 KSP Residual norm 3.837435487112e-11 1609 KSP Residual norm 3.898085368523e-11 1610 KSP Residual norm 3.825744659302e-11 1611 KSP Residual norm 3.659555902035e-11 1612 KSP Residual norm 3.280516930504e-11 1613 KSP Residual norm 2.804445548731e-11 1614 KSP Residual norm 2.522611289025e-11 1615 KSP Residual norm 2.672671577127e-11 1616 KSP Residual norm 3.107085900950e-11 1617 KSP Residual norm 3.536169025791e-11 1618 KSP Residual norm 4.196042641887e-11 1619 KSP Residual norm 5.078724052572e-11 1620 KSP Residual norm 5.340604617379e-11 1621 KSP Residual norm 4.505452305325e-11 1622 KSP Residual norm 3.550621552497e-11 1623 KSP Residual norm 3.045718485390e-11 1624 KSP Residual norm 3.115462016558e-11 1625 KSP Residual norm 3.512956686598e-11 1626 KSP Residual norm 4.034184914264e-11 1627 KSP Residual norm 4.414619457890e-11 1628 KSP Residual norm 4.111279098890e-11 1629 KSP Residual norm 3.258370781084e-11 1630 KSP Residual norm 2.686627278892e-11 1631 KSP Residual norm 2.658505258504e-11 1632 KSP Residual norm 3.245664219568e-11 1633 KSP Residual norm 4.017130769837e-11 1634 KSP Residual norm 4.432591891277e-11 1635 KSP Residual norm 4.706858895655e-11 1636 KSP Residual norm 4.836020531302e-11 1637 KSP Residual norm 4.540688968950e-11 1638 KSP Residual norm 4.207109431139e-11 1639 KSP Residual norm 4.150313214986e-11 1640 KSP Residual norm 4.131502709810e-11 1641 KSP Residual norm 3.855983356827e-11 1642 KSP Residual norm 2.985734288986e-11 1643 KSP Residual norm 2.145577783988e-11 1644 KSP Residual norm 1.872286564931e-11 1645 KSP Residual norm 2.139014940222e-11 1646 KSP Residual norm 2.835840706730e-11 1647 KSP Residual norm 3.924135910021e-11 1648 KSP Residual norm 5.144276913250e-11 1649 KSP Residual norm 5.536070304802e-11 1650 KSP Residual norm 4.830633315153e-11 1651 KSP Residual norm 3.775126809654e-11 1652 KSP Residual norm 2.791214472150e-11 1653 KSP Residual norm 2.259655560261e-11 1654 KSP Residual norm 2.255736694278e-11 1655 KSP Residual norm 2.803739036615e-11 1656 KSP Residual norm 3.813180897992e-11 1657 KSP Residual norm 4.881271832385e-11 1658 KSP Residual norm 4.722560642665e-11 1659 KSP Residual norm 3.505735235567e-11 1660 KSP Residual norm 2.461051177100e-11 1661 KSP Residual norm 2.007860995089e-11 1662 KSP Residual norm 2.024901373181e-11 1663 KSP Residual norm 2.473991686929e-11 1664 KSP Residual norm 3.364600255089e-11 1665 KSP Residual norm 4.500327345721e-11 1666 KSP Residual norm 4.700782746664e-11 1667 KSP Residual norm 3.746486575550e-11 1668 KSP Residual norm 2.361566781339e-11 1669 KSP Residual norm 1.507023527700e-11 1670 KSP Residual norm 1.274811563609e-11 1671 KSP Residual norm 1.496339940879e-11 1672 KSP Residual norm 2.176045502602e-11 1673 KSP Residual norm 3.214339757779e-11 1674 KSP Residual norm 3.686898219120e-11 1675 KSP Residual norm 3.154820389403e-11 1676 KSP Residual norm 2.415041492653e-11 1677 KSP Residual norm 1.935630756761e-11 1678 KSP Residual norm 1.753965388418e-11 1679 KSP Residual norm 1.964641852536e-11 1680 KSP Residual norm 2.686321427461e-11 1681 KSP Residual norm 4.266527903526e-11 1682 KSP Residual norm 6.021987310490e-11 1683 KSP Residual norm 5.819873273258e-11 1684 KSP Residual norm 3.920364130220e-11 1685 KSP Residual norm 2.349727326674e-11 1686 KSP Residual norm 1.516657162431e-11 1687 KSP Residual norm 1.339440859578e-11 1688 KSP Residual norm 1.807706764556e-11 1689 KSP Residual norm 3.363536440113e-11 1690 KSP Residual norm 6.162377039182e-11 1691 KSP Residual norm 8.996466611310e-11 1692 KSP Residual norm 8.645644488124e-11 1693 KSP Residual norm 5.594241441521e-11 1694 KSP Residual norm 3.496918386578e-11 1695 KSP Residual norm 2.627472459740e-11 1696 KSP Residual norm 2.599432569200e-11 1697 KSP Residual norm 3.724913399350e-11 1698 KSP Residual norm 6.483144055940e-11 1699 KSP Residual norm 1.118075869119e-10 1700 KSP Residual norm 1.586655331099e-10 1701 KSP Residual norm 1.559009013628e-10 1702 KSP Residual norm 9.684452744993e-11 1703 KSP Residual norm 5.435865088920e-11 1704 KSP Residual norm 3.680532700545e-11 1705 KSP Residual norm 3.599508311057e-11 1706 KSP Residual norm 5.183398549765e-11 1707 KSP Residual norm 8.969043427494e-11 1708 KSP Residual norm 1.295817789085e-10 1709 KSP Residual norm 1.137013855868e-10 1710 KSP Residual norm 6.576301467083e-11 1711 KSP Residual norm 3.723847573454e-11 1712 KSP Residual norm 2.706350953573e-11 1713 KSP Residual norm 2.711020096141e-11 1714 KSP Residual norm 3.377544703905e-11 1715 KSP Residual norm 4.333180008162e-11 1716 KSP Residual norm 4.461110637379e-11 1717 KSP Residual norm 4.307497993672e-11 1718 KSP Residual norm 4.921528950738e-11 1719 KSP Residual norm 6.255500857736e-11 1720 KSP Residual norm 6.585464979415e-11 1721 KSP Residual norm 5.571965193549e-11 1722 KSP Residual norm 4.532119216389e-11 1723 KSP Residual norm 3.767729826496e-11 1724 KSP Residual norm 3.376185846303e-11 1725 KSP Residual norm 3.232789232215e-11 1726 KSP Residual norm 3.110234690725e-11 1727 KSP Residual norm 2.986589828327e-11 1728 KSP Residual norm 3.037016598609e-11 1729 KSP Residual norm 3.363073738525e-11 1730 KSP Residual norm 4.479992602860e-11 1731 KSP Residual norm 5.948664567830e-11 1732 KSP Residual norm 6.936383586950e-11 1733 KSP Residual norm 6.838459588723e-11 1734 KSP Residual norm 6.211413432684e-11 1735 KSP Residual norm 6.272749308915e-11 1736 KSP Residual norm 6.773951442776e-11 1737 KSP Residual norm 6.082588216308e-11 1738 KSP Residual norm 4.421927208088e-11 1739 KSP Residual norm 2.990315460770e-11 1740 KSP Residual norm 2.365150490993e-11 1741 KSP Residual norm 2.494387564904e-11 1742 KSP Residual norm 3.401479761332e-11 1743 KSP Residual norm 4.655423096847e-11 1744 KSP Residual norm 5.136038391311e-11 1745 KSP Residual norm 5.503272514558e-11 1746 KSP Residual norm 7.101058823487e-11 1747 KSP Residual norm 1.000762885889e-10 1748 KSP Residual norm 1.189732676121e-10 1749 KSP Residual norm 1.166621873421e-10 1750 KSP Residual norm 1.142690385730e-10 1751 KSP Residual norm 1.211710416190e-10 1752 KSP Residual norm 1.179152930317e-10 1753 KSP Residual norm 9.187785552458e-11 1754 KSP Residual norm 6.699641828491e-11 1755 KSP Residual norm 5.631786572253e-11 1756 KSP Residual norm 5.560577816065e-11 1757 KSP Residual norm 5.809043157951e-11 1758 KSP Residual norm 5.213072564740e-11 1759 KSP Residual norm 4.410644361859e-11 1760 KSP Residual norm 4.512867413365e-11 1761 KSP Residual norm 5.839645693174e-11 1762 KSP Residual norm 8.170908547606e-11 1763 KSP Residual norm 9.811170287370e-11 1764 KSP Residual norm 1.032115914331e-10 1765 KSP Residual norm 1.094448628444e-10 1766 KSP Residual norm 1.337184242182e-10 1767 KSP Residual norm 1.681709984990e-10 1768 KSP Residual norm 1.692105553005e-10 1769 KSP Residual norm 1.300398938708e-10 1770 KSP Residual norm 9.983917051717e-11 1771 KSP Residual norm 9.208177844164e-11 1772 KSP Residual norm 9.299642524475e-11 1773 KSP Residual norm 8.157379402152e-11 1774 KSP Residual norm 5.848566077420e-11 1775 KSP Residual norm 4.503831887317e-11 1776 KSP Residual norm 4.341968992818e-11 1777 KSP Residual norm 5.269820923466e-11 1778 KSP Residual norm 6.454923119906e-11 1779 KSP Residual norm 6.698842002430e-11 1780 KSP Residual norm 6.903774516870e-11 1781 KSP Residual norm 8.238263330312e-11 1782 KSP Residual norm 1.041286837212e-10 1783 KSP Residual norm 1.122779875998e-10 1784 KSP Residual norm 9.630279885568e-11 1785 KSP Residual norm 8.720426091165e-11 1786 KSP Residual norm 9.356262561369e-11 1787 KSP Residual norm 9.742737192669e-11 1788 KSP Residual norm 8.387858903026e-11 1789 KSP Residual norm 6.164097393624e-11 1790 KSP Residual norm 5.063554535933e-11 1791 KSP Residual norm 5.075612877868e-11 1792 KSP Residual norm 5.234026025485e-11 1793 KSP Residual norm 4.962657577037e-11 1794 KSP Residual norm 4.332942543987e-11 1795 KSP Residual norm 4.456266950221e-11 1796 KSP Residual norm 5.433897045262e-11 1797 KSP Residual norm 6.166950860396e-11 1798 KSP Residual norm 5.808670693040e-11 1799 KSP Residual norm 5.674868055482e-11 1800 KSP Residual norm 6.645064426718e-11 1801 KSP Residual norm 9.585295839347e-11 1802 KSP Residual norm 1.391019482444e-10 1803 KSP Residual norm 1.638573687544e-10 1804 KSP Residual norm 1.764041478527e-10 1805 KSP Residual norm 2.082193665996e-10 1806 KSP Residual norm 2.638468322320e-10 1807 KSP Residual norm 2.797077315284e-10 1808 KSP Residual norm 2.205612332058e-10 1809 KSP Residual norm 1.611740214537e-10 1810 KSP Residual norm 1.371653681184e-10 1811 KSP Residual norm 1.342650196247e-10 1812 KSP Residual norm 1.244307943761e-10 1813 KSP Residual norm 9.627299112407e-11 1814 KSP Residual norm 6.836646352054e-11 1815 KSP Residual norm 5.659794118851e-11 1816 KSP Residual norm 5.598833626005e-11 1817 KSP Residual norm 5.450863116080e-11 1818 KSP Residual norm 4.575449672464e-11 1819 KSP Residual norm 3.910121712738e-11 1820 KSP Residual norm 4.201002686819e-11 1821 KSP Residual norm 5.564772551282e-11 1822 KSP Residual norm 6.967548378483e-11 1823 KSP Residual norm 6.935700004202e-11 1824 KSP Residual norm 7.029884443443e-11 1825 KSP Residual norm 8.586880534868e-11 1826 KSP Residual norm 1.201439185439e-10 1827 KSP Residual norm 1.436555586520e-10 1828 KSP Residual norm 1.334070616778e-10 1829 KSP Residual norm 1.249784775403e-10 1830 KSP Residual norm 1.350943254274e-10 1831 KSP Residual norm 1.554012535051e-10 1832 KSP Residual norm 1.418581070564e-10 1833 KSP Residual norm 1.148504635681e-10 1834 KSP Residual norm 1.013863617211e-10 1835 KSP Residual norm 1.005605962819e-10 1836 KSP Residual norm 9.587188184788e-11 1837 KSP Residual norm 7.403186593625e-11 1838 KSP Residual norm 5.720392939436e-11 1839 KSP Residual norm 5.326893756025e-11 1840 KSP Residual norm 5.683229152558e-11 1841 KSP Residual norm 6.158025175850e-11 1842 KSP Residual norm 5.907703940620e-11 1843 KSP Residual norm 5.708358098481e-11 1844 KSP Residual norm 6.314949806320e-11 1845 KSP Residual norm 7.543793491315e-11 1846 KSP Residual norm 8.138171034217e-11 1847 KSP Residual norm 8.558566858840e-11 1848 KSP Residual norm 1.021396617329e-10 1849 KSP Residual norm 1.399082343679e-10 1850 KSP Residual norm 1.720169796611e-10 1851 KSP Residual norm 1.719575494536e-10 1852 KSP Residual norm 1.819918975082e-10 1853 KSP Residual norm 2.268386698900e-10 1854 KSP Residual norm 2.779076825646e-10 1855 KSP Residual norm 2.680012756453e-10 1856 KSP Residual norm 2.251701725886e-10 1857 KSP Residual norm 2.031711858446e-10 1858 KSP Residual norm 2.069820003424e-10 1859 KSP Residual norm 1.835049019101e-10 1860 KSP Residual norm 1.420745589240e-10 1861 KSP Residual norm 1.241201488597e-10 1862 KSP Residual norm 1.236025394327e-10 1863 KSP Residual norm 1.183578001551e-10 1864 KSP Residual norm 9.231771775263e-11 1865 KSP Residual norm 6.993237048022e-11 1866 KSP Residual norm 6.531940008196e-11 1867 KSP Residual norm 7.424691709638e-11 1868 KSP Residual norm 8.174044023224e-11 1869 KSP Residual norm 8.306160093690e-11 1870 KSP Residual norm 8.899751858933e-11 1871 KSP Residual norm 1.071921867869e-10 1872 KSP Residual norm 1.296764478521e-10 1873 KSP Residual norm 1.395024963873e-10 1874 KSP Residual norm 1.469610784310e-10 1875 KSP Residual norm 1.691079752839e-10 1876 KSP Residual norm 1.970734259586e-10 1877 KSP Residual norm 2.063061409858e-10 1878 KSP Residual norm 1.959531000573e-10 1879 KSP Residual norm 2.032812940376e-10 1880 KSP Residual norm 2.328222209669e-10 1881 KSP Residual norm 2.414616682665e-10 1882 KSP Residual norm 2.096503821730e-10 1883 KSP Residual norm 1.890254577129e-10 1884 KSP Residual norm 1.824033703721e-10 1885 KSP Residual norm 1.714173536165e-10 1886 KSP Residual norm 1.455324140427e-10 1887 KSP Residual norm 1.294755142389e-10 1888 KSP Residual norm 1.245296730865e-10 1889 KSP Residual norm 1.229687221037e-10 1890 KSP Residual norm 1.054702775926e-10 1891 KSP Residual norm 8.319660808855e-11 1892 KSP Residual norm 7.285733574077e-11 1893 KSP Residual norm 7.533674992509e-11 1894 KSP Residual norm 7.613073266344e-11 1895 KSP Residual norm 7.110135267061e-11 1896 KSP Residual norm 7.201050300051e-11 1897 KSP Residual norm 7.936507819547e-11 1898 KSP Residual norm 8.275048609760e-11 1899 KSP Residual norm 8.290109543377e-11 1900 KSP Residual norm 8.834180743366e-11 1901 KSP Residual norm 1.033493471375e-10 1902 KSP Residual norm 1.188305820930e-10 1903 KSP Residual norm 1.326742091115e-10 1904 KSP Residual norm 1.396674441527e-10 1905 KSP Residual norm 1.566418286199e-10 1906 KSP Residual norm 1.827732608008e-10 1907 KSP Residual norm 1.974669067479e-10 1908 KSP Residual norm 1.938611996790e-10 1909 KSP Residual norm 1.915562247969e-10 1910 KSP Residual norm 1.842053303867e-10 1911 KSP Residual norm 1.633773303297e-10 1912 KSP Residual norm 1.452560785537e-10 1913 KSP Residual norm 1.380318952387e-10 1914 KSP Residual norm 1.363002069024e-10 1915 KSP Residual norm 1.206836584703e-10 1916 KSP Residual norm 9.974435151093e-11 1917 KSP Residual norm 8.882801331267e-11 1918 KSP Residual norm 8.292350786833e-11 1919 KSP Residual norm 7.780037926348e-11 1920 KSP Residual norm 7.407145515503e-11 1921 KSP Residual norm 7.519067424956e-11 1922 KSP Residual norm 8.107455076648e-11 1923 KSP Residual norm 8.359693262964e-11 1924 KSP Residual norm 7.887408595720e-11 1925 KSP Residual norm 7.837516108684e-11 1926 KSP Residual norm 9.171697252375e-11 1927 KSP Residual norm 1.102688129402e-10 1928 KSP Residual norm 1.176160343324e-10 1929 KSP Residual norm 1.221261929950e-10 1930 KSP Residual norm 1.455915938973e-10 1931 KSP Residual norm 1.792715255549e-10 1932 KSP Residual norm 1.898884877683e-10 1933 KSP Residual norm 1.783373966337e-10 1934 KSP Residual norm 1.821832691090e-10 1935 KSP Residual norm 2.063352135026e-10 1936 KSP Residual norm 2.222825639375e-10 1937 KSP Residual norm 2.308025170114e-10 1938 KSP Residual norm 2.351817038011e-10 1939 KSP Residual norm 2.285004089208e-10 1940 KSP Residual norm 2.048432638639e-10 1941 KSP Residual norm 1.789156584146e-10 1942 KSP Residual norm 1.686654993699e-10 1943 KSP Residual norm 1.618209830643e-10 1944 KSP Residual norm 1.450590685950e-10 1945 KSP Residual norm 1.236932116584e-10 1946 KSP Residual norm 1.109158508351e-10 1947 KSP Residual norm 1.059496835659e-10 1948 KSP Residual norm 1.037999090730e-10 1949 KSP Residual norm 9.308779047236e-11 1950 KSP Residual norm 8.116334665689e-11 1951 KSP Residual norm 7.191585428170e-11 1952 KSP Residual norm 6.588810591171e-11 1953 KSP Residual norm 5.625006219301e-11 1954 KSP Residual norm 4.905248307638e-11 1955 KSP Residual norm 4.828778868116e-11 1956 KSP Residual norm 5.672705750343e-11 1957 KSP Residual norm 6.860083397404e-11 1958 KSP Residual norm 7.308924598252e-11 1959 KSP Residual norm 7.524524160838e-11 1960 KSP Residual norm 8.633966787895e-11 1961 KSP Residual norm 1.047479149424e-10 1962 KSP Residual norm 1.139789025381e-10 1963 KSP Residual norm 1.157696763288e-10 1964 KSP Residual norm 1.300239102695e-10 1965 KSP Residual norm 1.505530582577e-10 1966 KSP Residual norm 1.669549427103e-10 1967 KSP Residual norm 1.727162723745e-10 1968 KSP Residual norm 1.833793971041e-10 1969 KSP Residual norm 2.018894563158e-10 1970 KSP Residual norm 2.160147223282e-10 1971 KSP Residual norm 2.198345386089e-10 1972 KSP Residual norm 2.189813501640e-10 1973 KSP Residual norm 2.279090897322e-10 1974 KSP Residual norm 2.276584245505e-10 1975 KSP Residual norm 2.171969493039e-10 1976 KSP Residual norm 2.113114730185e-10 1977 KSP Residual norm 2.032912827689e-10 1978 KSP Residual norm 1.756699673372e-10 1979 KSP Residual norm 1.417074875890e-10 1980 KSP Residual norm 1.315657687131e-10 1981 KSP Residual norm 1.295456869918e-10 1982 KSP Residual norm 1.134509726205e-10 1983 KSP Residual norm 9.096452370523e-11 1984 KSP Residual norm 7.695118035297e-11 1985 KSP Residual norm 7.233319017844e-11 1986 KSP Residual norm 6.678849183872e-11 1987 KSP Residual norm 5.895147448799e-11 1988 KSP Residual norm 5.307891669691e-11 1989 KSP Residual norm 5.227623776605e-11 1990 KSP Residual norm 5.551593346879e-11 1991 KSP Residual norm 5.559290659743e-11 1992 KSP Residual norm 5.491150729404e-11 1993 KSP Residual norm 5.931771545135e-11 1994 KSP Residual norm 7.027863320513e-11 1995 KSP Residual norm 7.284349584872e-11 1996 KSP Residual norm 7.255055832227e-11 1997 KSP Residual norm 7.569347573341e-11 1998 KSP Residual norm 8.522599457897e-11 1999 KSP Residual norm 9.794099350455e-11 2000 KSP Residual norm 1.090264518368e-10 2001 KSP Residual norm 1.222210633486e-10 2002 KSP Residual norm 1.336866469837e-10 2003 KSP Residual norm 1.488144480910e-10 2004 KSP Residual norm 1.651613034836e-10 2005 KSP Residual norm 1.848777920508e-10 2006 KSP Residual norm 2.167251724008e-10 2007 KSP Residual norm 2.429278157763e-10 2008 KSP Residual norm 2.543815894735e-10 2009 KSP Residual norm 2.683818593290e-10 2010 KSP Residual norm 2.915078352686e-10 2011 KSP Residual norm 3.043662932702e-10 2012 KSP Residual norm 3.084880532941e-10 2013 KSP Residual norm 3.147027126994e-10 2014 KSP Residual norm 3.115499246668e-10 2015 KSP Residual norm 2.871054839937e-10 2016 KSP Residual norm 2.650312322881e-10 2017 KSP Residual norm 2.448722693906e-10 2018 KSP Residual norm 2.167797523305e-10 2019 KSP Residual norm 1.887962285832e-10 2020 KSP Residual norm 1.637317380160e-10 2021 KSP Residual norm 1.464218560842e-10 2022 KSP Residual norm 1.314096150149e-10 2023 KSP Residual norm 1.138798682243e-10 2024 KSP Residual norm 1.026866628386e-10 2025 KSP Residual norm 9.894343372465e-11 2026 KSP Residual norm 1.036739297618e-10 2027 KSP Residual norm 1.026777499986e-10 2028 KSP Residual norm 9.579797793044e-11 2029 KSP Residual norm 9.203657598416e-11 2030 KSP Residual norm 1.005490308197e-10 2031 KSP Residual norm 1.195047034194e-10 2032 KSP Residual norm 1.370047833312e-10 2033 KSP Residual norm 1.383078359003e-10 2034 KSP Residual norm 1.455992384589e-10 2035 KSP Residual norm 1.720014107699e-10 2036 KSP Residual norm 1.971189376993e-10 2037 KSP Residual norm 2.066768165713e-10 2038 KSP Residual norm 2.285338454504e-10 2039 KSP Residual norm 2.860106297261e-10 2040 KSP Residual norm 3.502781069065e-10 2041 KSP Residual norm 3.792965527783e-10 2042 KSP Residual norm 4.055190871207e-10 2043 KSP Residual norm 4.529237267325e-10 2044 KSP Residual norm 4.758930272450e-10 2045 KSP Residual norm 4.339312633502e-10 2046 KSP Residual norm 4.113783731372e-10 2047 KSP Residual norm 4.258310034508e-10 2048 KSP Residual norm 4.172245905588e-10 2049 KSP Residual norm 3.819415417058e-10 2050 KSP Residual norm 3.355809192200e-10 2051 KSP Residual norm 3.136052701437e-10 2052 KSP Residual norm 3.103385933848e-10 2053 KSP Residual norm 2.809161155608e-10 2054 KSP Residual norm 2.284477350563e-10 2055 KSP Residual norm 2.038255480305e-10 2056 KSP Residual norm 2.039259222550e-10 2057 KSP Residual norm 1.822560887525e-10 2058 KSP Residual norm 1.602128574653e-10 2059 KSP Residual norm 1.610492663859e-10 2060 KSP Residual norm 1.623492067980e-10 2061 KSP Residual norm 1.407130927385e-10 2062 KSP Residual norm 1.163613966376e-10 2063 KSP Residual norm 1.035935360281e-10 2064 KSP Residual norm 9.917776804124e-11 2065 KSP Residual norm 9.395269868296e-11 2066 KSP Residual norm 9.280433125092e-11 2067 KSP Residual norm 9.439385281818e-11 2068 KSP Residual norm 9.619705843289e-11 2069 KSP Residual norm 9.130026736257e-11 2070 KSP Residual norm 8.797158306378e-11 2071 KSP Residual norm 9.180409586536e-11 2072 KSP Residual norm 1.040815599986e-10 2073 KSP Residual norm 1.147104506829e-10 2074 KSP Residual norm 1.184494011657e-10 2075 KSP Residual norm 1.235317886114e-10 2076 KSP Residual norm 1.419747323768e-10 2077 KSP Residual norm 1.708297448056e-10 2078 KSP Residual norm 1.855633410631e-10 2079 KSP Residual norm 1.849002310169e-10 2080 KSP Residual norm 1.920435388186e-10 2081 KSP Residual norm 2.173952157144e-10 2082 KSP Residual norm 2.226396729497e-10 2083 KSP Residual norm 2.156840564653e-10 2084 KSP Residual norm 2.140631542892e-10 2085 KSP Residual norm 2.182642611950e-10 2086 KSP Residual norm 2.096874291607e-10 2087 KSP Residual norm 1.898832946013e-10 2088 KSP Residual norm 1.810129464174e-10 2089 KSP Residual norm 1.784759584142e-10 2090 KSP Residual norm 1.669028061626e-10 2091 KSP Residual norm 1.453465148264e-10 2092 KSP Residual norm 1.320992233306e-10 2093 KSP Residual norm 1.335668219344e-10 2094 KSP Residual norm 1.331766050820e-10 2095 KSP Residual norm 1.135386327476e-10 2096 KSP Residual norm 9.982030233435e-11 2097 KSP Residual norm 9.991809412648e-11 2098 KSP Residual norm 1.006454093342e-10 2099 KSP Residual norm 9.548728734724e-11 2100 KSP Residual norm 8.700424690934e-11 2101 KSP Residual norm 8.635180284336e-11 2102 KSP Residual norm 9.132753533364e-11 2103 KSP Residual norm 8.866013819529e-11 2104 KSP Residual norm 7.810133400059e-11 2105 KSP Residual norm 7.758472775974e-11 2106 KSP Residual norm 8.420385421331e-11 2107 KSP Residual norm 8.219259216674e-11 2108 KSP Residual norm 8.004277397670e-11 2109 KSP Residual norm 9.028722430798e-11 2110 KSP Residual norm 1.008145586295e-10 2111 KSP Residual norm 1.027810548114e-10 2112 KSP Residual norm 1.077806213331e-10 2113 KSP Residual norm 1.210004751337e-10 2114 KSP Residual norm 1.385760197801e-10 2115 KSP Residual norm 1.544739183311e-10 2116 KSP Residual norm 1.561219571506e-10 2117 KSP Residual norm 1.630509075295e-10 2118 KSP Residual norm 1.906306268967e-10 2119 KSP Residual norm 2.210444213755e-10 2120 KSP Residual norm 2.346943876231e-10 2121 KSP Residual norm 2.586014662113e-10 2122 KSP Residual norm 2.933911383561e-10 2123 KSP Residual norm 2.975351636786e-10 2124 KSP Residual norm 2.972922840815e-10 2125 KSP Residual norm 3.210992324256e-10 2126 KSP Residual norm 3.413600628509e-10 2127 KSP Residual norm 3.261822314442e-10 2128 KSP Residual norm 3.122359096875e-10 2129 KSP Residual norm 3.122783960999e-10 2130 KSP Residual norm 3.106930144919e-10 2131 KSP Residual norm 2.861978818888e-10 2132 KSP Residual norm 2.543585633919e-10 2133 KSP Residual norm 2.354930160544e-10 2134 KSP Residual norm 2.337761728229e-10 2135 KSP Residual norm 2.301780846776e-10 2136 KSP Residual norm 2.051043332576e-10 2137 KSP Residual norm 1.930328802456e-10 2138 KSP Residual norm 1.914244625060e-10 2139 KSP Residual norm 1.867133248933e-10 2140 KSP Residual norm 1.737722527371e-10 2141 KSP Residual norm 1.664834596357e-10 2142 KSP Residual norm 1.699526348903e-10 2143 KSP Residual norm 1.657152676381e-10 2144 KSP Residual norm 1.581291107719e-10 2145 KSP Residual norm 1.541609027014e-10 2146 KSP Residual norm 1.631130581822e-10 2147 KSP Residual norm 1.716883179138e-10 2148 KSP Residual norm 1.759148951415e-10 2149 KSP Residual norm 1.824679548897e-10 2150 KSP Residual norm 1.948007607072e-10 2151 KSP Residual norm 2.041784968468e-10 2152 KSP Residual norm 2.240608271444e-10 2153 KSP Residual norm 2.488840856820e-10 2154 KSP Residual norm 2.737311786205e-10 2155 KSP Residual norm 2.954914653271e-10 2156 KSP Residual norm 3.143007018616e-10 2157 KSP Residual norm 3.276300163631e-10 2158 KSP Residual norm 3.559758190923e-10 2159 KSP Residual norm 3.944726619021e-10 2160 KSP Residual norm 3.981329171558e-10 2161 KSP Residual norm 3.871500912897e-10 2162 KSP Residual norm 3.908449560096e-10 2163 KSP Residual norm 4.045307063583e-10 2164 KSP Residual norm 3.967024757667e-10 2165 KSP Residual norm 3.736716466868e-10 2166 KSP Residual norm 3.505462125125e-10 2167 KSP Residual norm 3.563967579342e-10 2168 KSP Residual norm 3.373101782276e-10 2169 KSP Residual norm 2.963070237795e-10 2170 KSP Residual norm 2.689035293488e-10 2171 KSP Residual norm 2.689315942289e-10 2172 KSP Residual norm 2.660704460402e-10 2173 KSP Residual norm 2.429375934432e-10 2174 KSP Residual norm 2.263358766855e-10 2175 KSP Residual norm 2.217179618254e-10 2176 KSP Residual norm 2.138896092436e-10 2177 KSP Residual norm 1.944626847933e-10 2178 KSP Residual norm 1.770169096739e-10 2179 KSP Residual norm 1.680106575919e-10 2180 KSP Residual norm 1.607372387657e-10 2181 KSP Residual norm 1.402798402789e-10 2182 KSP Residual norm 1.288146935252e-10 2183 KSP Residual norm 1.299392505235e-10 2184 KSP Residual norm 1.218731459517e-10 2185 KSP Residual norm 1.038040309453e-10 2186 KSP Residual norm 9.705322938405e-11 2187 KSP Residual norm 1.016096890036e-10 2188 KSP Residual norm 9.825518300719e-11 2189 KSP Residual norm 8.937069741244e-11 2190 KSP Residual norm 8.648766581649e-11 2191 KSP Residual norm 8.351126155862e-11 2192 KSP Residual norm 7.474924187590e-11 2193 KSP Residual norm 6.770423101309e-11 2194 KSP Residual norm 6.767163026704e-11 2195 KSP Residual norm 7.530845232610e-11 2196 KSP Residual norm 8.055917017063e-11 2197 KSP Residual norm 8.317101904893e-11 2198 KSP Residual norm 8.697279910758e-11 2199 KSP Residual norm 9.285692206865e-11 2200 KSP Residual norm 9.740395553086e-11 2201 KSP Residual norm 1.018754782712e-10 2202 KSP Residual norm 1.096291599987e-10 2203 KSP Residual norm 1.223901420078e-10 2204 KSP Residual norm 1.267449532594e-10 2205 KSP Residual norm 1.259707171042e-10 2206 KSP Residual norm 1.331195020328e-10 2207 KSP Residual norm 1.492216501630e-10 2208 KSP Residual norm 1.527635022651e-10 2209 KSP Residual norm 1.485398504704e-10 2210 KSP Residual norm 1.574372137154e-10 2211 KSP Residual norm 1.756454306567e-10 2212 KSP Residual norm 1.786126533933e-10 2213 KSP Residual norm 1.878415215867e-10 2214 KSP Residual norm 2.110391239804e-10 2215 KSP Residual norm 2.324524779983e-10 2216 KSP Residual norm 2.356977686783e-10 2217 KSP Residual norm 2.462168432869e-10 2218 KSP Residual norm 2.713009449201e-10 2219 KSP Residual norm 2.735163833974e-10 2220 KSP Residual norm 2.611523943352e-10 2221 KSP Residual norm 2.615651937377e-10 2222 KSP Residual norm 2.667911046554e-10 2223 KSP Residual norm 2.706431920824e-10 2224 KSP Residual norm 2.756822139051e-10 2225 KSP Residual norm 2.867192457304e-10 2226 KSP Residual norm 2.918244187107e-10 2227 KSP Residual norm 2.867468705694e-10 2228 KSP Residual norm 2.665436485442e-10 2229 KSP Residual norm 2.330319418464e-10 2230 KSP Residual norm 2.114503277666e-10 2231 KSP Residual norm 2.096849853863e-10 2232 KSP Residual norm 2.066520540572e-10 2233 KSP Residual norm 1.895970398596e-10 2234 KSP Residual norm 1.758931729554e-10 2235 KSP Residual norm 1.659279978155e-10 2236 KSP Residual norm 1.486562997685e-10 2237 KSP Residual norm 1.300372230095e-10 2238 KSP Residual norm 1.241973805489e-10 2239 KSP Residual norm 1.307291314495e-10 2240 KSP Residual norm 1.344505026982e-10 2241 KSP Residual norm 1.290694209154e-10 2242 KSP Residual norm 1.203464342300e-10 2243 KSP Residual norm 1.108105240480e-10 2244 KSP Residual norm 1.014349836552e-10 2245 KSP Residual norm 9.208955122135e-11 2246 KSP Residual norm 8.427126685257e-11 2247 KSP Residual norm 8.334492104759e-11 2248 KSP Residual norm 8.899629939646e-11 2249 KSP Residual norm 8.896845149025e-11 2250 KSP Residual norm 8.129190458587e-11 2251 KSP Residual norm 7.508373448121e-11 2252 KSP Residual norm 7.304873996774e-11 2253 KSP Residual norm 7.066829635571e-11 2254 KSP Residual norm 6.680236434092e-11 2255 KSP Residual norm 6.683025413622e-11 2256 KSP Residual norm 7.090011747181e-11 2257 KSP Residual norm 7.005075424392e-11 2258 KSP Residual norm 6.561245931433e-11 2259 KSP Residual norm 6.450368168419e-11 2260 KSP Residual norm 6.557214962130e-11 2261 KSP Residual norm 6.224610251598e-11 2262 KSP Residual norm 6.094963079501e-11 2263 KSP Residual norm 6.036212892653e-11 2264 KSP Residual norm 6.038396516287e-11 2265 KSP Residual norm 6.234647260731e-11 2266 KSP Residual norm 6.873960815998e-11 2267 KSP Residual norm 7.636845025300e-11 2268 KSP Residual norm 8.195033918118e-11 2269 KSP Residual norm 8.721677902537e-11 2270 KSP Residual norm 9.333073210027e-11 2271 KSP Residual norm 9.862392163768e-11 2272 KSP Residual norm 1.048798878648e-10 2273 KSP Residual norm 1.076425768901e-10 2274 KSP Residual norm 1.099036098316e-10 2275 KSP Residual norm 1.178779266547e-10 2276 KSP Residual norm 1.256946310447e-10 2277 KSP Residual norm 1.285687297532e-10 2278 KSP Residual norm 1.419189611408e-10 2279 KSP Residual norm 1.727879043610e-10 2280 KSP Residual norm 1.952245540658e-10 2281 KSP Residual norm 1.863186261883e-10 2282 KSP Residual norm 1.829954529610e-10 2283 KSP Residual norm 1.968826351800e-10 2284 KSP Residual norm 2.108661806137e-10 2285 KSP Residual norm 2.026921427533e-10 2286 KSP Residual norm 2.072834428362e-10 2287 KSP Residual norm 2.286919676782e-10 2288 KSP Residual norm 2.384490432004e-10 2289 KSP Residual norm 2.286022037646e-10 2290 KSP Residual norm 2.310287296497e-10 2291 KSP Residual norm 2.526254307259e-10 2292 KSP Residual norm 2.733545803081e-10 2293 KSP Residual norm 2.683329642103e-10 2294 KSP Residual norm 2.498430049605e-10 2295 KSP Residual norm 2.460797610497e-10 2296 KSP Residual norm 2.397018786962e-10 2297 KSP Residual norm 2.044028022221e-10 2298 KSP Residual norm 1.687562986618e-10 2299 KSP Residual norm 1.585449075899e-10 2300 KSP Residual norm 1.585026695890e-10 2301 KSP Residual norm 1.486731678061e-10 2302 KSP Residual norm 1.508662006343e-10 2303 KSP Residual norm 1.678094696439e-10 2304 KSP Residual norm 1.794868168900e-10 2305 KSP Residual norm 1.641459457020e-10 2306 KSP Residual norm 1.541259104138e-10 2307 KSP Residual norm 1.512503381706e-10 2308 KSP Residual norm 1.409716289384e-10 2309 KSP Residual norm 1.234692240054e-10 2310 KSP Residual norm 1.106853373284e-10 2311 KSP Residual norm 1.008958838863e-10 2312 KSP Residual norm 9.153577821877e-11 2313 KSP Residual norm 8.287864481090e-11 2314 KSP Residual norm 7.502659240147e-11 2315 KSP Residual norm 6.943776620384e-11 2316 KSP Residual norm 6.406300197240e-11 2317 KSP Residual norm 6.012260611392e-11 2318 KSP Residual norm 5.855323861146e-11 2319 KSP Residual norm 6.048577026317e-11 2320 KSP Residual norm 6.107501710112e-11 2321 KSP Residual norm 5.497086339435e-11 2322 KSP Residual norm 4.755020157648e-11 2323 KSP Residual norm 4.439172987450e-11 2324 KSP Residual norm 4.402637714381e-11 2325 KSP Residual norm 4.458360919238e-11 2326 KSP Residual norm 4.806460091705e-11 2327 KSP Residual norm 5.002055942832e-11 2328 KSP Residual norm 4.889410635228e-11 2329 KSP Residual norm 4.547611842094e-11 2330 KSP Residual norm 4.392718140625e-11 2331 KSP Residual norm 4.529112605626e-11 2332 KSP Residual norm 4.646225380579e-11 2333 KSP Residual norm 4.663544697773e-11 2334 KSP Residual norm 5.020368130664e-11 2335 KSP Residual norm 5.520959592618e-11 2336 KSP Residual norm 5.876361524761e-11 2337 KSP Residual norm 5.734870333734e-11 2338 KSP Residual norm 5.492881440155e-11 2339 KSP Residual norm 5.562748760637e-11 2340 KSP Residual norm 5.982544468522e-11 2341 KSP Residual norm 6.317231788739e-11 2342 KSP Residual norm 6.700403168661e-11 2343 KSP Residual norm 7.781126156878e-11 2344 KSP Residual norm 9.014397809549e-11 2345 KSP Residual norm 9.768087996254e-11 2346 KSP Residual norm 9.988518851099e-11 2347 KSP Residual norm 1.054022829386e-10 2348 KSP Residual norm 1.066264608491e-10 2349 KSP Residual norm 1.100231971715e-10 2350 KSP Residual norm 1.151150154383e-10 2351 KSP Residual norm 1.295790909111e-10 2352 KSP Residual norm 1.479427139002e-10 2353 KSP Residual norm 1.635584333910e-10 2354 KSP Residual norm 1.765883948144e-10 2355 KSP Residual norm 1.833369071770e-10 2356 KSP Residual norm 1.791945681208e-10 2357 KSP Residual norm 1.650143218950e-10 2358 KSP Residual norm 1.599554547202e-10 2359 KSP Residual norm 1.598702835078e-10 2360 KSP Residual norm 1.572507707186e-10 2361 KSP Residual norm 1.528490807519e-10 2362 KSP Residual norm 1.502836665754e-10 2363 KSP Residual norm 1.471435493140e-10 2364 KSP Residual norm 1.555067497135e-10 2365 KSP Residual norm 1.615339618069e-10 2366 KSP Residual norm 1.552995480645e-10 2367 KSP Residual norm 1.509220108105e-10 2368 KSP Residual norm 1.570508398839e-10 2369 KSP Residual norm 1.507379942100e-10 2370 KSP Residual norm 1.348785416330e-10 2371 KSP Residual norm 1.226583477580e-10 2372 KSP Residual norm 1.234350900884e-10 2373 KSP Residual norm 1.243480977668e-10 2374 KSP Residual norm 1.150785591324e-10 2375 KSP Residual norm 1.074365996010e-10 2376 KSP Residual norm 1.087125753124e-10 2377 KSP Residual norm 1.095320537003e-10 2378 KSP Residual norm 9.489577081440e-11 2379 KSP Residual norm 8.442273981455e-11 2380 KSP Residual norm 8.236346555312e-11 2381 KSP Residual norm 7.977056606707e-11 2382 KSP Residual norm 7.049055659753e-11 2383 KSP Residual norm 6.561008120567e-11 2384 KSP Residual norm 6.617235267824e-11 2385 KSP Residual norm 6.519195802852e-11 2386 KSP Residual norm 6.049724471776e-11 2387 KSP Residual norm 5.741501411826e-11 2388 KSP Residual norm 5.690360192245e-11 2389 KSP Residual norm 5.620160763739e-11 2390 KSP Residual norm 5.110391910939e-11 2391 KSP Residual norm 4.873260310287e-11 2392 KSP Residual norm 5.346094440554e-11 2393 KSP Residual norm 5.444908854441e-11 2394 KSP Residual norm 4.869605676261e-11 2395 KSP Residual norm 4.814487133963e-11 2396 KSP Residual norm 5.013115004758e-11 2397 KSP Residual norm 4.763495594444e-11 2398 KSP Residual norm 4.496523472516e-11 2399 KSP Residual norm 4.607926626287e-11 2400 KSP Residual norm 4.767705973236e-11 2401 KSP Residual norm 4.713311358182e-11 2402 KSP Residual norm 4.827121081063e-11 2403 KSP Residual norm 5.423587799687e-11 2404 KSP Residual norm 5.822469526246e-11 2405 KSP Residual norm 5.678957775328e-11 2406 KSP Residual norm 5.678901971442e-11 2407 KSP Residual norm 6.047654235700e-11 2408 KSP Residual norm 6.093410095448e-11 2409 KSP Residual norm 5.571741217131e-11 2410 KSP Residual norm 5.352247060734e-11 2411 KSP Residual norm 5.443470081772e-11 2412 KSP Residual norm 5.736423490950e-11 2413 KSP Residual norm 6.451088778360e-11 2414 KSP Residual norm 7.385943714942e-11 2415 KSP Residual norm 7.733624689963e-11 2416 KSP Residual norm 7.902212754889e-11 2417 KSP Residual norm 8.199277618342e-11 2418 KSP Residual norm 8.317862802981e-11 2419 KSP Residual norm 8.637195806732e-11 2420 KSP Residual norm 9.508839491942e-11 2421 KSP Residual norm 1.030441012335e-10 2422 KSP Residual norm 1.063060670653e-10 2423 KSP Residual norm 1.062409534710e-10 2424 KSP Residual norm 1.079265473940e-10 2425 KSP Residual norm 1.109429568573e-10 2426 KSP Residual norm 1.167329412038e-10 2427 KSP Residual norm 1.271336715819e-10 2428 KSP Residual norm 1.317567408087e-10 2429 KSP Residual norm 1.358436862127e-10 2430 KSP Residual norm 1.406836948323e-10 2431 KSP Residual norm 1.510997354977e-10 2432 KSP Residual norm 1.708744487898e-10 2433 KSP Residual norm 1.815913290568e-10 2434 KSP Residual norm 1.766664855805e-10 2435 KSP Residual norm 1.663149994630e-10 2436 KSP Residual norm 1.660585185445e-10 2437 KSP Residual norm 1.730002371493e-10 2438 KSP Residual norm 1.754615577512e-10 2439 KSP Residual norm 1.641150497490e-10 2440 KSP Residual norm 1.582200674266e-10 2441 KSP Residual norm 1.596055299023e-10 2442 KSP Residual norm 1.441500743857e-10 2443 KSP Residual norm 1.344848997642e-10 2444 KSP Residual norm 1.388727807241e-10 2445 KSP Residual norm 1.409190910717e-10 2446 KSP Residual norm 1.242526205109e-10 2447 KSP Residual norm 1.059463816423e-10 2448 KSP Residual norm 9.912917441929e-11 2449 KSP Residual norm 8.975577069482e-11 2450 KSP Residual norm 7.630214910239e-11 2451 KSP Residual norm 6.940666363100e-11 2452 KSP Residual norm 6.787035571834e-11 2453 KSP Residual norm 6.521350915628e-11 2454 KSP Residual norm 6.092471142961e-11 2455 KSP Residual norm 5.872586033531e-11 2456 KSP Residual norm 5.751544148662e-11 2457 KSP Residual norm 5.498838782050e-11 2458 KSP Residual norm 5.089425159316e-11 2459 KSP Residual norm 4.657608824059e-11 2460 KSP Residual norm 4.350927638786e-11 2461 KSP Residual norm 4.317109349909e-11 2462 KSP Residual norm 4.188234333228e-11 2463 KSP Residual norm 3.932917235981e-11 2464 KSP Residual norm 3.776291939403e-11 2465 KSP Residual norm 3.624746141051e-11 2466 KSP Residual norm 3.465684715782e-11 2467 KSP Residual norm 3.564293175324e-11 2468 KSP Residual norm 3.694191966074e-11 2469 KSP Residual norm 3.642041837700e-11 2470 KSP Residual norm 3.566276159328e-11 2471 KSP Residual norm 3.584115237200e-11 2472 KSP Residual norm 3.718834754792e-11 2473 KSP Residual norm 3.707367250890e-11 2474 KSP Residual norm 3.471309140550e-11 2475 KSP Residual norm 3.178293342347e-11 2476 KSP Residual norm 3.213584304907e-11 2477 KSP Residual norm 3.300900290851e-11 2478 KSP Residual norm 3.433764709781e-11 2479 KSP Residual norm 3.839460124533e-11 2480 KSP Residual norm 4.711328604312e-11 2481 KSP Residual norm 5.302465094825e-11 2482 KSP Residual norm 5.459351161282e-11 2483 KSP Residual norm 5.492846953401e-11 2484 KSP Residual norm 5.896263509656e-11 2485 KSP Residual norm 6.696702554393e-11 2486 KSP Residual norm 7.666040156911e-11 2487 KSP Residual norm 8.459414797285e-11 2488 KSP Residual norm 9.033211822692e-11 2489 KSP Residual norm 8.957024266081e-11 2490 KSP Residual norm 8.478586845323e-11 2491 KSP Residual norm 8.636540699604e-11 2492 KSP Residual norm 9.561838970771e-11 2493 KSP Residual norm 1.005762466320e-10 2494 KSP Residual norm 1.025547639513e-10 2495 KSP Residual norm 1.105725271912e-10 2496 KSP Residual norm 1.257194865597e-10 2497 KSP Residual norm 1.330229380451e-10 2498 KSP Residual norm 1.265175578299e-10 2499 KSP Residual norm 1.220831472158e-10 2500 KSP Residual norm 1.235172724850e-10 2501 KSP Residual norm 1.206732989349e-10 2502 KSP Residual norm 1.189476356179e-10 2503 KSP Residual norm 1.206751833771e-10 2504 KSP Residual norm 1.207641434718e-10 2505 KSP Residual norm 1.202520881722e-10 2506 KSP Residual norm 1.248485636074e-10 2507 KSP Residual norm 1.317746885834e-10 2508 KSP Residual norm 1.272742211625e-10 2509 KSP Residual norm 1.100786651831e-10 2510 KSP Residual norm 1.023882981021e-10 2511 KSP Residual norm 9.910264988489e-11 2512 KSP Residual norm 9.763180373753e-11 2513 KSP Residual norm 1.087741074007e-10 2514 KSP Residual norm 1.282287087446e-10 2515 KSP Residual norm 1.290867543517e-10 2516 KSP Residual norm 1.145016176480e-10 2517 KSP Residual norm 1.023200093705e-10 2518 KSP Residual norm 9.441644031738e-11 2519 KSP Residual norm 8.487441153069e-11 2520 KSP Residual norm 7.973802426774e-11 2521 KSP Residual norm 7.998558606865e-11 2522 KSP Residual norm 7.788530126207e-11 2523 KSP Residual norm 7.221508745819e-11 2524 KSP Residual norm 6.761652034480e-11 2525 KSP Residual norm 6.419216680049e-11 2526 KSP Residual norm 6.202133779041e-11 2527 KSP Residual norm 6.013606731827e-11 2528 KSP Residual norm 5.798513982330e-11 2529 KSP Residual norm 5.342747198857e-11 2530 KSP Residual norm 4.905028982178e-11 2531 KSP Residual norm 4.808233129657e-11 2532 KSP Residual norm 4.934896702580e-11 2533 KSP Residual norm 4.792407279635e-11 2534 KSP Residual norm 4.422238339562e-11 2535 KSP Residual norm 4.144841375059e-11 2536 KSP Residual norm 4.442825921831e-11 2537 KSP Residual norm 5.245180898656e-11 2538 KSP Residual norm 5.655764847474e-11 2539 KSP Residual norm 5.850932451106e-11 2540 KSP Residual norm 6.087215146013e-11 2541 KSP Residual norm 5.867803039168e-11 2542 KSP Residual norm 5.181316895122e-11 2543 KSP Residual norm 4.681054727332e-11 2544 KSP Residual norm 4.794754507130e-11 2545 KSP Residual norm 5.110304965925e-11 2546 KSP Residual norm 5.238834238885e-11 2547 KSP Residual norm 4.917834488831e-11 2548 KSP Residual norm 4.665886055992e-11 2549 KSP Residual norm 4.623505658883e-11 2550 KSP Residual norm 4.988273795062e-11 2551 KSP Residual norm 5.569534760476e-11 2552 KSP Residual norm 5.935878709843e-11 2553 KSP Residual norm 5.934490760022e-11 2554 KSP Residual norm 6.164423650678e-11 2555 KSP Residual norm 6.811477624227e-11 2556 KSP Residual norm 7.751054071399e-11 2557 KSP Residual norm 8.067618211791e-11 2558 KSP Residual norm 8.159628951265e-11 2559 KSP Residual norm 8.652210170827e-11 2560 KSP Residual norm 9.159941196852e-11 2561 KSP Residual norm 9.646290134732e-11 2562 KSP Residual norm 9.855549239407e-11 2563 KSP Residual norm 1.055756161297e-10 2564 KSP Residual norm 1.159772930963e-10 2565 KSP Residual norm 1.295069869250e-10 2566 KSP Residual norm 1.484508987558e-10 2567 KSP Residual norm 1.686952821331e-10 2568 KSP Residual norm 1.786224160886e-10 2569 KSP Residual norm 1.795002776893e-10 2570 KSP Residual norm 1.871850344231e-10 2571 KSP Residual norm 1.926044656943e-10 2572 KSP Residual norm 1.895329186266e-10 2573 KSP Residual norm 1.811685082571e-10 2574 KSP Residual norm 1.689593808495e-10 2575 KSP Residual norm 1.680895747670e-10 2576 KSP Residual norm 1.734443387105e-10 2577 KSP Residual norm 1.775646420930e-10 2578 KSP Residual norm 1.959279248189e-10 2579 KSP Residual norm 2.357791736647e-10 2580 KSP Residual norm 3.020721171947e-10 2581 KSP Residual norm 3.562992845449e-10 2582 KSP Residual norm 3.548302562529e-10 2583 KSP Residual norm 3.590930846421e-10 2584 KSP Residual norm 3.646402623077e-10 2585 KSP Residual norm 3.512284821210e-10 2586 KSP Residual norm 3.543769172554e-10 2587 KSP Residual norm 3.688934771167e-10 2588 KSP Residual norm 3.666559635009e-10 2589 KSP Residual norm 3.405576989914e-10 2590 KSP Residual norm 3.396163974061e-10 2591 KSP Residual norm 3.562271057121e-10 2592 KSP Residual norm 3.544716167058e-10 2593 KSP Residual norm 3.300122129737e-10 2594 KSP Residual norm 3.140250916303e-10 2595 KSP Residual norm 3.263619510241e-10 2596 KSP Residual norm 3.652334793895e-10 2597 KSP Residual norm 3.852867656972e-10 2598 KSP Residual norm 3.680292373468e-10 2599 KSP Residual norm 3.569556252489e-10 2600 KSP Residual norm 3.408278909425e-10 2601 KSP Residual norm 2.770747475716e-10 2602 KSP Residual norm 2.187976742726e-10 2603 KSP Residual norm 2.047039573634e-10 2604 KSP Residual norm 1.994003139253e-10 2605 KSP Residual norm 1.767896684601e-10 2606 KSP Residual norm 1.640980260010e-10 2607 KSP Residual norm 1.680203566240e-10 2608 KSP Residual norm 1.689243752909e-10 2609 KSP Residual norm 1.625011005568e-10 2610 KSP Residual norm 1.567706871127e-10 2611 KSP Residual norm 1.489925241491e-10 2612 KSP Residual norm 1.335377119176e-10 2613 KSP Residual norm 1.261349874156e-10 2614 KSP Residual norm 1.388447344399e-10 2615 KSP Residual norm 1.580658280228e-10 2616 KSP Residual norm 1.610802646770e-10 2617 KSP Residual norm 1.498291637561e-10 2618 KSP Residual norm 1.485547292345e-10 2619 KSP Residual norm 1.518686342353e-10 2620 KSP Residual norm 1.501881571533e-10 2621 KSP Residual norm 1.496790381409e-10 2622 KSP Residual norm 1.503234034632e-10 2623 KSP Residual norm 1.497223330107e-10 2624 KSP Residual norm 1.510883991305e-10 2625 KSP Residual norm 1.649849590372e-10 2626 KSP Residual norm 1.840315351400e-10 2627 KSP Residual norm 1.673814739933e-10 2628 KSP Residual norm 1.368109829562e-10 2629 KSP Residual norm 1.302384443783e-10 2630 KSP Residual norm 1.366445164541e-10 2631 KSP Residual norm 1.318040332276e-10 2632 KSP Residual norm 1.279051580891e-10 2633 KSP Residual norm 1.299769604020e-10 2634 KSP Residual norm 1.328155185112e-10 2635 KSP Residual norm 1.375609071911e-10 2636 KSP Residual norm 1.588008263296e-10 2637 KSP Residual norm 1.810078406732e-10 2638 KSP Residual norm 1.977882099230e-10 2639 KSP Residual norm 2.197891020341e-10 2640 KSP Residual norm 2.410727612268e-10 2641 KSP Residual norm 2.417572947239e-10 2642 KSP Residual norm 2.452744148031e-10 2643 KSP Residual norm 2.588902634833e-10 2644 KSP Residual norm 2.723038243766e-10 2645 KSP Residual norm 2.803523142417e-10 2646 KSP Residual norm 2.969627004421e-10 2647 KSP Residual norm 3.136338778948e-10 2648 KSP Residual norm 3.320676226355e-10 2649 KSP Residual norm 3.443285585729e-10 2650 KSP Residual norm 3.347237303466e-10 2651 KSP Residual norm 3.372155840079e-10 2652 KSP Residual norm 3.719804634470e-10 2653 KSP Residual norm 3.972016409306e-10 2654 KSP Residual norm 3.810533768158e-10 2655 KSP Residual norm 3.737388111414e-10 2656 KSP Residual norm 3.914109456686e-10 2657 KSP Residual norm 3.804713279838e-10 2658 KSP Residual norm 3.565248556585e-10 2659 KSP Residual norm 3.551987028844e-10 2660 KSP Residual norm 3.755440890058e-10 2661 KSP Residual norm 3.863739516848e-10 2662 KSP Residual norm 3.912108676508e-10 2663 KSP Residual norm 3.963513252839e-10 2664 KSP Residual norm 3.997274621591e-10 2665 KSP Residual norm 3.913657832098e-10 2666 KSP Residual norm 4.004063814179e-10 2667 KSP Residual norm 3.998450137889e-10 2668 KSP Residual norm 3.399261794786e-10 2669 KSP Residual norm 2.908464975349e-10 2670 KSP Residual norm 2.876703334958e-10 2671 KSP Residual norm 2.829023410521e-10 2672 KSP Residual norm 2.530296077125e-10 2673 KSP Residual norm 2.404197754534e-10 2674 KSP Residual norm 2.541675313661e-10 2675 KSP Residual norm 2.594003754784e-10 2676 KSP Residual norm 2.482637570940e-10 2677 KSP Residual norm 2.557166950203e-10 2678 KSP Residual norm 3.012545001673e-10 2679 KSP Residual norm 3.206263086224e-10 2680 KSP Residual norm 3.035581219905e-10 2681 KSP Residual norm 2.927200345230e-10 2682 KSP Residual norm 2.980653223007e-10 2683 KSP Residual norm 2.998257674591e-10 2684 KSP Residual norm 2.719546760063e-10 2685 KSP Residual norm 2.307806997410e-10 2686 KSP Residual norm 1.966748949626e-10 2687 KSP Residual norm 1.741616149224e-10 2688 KSP Residual norm 1.543815912973e-10 2689 KSP Residual norm 1.449637424157e-10 2690 KSP Residual norm 1.504686373290e-10 2691 KSP Residual norm 1.568868466377e-10 2692 KSP Residual norm 1.531948341571e-10 2693 KSP Residual norm 1.464998243444e-10 2694 KSP Residual norm 1.397729534318e-10 2695 KSP Residual norm 1.254246471131e-10 2696 KSP Residual norm 1.148289307950e-10 2697 KSP Residual norm 1.141021500113e-10 2698 KSP Residual norm 1.082329439149e-10 2699 KSP Residual norm 9.274435060555e-11 2700 KSP Residual norm 8.971339605669e-11 2701 KSP Residual norm 9.937094083500e-11 2702 KSP Residual norm 9.974529709267e-11 2703 KSP Residual norm 8.793961906419e-11 2704 KSP Residual norm 8.457303700214e-11 2705 KSP Residual norm 8.588173889778e-11 2706 KSP Residual norm 8.033836401174e-11 2707 KSP Residual norm 7.644758326775e-11 2708 KSP Residual norm 7.770385865782e-11 2709 KSP Residual norm 7.699212866716e-11 2710 KSP Residual norm 6.931587621732e-11 2711 KSP Residual norm 6.450700978620e-11 2712 KSP Residual norm 6.999123042435e-11 2713 KSP Residual norm 7.844740988010e-11 2714 KSP Residual norm 7.487579437117e-11 2715 KSP Residual norm 7.050541093724e-11 2716 KSP Residual norm 6.862862533737e-11 2717 KSP Residual norm 6.511622030435e-11 2718 KSP Residual norm 6.336882344600e-11 2719 KSP Residual norm 6.699988411513e-11 2720 KSP Residual norm 7.052687938956e-11 2721 KSP Residual norm 7.292347017183e-11 2722 KSP Residual norm 7.950697211601e-11 2723 KSP Residual norm 8.333013976557e-11 2724 KSP Residual norm 7.831100210905e-11 2725 KSP Residual norm 7.615807374629e-11 2726 KSP Residual norm 8.044533317035e-11 2727 KSP Residual norm 8.673697461589e-11 2728 KSP Residual norm 8.961725768659e-11 2729 KSP Residual norm 8.882582578248e-11 2730 KSP Residual norm 8.738438053653e-11 2731 KSP Residual norm 9.515245419945e-11 2732 KSP Residual norm 1.136150795634e-10 2733 KSP Residual norm 1.224052362497e-10 2734 KSP Residual norm 1.257486426984e-10 2735 KSP Residual norm 1.332727546660e-10 2736 KSP Residual norm 1.408746603734e-10 2737 KSP Residual norm 1.363725983132e-10 2738 KSP Residual norm 1.242833549100e-10 2739 KSP Residual norm 1.127696441597e-10 2740 KSP Residual norm 1.026928353411e-10 2741 KSP Residual norm 9.197681315245e-11 2742 KSP Residual norm 9.457353479952e-11 2743 KSP Residual norm 1.058044768705e-10 2744 KSP Residual norm 1.025686671084e-10 2745 KSP Residual norm 9.232891148308e-11 2746 KSP Residual norm 9.222904710046e-11 2747 KSP Residual norm 9.382758823938e-11 2748 KSP Residual norm 9.266370901142e-11 2749 KSP Residual norm 1.000855798128e-10 2750 KSP Residual norm 1.147847606464e-10 2751 KSP Residual norm 1.271382965803e-10 2752 KSP Residual norm 1.400895377463e-10 2753 KSP Residual norm 1.474502876642e-10 2754 KSP Residual norm 1.424989806606e-10 2755 KSP Residual norm 1.382809361402e-10 2756 KSP Residual norm 1.379922684551e-10 2757 KSP Residual norm 1.416947072482e-10 2758 KSP Residual norm 1.458520569903e-10 2759 KSP Residual norm 1.504455429371e-10 2760 KSP Residual norm 1.488788536247e-10 2761 KSP Residual norm 1.327345847463e-10 2762 KSP Residual norm 1.234554964739e-10 2763 KSP Residual norm 1.243973676093e-10 2764 KSP Residual norm 1.229995533746e-10 2765 KSP Residual norm 1.160884811545e-10 2766 KSP Residual norm 1.234258201297e-10 2767 KSP Residual norm 1.376939724850e-10 2768 KSP Residual norm 1.461528694143e-10 2769 KSP Residual norm 1.595014271833e-10 2770 KSP Residual norm 1.773876280220e-10 2771 KSP Residual norm 1.736087180434e-10 2772 KSP Residual norm 1.620360391680e-10 2773 KSP Residual norm 1.737192576434e-10 2774 KSP Residual norm 1.912030924706e-10 2775 KSP Residual norm 1.741520885187e-10 2776 KSP Residual norm 1.505140749590e-10 2777 KSP Residual norm 1.512178741270e-10 2778 KSP Residual norm 1.591761698263e-10 2779 KSP Residual norm 1.442808082135e-10 2780 KSP Residual norm 1.437620446103e-10 2781 KSP Residual norm 1.646068745887e-10 2782 KSP Residual norm 1.822919243720e-10 2783 KSP Residual norm 1.644241305869e-10 2784 KSP Residual norm 1.450334993389e-10 2785 KSP Residual norm 1.275630070341e-10 2786 KSP Residual norm 1.057203959338e-10 2787 KSP Residual norm 8.832661612782e-11 2788 KSP Residual norm 8.663322600860e-11 2789 KSP Residual norm 8.580088200037e-11 2790 KSP Residual norm 7.591677040160e-11 2791 KSP Residual norm 6.912969452594e-11 2792 KSP Residual norm 7.641639776334e-11 2793 KSP Residual norm 8.643252432865e-11 2794 KSP Residual norm 8.931030063384e-11 2795 KSP Residual norm 9.209241763996e-11 2796 KSP Residual norm 9.061005160252e-11 2797 KSP Residual norm 8.116282902541e-11 2798 KSP Residual norm 6.913419348616e-11 2799 KSP Residual norm 6.079513596297e-11 2800 KSP Residual norm 5.592220192988e-11 2801 KSP Residual norm 5.201870971250e-11 2802 KSP Residual norm 5.277925756400e-11 2803 KSP Residual norm 5.930230671833e-11 2804 KSP Residual norm 6.410612722731e-11 2805 KSP Residual norm 6.236451401665e-11 2806 KSP Residual norm 5.662233128687e-11 2807 KSP Residual norm 5.289890183490e-11 2808 KSP Residual norm 5.090436407632e-11 2809 KSP Residual norm 5.008807327907e-11 2810 KSP Residual norm 4.941890934843e-11 2811 KSP Residual norm 5.017695126338e-11 2812 KSP Residual norm 5.044549923483e-11 2813 KSP Residual norm 5.122023398052e-11 2814 KSP Residual norm 5.348586433712e-11 2815 KSP Residual norm 6.333846609935e-11 2816 KSP Residual norm 7.681841019137e-11 2817 KSP Residual norm 7.422696816113e-11 2818 KSP Residual norm 6.261317178846e-11 2819 KSP Residual norm 6.122306602876e-11 2820 KSP Residual norm 6.508237017466e-11 2821 KSP Residual norm 6.411561892414e-11 2822 KSP Residual norm 6.000081192435e-11 2823 KSP Residual norm 6.419455140445e-11 2824 KSP Residual norm 6.683856740104e-11 2825 KSP Residual norm 6.239902142650e-11 2826 KSP Residual norm 5.838927276355e-11 2827 KSP Residual norm 6.297374769097e-11 2828 KSP Residual norm 6.997689354060e-11 2829 KSP Residual norm 7.413935238082e-11 2830 KSP Residual norm 8.010413536059e-11 2831 KSP Residual norm 8.569055362059e-11 2832 KSP Residual norm 8.085614131260e-11 2833 KSP Residual norm 7.595593193771e-11 2834 KSP Residual norm 8.161071586042e-11 2835 KSP Residual norm 8.693249545344e-11 2836 KSP Residual norm 8.133711008605e-11 2837 KSP Residual norm 8.956980773986e-11 2838 KSP Residual norm 1.104889846534e-10 2839 KSP Residual norm 1.135953024730e-10 2840 KSP Residual norm 9.934150226468e-11 2841 KSP Residual norm 1.021410148655e-10 2842 KSP Residual norm 1.244795989448e-10 2843 KSP Residual norm 1.408886053405e-10 2844 KSP Residual norm 1.471494756412e-10 2845 KSP Residual norm 1.686075635207e-10 2846 KSP Residual norm 1.989993690977e-10 2847 KSP Residual norm 2.053647476809e-10 2848 KSP Residual norm 2.012358972357e-10 2849 KSP Residual norm 2.076965521236e-10 2850 KSP Residual norm 2.163407181242e-10 2851 KSP Residual norm 2.286243752289e-10 2852 KSP Residual norm 2.636111019595e-10 2853 KSP Residual norm 2.924816569511e-10 2854 KSP Residual norm 2.733366782735e-10 2855 KSP Residual norm 2.664264841346e-10 2856 KSP Residual norm 3.250624450355e-10 2857 KSP Residual norm 3.863137609996e-10 2858 KSP Residual norm 3.522272644323e-10 2859 KSP Residual norm 3.079378603273e-10 2860 KSP Residual norm 3.044869359086e-10 2861 KSP Residual norm 2.957089468057e-10 2862 KSP Residual norm 2.927586504427e-10 2863 KSP Residual norm 3.075591204124e-10 2864 KSP Residual norm 3.093297755729e-10 2865 KSP Residual norm 2.961328618997e-10 2866 KSP Residual norm 3.115425809557e-10 2867 KSP Residual norm 3.366977768627e-10 2868 KSP Residual norm 3.370581730420e-10 2869 KSP Residual norm 3.344545545575e-10 2870 KSP Residual norm 3.487359258423e-10 2871 KSP Residual norm 3.645134695704e-10 2872 KSP Residual norm 3.669049465811e-10 2873 KSP Residual norm 3.709342624645e-10 2874 KSP Residual norm 3.451673144354e-10 2875 KSP Residual norm 2.987296148283e-10 2876 KSP Residual norm 2.876983410402e-10 2877 KSP Residual norm 3.133126289016e-10 2878 KSP Residual norm 3.058231471199e-10 2879 KSP Residual norm 2.919596663525e-10 2880 KSP Residual norm 3.184121609912e-10 2881 KSP Residual norm 3.831276181653e-10 2882 KSP Residual norm 3.945989898409e-10 2883 KSP Residual norm 4.007720120790e-10 2884 KSP Residual norm 4.433437967563e-10 2885 KSP Residual norm 4.429921190455e-10 2886 KSP Residual norm 3.995622207105e-10 2887 KSP Residual norm 4.069699872912e-10 2888 KSP Residual norm 4.524134205270e-10 2889 KSP Residual norm 4.534607725854e-10 2890 KSP Residual norm 4.724496379382e-10 2891 KSP Residual norm 5.670141413287e-10 2892 KSP Residual norm 6.081571459089e-10 2893 KSP Residual norm 5.528512232496e-10 2894 KSP Residual norm 5.023199494095e-10 2895 KSP Residual norm 4.286401916995e-10 2896 KSP Residual norm 3.468077430468e-10 2897 KSP Residual norm 3.159937517605e-10 2898 KSP Residual norm 3.535598826410e-10 2899 KSP Residual norm 3.720219359788e-10 2900 KSP Residual norm 3.106823592704e-10 2901 KSP Residual norm 2.795887634886e-10 2902 KSP Residual norm 2.958324289518e-10 2903 KSP Residual norm 2.819012281393e-10 2904 KSP Residual norm 2.703609006753e-10 2905 KSP Residual norm 3.132423686832e-10 2906 KSP Residual norm 3.750259473452e-10 2907 KSP Residual norm 3.245697781487e-10 2908 KSP Residual norm 2.844787052772e-10 2909 KSP Residual norm 3.111649901245e-10 2910 KSP Residual norm 3.219654773034e-10 2911 KSP Residual norm 2.806984479859e-10 2912 KSP Residual norm 2.548317734403e-10 2913 KSP Residual norm 2.489661064267e-10 2914 KSP Residual norm 2.094707676940e-10 2915 KSP Residual norm 1.673567574308e-10 2916 KSP Residual norm 1.568799016481e-10 2917 KSP Residual norm 1.561146666837e-10 2918 KSP Residual norm 1.413602759548e-10 2919 KSP Residual norm 1.455827743572e-10 2920 KSP Residual norm 1.662350197593e-10 2921 KSP Residual norm 1.543764731555e-10 2922 KSP Residual norm 1.224262741986e-10 2923 KSP Residual norm 1.096461090648e-10 2924 KSP Residual norm 1.111614123649e-10 2925 KSP Residual norm 1.054671766699e-10 2926 KSP Residual norm 1.059747020060e-10 2927 KSP Residual norm 1.162325670389e-10 2928 KSP Residual norm 1.118113706644e-10 2929 KSP Residual norm 8.967047190524e-11 2930 KSP Residual norm 8.109283692253e-11 2931 KSP Residual norm 8.525588608580e-11 2932 KSP Residual norm 8.170736571085e-11 2933 KSP Residual norm 8.357596215059e-11 2934 KSP Residual norm 9.196918167308e-11 2935 KSP Residual norm 8.930124970252e-11 2936 KSP Residual norm 7.534724293010e-11 2937 KSP Residual norm 7.348576590142e-11 2938 KSP Residual norm 7.846389202091e-11 2939 KSP Residual norm 7.564721360210e-11 2940 KSP Residual norm 6.945655269237e-11 2941 KSP Residual norm 7.532278637314e-11 2942 KSP Residual norm 8.278912828553e-11 2943 KSP Residual norm 7.680869322829e-11 2944 KSP Residual norm 7.325145245207e-11 2945 KSP Residual norm 7.792590000344e-11 2946 KSP Residual norm 7.284439452493e-11 2947 KSP Residual norm 6.213299939558e-11 2948 KSP Residual norm 5.826888616233e-11 2949 KSP Residual norm 5.708149069548e-11 2950 KSP Residual norm 5.379338404330e-11 2951 KSP Residual norm 5.279450008556e-11 2952 KSP Residual norm 5.205166565086e-11 2953 KSP Residual norm 4.928073313945e-11 2954 KSP Residual norm 4.895231466257e-11 2955 KSP Residual norm 5.175464793394e-11 2956 KSP Residual norm 4.778988628865e-11 2957 KSP Residual norm 4.011322920294e-11 2958 KSP Residual norm 4.159035668933e-11 2959 KSP Residual norm 4.943910559722e-11 2960 KSP Residual norm 4.819451584447e-11 2961 KSP Residual norm 4.149331200267e-11 2962 KSP Residual norm 4.306312544368e-11 2963 KSP Residual norm 4.577123717922e-11 2964 KSP Residual norm 4.208818648973e-11 2965 KSP Residual norm 4.023903100366e-11 2966 KSP Residual norm 4.209536168180e-11 2967 KSP Residual norm 3.984653103433e-11 2968 KSP Residual norm 3.500408260036e-11 2969 KSP Residual norm 3.452553412129e-11 2970 KSP Residual norm 3.786920468734e-11 2971 KSP Residual norm 3.895251106948e-11 2972 KSP Residual norm 4.055995420711e-11 2973 KSP Residual norm 4.656080646631e-11 2974 KSP Residual norm 4.597050777329e-11 2975 KSP Residual norm 4.199779538364e-11 2976 KSP Residual norm 4.593876198906e-11 2977 KSP Residual norm 5.339149398646e-11 2978 KSP Residual norm 4.811003973176e-11 2979 KSP Residual norm 4.149872952656e-11 2980 KSP Residual norm 4.222654343502e-11 2981 KSP Residual norm 4.634228106429e-11 2982 KSP Residual norm 4.687954987410e-11 2983 KSP Residual norm 4.737147250984e-11 2984 KSP Residual norm 5.064662729960e-11 2985 KSP Residual norm 4.984903838065e-11 2986 KSP Residual norm 4.754001567965e-11 2987 KSP Residual norm 5.049176499555e-11 2988 KSP Residual norm 5.120740332414e-11 2989 KSP Residual norm 4.511741444619e-11 2990 KSP Residual norm 4.731165752005e-11 2991 KSP Residual norm 6.016927709597e-11 2992 KSP Residual norm 6.613859593270e-11 2993 KSP Residual norm 6.369621644924e-11 2994 KSP Residual norm 7.049785786962e-11 2995 KSP Residual norm 7.778174280848e-11 2996 KSP Residual norm 7.373389967884e-11 2997 KSP Residual norm 7.419902343084e-11 2998 KSP Residual norm 8.024855664454e-11 2999 KSP Residual norm 8.116109676494e-11 3000 KSP Residual norm 8.472441319456e-11 3001 KSP Residual norm 9.438220967583e-11 3002 KSP Residual norm 9.067256184761e-11 3003 KSP Residual norm 8.867376624264e-11 3004 KSP Residual norm 1.043120154534e-10 3005 KSP Residual norm 1.263925779119e-10 3006 KSP Residual norm 1.284091647616e-10 3007 KSP Residual norm 1.337501420266e-10 3008 KSP Residual norm 1.483970087769e-10 3009 KSP Residual norm 1.497242026731e-10 3010 KSP Residual norm 1.362615155887e-10 3011 KSP Residual norm 1.479515273854e-10 3012 KSP Residual norm 1.806154921502e-10 3013 KSP Residual norm 1.781317801900e-10 3014 KSP Residual norm 1.578192929416e-10 3015 KSP Residual norm 1.512965374372e-10 3016 KSP Residual norm 1.472337680675e-10 3017 KSP Residual norm 1.375823617423e-10 3018 KSP Residual norm 1.392565856021e-10 3019 KSP Residual norm 1.388918449602e-10 3020 KSP Residual norm 1.245060877559e-10 3021 KSP Residual norm 1.145256275805e-10 3022 KSP Residual norm 1.292830907729e-10 3023 KSP Residual norm 1.521008350815e-10 3024 KSP Residual norm 1.521116719366e-10 3025 KSP Residual norm 1.541964989333e-10 3026 KSP Residual norm 1.697315594305e-10 3027 KSP Residual norm 1.788637912313e-10 3028 KSP Residual norm 1.976525782933e-10 3029 KSP Residual norm 2.508091508062e-10 3030 KSP Residual norm 3.053500222800e-10 3031 KSP Residual norm 2.971755321277e-10 3032 KSP Residual norm 2.812307621503e-10 3033 KSP Residual norm 2.827143298048e-10 3034 KSP Residual norm 2.469535401461e-10 3035 KSP Residual norm 2.048148466590e-10 3036 KSP Residual norm 2.131850369873e-10 3037 KSP Residual norm 2.402244027206e-10 3038 KSP Residual norm 2.265575690659e-10 3039 KSP Residual norm 2.201793732626e-10 3040 KSP Residual norm 2.542561054348e-10 3041 KSP Residual norm 2.862016869044e-10 3042 KSP Residual norm 2.607882591758e-10 3043 KSP Residual norm 2.294524561694e-10 3044 KSP Residual norm 2.125312824216e-10 3045 KSP Residual norm 1.926241911535e-10 3046 KSP Residual norm 1.956068644538e-10 3047 KSP Residual norm 2.334417409363e-10 3048 KSP Residual norm 2.779233588432e-10 3049 KSP Residual norm 2.786276471731e-10 3050 KSP Residual norm 2.667357007110e-10 3051 KSP Residual norm 2.563079580266e-10 3052 KSP Residual norm 2.354460734323e-10 3053 KSP Residual norm 2.210460740735e-10 3054 KSP Residual norm 2.184529220018e-10 3055 KSP Residual norm 2.266062223824e-10 3056 KSP Residual norm 2.296891564012e-10 3057 KSP Residual norm 2.087029977244e-10 3058 KSP Residual norm 1.884882431539e-10 3059 KSP Residual norm 1.849072492994e-10 3060 KSP Residual norm 1.933359378955e-10 3061 KSP Residual norm 2.116624386071e-10 3062 KSP Residual norm 2.338286383802e-10 3063 KSP Residual norm 2.465176354056e-10 3064 KSP Residual norm 2.365417131467e-10 3065 KSP Residual norm 2.249229066633e-10 3066 KSP Residual norm 2.448077727987e-10 3067 KSP Residual norm 2.839151725092e-10 3068 KSP Residual norm 2.766795299586e-10 3069 KSP Residual norm 2.606315092731e-10 3070 KSP Residual norm 2.650270457160e-10 3071 KSP Residual norm 2.691177332525e-10 3072 KSP Residual norm 2.616327225440e-10 3073 KSP Residual norm 2.546012419948e-10 3074 KSP Residual norm 2.268162557471e-10 3075 KSP Residual norm 1.888110336794e-10 3076 KSP Residual norm 1.657596910712e-10 3077 KSP Residual norm 1.486168146967e-10 3078 KSP Residual norm 1.346454796747e-10 3079 KSP Residual norm 1.430818559260e-10 3080 KSP Residual norm 1.647390071032e-10 3081 KSP Residual norm 1.556128203702e-10 3082 KSP Residual norm 1.292853797214e-10 3083 KSP Residual norm 1.169276587706e-10 3084 KSP Residual norm 1.150883138190e-10 3085 KSP Residual norm 1.089736071152e-10 3086 KSP Residual norm 1.077616032030e-10 3087 KSP Residual norm 1.175530702237e-10 3088 KSP Residual norm 1.221428961084e-10 3089 KSP Residual norm 1.171599409728e-10 3090 KSP Residual norm 1.160797278241e-10 3091 KSP Residual norm 1.124576801714e-10 3092 KSP Residual norm 1.056448089233e-10 3093 KSP Residual norm 1.051531935013e-10 3094 KSP Residual norm 1.003290598742e-10 3095 KSP Residual norm 8.784263113800e-11 3096 KSP Residual norm 7.886572751839e-11 3097 KSP Residual norm 8.262692204561e-11 3098 KSP Residual norm 8.885841735828e-11 3099 KSP Residual norm 8.729169313802e-11 3100 KSP Residual norm 7.949082066158e-11 3101 KSP Residual norm 7.243742888685e-11 3102 KSP Residual norm 6.351640386139e-11 3103 KSP Residual norm 5.949914991173e-11 3104 KSP Residual norm 6.844402487716e-11 3105 KSP Residual norm 8.513589124006e-11 3106 KSP Residual norm 9.174384498793e-11 3107 KSP Residual norm 9.328404417081e-11 3108 KSP Residual norm 8.297274865652e-11 3109 KSP Residual norm 6.282663836471e-11 3110 KSP Residual norm 5.365311875498e-11 3111 KSP Residual norm 5.297977403727e-11 3112 KSP Residual norm 5.063250259290e-11 3113 KSP Residual norm 4.982913442150e-11 3114 KSP Residual norm 5.422497867112e-11 3115 KSP Residual norm 5.510940773171e-11 3116 KSP Residual norm 4.918301566347e-11 3117 KSP Residual norm 4.584589653722e-11 3118 KSP Residual norm 4.470260866104e-11 3119 KSP Residual norm 4.102496818577e-11 3120 KSP Residual norm 3.728466832460e-11 3121 KSP Residual norm 3.666512293034e-11 3122 KSP Residual norm 3.871040493418e-11 3123 KSP Residual norm 3.985738959604e-11 3124 KSP Residual norm 3.971308778454e-11 3125 KSP Residual norm 4.294684988962e-11 3126 KSP Residual norm 4.584097034345e-11 3127 KSP Residual norm 4.571445152909e-11 3128 KSP Residual norm 4.587481956880e-11 3129 KSP Residual norm 4.792398016564e-11 3130 KSP Residual norm 5.037428353665e-11 3131 KSP Residual norm 4.911391653513e-11 3132 KSP Residual norm 4.509056807156e-11 3133 KSP Residual norm 4.513740221507e-11 3134 KSP Residual norm 4.664441273013e-11 3135 KSP Residual norm 4.433926280757e-11 3136 KSP Residual norm 4.182326055261e-11 3137 KSP Residual norm 4.069076238062e-11 3138 KSP Residual norm 4.085761016337e-11 3139 KSP Residual norm 4.144738743686e-11 3140 KSP Residual norm 4.809373764989e-11 3141 KSP Residual norm 5.916625029816e-11 3142 KSP Residual norm 5.965620833269e-11 3143 KSP Residual norm 5.814325154515e-11 3144 KSP Residual norm 6.853989071551e-11 3145 KSP Residual norm 7.601685589142e-11 3146 KSP Residual norm 6.671802165544e-11 3147 KSP Residual norm 5.820327413471e-11 3148 KSP Residual norm 5.594421520670e-11 3149 KSP Residual norm 5.523458813810e-11 3150 KSP Residual norm 6.074661818432e-11 3151 KSP Residual norm 7.200869545119e-11 3152 KSP Residual norm 7.454675436905e-11 3153 KSP Residual norm 7.077785959259e-11 3154 KSP Residual norm 7.203333736681e-11 3155 KSP Residual norm 7.780795774508e-11 3156 KSP Residual norm 8.635628192525e-11 3157 KSP Residual norm 1.003216037196e-10 3158 KSP Residual norm 1.009740359010e-10 3159 KSP Residual norm 9.300923075687e-11 3160 KSP Residual norm 1.054306029504e-10 3161 KSP Residual norm 1.313965079223e-10 3162 KSP Residual norm 1.274676220169e-10 3163 KSP Residual norm 1.042175159654e-10 3164 KSP Residual norm 9.544189914654e-11 3165 KSP Residual norm 1.017976224232e-10 3166 KSP Residual norm 1.184742228405e-10 3167 KSP Residual norm 1.426652727774e-10 3168 KSP Residual norm 1.519876276756e-10 3169 KSP Residual norm 1.414399705748e-10 3170 KSP Residual norm 1.325633763027e-10 3171 KSP Residual norm 1.355412045541e-10 3172 KSP Residual norm 1.417413615427e-10 3173 KSP Residual norm 1.542832380908e-10 3174 KSP Residual norm 1.669434289383e-10 3175 KSP Residual norm 1.812725525160e-10 3176 KSP Residual norm 2.103151720169e-10 3177 KSP Residual norm 2.482950042305e-10 3178 KSP Residual norm 2.748770554753e-10 3179 KSP Residual norm 2.746097786508e-10 3180 KSP Residual norm 2.397462783142e-10 3181 KSP Residual norm 1.857138078675e-10 3182 KSP Residual norm 1.551553637169e-10 3183 KSP Residual norm 1.704106580391e-10 3184 KSP Residual norm 2.228179662809e-10 3185 KSP Residual norm 2.610008846858e-10 3186 KSP Residual norm 2.569760997459e-10 3187 KSP Residual norm 2.466227297936e-10 3188 KSP Residual norm 2.318632625686e-10 3189 KSP Residual norm 2.223784912783e-10 3190 KSP Residual norm 2.348711244950e-10 3191 KSP Residual norm 2.484086393324e-10 3192 KSP Residual norm 2.250032949247e-10 3193 KSP Residual norm 1.869438066587e-10 3194 KSP Residual norm 1.501871274267e-10 3195 KSP Residual norm 1.103983980661e-10 3196 KSP Residual norm 9.052676319188e-11 3197 KSP Residual norm 1.004510996368e-10 3198 KSP Residual norm 1.350110613138e-10 3199 KSP Residual norm 1.618554838079e-10 3200 KSP Residual norm 1.794668102950e-10 3201 KSP Residual norm 2.103194188278e-10 3202 KSP Residual norm 2.266991759421e-10 3203 KSP Residual norm 2.145543524745e-10 3204 KSP Residual norm 2.162549564413e-10 3205 KSP Residual norm 2.173455067825e-10 3206 KSP Residual norm 1.825202927983e-10 3207 KSP Residual norm 1.474288372031e-10 3208 KSP Residual norm 1.323706501708e-10 3209 KSP Residual norm 1.176077619079e-10 3210 KSP Residual norm 1.037870761520e-10 3211 KSP Residual norm 1.095151438899e-10 3212 KSP Residual norm 1.315981579815e-10 3213 KSP Residual norm 1.513174891594e-10 3214 KSP Residual norm 1.861142187762e-10 3215 KSP Residual norm 2.498573985392e-10 3216 KSP Residual norm 2.753951528833e-10 3217 KSP Residual norm 2.369600538044e-10 3218 KSP Residual norm 1.812268470452e-10 3219 KSP Residual norm 1.318404218297e-10 3220 KSP Residual norm 1.085512045717e-10 3221 KSP Residual norm 1.053002290212e-10 3222 KSP Residual norm 1.067203619783e-10 3223 KSP Residual norm 1.050724020343e-10 3224 KSP Residual norm 1.167929063646e-10 3225 KSP Residual norm 1.428581208283e-10 3226 KSP Residual norm 1.626078855332e-10 3227 KSP Residual norm 1.796512340171e-10 3228 KSP Residual norm 1.915506940428e-10 3229 KSP Residual norm 1.618555167397e-10 3230 KSP Residual norm 1.289508129166e-10 3231 KSP Residual norm 1.136556641398e-10 3232 KSP Residual norm 1.035903741967e-10 3233 KSP Residual norm 1.022234391560e-10 3234 KSP Residual norm 1.179892718465e-10 3235 KSP Residual norm 1.336711154933e-10 3236 KSP Residual norm 1.217428867216e-10 3237 KSP Residual norm 1.079429250303e-10 3238 KSP Residual norm 9.523923351772e-11 3239 KSP Residual norm 8.970022755045e-11 3240 KSP Residual norm 9.821917820522e-11 3241 KSP Residual norm 1.112150352802e-10 3242 KSP Residual norm 1.112681054195e-10 3243 KSP Residual norm 1.075819176977e-10 3244 KSP Residual norm 1.018141545271e-10 3245 KSP Residual norm 8.438393490804e-11 3246 KSP Residual norm 6.751111666085e-11 3247 KSP Residual norm 6.393806876166e-11 3248 KSP Residual norm 6.705482700107e-11 3249 KSP Residual norm 6.531062939849e-11 3250 KSP Residual norm 5.832979179078e-11 3251 KSP Residual norm 5.064127546472e-11 3252 KSP Residual norm 4.492423463737e-11 3253 KSP Residual norm 4.755823974602e-11 3254 KSP Residual norm 5.791872890217e-11 3255 KSP Residual norm 7.024294564646e-11 3256 KSP Residual norm 6.550059175618e-11 3257 KSP Residual norm 4.968424916681e-11 3258 KSP Residual norm 3.692327893091e-11 3259 KSP Residual norm 2.883853146583e-11 3260 KSP Residual norm 2.571934840885e-11 3261 KSP Residual norm 2.925462969071e-11 3262 KSP Residual norm 3.980615267854e-11 3263 KSP Residual norm 5.284330461985e-11 3264 KSP Residual norm 5.636975088820e-11 3265 KSP Residual norm 4.879615949775e-11 3266 KSP Residual norm 4.110020642935e-11 3267 KSP Residual norm 3.770806440637e-11 3268 KSP Residual norm 4.039031126019e-11 3269 KSP Residual norm 4.597081952006e-11 3270 KSP Residual norm 5.162572610533e-11 3271 KSP Residual norm 5.766679786297e-11 3272 KSP Residual norm 5.942285598031e-11 3273 KSP Residual norm 5.810007528248e-11 3274 KSP Residual norm 5.803260733973e-11 3275 KSP Residual norm 5.169392320556e-11 3276 KSP Residual norm 4.005138465833e-11 3277 KSP Residual norm 3.431545434839e-11 3278 KSP Residual norm 3.485784406053e-11 3279 KSP Residual norm 3.575100704778e-11 3280 KSP Residual norm 4.113326524296e-11 3281 KSP Residual norm 5.243424479543e-11 3282 KSP Residual norm 6.006026934887e-11 3283 KSP Residual norm 5.846293102654e-11 3284 KSP Residual norm 5.869244033153e-11 3285 KSP Residual norm 6.289406007166e-11 3286 KSP Residual norm 6.798855082956e-11 3287 KSP Residual norm 7.107705034424e-11 3288 KSP Residual norm 7.110617768374e-11 3289 KSP Residual norm 6.870567155612e-11 3290 KSP Residual norm 6.846279164383e-11 3291 KSP Residual norm 7.208975746138e-11 3292 KSP Residual norm 7.682499183672e-11 3293 KSP Residual norm 8.846440966444e-11 3294 KSP Residual norm 1.122445315827e-10 3295 KSP Residual norm 1.278134456487e-10 3296 KSP Residual norm 1.094942128582e-10 3297 KSP Residual norm 8.467383983528e-11 3298 KSP Residual norm 7.385689774838e-11 3299 KSP Residual norm 8.082369733595e-11 3300 KSP Residual norm 1.065182999386e-10 3301 KSP Residual norm 1.520861591357e-10 3302 KSP Residual norm 2.033825468542e-10 3303 KSP Residual norm 2.192230498169e-10 3304 KSP Residual norm 1.887639668288e-10 3305 KSP Residual norm 1.478064903758e-10 3306 KSP Residual norm 1.191244009599e-10 3307 KSP Residual norm 1.177558502157e-10 3308 KSP Residual norm 1.386444414489e-10 3309 KSP Residual norm 1.669552805175e-10 3310 KSP Residual norm 1.965322007887e-10 3311 KSP Residual norm 2.329018360254e-10 3312 KSP Residual norm 2.289665356512e-10 3313 KSP Residual norm 1.750070922389e-10 3314 KSP Residual norm 1.328350562496e-10 3315 KSP Residual norm 1.282365508702e-10 3316 KSP Residual norm 1.506386302337e-10 3317 KSP Residual norm 1.941071835217e-10 3318 KSP Residual norm 2.541177831587e-10 3319 KSP Residual norm 2.516177377778e-10 3320 KSP Residual norm 1.750182321173e-10 3321 KSP Residual norm 1.281395123657e-10 3322 KSP Residual norm 1.170898267753e-10 3323 KSP Residual norm 1.343839583065e-10 3324 KSP Residual norm 1.884599281521e-10 3325 KSP Residual norm 2.941834838452e-10 3326 KSP Residual norm 4.370036244922e-10 3327 KSP Residual norm 4.314717126356e-10 3328 KSP Residual norm 3.280250204608e-10 3329 KSP Residual norm 2.312443138177e-10 3330 KSP Residual norm 1.740708047297e-10 3331 KSP Residual norm 1.683942727067e-10 3332 KSP Residual norm 2.055224506063e-10 3333 KSP Residual norm 2.805148030802e-10 3334 KSP Residual norm 3.495075744702e-10 3335 KSP Residual norm 3.900290868124e-10 3336 KSP Residual norm 3.661173524256e-10 3337 KSP Residual norm 2.768793247146e-10 3338 KSP Residual norm 2.061689825815e-10 3339 KSP Residual norm 1.829784721707e-10 3340 KSP Residual norm 2.001019759723e-10 3341 KSP Residual norm 2.588805902988e-10 3342 KSP Residual norm 3.811789485937e-10 3343 KSP Residual norm 5.070653363899e-10 3344 KSP Residual norm 4.839554017803e-10 3345 KSP Residual norm 3.922609280025e-10 3346 KSP Residual norm 2.997198861114e-10 3347 KSP Residual norm 2.311369090448e-10 3348 KSP Residual norm 2.099368073223e-10 3349 KSP Residual norm 2.449462719616e-10 3350 KSP Residual norm 3.219176469895e-10 3351 KSP Residual norm 4.356167806774e-10 3352 KSP Residual norm 5.536404483035e-10 3353 KSP Residual norm 5.205754474086e-10 3354 KSP Residual norm 3.892047643635e-10 3355 KSP Residual norm 3.134337093148e-10 3356 KSP Residual norm 2.966553502900e-10 3357 KSP Residual norm 2.718737268946e-10 3358 KSP Residual norm 2.372393233552e-10 3359 KSP Residual norm 2.244554600371e-10 3360 KSP Residual norm 2.227938132623e-10 3361 KSP Residual norm 2.212822595559e-10 3362 KSP Residual norm 2.388120811816e-10 3363 KSP Residual norm 2.752325609251e-10 3364 KSP Residual norm 3.067999526233e-10 3365 KSP Residual norm 3.372021561440e-10 3366 KSP Residual norm 3.794737876579e-10 3367 KSP Residual norm 4.053119574349e-10 3368 KSP Residual norm 4.052086844763e-10 3369 KSP Residual norm 3.971790277524e-10 3370 KSP Residual norm 3.322730105977e-10 3371 KSP Residual norm 2.333790228663e-10 3372 KSP Residual norm 1.736358891670e-10 3373 KSP Residual norm 1.586793093229e-10 3374 KSP Residual norm 1.822875447850e-10 3375 KSP Residual norm 2.216575749924e-10 3376 KSP Residual norm 2.613486813931e-10 3377 KSP Residual norm 2.771698644023e-10 3378 KSP Residual norm 2.700326689111e-10 3379 KSP Residual norm 2.676674483288e-10 3380 KSP Residual norm 2.586373956920e-10 3381 KSP Residual norm 2.400924771599e-10 3382 KSP Residual norm 2.085725274617e-10 3383 KSP Residual norm 1.620894141114e-10 3384 KSP Residual norm 1.142378501781e-10 3385 KSP Residual norm 8.923093583578e-11 3386 KSP Residual norm 8.320292485485e-11 3387 KSP Residual norm 8.677576722453e-11 3388 KSP Residual norm 1.050806675273e-10 3389 KSP Residual norm 1.359306723967e-10 3390 KSP Residual norm 1.470482454342e-10 3391 KSP Residual norm 1.309549914668e-10 3392 KSP Residual norm 1.303046911751e-10 3393 KSP Residual norm 1.466899186025e-10 3394 KSP Residual norm 1.518655875922e-10 3395 KSP Residual norm 1.337285188591e-10 3396 KSP Residual norm 1.142356837656e-10 3397 KSP Residual norm 8.902797008675e-11 3398 KSP Residual norm 7.153826763123e-11 3399 KSP Residual norm 7.110021767952e-11 3400 KSP Residual norm 8.219138506379e-11 3401 KSP Residual norm 8.887039786295e-11 3402 KSP Residual norm 9.089271742171e-11 3403 KSP Residual norm 9.861899982067e-11 3404 KSP Residual norm 1.141883680035e-10 3405 KSP Residual norm 1.338574352827e-10 3406 KSP Residual norm 1.446651528281e-10 3407 KSP Residual norm 1.229496299193e-10 3408 KSP Residual norm 9.603575781956e-11 3409 KSP Residual norm 8.755270348099e-11 3410 KSP Residual norm 8.932730219936e-11 3411 KSP Residual norm 8.933226888376e-11 3412 KSP Residual norm 8.891505954572e-11 3413 KSP Residual norm 9.018877655662e-11 3414 KSP Residual norm 8.639878128655e-11 3415 KSP Residual norm 8.623872451571e-11 3416 KSP Residual norm 9.965915903982e-11 3417 KSP Residual norm 1.149592902381e-10 3418 KSP Residual norm 1.100887933390e-10 3419 KSP Residual norm 8.854228242143e-11 3420 KSP Residual norm 7.568275514015e-11 3421 KSP Residual norm 7.244575793320e-11 3422 KSP Residual norm 6.666257885435e-11 3423 KSP Residual norm 5.513569118787e-11 3424 KSP Residual norm 4.725163872814e-11 3425 KSP Residual norm 4.963910969631e-11 3426 KSP Residual norm 5.992282230103e-11 3427 KSP Residual norm 6.958483625407e-11 3428 KSP Residual norm 7.315419910881e-11 3429 KSP Residual norm 7.696004697270e-11 3430 KSP Residual norm 8.068087614643e-11 3431 KSP Residual norm 7.971215699061e-11 3432 KSP Residual norm 7.689458217420e-11 3433 KSP Residual norm 6.601501589094e-11 3434 KSP Residual norm 4.996036275089e-11 3435 KSP Residual norm 4.161655824275e-11 3436 KSP Residual norm 3.907799578890e-11 3437 KSP Residual norm 3.787426366756e-11 3438 KSP Residual norm 3.559382951637e-11 3439 KSP Residual norm 3.408292140737e-11 3440 KSP Residual norm 3.254813680769e-11 3441 KSP Residual norm 3.183047380127e-11 3442 KSP Residual norm 3.340158978214e-11 3443 KSP Residual norm 3.979972535906e-11 3444 KSP Residual norm 4.814061659969e-11 3445 KSP Residual norm 5.175884059598e-11 3446 KSP Residual norm 4.913638598068e-11 3447 KSP Residual norm 4.959759060046e-11 3448 KSP Residual norm 5.731864048132e-11 3449 KSP Residual norm 6.055034287952e-11 3450 KSP Residual norm 4.438126942495e-11 3451 KSP Residual norm 3.086411670345e-11 3452 KSP Residual norm 2.536467831973e-11 3453 KSP Residual norm 2.317586298013e-11 3454 KSP Residual norm 2.277471923607e-11 3455 KSP Residual norm 2.318526168517e-11 3456 KSP Residual norm 2.210051214277e-11 3457 KSP Residual norm 2.182186315697e-11 3458 KSP Residual norm 2.387628311984e-11 3459 KSP Residual norm 2.652697995126e-11 3460 KSP Residual norm 2.580024211929e-11 3461 KSP Residual norm 2.406008022096e-11 3462 KSP Residual norm 2.327404523175e-11 3463 KSP Residual norm 2.425055013962e-11 3464 KSP Residual norm 2.785113525536e-11 3465 KSP Residual norm 2.984465392576e-11 3466 KSP Residual norm 2.687410867953e-11 3467 KSP Residual norm 2.368456543414e-11 3468 KSP Residual norm 2.368302952833e-11 3469 KSP Residual norm 2.216782277397e-11 3470 KSP Residual norm 1.841448173654e-11 3471 KSP Residual norm 1.621025354172e-11 3472 KSP Residual norm 1.464900252165e-11 3473 KSP Residual norm 1.532834144240e-11 3474 KSP Residual norm 1.829318142791e-11 3475 KSP Residual norm 1.950403032118e-11 3476 KSP Residual norm 1.668723513144e-11 3477 KSP Residual norm 1.574182980627e-11 3478 KSP Residual norm 1.738136698995e-11 3479 KSP Residual norm 2.191177734002e-11 3480 KSP Residual norm 2.478857281006e-11 3481 KSP Residual norm 2.296960716256e-11 3482 KSP Residual norm 1.966331189780e-11 3483 KSP Residual norm 1.823574333396e-11 3484 KSP Residual norm 2.008792824458e-11 3485 KSP Residual norm 2.058501113133e-11 3486 KSP Residual norm 1.837189447630e-11 3487 KSP Residual norm 1.558241819578e-11 3488 KSP Residual norm 1.482354880788e-11 3489 KSP Residual norm 1.664940014940e-11 3490 KSP Residual norm 2.058308638326e-11 3491 KSP Residual norm 2.460557469264e-11 3492 KSP Residual norm 2.682182440247e-11 3493 KSP Residual norm 3.102360740941e-11 3494 KSP Residual norm 3.914000054643e-11 3495 KSP Residual norm 4.496158730816e-11 3496 KSP Residual norm 4.437367732687e-11 3497 KSP Residual norm 4.000194802926e-11 3498 KSP Residual norm 3.711753076805e-11 3499 KSP Residual norm 3.777618604059e-11 3500 KSP Residual norm 3.713293627090e-11 3501 KSP Residual norm 3.160740829774e-11 3502 KSP Residual norm 2.410690943738e-11 3503 KSP Residual norm 1.966757686369e-11 3504 KSP Residual norm 1.825704381500e-11 3505 KSP Residual norm 1.885678295308e-11 3506 KSP Residual norm 1.901010155588e-11 3507 KSP Residual norm 1.796180488368e-11 3508 KSP Residual norm 1.699277080956e-11 3509 KSP Residual norm 1.759894427066e-11 3510 KSP Residual norm 2.059450482348e-11 3511 KSP Residual norm 2.507050913528e-11 3512 KSP Residual norm 2.972961781508e-11 3513 KSP Residual norm 3.252252183780e-11 3514 KSP Residual norm 3.428957722512e-11 3515 KSP Residual norm 3.463880417866e-11 3516 KSP Residual norm 3.427755883941e-11 3517 KSP Residual norm 3.208725417721e-11 3518 KSP Residual norm 2.677507126355e-11 3519 KSP Residual norm 2.236378967435e-11 3520 KSP Residual norm 2.207950699969e-11 3521 KSP Residual norm 2.385802017190e-11 3522 KSP Residual norm 2.257803780762e-11 3523 KSP Residual norm 1.838352457592e-11 3524 KSP Residual norm 1.626939631450e-11 3525 KSP Residual norm 1.677127213050e-11 3526 KSP Residual norm 2.010745780446e-11 3527 KSP Residual norm 2.690300809964e-11 3528 KSP Residual norm 3.156391401697e-11 3529 KSP Residual norm 2.986449847236e-11 3530 KSP Residual norm 2.995009123434e-11 3531 KSP Residual norm 3.721551922166e-11 3532 KSP Residual norm 4.578104314475e-11 3533 KSP Residual norm 4.571453044375e-11 3534 KSP Residual norm 3.927460641209e-11 3535 KSP Residual norm 3.725278372518e-11 3536 KSP Residual norm 4.183190722799e-11 3537 KSP Residual norm 4.378715646247e-11 3538 KSP Residual norm 3.551727326698e-11 3539 KSP Residual norm 2.648780261804e-11 3540 KSP Residual norm 2.448543310460e-11 3541 KSP Residual norm 2.648229551953e-11 3542 KSP Residual norm 2.565164787755e-11 3543 KSP Residual norm 2.167317159623e-11 3544 KSP Residual norm 1.868151969719e-11 3545 KSP Residual norm 1.989740836942e-11 3546 KSP Residual norm 2.693818975558e-11 3547 KSP Residual norm 3.719539520925e-11 3548 KSP Residual norm 3.725570063429e-11 3549 KSP Residual norm 3.592235320368e-11 3550 KSP Residual norm 4.194643814029e-11 3551 KSP Residual norm 5.625393945049e-11 3552 KSP Residual norm 6.848193281423e-11 3553 KSP Residual norm 7.476361977013e-11 3554 KSP Residual norm 7.629448804726e-11 3555 KSP Residual norm 7.948966755251e-11 3556 KSP Residual norm 9.469391772273e-11 3557 KSP Residual norm 1.006377896897e-10 3558 KSP Residual norm 8.134742644095e-11 3559 KSP Residual norm 6.906257314974e-11 3560 KSP Residual norm 7.255639163106e-11 3561 KSP Residual norm 7.845615100264e-11 3562 KSP Residual norm 6.983767645912e-11 3563 KSP Residual norm 5.776435366861e-11 3564 KSP Residual norm 5.020312401755e-11 3565 KSP Residual norm 5.308236513274e-11 3566 KSP Residual norm 6.410455100164e-11 3567 KSP Residual norm 6.905257813414e-11 3568 KSP Residual norm 6.425386668960e-11 3569 KSP Residual norm 7.300585487809e-11 3570 KSP Residual norm 1.018744260328e-10 3571 KSP Residual norm 1.273146999105e-10 3572 KSP Residual norm 1.306107061486e-10 3573 KSP Residual norm 1.240554449917e-10 3574 KSP Residual norm 1.203412278679e-10 3575 KSP Residual norm 1.286685256576e-10 3576 KSP Residual norm 1.310605708080e-10 3577 KSP Residual norm 1.076431316412e-10 3578 KSP Residual norm 9.075971022190e-11 3579 KSP Residual norm 9.828811094089e-11 3580 KSP Residual norm 1.062524793050e-10 3581 KSP Residual norm 1.040610359948e-10 3582 KSP Residual norm 1.073688421809e-10 3583 KSP Residual norm 1.207020446715e-10 3584 KSP Residual norm 1.293740765834e-10 3585 KSP Residual norm 1.311118071459e-10 3586 KSP Residual norm 1.411861875230e-10 3587 KSP Residual norm 1.704348031847e-10 3588 KSP Residual norm 2.119964147848e-10 3589 KSP Residual norm 2.184565637273e-10 3590 KSP Residual norm 2.061434712586e-10 3591 KSP Residual norm 2.208789529637e-10 3592 KSP Residual norm 2.569472271703e-10 3593 KSP Residual norm 2.514591328420e-10 3594 KSP Residual norm 2.041642910364e-10 3595 KSP Residual norm 1.747461507741e-10 3596 KSP Residual norm 1.647823664682e-10 3597 KSP Residual norm 1.609055345481e-10 3598 KSP Residual norm 1.507957560195e-10 3599 KSP Residual norm 1.278141971873e-10 3600 KSP Residual norm 1.068480489382e-10 3601 KSP Residual norm 1.016852326628e-10 3602 KSP Residual norm 1.028613589170e-10 3603 KSP Residual norm 9.518751098156e-11 3604 KSP Residual norm 8.367373416348e-11 3605 KSP Residual norm 8.639865354682e-11 3606 KSP Residual norm 1.010015214290e-10 3607 KSP Residual norm 1.136116769874e-10 3608 KSP Residual norm 1.161673223308e-10 3609 KSP Residual norm 1.151582778761e-10 3610 KSP Residual norm 1.241679983574e-10 3611 KSP Residual norm 1.428304751257e-10 3612 KSP Residual norm 1.510049835125e-10 3613 KSP Residual norm 1.418430233482e-10 3614 KSP Residual norm 1.374868059590e-10 3615 KSP Residual norm 1.441014513217e-10 3616 KSP Residual norm 1.302501347568e-10 3617 KSP Residual norm 9.965452988884e-11 3618 KSP Residual norm 8.319435600947e-11 3619 KSP Residual norm 7.844965724595e-11 3620 KSP Residual norm 8.073184897737e-11 3621 KSP Residual norm 8.419652730861e-11 3622 KSP Residual norm 8.149522832848e-11 3623 KSP Residual norm 7.357652241908e-11 3624 KSP Residual norm 7.425233541078e-11 3625 KSP Residual norm 7.991312515611e-11 3626 KSP Residual norm 7.477504734031e-11 3627 KSP Residual norm 7.180536945570e-11 3628 KSP Residual norm 8.189481163010e-11 3629 KSP Residual norm 1.005844287078e-10 3630 KSP Residual norm 1.061942429056e-10 3631 KSP Residual norm 9.594179144092e-11 3632 KSP Residual norm 8.372625363004e-11 3633 KSP Residual norm 7.932342454602e-11 3634 KSP Residual norm 8.111378002999e-11 3635 KSP Residual norm 7.957312449991e-11 3636 KSP Residual norm 6.627181192782e-11 3637 KSP Residual norm 5.758185314238e-11 3638 KSP Residual norm 5.812445371886e-11 3639 KSP Residual norm 5.664356915330e-11 3640 KSP Residual norm 4.965946670570e-11 3641 KSP Residual norm 4.618439329347e-11 3642 KSP Residual norm 4.683942558805e-11 3643 KSP Residual norm 4.713795470647e-11 3644 KSP Residual norm 4.714161642911e-11 3645 KSP Residual norm 4.311925609802e-11 3646 KSP Residual norm 3.691664744998e-11 3647 KSP Residual norm 3.482778490986e-11 3648 KSP Residual norm 3.630269369446e-11 3649 KSP Residual norm 3.872588874156e-11 3650 KSP Residual norm 3.995480920393e-11 3651 KSP Residual norm 4.122004301925e-11 3652 KSP Residual norm 3.979820591653e-11 3653 KSP Residual norm 4.112604995905e-11 3654 KSP Residual norm 4.840137256878e-11 3655 KSP Residual norm 5.164598261273e-11 3656 KSP Residual norm 4.511625162187e-11 3657 KSP Residual norm 3.759183533752e-11 3658 KSP Residual norm 3.345117403518e-11 3659 KSP Residual norm 3.244567247183e-11 3660 KSP Residual norm 3.269853992911e-11 3661 KSP Residual norm 2.870709551197e-11 3662 KSP Residual norm 2.368446495162e-11 3663 KSP Residual norm 2.293210723648e-11 3664 KSP Residual norm 2.335040757054e-11 3665 KSP Residual norm 2.033271210734e-11 3666 KSP Residual norm 1.797222792896e-11 3667 KSP Residual norm 1.845145837970e-11 3668 KSP Residual norm 2.019520846332e-11 3669 KSP Residual norm 2.191520621644e-11 3670 KSP Residual norm 2.246270564243e-11 3671 KSP Residual norm 2.130447957007e-11 3672 KSP Residual norm 2.171994588838e-11 3673 KSP Residual norm 2.541871350524e-11 3674 KSP Residual norm 2.687010561086e-11 3675 KSP Residual norm 2.767130705552e-11 3676 KSP Residual norm 3.284086451425e-11 3677 KSP Residual norm 3.799530351199e-11 3678 KSP Residual norm 3.977303234143e-11 3679 KSP Residual norm 4.175978654735e-11 3680 KSP Residual norm 4.138001271305e-11 3681 KSP Residual norm 4.010153067663e-11 3682 KSP Residual norm 3.918758454198e-11 3683 KSP Residual norm 3.761983833732e-11 3684 KSP Residual norm 3.366712347304e-11 3685 KSP Residual norm 3.187252131839e-11 3686 KSP Residual norm 3.165845836567e-11 3687 KSP Residual norm 3.007513892687e-11 3688 KSP Residual norm 2.593304959042e-11 3689 KSP Residual norm 2.219391839471e-11 3690 KSP Residual norm 1.905563064041e-11 3691 KSP Residual norm 1.714936448801e-11 3692 KSP Residual norm 1.664478146761e-11 3693 KSP Residual norm 1.485382077382e-11 3694 KSP Residual norm 1.289167610342e-11 3695 KSP Residual norm 1.289653525823e-11 3696 KSP Residual norm 1.371538484474e-11 3697 KSP Residual norm 1.384954537478e-11 3698 KSP Residual norm 1.385790386902e-11 3699 KSP Residual norm 1.317956872006e-11 3700 KSP Residual norm 1.208996828063e-11 3701 KSP Residual norm 1.216113876303e-11 3702 KSP Residual norm 1.366035785386e-11 3703 KSP Residual norm 1.421595243618e-11 3704 KSP Residual norm 1.376337053308e-11 3705 KSP Residual norm 1.375753741469e-11 3706 KSP Residual norm 1.516151617702e-11 3707 KSP Residual norm 1.710058653890e-11 3708 KSP Residual norm 1.782168859101e-11 3709 KSP Residual norm 1.682970827566e-11 3710 KSP Residual norm 1.543826047640e-11 3711 KSP Residual norm 1.541102084599e-11 3712 KSP Residual norm 1.504364885928e-11 3713 KSP Residual norm 1.278433722782e-11 3714 KSP Residual norm 1.090079022061e-11 3715 KSP Residual norm 1.034154128675e-11 3716 KSP Residual norm 9.968691653883e-12 3717 KSP Residual norm 9.364287532219e-12 3718 KSP Residual norm 8.581827940308e-12 3719 KSP Residual norm 7.520189563279e-12 3720 KSP Residual norm 7.298536611933e-12 3721 KSP Residual norm 7.838645779582e-12 3722 KSP Residual norm 7.973330533763e-12 3723 KSP Residual norm 7.617971930490e-12 3724 KSP Residual norm 7.709371822261e-12 3725 KSP Residual norm 7.921660487321e-12 3726 KSP Residual norm 7.826632319502e-12 3727 KSP Residual norm 8.546168299266e-12 3728 KSP Residual norm 9.669942585504e-12 3729 KSP Residual norm 1.020377646573e-11 3730 KSP Residual norm 1.060394611832e-11 3731 KSP Residual norm 1.115626836337e-11 3732 KSP Residual norm 1.034401924137e-11 3733 KSP Residual norm 8.941442676076e-12 3734 KSP Residual norm 8.111778884222e-12 3735 KSP Residual norm 7.314867735474e-12 3736 KSP Residual norm 7.065916193624e-12 3737 KSP Residual norm 7.312212847024e-12 3738 KSP Residual norm 7.574248136095e-12 3739 KSP Residual norm 7.339326761645e-12 3740 KSP Residual norm 7.111062249793e-12 3741 KSP Residual norm 6.686457586676e-12 3742 KSP Residual norm 5.764712409075e-12 3743 KSP Residual norm 5.382469847078e-12 3744 KSP Residual norm 4.982527481351e-12 3745 KSP Residual norm 4.467402609121e-12 3746 KSP Residual norm 4.491195969803e-12 3747 KSP Residual norm 4.775459519086e-12 3748 KSP Residual norm 4.882400182022e-12 3749 KSP Residual norm 5.462956377941e-12 3750 KSP Residual norm 6.359745255369e-12 3751 KSP Residual norm 7.014278094362e-12 3752 KSP Residual norm 7.634319068017e-12 3753 KSP Residual norm 8.548180800068e-12 3754 KSP Residual norm 8.517097288632e-12 3755 KSP Residual norm 8.259658790553e-12 3756 KSP Residual norm 9.152889760024e-12 3757 KSP Residual norm 1.005558643348e-11 3758 KSP Residual norm 1.032322286004e-11 3759 KSP Residual norm 9.720504443234e-12 3760 KSP Residual norm 8.297596043589e-12 3761 KSP Residual norm 7.822017055078e-12 3762 KSP Residual norm 7.697032113225e-12 3763 KSP Residual norm 6.791490030114e-12 3764 KSP Residual norm 5.942386557173e-12 3765 KSP Residual norm 5.900249341615e-12 3766 KSP Residual norm 5.647888230178e-12 3767 KSP Residual norm 5.026749479117e-12 3768 KSP Residual norm 5.217084517522e-12 3769 KSP Residual norm 6.188035065098e-12 3770 KSP Residual norm 6.693643412815e-12 3771 KSP Residual norm 6.611148166963e-12 3772 KSP Residual norm 6.085122152017e-12 3773 KSP Residual norm 5.791050431512e-12 3774 KSP Residual norm 6.012640776174e-12 3775 KSP Residual norm 6.686718769643e-12 3776 KSP Residual norm 6.887893380338e-12 3777 KSP Residual norm 7.250744614669e-12 3778 KSP Residual norm 8.219003144381e-12 3779 KSP Residual norm 8.846982334777e-12 3780 KSP Residual norm 8.883140574438e-12 3781 KSP Residual norm 9.047157642363e-12 3782 KSP Residual norm 8.996941353069e-12 3783 KSP Residual norm 9.365279738745e-12 3784 KSP Residual norm 1.015312176788e-11 3785 KSP Residual norm 9.566329649122e-12 3786 KSP Residual norm 8.767400558761e-12 3787 KSP Residual norm 9.098479575690e-12 3788 KSP Residual norm 8.996096154668e-12 3789 KSP Residual norm 8.161338974224e-12 3790 KSP Residual norm 7.805029485318e-12 3791 KSP Residual norm 8.154448513357e-12 3792 KSP Residual norm 8.220034142044e-12 3793 KSP Residual norm 7.323383911079e-12 3794 KSP Residual norm 6.652405962992e-12 3795 KSP Residual norm 6.468980735354e-12 3796 KSP Residual norm 7.006477504749e-12 3797 KSP Residual norm 7.463201918215e-12 3798 KSP Residual norm 6.492251789383e-12 3799 KSP Residual norm 6.172324390659e-12 3800 KSP Residual norm 6.853204596900e-12 3801 KSP Residual norm 7.082347687186e-12 3802 KSP Residual norm 6.824093061508e-12 3803 KSP Residual norm 7.461467359836e-12 3804 KSP Residual norm 8.838985768006e-12 3805 KSP Residual norm 9.917804987539e-12 3806 KSP Residual norm 1.117135323223e-11 3807 KSP Residual norm 1.355776899362e-11 3808 KSP Residual norm 1.679158192636e-11 3809 KSP Residual norm 1.852974382608e-11 3810 KSP Residual norm 1.899697545286e-11 3811 KSP Residual norm 1.981895642261e-11 3812 KSP Residual norm 2.085987804693e-11 3813 KSP Residual norm 2.061291943037e-11 3814 KSP Residual norm 1.729895759565e-11 3815 KSP Residual norm 1.487959277245e-11 3816 KSP Residual norm 1.565000318736e-11 3817 KSP Residual norm 1.594451143806e-11 3818 KSP Residual norm 1.369161384088e-11 3819 KSP Residual norm 1.220315755012e-11 3820 KSP Residual norm 1.228539648443e-11 3821 KSP Residual norm 1.177986454707e-11 3822 KSP Residual norm 1.108898964748e-11 3823 KSP Residual norm 1.022354391172e-11 3824 KSP Residual norm 9.969064656808e-12 3825 KSP Residual norm 1.116041119106e-11 3826 KSP Residual norm 1.247532765617e-11 3827 KSP Residual norm 1.180854559556e-11 3828 KSP Residual norm 1.158221471437e-11 3829 KSP Residual norm 1.415954452850e-11 3830 KSP Residual norm 1.787381094332e-11 3831 KSP Residual norm 1.936093094724e-11 3832 KSP Residual norm 2.026472656191e-11 3833 KSP Residual norm 2.186710544474e-11 3834 KSP Residual norm 2.379242354545e-11 3835 KSP Residual norm 2.670411235574e-11 3836 KSP Residual norm 2.956945898690e-11 3837 KSP Residual norm 3.037240910388e-11 3838 KSP Residual norm 3.106360495709e-11 3839 KSP Residual norm 3.302317488468e-11 3840 KSP Residual norm 3.011618833108e-11 3841 KSP Residual norm 2.683239184487e-11 3842 KSP Residual norm 2.716252372211e-11 3843 KSP Residual norm 2.724325077449e-11 3844 KSP Residual norm 2.614707309500e-11 3845 KSP Residual norm 2.507740404692e-11 3846 KSP Residual norm 2.097204973335e-11 3847 KSP Residual norm 1.765947054803e-11 3848 KSP Residual norm 1.636798199712e-11 3849 KSP Residual norm 1.527603018057e-11 3850 KSP Residual norm 1.400095344538e-11 3851 KSP Residual norm 1.425312828211e-11 3852 KSP Residual norm 1.433717443853e-11 3853 KSP Residual norm 1.372162024226e-11 3854 KSP Residual norm 1.413646008131e-11 3855 KSP Residual norm 1.517318095629e-11 3856 KSP Residual norm 1.543496603754e-11 3857 KSP Residual norm 1.518793256012e-11 3858 KSP Residual norm 1.562883295683e-11 3859 KSP Residual norm 1.583126172226e-11 3860 KSP Residual norm 1.641106188350e-11 3861 KSP Residual norm 1.815660725320e-11 3862 KSP Residual norm 2.013616999191e-11 3863 KSP Residual norm 2.306692151250e-11 3864 KSP Residual norm 2.611833928874e-11 3865 KSP Residual norm 2.558534710742e-11 3866 KSP Residual norm 2.223572873814e-11 3867 KSP Residual norm 2.118224813771e-11 3868 KSP Residual norm 2.097996021233e-11 3869 KSP Residual norm 2.118891652183e-11 3870 KSP Residual norm 2.071719158672e-11 3871 KSP Residual norm 2.005745662121e-11 3872 KSP Residual norm 1.937106302097e-11 3873 KSP Residual norm 1.926951728288e-11 3874 KSP Residual norm 1.763821681645e-11 3875 KSP Residual norm 1.588235851249e-11 3876 KSP Residual norm 1.523536962635e-11 3877 KSP Residual norm 1.444208771156e-11 3878 KSP Residual norm 1.285092256672e-11 3879 KSP Residual norm 1.229494050237e-11 3880 KSP Residual norm 1.223737917510e-11 3881 KSP Residual norm 1.115146113184e-11 3882 KSP Residual norm 1.013689021920e-11 3883 KSP Residual norm 9.806541151080e-12 3884 KSP Residual norm 9.905259019708e-12 3885 KSP Residual norm 1.042576736745e-11 3886 KSP Residual norm 1.128545140278e-11 3887 KSP Residual norm 1.194204067757e-11 3888 KSP Residual norm 1.355808209646e-11 3889 KSP Residual norm 1.562528603366e-11 3890 KSP Residual norm 1.664250444244e-11 3891 KSP Residual norm 1.606616716334e-11 3892 KSP Residual norm 1.700164123231e-11 3893 KSP Residual norm 1.874075925608e-11 3894 KSP Residual norm 1.970284513762e-11 3895 KSP Residual norm 2.125827068339e-11 3896 KSP Residual norm 2.202546643014e-11 3897 KSP Residual norm 2.056956651525e-11 3898 KSP Residual norm 1.991669936393e-11 3899 KSP Residual norm 1.937199402495e-11 3900 KSP Residual norm 1.755783734077e-11 3901 KSP Residual norm 1.545396491496e-11 3902 KSP Residual norm 1.289959091752e-11 3903 KSP Residual norm 1.023671080712e-11 3904 KSP Residual norm 9.307771106884e-12 3905 KSP Residual norm 8.994063218441e-12 3906 KSP Residual norm 7.561322035569e-12 3907 KSP Residual norm 6.287680448530e-12 3908 KSP Residual norm 5.804838431703e-12 3909 KSP Residual norm 5.437289738408e-12 3910 KSP Residual norm 5.112546727907e-12 3911 KSP Residual norm 4.704381267769e-12 3912 KSP Residual norm 4.335877738373e-12 3913 KSP Residual norm 4.162380963341e-12 3914 KSP Residual norm 4.160093426191e-12 3915 KSP Residual norm 4.004614526500e-12 3916 KSP Residual norm 3.874960730836e-12 3917 KSP Residual norm 4.208403188655e-12 3918 KSP Residual norm 4.563150815371e-12 3919 KSP Residual norm 4.444525000091e-12 3920 KSP Residual norm 4.438713216956e-12 3921 KSP Residual norm 5.074240988178e-12 3922 KSP Residual norm 5.936715532582e-12 3923 KSP Residual norm 6.705010322443e-12 3924 KSP Residual norm 6.833434460410e-12 3925 KSP Residual norm 6.752869771877e-12 3926 KSP Residual norm 7.041903157664e-12 3927 KSP Residual norm 7.470332347600e-12 3928 KSP Residual norm 6.420609305933e-12 3929 KSP Residual norm 5.281791693512e-12 3930 KSP Residual norm 5.088483216393e-12 3931 KSP Residual norm 4.741953153135e-12 3932 KSP Residual norm 3.748006927707e-12 3933 KSP Residual norm 2.975302428510e-12 3934 KSP Residual norm 2.559876609141e-12 3935 KSP Residual norm 2.369080214867e-12 3936 KSP Residual norm 2.303111700887e-12 3937 KSP Residual norm 2.247625947629e-12 3938 KSP Residual norm 2.270232120320e-12 3939 KSP Residual norm 2.459370856192e-12 3940 KSP Residual norm 2.410657047357e-12 3941 KSP Residual norm 2.040868969851e-12 3942 KSP Residual norm 1.961317730485e-12 3943 KSP Residual norm 2.189485563422e-12 3944 KSP Residual norm 2.113908654966e-12 3945 KSP Residual norm 1.773820793622e-12 3946 KSP Residual norm 1.729825784971e-12 3947 KSP Residual norm 1.915230457881e-12 3948 KSP Residual norm 2.121583961283e-12 3949 KSP Residual norm 2.284106663497e-12 3950 KSP Residual norm 2.273175354748e-12 3951 KSP Residual norm 2.391156315291e-12 3952 KSP Residual norm 2.782051882792e-12 3953 KSP Residual norm 2.868326858841e-12 3954 KSP Residual norm 2.744436218461e-12 3955 KSP Residual norm 2.922918630504e-12 3956 KSP Residual norm 3.134098189986e-12 3957 KSP Residual norm 2.937613084849e-12 3958 KSP Residual norm 2.855995509275e-12 3959 KSP Residual norm 3.137511400185e-12 3960 KSP Residual norm 3.234433422852e-12 3961 KSP Residual norm 3.057070527592e-12 3962 KSP Residual norm 2.916537392872e-12 3963 KSP Residual norm 2.837406726834e-12 3964 KSP Residual norm 2.897376764040e-12 3965 KSP Residual norm 3.056470798275e-12 3966 KSP Residual norm 3.082556705169e-12 3967 KSP Residual norm 3.010030960494e-12 3968 KSP Residual norm 2.798925218727e-12 3969 KSP Residual norm 2.143874696257e-12 3970 KSP Residual norm 1.623510664839e-12 3971 KSP Residual norm 1.447450857531e-12 3972 KSP Residual norm 1.367546406527e-12 3973 KSP Residual norm 1.189200011064e-12 3974 KSP Residual norm 1.129245822781e-12 3975 KSP Residual norm 1.245220262555e-12 3976 KSP Residual norm 1.415695875502e-12 3977 KSP Residual norm 1.493792770841e-12 3978 KSP Residual norm 1.433788754247e-12 3979 KSP Residual norm 1.310349436151e-12 3980 KSP Residual norm 1.311490187403e-12 3981 KSP Residual norm 1.346403840876e-12 3982 KSP Residual norm 1.299546813925e-12 3983 KSP Residual norm 1.332119341782e-12 3984 KSP Residual norm 1.595987149188e-12 3985 KSP Residual norm 1.936004784442e-12 3986 KSP Residual norm 2.046816721365e-12 3987 KSP Residual norm 2.040229570084e-12 3988 KSP Residual norm 2.128593912246e-12 3989 KSP Residual norm 2.334378848572e-12 3990 KSP Residual norm 2.582116501535e-12 3991 KSP Residual norm 2.521787333521e-12 3992 KSP Residual norm 2.372789984969e-12 3993 KSP Residual norm 2.590548639240e-12 3994 KSP Residual norm 2.971396155057e-12 3995 KSP Residual norm 2.989535596592e-12 3996 KSP Residual norm 3.117964599616e-12 3997 KSP Residual norm 3.495803780776e-12 3998 KSP Residual norm 3.714385030824e-12 3999 KSP Residual norm 3.745540254322e-12 4000 KSP Residual norm 3.632484783481e-12 4001 KSP Residual norm 3.612104904421e-12 4002 KSP Residual norm 3.854541552583e-12 4003 KSP Residual norm 3.890321484221e-12 4004 KSP Residual norm 3.437875796802e-12 4005 KSP Residual norm 3.026395710359e-12 4006 KSP Residual norm 2.798892246472e-12 4007 KSP Residual norm 2.376497919053e-12 4008 KSP Residual norm 2.066565904823e-12 4009 KSP Residual norm 2.106971107841e-12 4010 KSP Residual norm 2.398622873963e-12 4011 KSP Residual norm 2.436451731668e-12 4012 KSP Residual norm 2.501850510616e-12 4013 KSP Residual norm 2.722734382475e-12 4014 KSP Residual norm 3.072940678351e-12 4015 KSP Residual norm 3.302520856483e-12 4016 KSP Residual norm 3.403673866707e-12 4017 KSP Residual norm 3.558456457170e-12 4018 KSP Residual norm 4.155758181163e-12 4019 KSP Residual norm 4.730682332391e-12 4020 KSP Residual norm 5.072514725281e-12 4021 KSP Residual norm 5.627588599504e-12 4022 KSP Residual norm 6.911216953962e-12 4023 KSP Residual norm 7.317924444762e-12 4024 KSP Residual norm 6.940120060310e-12 4025 KSP Residual norm 7.408537526533e-12 4026 KSP Residual norm 8.682361569497e-12 4027 KSP Residual norm 1.005547535525e-11 4028 KSP Residual norm 1.095796266923e-11 4029 KSP Residual norm 1.100123477593e-11 4030 KSP Residual norm 1.070941014977e-11 4031 KSP Residual norm 1.111440713300e-11 4032 KSP Residual norm 1.026454978481e-11 4033 KSP Residual norm 8.341420403127e-12 4034 KSP Residual norm 7.749560789360e-12 4035 KSP Residual norm 8.396007512512e-12 4036 KSP Residual norm 8.332431086386e-12 4037 KSP Residual norm 7.333191890040e-12 4038 KSP Residual norm 6.501771657031e-12 4039 KSP Residual norm 5.819366189390e-12 4040 KSP Residual norm 5.487907649745e-12 4041 KSP Residual norm 4.884158002354e-12 4042 KSP Residual norm 4.147904412716e-12 4043 KSP Residual norm 4.121980272279e-12 4044 KSP Residual norm 5.104143538555e-12 4045 KSP Residual norm 5.523539855315e-12 4046 KSP Residual norm 4.832745307477e-12 4047 KSP Residual norm 4.755045819265e-12 4048 KSP Residual norm 5.501269120086e-12 4049 KSP Residual norm 6.301510312830e-12 4050 KSP Residual norm 6.575849352680e-12 4051 KSP Residual norm 6.809141584285e-12 4052 KSP Residual norm 7.378191757793e-12 4053 KSP Residual norm 8.433216473433e-12 4054 KSP Residual norm 9.272732269943e-12 4055 KSP Residual norm 9.561623636455e-12 4056 KSP Residual norm 9.637485598592e-12 4057 KSP Residual norm 9.818230339527e-12 4058 KSP Residual norm 9.747590562723e-12 4059 KSP Residual norm 1.010008192877e-11 4060 KSP Residual norm 1.081716753893e-11 4061 KSP Residual norm 1.122605249317e-11 4062 KSP Residual norm 1.097656412414e-11 4063 KSP Residual norm 1.058499593201e-11 4064 KSP Residual norm 9.695916613368e-12 4065 KSP Residual norm 8.590016663842e-12 4066 KSP Residual norm 7.975116833287e-12 4067 KSP Residual norm 7.487852009165e-12 4068 KSP Residual norm 6.804625424971e-12 4069 KSP Residual norm 6.142024618687e-12 4070 KSP Residual norm 5.811668237187e-12 4071 KSP Residual norm 5.550320006565e-12 4072 KSP Residual norm 5.447105904237e-12 4073 KSP Residual norm 5.232979552670e-12 4074 KSP Residual norm 4.740804185621e-12 4075 KSP Residual norm 4.505034251757e-12 4076 KSP Residual norm 4.645449315599e-12 4077 KSP Residual norm 4.656942858773e-12 4078 KSP Residual norm 4.776005408012e-12 4079 KSP Residual norm 4.952408190675e-12 4080 KSP Residual norm 4.849469983913e-12 4081 KSP Residual norm 4.598758805629e-12 4082 KSP Residual norm 4.518527775292e-12 4083 KSP Residual norm 4.556694361654e-12 4084 KSP Residual norm 4.897099154331e-12 4085 KSP Residual norm 5.180884659165e-12 4086 KSP Residual norm 4.729690524592e-12 4087 KSP Residual norm 4.412710074368e-12 4088 KSP Residual norm 4.840057350196e-12 4089 KSP Residual norm 5.361427973143e-12 4090 KSP Residual norm 4.958743031047e-12 4091 KSP Residual norm 4.777012384117e-12 4092 KSP Residual norm 5.012454010999e-12 4093 KSP Residual norm 5.463119450873e-12 4094 KSP Residual norm 5.580391517731e-12 4095 KSP Residual norm 5.484504783757e-12 4096 KSP Residual norm 5.392380153756e-12 4097 KSP Residual norm 5.596264853152e-12 4098 KSP Residual norm 5.788124201012e-12 4099 KSP Residual norm 5.338139867049e-12 4100 KSP Residual norm 5.028961130634e-12 4101 KSP Residual norm 5.196737695605e-12 4102 KSP Residual norm 4.994442094642e-12 4103 KSP Residual norm 4.457162837789e-12 4104 KSP Residual norm 4.336766861414e-12 4105 KSP Residual norm 4.241261474446e-12 4106 KSP Residual norm 3.876077306801e-12 4107 KSP Residual norm 3.773891543988e-12 4108 KSP Residual norm 3.692156360888e-12 4109 KSP Residual norm 3.108566065976e-12 4110 KSP Residual norm 2.532361956891e-12 4111 KSP Residual norm 2.195179953933e-12 4112 KSP Residual norm 2.005385398534e-12 4113 KSP Residual norm 2.165441364729e-12 4114 KSP Residual norm 2.438704615412e-12 4115 KSP Residual norm 2.453193328565e-12 4116 KSP Residual norm 2.334881363405e-12 4117 KSP Residual norm 2.351486566164e-12 4118 KSP Residual norm 2.328770035385e-12 4119 KSP Residual norm 2.436223785387e-12 4120 KSP Residual norm 2.547312328981e-12 4121 KSP Residual norm 2.568589212493e-12 4122 KSP Residual norm 2.638811597376e-12 4123 KSP Residual norm 2.897872619967e-12 4124 KSP Residual norm 2.967677350852e-12 4125 KSP Residual norm 2.915304519152e-12 4126 KSP Residual norm 2.936043479247e-12 4127 KSP Residual norm 2.790059705419e-12 4128 KSP Residual norm 2.761818330250e-12 4129 KSP Residual norm 2.931944467230e-12 4130 KSP Residual norm 2.937799573277e-12 4131 KSP Residual norm 2.779968004098e-12 4132 KSP Residual norm 2.676609615076e-12 4133 KSP Residual norm 2.619587488787e-12 4134 KSP Residual norm 2.496041327860e-12 4135 KSP Residual norm 2.457204824607e-12 4136 KSP Residual norm 2.372302435638e-12 4137 KSP Residual norm 2.380355319922e-12 4138 KSP Residual norm 2.388871559537e-12 4139 KSP Residual norm 2.164748297247e-12 4140 KSP Residual norm 1.965193651881e-12 4141 KSP Residual norm 2.009668195101e-12 4142 KSP Residual norm 2.031014528491e-12 4143 KSP Residual norm 1.838650237450e-12 4144 KSP Residual norm 1.778989543144e-12 4145 KSP Residual norm 1.719174025321e-12 4146 KSP Residual norm 1.573949523596e-12 4147 KSP Residual norm 1.487744704197e-12 4148 KSP Residual norm 1.376496716099e-12 4149 KSP Residual norm 1.210332694921e-12 4150 KSP Residual norm 1.072077690657e-12 4151 KSP Residual norm 9.636759853239e-13 4152 KSP Residual norm 8.978176584015e-13 4153 KSP Residual norm 9.074329279340e-13 4154 KSP Residual norm 8.872811574900e-13 4155 KSP Residual norm 7.906491144898e-13 4156 KSP Residual norm 7.315378854277e-13 4157 KSP Residual norm 6.941250867029e-13 4158 KSP Residual norm 6.302149790344e-13 4159 KSP Residual norm 5.805986226317e-13 4160 KSP Residual norm 5.632018308303e-13 4161 KSP Residual norm 5.260536530787e-13 4162 KSP Residual norm 4.924540196826e-13 4163 KSP Residual norm 4.866856124941e-13 4164 KSP Residual norm 4.887099151342e-13 4165 KSP Residual norm 5.391058470781e-13 4166 KSP Residual norm 6.388302926618e-13 4167 KSP Residual norm 6.580514918767e-13 4168 KSP Residual norm 6.189720862163e-13 4169 KSP Residual norm 6.518164180713e-13 4170 KSP Residual norm 7.350667052019e-13 4171 KSP Residual norm 8.349017109924e-13 4172 KSP Residual norm 1.040882628235e-12 4173 KSP Residual norm 1.284975163327e-12 4174 KSP Residual norm 1.235487216283e-12 4175 KSP Residual norm 1.140450303584e-12 4176 KSP Residual norm 1.190386199188e-12 4177 KSP Residual norm 1.297961569738e-12 4178 KSP Residual norm 1.367272768015e-12 4179 KSP Residual norm 1.358437777270e-12 4180 KSP Residual norm 1.290557053610e-12 4181 KSP Residual norm 1.360958321853e-12 4182 KSP Residual norm 1.576251872252e-12 4183 KSP Residual norm 1.635821649585e-12 4184 KSP Residual norm 1.646215954884e-12 4185 KSP Residual norm 1.720408281434e-12 4186 KSP Residual norm 1.742075741180e-12 4187 KSP Residual norm 1.638833558464e-12 4188 KSP Residual norm 1.484031800010e-12 4189 KSP Residual norm 1.393569636092e-12 4190 KSP Residual norm 1.323158504441e-12 4191 KSP Residual norm 1.173979541907e-12 4192 KSP Residual norm 9.787894170910e-13 4193 KSP Residual norm 7.970031920806e-13 4194 KSP Residual norm 7.181778595965e-13 4195 KSP Residual norm 7.024989147695e-13 4196 KSP Residual norm 6.427351601814e-13 4197 KSP Residual norm 6.098973452321e-13 4198 KSP Residual norm 6.064083593966e-13 4199 KSP Residual norm 5.991766431913e-13 4200 KSP Residual norm 6.006784042900e-13 4201 KSP Residual norm 5.929587834114e-13 4202 KSP Residual norm 5.371759240552e-13 4203 KSP Residual norm 4.723699082384e-13 4204 KSP Residual norm 4.288638862190e-13 4205 KSP Residual norm 4.116287083235e-13 4206 KSP Residual norm 4.134256928570e-13 4207 KSP Residual norm 4.646908692584e-13 4208 KSP Residual norm 5.095464710027e-13 4209 KSP Residual norm 5.655967195264e-13 4210 KSP Residual norm 6.936635437812e-13 4211 KSP Residual norm 8.262037803297e-13 4212 KSP Residual norm 8.956453526317e-13 4213 KSP Residual norm 9.705251358860e-13 4214 KSP Residual norm 1.144634524299e-12 4215 KSP Residual norm 1.309470062725e-12 4216 KSP Residual norm 1.449177461020e-12 4217 KSP Residual norm 1.567531317372e-12 4218 KSP Residual norm 1.686699890725e-12 4219 KSP Residual norm 1.807721554767e-12 4220 KSP Residual norm 1.799793513528e-12 4221 KSP Residual norm 1.676280467816e-12 4222 KSP Residual norm 1.685355331379e-12 4223 KSP Residual norm 1.789003736006e-12 4224 KSP Residual norm 1.784332906645e-12 4225 KSP Residual norm 1.887626596434e-12 4226 KSP Residual norm 2.146500546360e-12 4227 KSP Residual norm 2.215038667803e-12 4228 KSP Residual norm 2.153785837320e-12 4229 KSP Residual norm 2.138556725193e-12 4230 KSP Residual norm 2.275550157723e-12 4231 KSP Residual norm 2.417812161464e-12 4232 KSP Residual norm 2.271316705146e-12 4233 KSP Residual norm 2.031607638058e-12 4234 KSP Residual norm 1.958742891332e-12 4235 KSP Residual norm 1.927864467265e-12 4236 KSP Residual norm 1.777215127308e-12 4237 KSP Residual norm 1.817461915082e-12 4238 KSP Residual norm 2.073551018282e-12 4239 KSP Residual norm 2.174875625718e-12 4240 KSP Residual norm 2.105814624625e-12 4241 KSP Residual norm 2.047828167318e-12 4242 KSP Residual norm 1.963858691399e-12 4243 KSP Residual norm 1.740109509462e-12 4244 KSP Residual norm 1.487254427294e-12 4245 KSP Residual norm 1.395755750891e-12 4246 KSP Residual norm 1.432141907467e-12 4247 KSP Residual norm 1.494773839748e-12 4248 KSP Residual norm 1.437662548637e-12 4249 KSP Residual norm 1.408847033512e-12 4250 KSP Residual norm 1.534866474598e-12 4251 KSP Residual norm 1.711498861688e-12 4252 KSP Residual norm 1.812618358945e-12 4253 KSP Residual norm 1.824841554321e-12 4254 KSP Residual norm 1.855138135645e-12 4255 KSP Residual norm 1.998565138501e-12 4256 KSP Residual norm 2.096623948141e-12 4257 KSP Residual norm 1.975852076496e-12 4258 KSP Residual norm 1.977729046086e-12 4259 KSP Residual norm 2.075083098594e-12 4260 KSP Residual norm 2.203366611847e-12 4261 KSP Residual norm 2.282851034556e-12 4262 KSP Residual norm 2.376396254481e-12 4263 KSP Residual norm 2.490828270152e-12 4264 KSP Residual norm 2.555988628099e-12 4265 KSP Residual norm 2.603667518431e-12 4266 KSP Residual norm 2.681905264510e-12 4267 KSP Residual norm 2.731917796090e-12 4268 KSP Residual norm 2.608338382615e-12 4269 KSP Residual norm 2.506097183602e-12 4270 KSP Residual norm 2.721767136013e-12 4271 KSP Residual norm 3.105125135282e-12 4272 KSP Residual norm 3.331534622663e-12 4273 KSP Residual norm 3.347810164054e-12 4274 KSP Residual norm 3.414649573483e-12 4275 KSP Residual norm 3.683866538803e-12 4276 KSP Residual norm 4.082263239819e-12 4277 KSP Residual norm 4.422367646180e-12 4278 KSP Residual norm 4.503311971211e-12 4279 KSP Residual norm 4.915749732951e-12 4280 KSP Residual norm 5.462882561808e-12 4281 KSP Residual norm 5.355392811856e-12 4282 KSP Residual norm 4.741436033225e-12 4283 KSP Residual norm 4.543555506834e-12 4284 KSP Residual norm 4.567850037832e-12 4285 KSP Residual norm 4.336364285056e-12 4286 KSP Residual norm 4.026868223431e-12 4287 KSP Residual norm 3.808545460471e-12 4288 KSP Residual norm 3.560894636193e-12 4289 KSP Residual norm 3.102325859175e-12 4290 KSP Residual norm 2.808871035925e-12 4291 KSP Residual norm 2.857438939100e-12 4292 KSP Residual norm 3.164573744620e-12 4293 KSP Residual norm 3.285275459747e-12 4294 KSP Residual norm 3.192964301182e-12 4295 KSP Residual norm 3.230660255289e-12 4296 KSP Residual norm 3.334576390350e-12 4297 KSP Residual norm 3.276173044267e-12 4298 KSP Residual norm 3.321415637321e-12 4299 KSP Residual norm 3.663870402715e-12 4300 KSP Residual norm 4.263431072637e-12 4301 KSP Residual norm 4.710042669789e-12 4302 KSP Residual norm 4.693604195782e-12 4303 KSP Residual norm 4.568589029363e-12 4304 KSP Residual norm 4.597181535584e-12 4305 KSP Residual norm 4.513971163786e-12 4306 KSP Residual norm 4.659957415214e-12 4307 KSP Residual norm 4.850282861480e-12 4308 KSP Residual norm 5.593038560025e-12 4309 KSP Residual norm 6.510824613436e-12 4310 KSP Residual norm 7.539783538790e-12 4311 KSP Residual norm 8.680918964755e-12 4312 KSP Residual norm 9.257583389504e-12 4313 KSP Residual norm 9.706421183939e-12 4314 KSP Residual norm 1.043282264391e-11 4315 KSP Residual norm 1.088652675137e-11 4316 KSP Residual norm 1.121139499262e-11 4317 KSP Residual norm 1.157179382893e-11 4318 KSP Residual norm 1.202304132919e-11 4319 KSP Residual norm 1.320173063631e-11 4320 KSP Residual norm 1.572332815614e-11 4321 KSP Residual norm 1.714603907192e-11 4322 KSP Residual norm 1.638016555086e-11 4323 KSP Residual norm 1.696829936866e-11 4324 KSP Residual norm 1.782376915144e-11 4325 KSP Residual norm 1.756613844205e-11 4326 KSP Residual norm 1.836036726382e-11 4327 KSP Residual norm 2.019325791376e-11 4328 KSP Residual norm 2.173135940036e-11 4329 KSP Residual norm 2.097774059234e-11 4330 KSP Residual norm 1.947746899398e-11 4331 KSP Residual norm 1.986607317386e-11 4332 KSP Residual norm 2.161411896756e-11 4333 KSP Residual norm 2.332733807242e-11 4334 KSP Residual norm 2.198276476486e-11 4335 KSP Residual norm 1.927634921429e-11 4336 KSP Residual norm 1.852897307150e-11 4337 KSP Residual norm 1.860544998893e-11 4338 KSP Residual norm 1.824532804907e-11 4339 KSP Residual norm 1.813968603854e-11 4340 KSP Residual norm 1.841693979665e-11 4341 KSP Residual norm 1.786262971527e-11 4342 KSP Residual norm 1.709143954024e-11 4343 KSP Residual norm 1.702011402086e-11 4344 KSP Residual norm 1.808722290409e-11 4345 KSP Residual norm 1.951906529996e-11 4346 KSP Residual norm 1.931862003997e-11 4347 KSP Residual norm 1.739790290573e-11 4348 KSP Residual norm 1.593154794275e-11 4349 KSP Residual norm 1.507336172418e-11 4350 KSP Residual norm 1.499097226346e-11 4351 KSP Residual norm 1.532307873312e-11 4352 KSP Residual norm 1.517673761471e-11 4353 KSP Residual norm 1.468877541357e-11 4354 KSP Residual norm 1.394136992341e-11 4355 KSP Residual norm 1.272447759331e-11 4356 KSP Residual norm 1.257686248736e-11 4357 KSP Residual norm 1.373327437483e-11 4358 KSP Residual norm 1.422057025123e-11 4359 KSP Residual norm 1.350988915595e-11 4360 KSP Residual norm 1.320569180346e-11 4361 KSP Residual norm 1.321124335921e-11 4362 KSP Residual norm 1.265846099424e-11 4363 KSP Residual norm 1.262750682713e-11 4364 KSP Residual norm 1.248929782766e-11 4365 KSP Residual norm 1.237844163206e-11 4366 KSP Residual norm 1.273754968993e-11 4367 KSP Residual norm 1.305338395799e-11 4368 KSP Residual norm 1.383949821607e-11 4369 KSP Residual norm 1.556322828735e-11 4370 KSP Residual norm 1.566897911857e-11 4371 KSP Residual norm 1.501065989491e-11 4372 KSP Residual norm 1.610380452501e-11 4373 KSP Residual norm 1.827597923686e-11 4374 KSP Residual norm 1.896090505242e-11 4375 KSP Residual norm 1.952719405350e-11 4376 KSP Residual norm 2.219489989377e-11 4377 KSP Residual norm 2.605892734584e-11 4378 KSP Residual norm 2.849014051046e-11 4379 KSP Residual norm 2.793498875739e-11 4380 KSP Residual norm 2.556808333506e-11 4381 KSP Residual norm 2.465028311167e-11 4382 KSP Residual norm 2.397760932166e-11 4383 KSP Residual norm 2.355164667807e-11 4384 KSP Residual norm 2.364763365602e-11 4385 KSP Residual norm 2.491635976510e-11 4386 KSP Residual norm 2.477970974094e-11 4387 KSP Residual norm 2.378124362338e-11 4388 KSP Residual norm 2.249411208976e-11 4389 KSP Residual norm 2.128478745765e-11 4390 KSP Residual norm 1.898237568995e-11 4391 KSP Residual norm 1.727853141974e-11 4392 KSP Residual norm 1.686972912154e-11 4393 KSP Residual norm 1.707985430683e-11 4394 KSP Residual norm 1.630972699447e-11 4395 KSP Residual norm 1.511692409126e-11 4396 KSP Residual norm 1.461642171616e-11 4397 KSP Residual norm 1.501351310404e-11 4398 KSP Residual norm 1.453653198795e-11 4399 KSP Residual norm 1.337445563587e-11 4400 KSP Residual norm 1.193485364858e-11 4401 KSP Residual norm 1.094047147276e-11 4402 KSP Residual norm 1.065214091843e-11 4403 KSP Residual norm 1.003219554056e-11 4404 KSP Residual norm 9.293011933654e-12 4405 KSP Residual norm 8.960027272401e-12 4406 KSP Residual norm 8.421596272926e-12 4407 KSP Residual norm 7.258787937917e-12 4408 KSP Residual norm 6.136475654817e-12 4409 KSP Residual norm 5.498110036379e-12 4410 KSP Residual norm 5.300252463661e-12 4411 KSP Residual norm 5.336151393489e-12 4412 KSP Residual norm 6.084988879900e-12 4413 KSP Residual norm 7.042208032961e-12 4414 KSP Residual norm 7.869979367603e-12 4415 KSP Residual norm 8.550325066756e-12 4416 KSP Residual norm 8.603381465038e-12 4417 KSP Residual norm 8.311101868051e-12 4418 KSP Residual norm 7.890992295528e-12 4419 KSP Residual norm 7.711470806370e-12 4420 KSP Residual norm 7.721904038219e-12 4421 KSP Residual norm 7.926788677146e-12 4422 KSP Residual norm 8.419657559140e-12 4423 KSP Residual norm 9.169157260452e-12 4424 KSP Residual norm 9.465910930385e-12 4425 KSP Residual norm 9.292922930230e-12 4426 KSP Residual norm 9.677634201094e-12 4427 KSP Residual norm 1.078730025710e-11 4428 KSP Residual norm 1.190354013839e-11 4429 KSP Residual norm 1.315319861264e-11 4430 KSP Residual norm 1.474057087698e-11 4431 KSP Residual norm 1.542669612650e-11 4432 KSP Residual norm 1.589410928210e-11 4433 KSP Residual norm 1.595295051431e-11 4434 KSP Residual norm 1.489962558496e-11 4435 KSP Residual norm 1.351251159620e-11 4436 KSP Residual norm 1.285447160314e-11 4437 KSP Residual norm 1.278627611803e-11 4438 KSP Residual norm 1.255555379484e-11 4439 KSP Residual norm 1.130360818266e-11 4440 KSP Residual norm 1.046414097727e-11 4441 KSP Residual norm 1.089998692387e-11 4442 KSP Residual norm 1.132161766168e-11 4443 KSP Residual norm 1.065213433185e-11 4444 KSP Residual norm 9.923858060499e-12 4445 KSP Residual norm 9.856363491296e-12 4446 KSP Residual norm 9.824504893607e-12 4447 KSP Residual norm 8.461742801694e-12 4448 KSP Residual norm 7.377952985144e-12 4449 KSP Residual norm 6.826119058821e-12 4450 KSP Residual norm 6.492217457740e-12 4451 KSP Residual norm 6.268778800284e-12 4452 KSP Residual norm 5.751216855680e-12 4453 KSP Residual norm 5.535364738197e-12 4454 KSP Residual norm 5.898709223975e-12 4455 KSP Residual norm 6.229973941040e-12 4456 KSP Residual norm 6.068756025460e-12 4457 KSP Residual norm 5.681128275573e-12 4458 KSP Residual norm 5.342594609591e-12 4459 KSP Residual norm 4.718152145356e-12 4460 KSP Residual norm 3.860882538243e-12 4461 KSP Residual norm 3.325263512555e-12 4462 KSP Residual norm 2.961510314034e-12 4463 KSP Residual norm 2.672660268807e-12 4464 KSP Residual norm 2.516445657091e-12 4465 KSP Residual norm 2.685154357912e-12 4466 KSP Residual norm 2.964924795295e-12 4467 KSP Residual norm 2.971161356099e-12 4468 KSP Residual norm 2.905351804732e-12 4469 KSP Residual norm 2.978282372313e-12 4470 KSP Residual norm 3.279726509027e-12 4471 KSP Residual norm 3.536564282113e-12 4472 KSP Residual norm 3.716403246631e-12 4473 KSP Residual norm 4.237460913805e-12 4474 KSP Residual norm 4.464625523557e-12 4475 KSP Residual norm 4.008345283160e-12 4476 KSP Residual norm 3.659378208261e-12 4477 KSP Residual norm 3.738555696555e-12 4478 KSP Residual norm 3.945906703579e-12 4479 KSP Residual norm 4.051625718802e-12 4480 KSP Residual norm 4.181165010098e-12 4481 KSP Residual norm 4.282143396089e-12 4482 KSP Residual norm 4.741151464051e-12 4483 KSP Residual norm 5.088810336468e-12 4484 KSP Residual norm 5.071868101001e-12 4485 KSP Residual norm 5.427191190198e-12 4486 KSP Residual norm 6.057709346826e-12 4487 KSP Residual norm 5.470856770235e-12 4488 KSP Residual norm 4.475054933385e-12 4489 KSP Residual norm 3.948394206666e-12 4490 KSP Residual norm 3.799556147045e-12 4491 KSP Residual norm 3.999993695381e-12 4492 KSP Residual norm 4.417784431567e-12 4493 KSP Residual norm 4.625446317302e-12 4494 KSP Residual norm 4.328265250679e-12 4495 KSP Residual norm 3.933799851482e-12 4496 KSP Residual norm 3.626753577425e-12 4497 KSP Residual norm 3.519903938813e-12 4498 KSP Residual norm 3.363550542611e-12 4499 KSP Residual norm 3.161171845465e-12 4500 KSP Residual norm 3.214585670679e-12 4501 KSP Residual norm 3.272599543615e-12 4502 KSP Residual norm 3.112750371949e-12 4503 KSP Residual norm 2.714983050967e-12 4504 KSP Residual norm 2.493028828916e-12 4505 KSP Residual norm 2.489616901845e-12 4506 KSP Residual norm 2.834337477527e-12 4507 KSP Residual norm 3.255546520883e-12 4508 KSP Residual norm 3.611944674983e-12 4509 KSP Residual norm 3.717569697078e-12 4510 KSP Residual norm 3.532867649468e-12 4511 KSP Residual norm 3.373160197025e-12 4512 KSP Residual norm 3.320345142895e-12 4513 KSP Residual norm 3.349291602561e-12 4514 KSP Residual norm 3.460775369216e-12 4515 KSP Residual norm 3.460155972860e-12 4516 KSP Residual norm 3.152636667790e-12 4517 KSP Residual norm 2.772757265474e-12 4518 KSP Residual norm 2.666313390715e-12 4519 KSP Residual norm 2.896398842765e-12 4520 KSP Residual norm 3.145614237426e-12 4521 KSP Residual norm 3.526758016158e-12 4522 KSP Residual norm 3.879157164417e-12 4523 KSP Residual norm 3.917679286649e-12 4524 KSP Residual norm 3.725832870086e-12 4525 KSP Residual norm 3.530423209447e-12 4526 KSP Residual norm 3.660206615801e-12 4527 KSP Residual norm 3.837720259203e-12 4528 KSP Residual norm 4.082488474388e-12 4529 KSP Residual norm 4.736783644584e-12 4530 KSP Residual norm 5.709115252066e-12 4531 KSP Residual norm 6.171378484480e-12 4532 KSP Residual norm 5.890454037442e-12 4533 KSP Residual norm 5.506296756494e-12 4534 KSP Residual norm 5.736523005818e-12 4535 KSP Residual norm 6.274644866718e-12 4536 KSP Residual norm 6.726206150675e-12 4537 KSP Residual norm 7.039135865998e-12 4538 KSP Residual norm 7.555347659942e-12 4539 KSP Residual norm 7.461715729300e-12 4540 KSP Residual norm 6.812956469209e-12 4541 KSP Residual norm 6.337560702431e-12 4542 KSP Residual norm 6.181637549855e-12 4543 KSP Residual norm 6.260828094515e-12 4544 KSP Residual norm 6.097505263122e-12 4545 KSP Residual norm 5.534911272592e-12 4546 KSP Residual norm 5.540952651090e-12 4547 KSP Residual norm 6.412616785043e-12 4548 KSP Residual norm 6.980970241605e-12 4549 KSP Residual norm 6.550171747150e-12 4550 KSP Residual norm 6.270115378388e-12 4551 KSP Residual norm 6.042070772535e-12 4552 KSP Residual norm 5.500059398524e-12 4553 KSP Residual norm 5.040932394667e-12 4554 KSP Residual norm 5.117136719578e-12 4555 KSP Residual norm 5.470690346210e-12 4556 KSP Residual norm 5.698863368529e-12 4557 KSP Residual norm 5.585454230451e-12 4558 KSP Residual norm 5.395928584168e-12 4559 KSP Residual norm 5.228070842600e-12 4560 KSP Residual norm 5.201502464183e-12 4561 KSP Residual norm 5.467917088387e-12 4562 KSP Residual norm 6.016033631636e-12 4563 KSP Residual norm 6.304356641702e-12 4564 KSP Residual norm 5.865382250287e-12 4565 KSP Residual norm 4.959210874956e-12 4566 KSP Residual norm 4.271884443111e-12 4567 KSP Residual norm 3.949133101333e-12 4568 KSP Residual norm 4.166289196426e-12 4569 KSP Residual norm 4.564036541828e-12 4570 KSP Residual norm 4.702649401274e-12 4571 KSP Residual norm 4.587556877837e-12 4572 KSP Residual norm 4.303430033194e-12 4573 KSP Residual norm 4.350642176448e-12 4574 KSP Residual norm 5.082076505930e-12 4575 KSP Residual norm 5.505088569960e-12 4576 KSP Residual norm 4.988567921472e-12 4577 KSP Residual norm 4.409505933337e-12 4578 KSP Residual norm 4.206565465549e-12 4579 KSP Residual norm 4.115472922427e-12 4580 KSP Residual norm 4.227271449993e-12 4581 KSP Residual norm 4.532324059143e-12 4582 KSP Residual norm 5.055415836538e-12 4583 KSP Residual norm 5.871331891541e-12 4584 KSP Residual norm 6.806324477887e-12 4585 KSP Residual norm 7.660741218339e-12 4586 KSP Residual norm 8.039380877586e-12 4587 KSP Residual norm 7.834461056744e-12 4588 KSP Residual norm 7.450905284083e-12 4589 KSP Residual norm 7.191761550405e-12 4590 KSP Residual norm 7.094953884980e-12 4591 KSP Residual norm 7.236364260058e-12 4592 KSP Residual norm 7.189403454160e-12 4593 KSP Residual norm 6.744248746015e-12 4594 KSP Residual norm 6.550669786916e-12 4595 KSP Residual norm 7.173406716310e-12 4596 KSP Residual norm 8.154478678530e-12 4597 KSP Residual norm 7.698662386494e-12 4598 KSP Residual norm 6.233188323226e-12 4599 KSP Residual norm 5.418236247466e-12 4600 KSP Residual norm 5.387568718049e-12 4601 KSP Residual norm 5.708011588793e-12 4602 KSP Residual norm 6.310866767100e-12 4603 KSP Residual norm 6.943704377872e-12 4604 KSP Residual norm 7.521489235605e-12 4605 KSP Residual norm 7.504685720005e-12 4606 KSP Residual norm 7.177638685817e-12 4607 KSP Residual norm 7.381570873954e-12 4608 KSP Residual norm 7.979237000116e-12 4609 KSP Residual norm 8.121942312282e-12 4610 KSP Residual norm 7.386094106016e-12 4611 KSP Residual norm 6.462699537675e-12 4612 KSP Residual norm 5.576794687206e-12 4613 KSP Residual norm 4.811626218392e-12 4614 KSP Residual norm 4.508277157196e-12 4615 KSP Residual norm 4.500531509828e-12 4616 KSP Residual norm 4.599156615627e-12 4617 KSP Residual norm 4.700844544254e-12 4618 KSP Residual norm 4.731066357856e-12 4619 KSP Residual norm 4.946206552900e-12 4620 KSP Residual norm 5.372728414758e-12 4621 KSP Residual norm 5.715908132991e-12 4622 KSP Residual norm 5.967616279211e-12 4623 KSP Residual norm 6.359664412376e-12 4624 KSP Residual norm 6.664938948078e-12 4625 KSP Residual norm 7.082324943374e-12 4626 KSP Residual norm 8.296451903065e-12 4627 KSP Residual norm 9.777327035694e-12 4628 KSP Residual norm 1.006961221234e-11 4629 KSP Residual norm 1.021557355716e-11 4630 KSP Residual norm 1.004433854311e-11 4631 KSP Residual norm 9.800676335584e-12 4632 KSP Residual norm 1.033506443409e-11 4633 KSP Residual norm 1.185175854559e-11 4634 KSP Residual norm 1.269554508812e-11 4635 KSP Residual norm 1.192758052150e-11 4636 KSP Residual norm 1.063217746078e-11 4637 KSP Residual norm 1.015920140991e-11 4638 KSP Residual norm 1.073255679064e-11 4639 KSP Residual norm 1.195685714053e-11 4640 KSP Residual norm 1.295041675393e-11 4641 KSP Residual norm 1.444079759542e-11 4642 KSP Residual norm 1.745701887409e-11 4643 KSP Residual norm 2.105556859293e-11 4644 KSP Residual norm 2.172793968315e-11 4645 KSP Residual norm 1.881212450685e-11 4646 KSP Residual norm 1.630196355244e-11 4647 KSP Residual norm 1.559352568027e-11 4648 KSP Residual norm 1.429829839854e-11 4649 KSP Residual norm 1.290083012512e-11 4650 KSP Residual norm 1.358268496532e-11 4651 KSP Residual norm 1.539607358102e-11 4652 KSP Residual norm 1.417174417281e-11 4653 KSP Residual norm 1.101131822298e-11 4654 KSP Residual norm 8.898983856527e-12 4655 KSP Residual norm 8.491417222645e-12 4656 KSP Residual norm 8.674323634485e-12 4657 KSP Residual norm 8.595498240425e-12 4658 KSP Residual norm 8.250098349478e-12 4659 KSP Residual norm 8.262783938633e-12 4660 KSP Residual norm 8.498309339650e-12 4661 KSP Residual norm 9.413153537152e-12 4662 KSP Residual norm 1.086061737538e-11 4663 KSP Residual norm 1.032625514051e-11 4664 KSP Residual norm 8.963668081085e-12 4665 KSP Residual norm 8.411297729930e-12 4666 KSP Residual norm 8.187419896859e-12 4667 KSP Residual norm 7.851343398847e-12 4668 KSP Residual norm 7.706517473020e-12 4669 KSP Residual norm 8.185541242876e-12 4670 KSP Residual norm 8.773858726171e-12 4671 KSP Residual norm 8.982475649870e-12 4672 KSP Residual norm 8.551394026359e-12 4673 KSP Residual norm 8.252729113773e-12 4674 KSP Residual norm 8.609829953285e-12 4675 KSP Residual norm 9.177978711003e-12 4676 KSP Residual norm 8.754104994160e-12 4677 KSP Residual norm 8.120036317947e-12 4678 KSP Residual norm 8.175983026166e-12 4679 KSP Residual norm 8.493377495447e-12 4680 KSP Residual norm 8.866327870857e-12 4681 KSP Residual norm 9.841449534348e-12 4682 KSP Residual norm 1.108810403895e-11 4683 KSP Residual norm 1.191822390686e-11 4684 KSP Residual norm 1.236530620904e-11 4685 KSP Residual norm 1.207590095003e-11 4686 KSP Residual norm 1.101592104037e-11 4687 KSP Residual norm 1.046709840251e-11 4688 KSP Residual norm 1.055802974104e-11 4689 KSP Residual norm 1.043374293668e-11 4690 KSP Residual norm 1.012185489287e-11 4691 KSP Residual norm 1.034885810600e-11 4692 KSP Residual norm 1.178359008383e-11 4693 KSP Residual norm 1.434671172908e-11 4694 KSP Residual norm 1.675628333176e-11 4695 KSP Residual norm 1.797366589557e-11 4696 KSP Residual norm 1.815366847698e-11 4697 KSP Residual norm 1.696625385829e-11 4698 KSP Residual norm 1.632901949394e-11 4699 KSP Residual norm 1.726354832203e-11 4700 KSP Residual norm 1.893609702416e-11 4701 KSP Residual norm 2.051892059760e-11 4702 KSP Residual norm 2.063956469454e-11 4703 KSP Residual norm 1.965986839390e-11 4704 KSP Residual norm 1.816328695650e-11 4705 KSP Residual norm 1.808038363318e-11 4706 KSP Residual norm 1.987092228216e-11 4707 KSP Residual norm 2.145615846570e-11 4708 KSP Residual norm 2.050740162586e-11 4709 KSP Residual norm 1.862007064475e-11 4710 KSP Residual norm 1.803280033885e-11 4711 KSP Residual norm 1.931518465701e-11 4712 KSP Residual norm 2.171205145338e-11 4713 KSP Residual norm 2.192347139066e-11 4714 KSP Residual norm 1.846774002814e-11 4715 KSP Residual norm 1.577206639506e-11 4716 KSP Residual norm 1.431070355431e-11 4717 KSP Residual norm 1.344026404803e-11 4718 KSP Residual norm 1.326746770335e-11 4719 KSP Residual norm 1.345057128103e-11 4720 KSP Residual norm 1.203069738915e-11 4721 KSP Residual norm 1.048837092959e-11 4722 KSP Residual norm 1.010954959809e-11 4723 KSP Residual norm 1.041634504813e-11 4724 KSP Residual norm 1.082730507291e-11 4725 KSP Residual norm 1.119631407011e-11 4726 KSP Residual norm 1.075834599396e-11 4727 KSP Residual norm 9.408711270743e-12 4728 KSP Residual norm 8.824932398316e-12 4729 KSP Residual norm 9.937830432875e-12 4730 KSP Residual norm 1.238993234507e-11 4731 KSP Residual norm 1.330887788740e-11 4732 KSP Residual norm 1.184394301450e-11 4733 KSP Residual norm 1.067538214588e-11 4734 KSP Residual norm 1.045162151739e-11 4735 KSP Residual norm 1.047899708575e-11 4736 KSP Residual norm 1.041675562522e-11 4737 KSP Residual norm 9.808209416253e-12 4738 KSP Residual norm 9.646002890976e-12 4739 KSP Residual norm 1.053492574712e-11 4740 KSP Residual norm 1.208524407433e-11 4741 KSP Residual norm 1.248932897513e-11 4742 KSP Residual norm 1.267598776419e-11 4743 KSP Residual norm 1.384048604244e-11 4744 KSP Residual norm 1.385604557299e-11 4745 KSP Residual norm 1.223423508217e-11 4746 KSP Residual norm 1.113779128694e-11 4747 KSP Residual norm 1.170414419102e-11 4748 KSP Residual norm 1.451446155662e-11 4749 KSP Residual norm 1.880211790314e-11 4750 KSP Residual norm 2.014362685443e-11 4751 KSP Residual norm 1.749123985572e-11 4752 KSP Residual norm 1.422543361112e-11 4753 KSP Residual norm 1.283401510119e-11 4754 KSP Residual norm 1.403997928605e-11 4755 KSP Residual norm 1.780921236354e-11 4756 KSP Residual norm 2.047589937737e-11 4757 KSP Residual norm 2.093911196844e-11 4758 KSP Residual norm 2.129996071565e-11 4759 KSP Residual norm 2.056259474967e-11 4760 KSP Residual norm 1.939168286107e-11 4761 KSP Residual norm 2.128032667866e-11 4762 KSP Residual norm 2.609543115402e-11 4763 KSP Residual norm 3.034335335761e-11 4764 KSP Residual norm 3.173697591519e-11 4765 KSP Residual norm 3.396616047614e-11 4766 KSP Residual norm 3.662948112713e-11 4767 KSP Residual norm 3.508695526962e-11 4768 KSP Residual norm 3.102215343197e-11 4769 KSP Residual norm 2.852274169396e-11 4770 KSP Residual norm 2.731246915888e-11 4771 KSP Residual norm 2.678817645367e-11 4772 KSP Residual norm 2.588747266304e-11 4773 KSP Residual norm 2.658482869513e-11 4774 KSP Residual norm 2.855639113681e-11 4775 KSP Residual norm 2.748590962730e-11 4776 KSP Residual norm 2.382491959739e-11 4777 KSP Residual norm 2.157501807508e-11 4778 KSP Residual norm 2.231418214196e-11 4779 KSP Residual norm 2.651336732086e-11 4780 KSP Residual norm 2.900178937325e-11 4781 KSP Residual norm 2.671059355386e-11 4782 KSP Residual norm 2.335905118080e-11 4783 KSP Residual norm 1.986987246025e-11 4784 KSP Residual norm 1.772992616766e-11 4785 KSP Residual norm 1.787960670018e-11 4786 KSP Residual norm 1.789708557742e-11 4787 KSP Residual norm 1.629146275550e-11 4788 KSP Residual norm 1.650083599452e-11 4789 KSP Residual norm 1.786004243269e-11 4790 KSP Residual norm 1.735122595563e-11 4791 KSP Residual norm 1.429370823342e-11 4792 KSP Residual norm 1.168515198892e-11 4793 KSP Residual norm 9.643375728093e-12 4794 KSP Residual norm 8.549683983834e-12 4795 KSP Residual norm 7.931987211215e-12 4796 KSP Residual norm 8.066207692987e-12 4797 KSP Residual norm 9.322869535306e-12 4798 KSP Residual norm 1.067089680501e-11 4799 KSP Residual norm 1.092855740926e-11 4800 KSP Residual norm 1.025146496110e-11 4801 KSP Residual norm 9.641315875605e-12 4802 KSP Residual norm 8.990564325214e-12 4803 KSP Residual norm 8.138053133236e-12 4804 KSP Residual norm 7.997260465006e-12 4805 KSP Residual norm 8.401812293564e-12 4806 KSP Residual norm 8.125459308851e-12 4807 KSP Residual norm 8.144550381611e-12 4808 KSP Residual norm 8.436205872983e-12 4809 KSP Residual norm 8.119208898815e-12 4810 KSP Residual norm 7.067654731680e-12 4811 KSP Residual norm 6.609441204809e-12 4812 KSP Residual norm 7.258040878607e-12 4813 KSP Residual norm 8.293291269205e-12 4814 KSP Residual norm 8.762005561265e-12 4815 KSP Residual norm 9.433016072475e-12 4816 KSP Residual norm 1.101268541371e-11 4817 KSP Residual norm 1.217209630734e-11 4818 KSP Residual norm 1.160717861650e-11 4819 KSP Residual norm 1.096073106840e-11 4820 KSP Residual norm 1.102387325331e-11 4821 KSP Residual norm 1.053735710274e-11 4822 KSP Residual norm 1.041638896212e-11 4823 KSP Residual norm 1.071789523708e-11 4824 KSP Residual norm 1.085219228887e-11 4825 KSP Residual norm 1.152477825326e-11 4826 KSP Residual norm 1.328012397873e-11 4827 KSP Residual norm 1.541174475805e-11 4828 KSP Residual norm 1.779282498447e-11 4829 KSP Residual norm 1.814441514865e-11 4830 KSP Residual norm 1.711401675545e-11 4831 KSP Residual norm 1.628403317988e-11 4832 KSP Residual norm 1.425464517473e-11 4833 KSP Residual norm 1.197787529189e-11 4834 KSP Residual norm 1.110182904295e-11 4835 KSP Residual norm 1.150630294373e-11 4836 KSP Residual norm 1.311602337110e-11 4837 KSP Residual norm 1.545426293092e-11 4838 KSP Residual norm 1.713988193771e-11 4839 KSP Residual norm 1.616728053326e-11 4840 KSP Residual norm 1.445825300591e-11 4841 KSP Residual norm 1.285385461068e-11 4842 KSP Residual norm 1.185860497581e-11 4843 KSP Residual norm 1.159281833783e-11 4844 KSP Residual norm 1.152160894835e-11 4845 KSP Residual norm 1.035927823783e-11 4846 KSP Residual norm 8.156528465170e-12 4847 KSP Residual norm 6.172196056574e-12 4848 KSP Residual norm 5.049120374520e-12 4849 KSP Residual norm 4.832068510014e-12 4850 KSP Residual norm 5.497529062725e-12 4851 KSP Residual norm 6.697801240399e-12 4852 KSP Residual norm 7.903818499990e-12 4853 KSP Residual norm 8.586650159792e-12 4854 KSP Residual norm 8.520725490526e-12 4855 KSP Residual norm 7.811696674260e-12 4856 KSP Residual norm 6.822526186585e-12 4857 KSP Residual norm 6.122277287536e-12 4858 KSP Residual norm 6.006852570121e-12 4859 KSP Residual norm 5.857339572564e-12 4860 KSP Residual norm 5.197608518742e-12 4861 KSP Residual norm 4.658913874410e-12 4862 KSP Residual norm 4.766920372507e-12 4863 KSP Residual norm 5.778108687216e-12 4864 KSP Residual norm 7.557462234361e-12 4865 KSP Residual norm 9.294102964046e-12 4866 KSP Residual norm 9.316060800038e-12 4867 KSP Residual norm 8.694025512986e-12 4868 KSP Residual norm 7.627550541547e-12 4869 KSP Residual norm 6.316597904152e-12 4870 KSP Residual norm 5.349593547429e-12 4871 KSP Residual norm 4.641893850650e-12 4872 KSP Residual norm 4.223047359762e-12 4873 KSP Residual norm 4.154842883891e-12 4874 KSP Residual norm 4.667446739021e-12 4875 KSP Residual norm 5.772284541539e-12 4876 KSP Residual norm 6.522041314963e-12 4877 KSP Residual norm 6.204002131811e-12 4878 KSP Residual norm 5.380369149348e-12 4879 KSP Residual norm 4.965168010305e-12 4880 KSP Residual norm 5.143349421213e-12 4881 KSP Residual norm 5.411730884215e-12 4882 KSP Residual norm 5.675485775830e-12 4883 KSP Residual norm 5.861871474799e-12 4884 KSP Residual norm 6.451873270654e-12 4885 KSP Residual norm 8.098552899208e-12 4886 KSP Residual norm 9.955618586465e-12 4887 KSP Residual norm 8.729172022580e-12 4888 KSP Residual norm 5.977968854738e-12 4889 KSP Residual norm 4.573038471728e-12 4890 KSP Residual norm 4.693209980273e-12 4891 KSP Residual norm 6.348635565815e-12 4892 KSP Residual norm 9.128365229343e-12 4893 KSP Residual norm 1.215066616872e-11 4894 KSP Residual norm 1.243263951878e-11 4895 KSP Residual norm 1.007098265013e-11 4896 KSP Residual norm 6.990375987309e-12 4897 KSP Residual norm 4.834316721470e-12 4898 KSP Residual norm 3.946205997274e-12 4899 KSP Residual norm 4.030122530025e-12 4900 KSP Residual norm 4.933942965639e-12 4901 KSP Residual norm 6.736305693442e-12 4902 KSP Residual norm 8.224787480131e-12 4903 KSP Residual norm 8.014787379513e-12 4904 KSP Residual norm 6.924073351369e-12 4905 KSP Residual norm 6.460348276073e-12 4906 KSP Residual norm 6.493708996848e-12 4907 KSP Residual norm 6.238736535374e-12 4908 KSP Residual norm 5.553332164133e-12 4909 KSP Residual norm 4.622901120056e-12 4910 KSP Residual norm 3.893794998099e-12 4911 KSP Residual norm 3.178213852006e-12 4912 KSP Residual norm 2.579938691494e-12 4913 KSP Residual norm 2.546170415180e-12 4914 KSP Residual norm 3.322580961887e-12 4915 KSP Residual norm 4.372048756355e-12 4916 KSP Residual norm 4.498495081330e-12 4917 KSP Residual norm 3.790374124959e-12 4918 KSP Residual norm 3.296095716772e-12 4919 KSP Residual norm 3.025986327543e-12 4920 KSP Residual norm 2.942230175436e-12 4921 KSP Residual norm 3.196296861437e-12 4922 KSP Residual norm 3.450864950673e-12 4923 KSP Residual norm 3.395918075624e-12 4924 KSP Residual norm 2.853865164270e-12 4925 KSP Residual norm 2.213003858805e-12 4926 KSP Residual norm 1.815116721434e-12 4927 KSP Residual norm 1.636228826521e-12 4928 KSP Residual norm 1.645166062020e-12 4929 KSP Residual norm 1.982567849495e-12 4930 KSP Residual norm 2.642089608231e-12 4931 KSP Residual norm 3.522236826372e-12 4932 KSP Residual norm 4.370638517286e-12 4933 KSP Residual norm 5.223417570235e-12 4934 KSP Residual norm 6.252704758452e-12 4935 KSP Residual norm 6.304957869600e-12 4936 KSP Residual norm 5.329861141447e-12 4937 KSP Residual norm 4.361544928313e-12 4938 KSP Residual norm 3.800178014254e-12 4939 KSP Residual norm 3.674235219780e-12 4940 KSP Residual norm 3.829251650727e-12 4941 KSP Residual norm 4.671949406728e-12 4942 KSP Residual norm 6.307943697523e-12 4943 KSP Residual norm 7.622661537012e-12 4944 KSP Residual norm 7.907141547634e-12 4945 KSP Residual norm 7.538648761784e-12 4946 KSP Residual norm 7.100075847875e-12 4947 KSP Residual norm 6.826641923909e-12 4948 KSP Residual norm 7.456926476580e-12 4949 KSP Residual norm 8.599291594599e-12 4950 KSP Residual norm 8.900782807233e-12 4951 KSP Residual norm 8.285562129459e-12 4952 KSP Residual norm 7.712665283050e-12 4953 KSP Residual norm 7.782455247724e-12 4954 KSP Residual norm 8.616432928979e-12 4955 KSP Residual norm 9.735884958254e-12 4956 KSP Residual norm 9.078676508734e-12 4957 KSP Residual norm 7.195086193457e-12 4958 KSP Residual norm 5.514948214717e-12 4959 KSP Residual norm 4.449628412853e-12 4960 KSP Residual norm 3.903214010542e-12 4961 KSP Residual norm 4.016351601839e-12 4962 KSP Residual norm 4.641492860681e-12 4963 KSP Residual norm 5.218454247470e-12 4964 KSP Residual norm 4.992857141790e-12 4965 KSP Residual norm 3.891308331353e-12 4966 KSP Residual norm 2.936668024722e-12 4967 KSP Residual norm 2.476250403264e-12 4968 KSP Residual norm 2.561830854608e-12 4969 KSP Residual norm 3.089239519447e-12 4970 KSP Residual norm 3.982806307766e-12 4971 KSP Residual norm 4.540829390456e-12 4972 KSP Residual norm 4.179522960858e-12 4973 KSP Residual norm 3.333512335861e-12 4974 KSP Residual norm 2.522186352353e-12 4975 KSP Residual norm 2.011090607646e-12 4976 KSP Residual norm 1.835163587449e-12 4977 KSP Residual norm 1.784056634117e-12 4978 KSP Residual norm 1.661268075403e-12 4979 KSP Residual norm 1.562729928729e-12 4980 KSP Residual norm 1.551809712420e-12 4981 KSP Residual norm 1.508478650320e-12 4982 KSP Residual norm 1.535724842786e-12 4983 KSP Residual norm 1.685158636575e-12 4984 KSP Residual norm 1.927773842777e-12 4985 KSP Residual norm 2.014045562930e-12 4986 KSP Residual norm 1.988897435646e-12 4987 KSP Residual norm 1.951269018622e-12 4988 KSP Residual norm 1.746060833142e-12 4989 KSP Residual norm 1.475543475880e-12 4990 KSP Residual norm 1.149316236535e-12 4991 KSP Residual norm 8.808269593741e-13 4992 KSP Residual norm 7.762654757119e-13 4993 KSP Residual norm 9.174295734665e-13 4994 KSP Residual norm 1.318276463519e-12 4995 KSP Residual norm 1.915399899229e-12 4996 KSP Residual norm 2.515203610354e-12 4997 KSP Residual norm 2.950900736452e-12 4998 KSP Residual norm 3.012171764100e-12 4999 KSP Residual norm 2.615647856943e-12 5000 KSP Residual norm 2.376741547762e-12 5001 KSP Residual norm 2.670017484234e-12 5002 KSP Residual norm 3.443029543080e-12 5003 KSP Residual norm 4.535505524500e-12 5004 KSP Residual norm 5.435864963087e-12 5005 KSP Residual norm 5.764025235757e-12 5006 KSP Residual norm 5.001153657512e-12 5007 KSP Residual norm 3.958426230740e-12 5008 KSP Residual norm 3.299215260665e-12 5009 KSP Residual norm 2.933458116486e-12 5010 KSP Residual norm 2.945627316068e-12 5011 KSP Residual norm 3.620554433854e-12 5012 KSP Residual norm 5.265587507768e-12 5013 KSP Residual norm 7.250353854742e-12 5014 KSP Residual norm 7.706677328485e-12 5015 KSP Residual norm 7.153179331880e-12 5016 KSP Residual norm 6.779919264271e-12 5017 KSP Residual norm 6.477956663018e-12 5018 KSP Residual norm 6.246895597169e-12 5019 KSP Residual norm 6.052285127208e-12 5020 KSP Residual norm 5.747497768001e-12 5021 KSP Residual norm 5.295060208837e-12 5022 KSP Residual norm 5.679471168966e-12 5023 KSP Residual norm 6.628464485607e-12 5024 KSP Residual norm 7.595499717625e-12 5025 KSP Residual norm 7.796270589664e-12 5026 KSP Residual norm 7.809864583857e-12 5027 KSP Residual norm 7.785637803330e-12 5028 KSP Residual norm 8.085559474345e-12 5029 KSP Residual norm 8.463470315050e-12 5030 KSP Residual norm 7.495866059666e-12 5031 KSP Residual norm 6.092813625487e-12 5032 KSP Residual norm 5.215489882567e-12 5033 KSP Residual norm 4.718919808163e-12 5034 KSP Residual norm 4.652605574060e-12 5035 KSP Residual norm 5.430866269390e-12 5036 KSP Residual norm 6.269024385182e-12 5037 KSP Residual norm 5.712582670954e-12 5038 KSP Residual norm 5.866304317659e-12 5039 KSP Residual norm 7.171010672447e-12 5040 KSP Residual norm 8.630717835717e-12 5041 KSP Residual norm 8.667415293904e-12 5042 KSP Residual norm 7.103851835432e-12 5043 KSP Residual norm 5.022597389599e-12 5044 KSP Residual norm 3.626150716996e-12 5045 KSP Residual norm 2.979784701214e-12 5046 KSP Residual norm 3.226948875951e-12 5047 KSP Residual norm 4.143202172347e-12 5048 KSP Residual norm 5.042760056685e-12 5049 KSP Residual norm 4.673684040662e-12 5050 KSP Residual norm 3.719695552805e-12 5051 KSP Residual norm 3.168777626937e-12 5052 KSP Residual norm 2.875148509221e-12 5053 KSP Residual norm 2.623294387630e-12 5054 KSP Residual norm 2.520309633608e-12 5055 KSP Residual norm 2.551248588685e-12 5056 KSP Residual norm 2.770807722568e-12 5057 KSP Residual norm 3.039161133311e-12 5058 KSP Residual norm 3.202677963583e-12 5059 KSP Residual norm 3.258107047385e-12 5060 KSP Residual norm 3.475023018085e-12 5061 KSP Residual norm 3.405977414800e-12 5062 KSP Residual norm 3.160561072807e-12 5063 KSP Residual norm 2.833595966846e-12 5064 KSP Residual norm 2.451655611483e-12 5065 KSP Residual norm 2.294236324466e-12 5066 KSP Residual norm 2.526655323649e-12 5067 KSP Residual norm 2.779099456475e-12 5068 KSP Residual norm 3.030528006444e-12 5069 KSP Residual norm 3.518839310326e-12 5070 KSP Residual norm 4.068042799603e-12 5071 KSP Residual norm 4.583437422417e-12 5072 KSP Residual norm 5.239352145788e-12 5073 KSP Residual norm 5.865542149170e-12 5074 KSP Residual norm 6.073640108176e-12 5075 KSP Residual norm 5.767436584434e-12 5076 KSP Residual norm 5.144636252572e-12 5077 KSP Residual norm 4.815701821238e-12 5078 KSP Residual norm 5.095070531482e-12 5079 KSP Residual norm 5.910167109382e-12 5080 KSP Residual norm 7.210559843956e-12 5081 KSP Residual norm 9.049114018761e-12 5082 KSP Residual norm 1.092927403720e-11 5083 KSP Residual norm 1.305352716560e-11 5084 KSP Residual norm 1.442727643376e-11 5085 KSP Residual norm 1.333539735506e-11 5086 KSP Residual norm 1.273493823764e-11 5087 KSP Residual norm 1.416915110236e-11 5088 KSP Residual norm 1.600242967846e-11 5089 KSP Residual norm 1.542236696747e-11 5090 KSP Residual norm 1.359087259281e-11 5091 KSP Residual norm 1.139818014743e-11 5092 KSP Residual norm 9.811029958586e-12 5093 KSP Residual norm 9.470171379481e-12 5094 KSP Residual norm 9.900283064174e-12 5095 KSP Residual norm 1.076393210753e-11 5096 KSP Residual norm 1.205806947966e-11 5097 KSP Residual norm 1.315929682606e-11 5098 KSP Residual norm 1.377163497629e-11 5099 KSP Residual norm 1.534368936316e-11 5100 KSP Residual norm 1.730881577674e-11 5101 KSP Residual norm 1.748350020642e-11 5102 KSP Residual norm 1.508554358809e-11 5103 KSP Residual norm 1.215944801548e-11 5104 KSP Residual norm 1.066975450451e-11 5105 KSP Residual norm 1.119357989442e-11 5106 KSP Residual norm 1.261621605425e-11 5107 KSP Residual norm 1.253485467066e-11 5108 KSP Residual norm 1.271922608567e-11 5109 KSP Residual norm 1.429569511996e-11 5110 KSP Residual norm 1.618165005719e-11 5111 KSP Residual norm 1.782808141838e-11 5112 KSP Residual norm 1.898079650754e-11 5113 KSP Residual norm 1.836721379825e-11 5114 KSP Residual norm 1.779756376426e-11 5115 KSP Residual norm 1.733821780578e-11 5116 KSP Residual norm 1.746151087346e-11 5117 KSP Residual norm 1.658327870763e-11 5118 KSP Residual norm 1.524857515496e-11 5119 KSP Residual norm 1.375499040244e-11 5120 KSP Residual norm 1.326293918046e-11 5121 KSP Residual norm 1.263184614019e-11 5122 KSP Residual norm 1.111223998225e-11 5123 KSP Residual norm 9.690137231123e-12 5124 KSP Residual norm 9.785355596885e-12 5125 KSP Residual norm 1.107099382209e-11 5126 KSP Residual norm 1.255391932541e-11 5127 KSP Residual norm 1.350991051568e-11 5128 KSP Residual norm 1.463854094704e-11 5129 KSP Residual norm 1.520438294021e-11 5130 KSP Residual norm 1.457125873395e-11 5131 KSP Residual norm 1.425386410197e-11 5132 KSP Residual norm 1.584312965656e-11 5133 KSP Residual norm 1.727907211058e-11 5134 KSP Residual norm 1.472178077635e-11 5135 KSP Residual norm 1.144584801987e-11 5136 KSP Residual norm 1.015990981366e-11 5137 KSP Residual norm 1.014184826995e-11 5138 KSP Residual norm 1.004911333289e-11 5139 KSP Residual norm 1.016032364438e-11 5140 KSP Residual norm 9.236852770611e-12 5141 KSP Residual norm 7.496401878520e-12 5142 KSP Residual norm 6.916648420479e-12 5143 KSP Residual norm 7.795077683598e-12 5144 KSP Residual norm 9.253707862067e-12 5145 KSP Residual norm 9.991879301588e-12 5146 KSP Residual norm 1.041607901118e-11 5147 KSP Residual norm 1.190286698193e-11 5148 KSP Residual norm 1.374906642805e-11 5149 KSP Residual norm 1.413298797969e-11 5150 KSP Residual norm 1.309817235225e-11 5151 KSP Residual norm 1.258896424291e-11 5152 KSP Residual norm 1.277939733565e-11 5153 KSP Residual norm 1.220465350040e-11 5154 KSP Residual norm 1.178205863279e-11 5155 KSP Residual norm 1.131144867847e-11 5156 KSP Residual norm 1.097622220791e-11 5157 KSP Residual norm 1.076970992467e-11 5158 KSP Residual norm 1.116045919828e-11 5159 KSP Residual norm 1.231332959387e-11 5160 KSP Residual norm 1.425931051189e-11 5161 KSP Residual norm 1.650665889069e-11 5162 KSP Residual norm 1.799158075353e-11 5163 KSP Residual norm 1.721470579305e-11 5164 KSP Residual norm 1.517444285977e-11 5165 KSP Residual norm 1.376778343574e-11 5166 KSP Residual norm 1.406880009387e-11 5167 KSP Residual norm 1.438758291399e-11 5168 KSP Residual norm 1.369415871871e-11 5169 KSP Residual norm 1.195606915145e-11 5170 KSP Residual norm 1.056856896198e-11 5171 KSP Residual norm 1.101122774990e-11 5172 KSP Residual norm 1.314944848074e-11 5173 KSP Residual norm 1.537357824382e-11 5174 KSP Residual norm 1.668224620809e-11 5175 KSP Residual norm 1.773877453808e-11 5176 KSP Residual norm 1.846276305107e-11 5177 KSP Residual norm 2.117767971511e-11 5178 KSP Residual norm 2.649136526719e-11 5179 KSP Residual norm 3.448892290166e-11 5180 KSP Residual norm 3.963620102471e-11 5181 KSP Residual norm 3.719600204827e-11 5182 KSP Residual norm 3.499516929507e-11 5183 KSP Residual norm 3.516639836872e-11 5184 KSP Residual norm 3.479613149857e-11 5185 KSP Residual norm 3.034967475577e-11 5186 KSP Residual norm 2.395760066371e-11 5187 KSP Residual norm 2.240435006689e-11 5188 KSP Residual norm 2.562733056918e-11 5189 KSP Residual norm 3.233203819317e-11 5190 KSP Residual norm 3.607091389438e-11 5191 KSP Residual norm 3.244693762497e-11 5192 KSP Residual norm 3.106205064792e-11 5193 KSP Residual norm 3.627014944643e-11 5194 KSP Residual norm 4.482936931769e-11 5195 KSP Residual norm 5.101622015491e-11 5196 KSP Residual norm 5.159081383164e-11 5197 KSP Residual norm 5.163585229209e-11 5198 KSP Residual norm 5.734559264366e-11 5199 KSP Residual norm 6.897281250318e-11 5200 KSP Residual norm 7.151734304883e-11 5201 KSP Residual norm 6.078510394485e-11 5202 KSP Residual norm 5.333184351410e-11 5203 KSP Residual norm 5.207964491519e-11 5204 KSP Residual norm 5.152547627309e-11 5205 KSP Residual norm 4.577269191808e-11 5206 KSP Residual norm 4.073371927370e-11 5207 KSP Residual norm 4.203594106114e-11 5208 KSP Residual norm 4.601618707764e-11 5209 KSP Residual norm 4.965284343808e-11 5210 KSP Residual norm 5.359619357967e-11 5211 KSP Residual norm 5.550918904485e-11 5212 KSP Residual norm 5.172605836243e-11 5213 KSP Residual norm 5.069626436267e-11 5214 KSP Residual norm 5.812329376742e-11 5215 KSP Residual norm 6.562198478921e-11 5216 KSP Residual norm 6.194388712776e-11 5217 KSP Residual norm 4.775219635047e-11 5218 KSP Residual norm 3.534376936954e-11 5219 KSP Residual norm 2.864278997591e-11 5220 KSP Residual norm 2.695595190942e-11 5221 KSP Residual norm 2.751794742143e-11 5222 KSP Residual norm 2.622317547439e-11 5223 KSP Residual norm 2.189054781817e-11 5224 KSP Residual norm 1.803998027451e-11 5225 KSP Residual norm 1.689983125034e-11 5226 KSP Residual norm 1.657596909125e-11 5227 KSP Residual norm 1.510217802600e-11 5228 KSP Residual norm 1.361667544790e-11 5229 KSP Residual norm 1.396466890197e-11 5230 KSP Residual norm 1.606469929500e-11 5231 KSP Residual norm 2.059168569761e-11 5232 KSP Residual norm 2.424678840713e-11 5233 KSP Residual norm 2.293308189030e-11 5234 KSP Residual norm 2.140307643676e-11 5235 KSP Residual norm 2.242980862915e-11 5236 KSP Residual norm 2.500816697822e-11 5237 KSP Residual norm 2.624572454986e-11 5238 KSP Residual norm 2.406129680332e-11 5239 KSP Residual norm 1.974460118952e-11 5240 KSP Residual norm 1.881406534135e-11 5241 KSP Residual norm 2.226203647963e-11 5242 KSP Residual norm 2.371210779290e-11 5243 KSP Residual norm 2.156812268998e-11 5244 KSP Residual norm 2.321143350690e-11 5245 KSP Residual norm 2.917661678263e-11 5246 KSP Residual norm 3.346465501570e-11 5247 KSP Residual norm 3.122259372121e-11 5248 KSP Residual norm 2.688778353639e-11 5249 KSP Residual norm 2.607531039019e-11 5250 KSP Residual norm 2.706686313062e-11 5251 KSP Residual norm 2.672688140460e-11 5252 KSP Residual norm 2.568861325810e-11 5253 KSP Residual norm 2.470453992797e-11 5254 KSP Residual norm 2.321415245677e-11 5255 KSP Residual norm 2.311544680159e-11 5256 KSP Residual norm 2.607595913927e-11 5257 KSP Residual norm 2.955130386061e-11 5258 KSP Residual norm 2.995384714402e-11 5259 KSP Residual norm 3.016381854237e-11 5260 KSP Residual norm 3.103367599536e-11 5261 KSP Residual norm 3.307388464309e-11 5262 KSP Residual norm 3.532435461821e-11 5263 KSP Residual norm 3.313312317926e-11 5264 KSP Residual norm 3.057945733555e-11 5265 KSP Residual norm 2.895140201395e-11 5266 KSP Residual norm 2.785932000944e-11 5267 KSP Residual norm 2.877102398968e-11 5268 KSP Residual norm 3.281171426540e-11 5269 KSP Residual norm 3.295290718762e-11 5270 KSP Residual norm 3.099250282615e-11 5271 KSP Residual norm 3.150868230177e-11 5272 KSP Residual norm 3.540763188018e-11 5273 KSP Residual norm 4.138148450052e-11 5274 KSP Residual norm 4.671281401968e-11 5275 KSP Residual norm 4.707694483194e-11 5276 KSP Residual norm 4.632466160814e-11 5277 KSP Residual norm 4.980757493483e-11 5278 KSP Residual norm 5.213965586496e-11 5279 KSP Residual norm 4.910795000236e-11 5280 KSP Residual norm 4.328401340251e-11 5281 KSP Residual norm 3.643260717946e-11 5282 KSP Residual norm 3.328844964617e-11 5283 KSP Residual norm 3.670242246005e-11 5284 KSP Residual norm 4.194514402295e-11 5285 KSP Residual norm 3.820112135113e-11 5286 KSP Residual norm 3.357648126872e-11 5287 KSP Residual norm 3.978966889095e-11 5288 KSP Residual norm 5.397343595354e-11 5289 KSP Residual norm 5.559260010122e-11 5290 KSP Residual norm 4.955318286971e-11 5291 KSP Residual norm 4.964235327076e-11 5292 KSP Residual norm 5.425556544119e-11 5293 KSP Residual norm 5.465675344402e-11 5294 KSP Residual norm 5.033729101792e-11 5295 KSP Residual norm 4.457128966413e-11 5296 KSP Residual norm 4.042292846793e-11 5297 KSP Residual norm 3.869087622327e-11 5298 KSP Residual norm 3.470686551738e-11 5299 KSP Residual norm 2.815138773690e-11 5300 KSP Residual norm 2.406926145328e-11 5301 KSP Residual norm 2.244859263600e-11 5302 KSP Residual norm 2.371222375220e-11 5303 KSP Residual norm 2.576747038746e-11 5304 KSP Residual norm 2.897426783240e-11 5305 KSP Residual norm 3.261916936800e-11 5306 KSP Residual norm 3.298288529379e-11 5307 KSP Residual norm 2.983318122278e-11 5308 KSP Residual norm 2.865097915742e-11 5309 KSP Residual norm 2.991660653824e-11 5310 KSP Residual norm 3.007884422481e-11 5311 KSP Residual norm 2.803948395660e-11 5312 KSP Residual norm 2.646800216521e-11 5313 KSP Residual norm 2.503799482921e-11 5314 KSP Residual norm 2.460850680399e-11 5315 KSP Residual norm 2.364577637134e-11 5316 KSP Residual norm 2.097048071141e-11 5317 KSP Residual norm 1.798058439797e-11 5318 KSP Residual norm 1.610063184695e-11 5319 KSP Residual norm 1.492978284174e-11 5320 KSP Residual norm 1.413058334445e-11 5321 KSP Residual norm 1.290589885657e-11 5322 KSP Residual norm 1.180457001270e-11 5323 KSP Residual norm 1.128007146840e-11 5324 KSP Residual norm 1.202099994317e-11 5325 KSP Residual norm 1.343129916119e-11 5326 KSP Residual norm 1.445335862369e-11 5327 KSP Residual norm 1.409318777141e-11 5328 KSP Residual norm 1.343984409972e-11 5329 KSP Residual norm 1.340832178772e-11 5330 KSP Residual norm 1.411390877042e-11 5331 KSP Residual norm 1.477466741838e-11 5332 KSP Residual norm 1.354500876773e-11 5333 KSP Residual norm 1.074612877709e-11 5334 KSP Residual norm 8.587036619389e-12 5335 KSP Residual norm 7.960515474483e-12 5336 KSP Residual norm 7.874752885817e-12 5337 KSP Residual norm 7.398059886127e-12 5338 KSP Residual norm 6.601358560520e-12 5339 KSP Residual norm 6.135709944485e-12 5340 KSP Residual norm 6.473481853395e-12 5341 KSP Residual norm 7.228592600164e-12 5342 KSP Residual norm 7.874094710002e-12 5343 KSP Residual norm 8.603626733140e-12 5344 KSP Residual norm 9.530170617767e-12 5345 KSP Residual norm 1.100001589776e-11 5346 KSP Residual norm 1.349904254750e-11 5347 KSP Residual norm 1.538402427904e-11 5348 KSP Residual norm 1.506008511536e-11 5349 KSP Residual norm 1.483934045094e-11 5350 KSP Residual norm 1.608362116155e-11 5351 KSP Residual norm 1.809716141392e-11 5352 KSP Residual norm 2.020772173392e-11 5353 KSP Residual norm 2.105044118698e-11 5354 KSP Residual norm 1.906591614046e-11 5355 KSP Residual norm 1.765715451594e-11 5356 KSP Residual norm 1.850258738337e-11 5357 KSP Residual norm 2.008790734872e-11 5358 KSP Residual norm 1.766123807952e-11 5359 KSP Residual norm 1.423874462258e-11 5360 KSP Residual norm 1.360018130154e-11 5361 KSP Residual norm 1.515454824301e-11 5362 KSP Residual norm 1.519604236173e-11 5363 KSP Residual norm 1.285606363857e-11 5364 KSP Residual norm 1.221582723735e-11 5365 KSP Residual norm 1.459647689529e-11 5366 KSP Residual norm 1.704149243244e-11 5367 KSP Residual norm 1.696515279880e-11 5368 KSP Residual norm 1.674395010839e-11 5369 KSP Residual norm 1.886583502951e-11 5370 KSP Residual norm 2.451660548596e-11 5371 KSP Residual norm 3.106772571867e-11 5372 KSP Residual norm 3.180571141991e-11 5373 KSP Residual norm 3.127384063546e-11 5374 KSP Residual norm 3.199509160299e-11 5375 KSP Residual norm 3.139244120777e-11 5376 KSP Residual norm 2.632320228389e-11 5377 KSP Residual norm 2.123561722598e-11 5378 KSP Residual norm 1.843044483114e-11 5379 KSP Residual norm 1.794249119264e-11 5380 KSP Residual norm 1.788667167948e-11 5381 KSP Residual norm 1.591700991738e-11 5382 KSP Residual norm 1.468764867920e-11 5383 KSP Residual norm 1.522933650195e-11 5384 KSP Residual norm 1.638518625246e-11 5385 KSP Residual norm 1.465970147098e-11 5386 KSP Residual norm 1.112857794695e-11 5387 KSP Residual norm 9.315317404182e-12 5388 KSP Residual norm 9.326772567693e-12 5389 KSP Residual norm 1.029214539178e-11 5390 KSP Residual norm 1.059363007264e-11 5391 KSP Residual norm 1.130298503889e-11 5392 KSP Residual norm 1.325481156490e-11 5393 KSP Residual norm 1.428637596084e-11 5394 KSP Residual norm 1.347917755403e-11 5395 KSP Residual norm 1.269733177386e-11 5396 KSP Residual norm 1.186963814900e-11 5397 KSP Residual norm 1.213677060445e-11 5398 KSP Residual norm 1.382573846256e-11 5399 KSP Residual norm 1.423360530884e-11 5400 KSP Residual norm 1.210118225050e-11 5401 KSP Residual norm 1.089226053930e-11 5402 KSP Residual norm 1.101319544099e-11 5403 KSP Residual norm 1.156315412254e-11 5404 KSP Residual norm 1.122257321038e-11 5405 KSP Residual norm 9.530089692831e-12 5406 KSP Residual norm 7.806114902998e-12 5407 KSP Residual norm 7.203950144184e-12 5408 KSP Residual norm 7.413304423724e-12 5409 KSP Residual norm 8.160282209148e-12 5410 KSP Residual norm 9.282699219470e-12 5411 KSP Residual norm 9.617791870059e-12 5412 KSP Residual norm 9.358174528593e-12 5413 KSP Residual norm 1.017776720314e-11 5414 KSP Residual norm 1.191719446161e-11 5415 KSP Residual norm 1.172419975584e-11 5416 KSP Residual norm 1.049187711885e-11 5417 KSP Residual norm 9.812594347276e-12 5418 KSP Residual norm 9.926190160924e-12 5419 KSP Residual norm 9.578301613333e-12 5420 KSP Residual norm 8.542673440279e-12 5421 KSP Residual norm 7.164333585593e-12 5422 KSP Residual norm 6.476753814366e-12 5423 KSP Residual norm 6.893400056395e-12 5424 KSP Residual norm 7.121151354515e-12 5425 KSP Residual norm 6.398375032379e-12 5426 KSP Residual norm 5.555527321950e-12 5427 KSP Residual norm 5.146360597222e-12 5428 KSP Residual norm 5.402486562949e-12 5429 KSP Residual norm 6.009660397642e-12 5430 KSP Residual norm 5.857312257327e-12 5431 KSP Residual norm 4.835547937453e-12 5432 KSP Residual norm 4.276661900734e-12 5433 KSP Residual norm 4.417990321684e-12 5434 KSP Residual norm 4.928989287118e-12 5435 KSP Residual norm 5.150948391793e-12 5436 KSP Residual norm 4.753197078643e-12 5437 KSP Residual norm 4.970181530817e-12 5438 KSP Residual norm 5.835386973637e-12 5439 KSP Residual norm 6.389469192916e-12 5440 KSP Residual norm 6.605414317162e-12 5441 KSP Residual norm 7.089388281390e-12 5442 KSP Residual norm 7.987209953458e-12 5443 KSP Residual norm 9.108164536296e-12 5444 KSP Residual norm 1.050159395083e-11 5445 KSP Residual norm 1.059094280172e-11 5446 KSP Residual norm 9.077613292210e-12 5447 KSP Residual norm 8.208677587707e-12 5448 KSP Residual norm 8.207192416435e-12 5449 KSP Residual norm 7.773362005385e-12 5450 KSP Residual norm 6.943239754201e-12 5451 KSP Residual norm 6.146895242273e-12 5452 KSP Residual norm 5.477615562952e-12 5453 KSP Residual norm 5.057195242514e-12 5454 KSP Residual norm 5.138480432866e-12 5455 KSP Residual norm 5.172359304340e-12 5456 KSP Residual norm 5.317569843744e-12 5457 KSP Residual norm 5.795613211834e-12 5458 KSP Residual norm 7.063084726888e-12 5459 KSP Residual norm 8.607988968644e-12 5460 KSP Residual norm 9.185585298678e-12 5461 KSP Residual norm 1.024418194030e-11 5462 KSP Residual norm 1.210485321819e-11 5463 KSP Residual norm 1.207712514917e-11 5464 KSP Residual norm 1.183850057761e-11 5465 KSP Residual norm 1.342713883329e-11 5466 KSP Residual norm 1.467908085507e-11 5467 KSP Residual norm 1.429822877368e-11 5468 KSP Residual norm 1.343300879601e-11 5469 KSP Residual norm 1.260346600652e-11 5470 KSP Residual norm 1.185209565417e-11 5471 KSP Residual norm 1.167242585746e-11 5472 KSP Residual norm 1.195387344619e-11 5473 KSP Residual norm 1.242235972988e-11 5474 KSP Residual norm 1.117351173642e-11 5475 KSP Residual norm 9.676012854783e-12 5476 KSP Residual norm 9.096996902633e-12 5477 KSP Residual norm 8.692391803752e-12 5478 KSP Residual norm 8.022983750297e-12 5479 KSP Residual norm 7.314811670998e-12 5480 KSP Residual norm 6.621303295606e-12 5481 KSP Residual norm 6.484803326330e-12 5482 KSP Residual norm 7.048525559733e-12 5483 KSP Residual norm 7.702901855617e-12 5484 KSP Residual norm 7.206790032149e-12 5485 KSP Residual norm 6.796633262880e-12 5486 KSP Residual norm 7.712387262627e-12 5487 KSP Residual norm 9.730488748942e-12 5488 KSP Residual norm 1.042529474248e-11 5489 KSP Residual norm 9.929568711613e-12 5490 KSP Residual norm 1.024815995877e-11 5491 KSP Residual norm 1.166549114584e-11 5492 KSP Residual norm 1.210854611116e-11 5493 KSP Residual norm 1.157403254952e-11 5494 KSP Residual norm 1.160223681999e-11 5495 KSP Residual norm 1.250262428085e-11 5496 KSP Residual norm 1.302312267053e-11 5497 KSP Residual norm 1.238596203617e-11 5498 KSP Residual norm 1.033909755605e-11 5499 KSP Residual norm 8.798897468260e-12 5500 KSP Residual norm 8.835829480707e-12 5501 KSP Residual norm 8.610273085702e-12 5502 KSP Residual norm 7.124767022959e-12 5503 KSP Residual norm 5.889586877039e-12 5504 KSP Residual norm 6.016352887111e-12 5505 KSP Residual norm 6.427926843818e-12 5506 KSP Residual norm 6.353004121850e-12 5507 KSP Residual norm 5.961702014386e-12 5508 KSP Residual norm 5.572244544366e-12 5509 KSP Residual norm 5.770233443156e-12 5510 KSP Residual norm 6.375449959926e-12 5511 KSP Residual norm 6.659985827730e-12 5512 KSP Residual norm 6.609587496396e-12 5513 KSP Residual norm 6.805204452205e-12 5514 KSP Residual norm 6.867141738881e-12 5515 KSP Residual norm 6.233728440556e-12 5516 KSP Residual norm 5.677938245384e-12 5517 KSP Residual norm 5.413145651623e-12 5518 KSP Residual norm 5.630571534340e-12 5519 KSP Residual norm 5.967232947870e-12 5520 KSP Residual norm 5.917455745039e-12 5521 KSP Residual norm 6.346756861438e-12 5522 KSP Residual norm 7.057286572378e-12 5523 KSP Residual norm 7.214732367034e-12 5524 KSP Residual norm 7.439624112191e-12 5525 KSP Residual norm 7.987661895676e-12 5526 KSP Residual norm 8.391447385169e-12 5527 KSP Residual norm 9.158000178287e-12 5528 KSP Residual norm 9.924508960532e-12 5529 KSP Residual norm 1.015857770006e-11 5530 KSP Residual norm 1.074004662550e-11 5531 KSP Residual norm 1.159579239177e-11 5532 KSP Residual norm 1.122481301232e-11 5533 KSP Residual norm 1.017824061480e-11 5534 KSP Residual norm 9.809218424974e-12 5535 KSP Residual norm 1.010579660232e-11 5536 KSP Residual norm 1.059361015838e-11 5537 KSP Residual norm 1.086764150170e-11 5538 KSP Residual norm 1.234800115441e-11 5539 KSP Residual norm 1.489246461477e-11 5540 KSP Residual norm 1.519308417199e-11 5541 KSP Residual norm 1.396694619266e-11 5542 KSP Residual norm 1.435035664336e-11 5543 KSP Residual norm 1.442074656565e-11 5544 KSP Residual norm 1.227687789587e-11 5545 KSP Residual norm 1.039735360670e-11 5546 KSP Residual norm 9.824244281285e-12 5547 KSP Residual norm 9.810676622436e-12 5548 KSP Residual norm 9.286512542398e-12 5549 KSP Residual norm 8.888380313094e-12 5550 KSP Residual norm 8.807554668120e-12 5551 KSP Residual norm 9.648479320880e-12 5552 KSP Residual norm 1.032768445778e-11 5553 KSP Residual norm 1.008481387419e-11 5554 KSP Residual norm 1.016424204519e-11 5555 KSP Residual norm 1.122259547765e-11 5556 KSP Residual norm 1.189316128824e-11 5557 KSP Residual norm 1.209898546859e-11 5558 KSP Residual norm 1.313046807997e-11 5559 KSP Residual norm 1.533822210499e-11 5560 KSP Residual norm 1.872862619774e-11 5561 KSP Residual norm 1.957616683148e-11 5562 KSP Residual norm 1.892594697107e-11 5563 KSP Residual norm 2.011092327579e-11 5564 KSP Residual norm 2.149502666797e-11 5565 KSP Residual norm 2.237795057636e-11 5566 KSP Residual norm 2.481468631349e-11 5567 KSP Residual norm 2.539659162738e-11 5568 KSP Residual norm 2.450628077691e-11 5569 KSP Residual norm 2.351965569059e-11 5570 KSP Residual norm 2.189617612615e-11 5571 KSP Residual norm 1.952346385018e-11 5572 KSP Residual norm 2.016494746240e-11 5573 KSP Residual norm 2.240459836862e-11 5574 KSP Residual norm 2.245439359386e-11 5575 KSP Residual norm 2.195253331909e-11 5576 KSP Residual norm 2.242614757782e-11 5577 KSP Residual norm 2.213716340487e-11 5578 KSP Residual norm 2.349556255952e-11 5579 KSP Residual norm 2.850422713540e-11 5580 KSP Residual norm 3.303854588535e-11 5581 KSP Residual norm 3.621154533585e-11 5582 KSP Residual norm 3.999065823538e-11 5583 KSP Residual norm 3.977078013051e-11 5584 KSP Residual norm 3.669807197786e-11 5585 KSP Residual norm 3.635416161144e-11 5586 KSP Residual norm 4.123747838186e-11 5587 KSP Residual norm 4.597899632726e-11 5588 KSP Residual norm 4.340615890122e-11 5589 KSP Residual norm 4.287827637385e-11 5590 KSP Residual norm 4.937198760374e-11 5591 KSP Residual norm 4.855785423357e-11 5592 KSP Residual norm 3.709978758845e-11 5593 KSP Residual norm 2.957822082424e-11 5594 KSP Residual norm 2.609218010244e-11 5595 KSP Residual norm 2.259328490265e-11 5596 KSP Residual norm 1.965763796735e-11 5597 KSP Residual norm 1.856369834539e-11 5598 KSP Residual norm 1.942391529075e-11 5599 KSP Residual norm 2.154861608985e-11 5600 KSP Residual norm 2.374424468329e-11 5601 KSP Residual norm 2.434339448669e-11 5602 KSP Residual norm 2.502094502266e-11 5603 KSP Residual norm 2.619635892519e-11 5604 KSP Residual norm 2.731355707579e-11 5605 KSP Residual norm 2.992192841012e-11 5606 KSP Residual norm 3.478600266647e-11 5607 KSP Residual norm 3.414099037596e-11 5608 KSP Residual norm 3.113906879984e-11 5609 KSP Residual norm 3.028969889874e-11 5610 KSP Residual norm 3.159869810882e-11 5611 KSP Residual norm 3.016863242224e-11 5612 KSP Residual norm 2.853102432581e-11 5613 KSP Residual norm 3.014264052844e-11 5614 KSP Residual norm 3.303614026879e-11 5615 KSP Residual norm 3.044388372803e-11 5616 KSP Residual norm 2.626422945323e-11 5617 KSP Residual norm 2.547853427834e-11 5618 KSP Residual norm 2.575139445495e-11 5619 KSP Residual norm 2.232576524035e-11 5620 KSP Residual norm 1.827003429193e-11 5621 KSP Residual norm 1.589397929189e-11 5622 KSP Residual norm 1.423938233379e-11 5623 KSP Residual norm 1.318873513523e-11 5624 KSP Residual norm 1.318181895343e-11 5625 KSP Residual norm 1.255081903551e-11 5626 KSP Residual norm 1.130340395483e-11 5627 KSP Residual norm 1.075137607446e-11 5628 KSP Residual norm 1.057041154129e-11 5629 KSP Residual norm 9.931293702579e-12 5630 KSP Residual norm 8.985192718914e-12 5631 KSP Residual norm 8.398186523772e-12 5632 KSP Residual norm 8.933644090464e-12 5633 KSP Residual norm 1.011279050211e-11 5634 KSP Residual norm 1.173393290530e-11 5635 KSP Residual norm 1.359619081274e-11 5636 KSP Residual norm 1.545153795429e-11 5637 KSP Residual norm 1.642524592929e-11 5638 KSP Residual norm 1.755793154553e-11 5639 KSP Residual norm 2.044797564840e-11 5640 KSP Residual norm 2.484700822173e-11 5641 KSP Residual norm 2.655340561964e-11 5642 KSP Residual norm 2.829121611488e-11 5643 KSP Residual norm 3.259156146349e-11 5644 KSP Residual norm 3.606327485826e-11 5645 KSP Residual norm 3.560568289266e-11 5646 KSP Residual norm 3.761060611307e-11 5647 KSP Residual norm 4.352815582403e-11 5648 KSP Residual norm 5.353063682980e-11 5649 KSP Residual norm 6.111845132842e-11 5650 KSP Residual norm 6.038062861573e-11 5651 KSP Residual norm 5.898752481490e-11 5652 KSP Residual norm 6.522347071309e-11 5653 KSP Residual norm 7.270413987527e-11 5654 KSP Residual norm 6.714123027328e-11 5655 KSP Residual norm 5.695945129323e-11 5656 KSP Residual norm 4.866025927520e-11 5657 KSP Residual norm 4.316327848987e-11 5658 KSP Residual norm 4.105739476341e-11 5659 KSP Residual norm 3.658513992281e-11 5660 KSP Residual norm 3.587794716169e-11 5661 KSP Residual norm 3.925707911683e-11 5662 KSP Residual norm 4.586312803899e-11 5663 KSP Residual norm 5.378143083475e-11 5664 KSP Residual norm 5.418324851050e-11 5665 KSP Residual norm 4.800839098110e-11 5666 KSP Residual norm 4.821005432672e-11 5667 KSP Residual norm 5.182512298826e-11 5668 KSP Residual norm 4.811064945468e-11 5669 KSP Residual norm 4.637349290365e-11 5670 KSP Residual norm 4.997876964620e-11 5671 KSP Residual norm 4.949868452906e-11 5672 KSP Residual norm 4.370826024422e-11 5673 KSP Residual norm 4.313800721563e-11 5674 KSP Residual norm 4.542428277565e-11 5675 KSP Residual norm 4.289557699782e-11 5676 KSP Residual norm 3.896644023394e-11 5677 KSP Residual norm 3.656489162078e-11 5678 KSP Residual norm 3.393190531199e-11 5679 KSP Residual norm 3.542996016483e-11 5680 KSP Residual norm 3.851480981982e-11 5681 KSP Residual norm 3.560364599481e-11 5682 KSP Residual norm 3.097650730922e-11 5683 KSP Residual norm 3.122963938428e-11 5684 KSP Residual norm 3.880561733838e-11 5685 KSP Residual norm 4.613100038800e-11 5686 KSP Residual norm 4.728640105559e-11 5687 KSP Residual norm 5.125914211218e-11 5688 KSP Residual norm 6.218346402723e-11 5689 KSP Residual norm 6.456549209761e-11 5690 KSP Residual norm 5.923367346102e-11 5691 KSP Residual norm 6.462724734893e-11 5692 KSP Residual norm 6.865232564838e-11 5693 KSP Residual norm 6.472334547721e-11 5694 KSP Residual norm 6.447880492928e-11 5695 KSP Residual norm 6.488417885693e-11 5696 KSP Residual norm 6.255758601653e-11 5697 KSP Residual norm 5.944574729260e-11 5698 KSP Residual norm 5.293374468291e-11 5699 KSP Residual norm 4.294991851597e-11 5700 KSP Residual norm 3.689899941140e-11 5701 KSP Residual norm 3.578131918797e-11 5702 KSP Residual norm 3.469141599564e-11 5703 KSP Residual norm 3.310879470648e-11 5704 KSP Residual norm 3.417168412640e-11 5705 KSP Residual norm 4.093848370123e-11 5706 KSP Residual norm 5.046409858929e-11 5707 KSP Residual norm 5.542683964662e-11 5708 KSP Residual norm 5.909890386194e-11 5709 KSP Residual norm 6.737348828020e-11 5710 KSP Residual norm 7.388443503162e-11 5711 KSP Residual norm 7.613955014665e-11 5712 KSP Residual norm 7.806341826237e-11 5713 KSP Residual norm 8.157831390738e-11 5714 KSP Residual norm 8.457736671073e-11 5715 KSP Residual norm 8.599260467969e-11 5716 KSP Residual norm 7.828936094528e-11 5717 KSP Residual norm 6.854499835197e-11 5718 KSP Residual norm 6.878459637399e-11 5719 KSP Residual norm 6.769857983918e-11 5720 KSP Residual norm 6.024267606041e-11 5721 KSP Residual norm 5.773292258208e-11 5722 KSP Residual norm 5.758913749237e-11 5723 KSP Residual norm 5.278317414136e-11 5724 KSP Residual norm 4.869794536237e-11 5725 KSP Residual norm 4.552170981397e-11 5726 KSP Residual norm 4.487491673030e-11 5727 KSP Residual norm 4.806615959698e-11 5728 KSP Residual norm 5.234737139084e-11 5729 KSP Residual norm 5.363295475532e-11 5730 KSP Residual norm 5.419736839809e-11 5731 KSP Residual norm 5.549021266325e-11 5732 KSP Residual norm 6.396296859826e-11 5733 KSP Residual norm 7.537143580508e-11 5734 KSP Residual norm 8.357416465031e-11 5735 KSP Residual norm 9.661823746778e-11 5736 KSP Residual norm 1.174303604878e-10 5737 KSP Residual norm 1.165633267073e-10 5738 KSP Residual norm 1.100222014221e-10 5739 KSP Residual norm 1.279569268042e-10 5740 KSP Residual norm 1.383672631837e-10 5741 KSP Residual norm 1.256840200761e-10 5742 KSP Residual norm 1.279597953376e-10 5743 KSP Residual norm 1.355937408620e-10 5744 KSP Residual norm 1.300850445245e-10 5745 KSP Residual norm 1.335925203137e-10 5746 KSP Residual norm 1.409345632961e-10 5747 KSP Residual norm 1.434507186744e-10 5748 KSP Residual norm 1.354613451236e-10 5749 KSP Residual norm 1.156082936977e-10 5750 KSP Residual norm 1.001323250331e-10 5751 KSP Residual norm 9.564083063013e-11 5752 KSP Residual norm 9.521353087749e-11 5753 KSP Residual norm 9.420194779746e-11 5754 KSP Residual norm 9.251699808077e-11 5755 KSP Residual norm 9.261771442548e-11 5756 KSP Residual norm 9.916480213888e-11 5757 KSP Residual norm 1.118228764952e-10 5758 KSP Residual norm 1.149340334046e-10 5759 KSP Residual norm 1.216526890013e-10 5760 KSP Residual norm 1.452013829331e-10 5761 KSP Residual norm 1.493113203364e-10 5762 KSP Residual norm 1.356260273381e-10 5763 KSP Residual norm 1.604515997576e-10 5764 KSP Residual norm 2.031863445801e-10 5765 KSP Residual norm 2.060254433490e-10 5766 KSP Residual norm 1.942411873997e-10 5767 KSP Residual norm 1.935994757589e-10 5768 KSP Residual norm 1.881496540256e-10 5769 KSP Residual norm 1.889755190418e-10 5770 KSP Residual norm 1.954337928717e-10 5771 KSP Residual norm 1.963895681667e-10 5772 KSP Residual norm 2.195751374139e-10 5773 KSP Residual norm 2.543635360762e-10 5774 KSP Residual norm 2.624050825572e-10 5775 KSP Residual norm 2.570548686124e-10 5776 KSP Residual norm 2.574337225234e-10 5777 KSP Residual norm 2.612126029886e-10 5778 KSP Residual norm 2.585372503511e-10 5779 KSP Residual norm 2.467428579512e-10 5780 KSP Residual norm 2.293342901898e-10 5781 KSP Residual norm 2.177866322364e-10 5782 KSP Residual norm 1.999173116603e-10 5783 KSP Residual norm 1.844896586790e-10 5784 KSP Residual norm 1.841227827190e-10 5785 KSP Residual norm 1.785740562691e-10 5786 KSP Residual norm 1.656438303608e-10 5787 KSP Residual norm 1.595742191893e-10 5788 KSP Residual norm 1.509328177101e-10 5789 KSP Residual norm 1.511043162007e-10 5790 KSP Residual norm 1.655258694499e-10 5791 KSP Residual norm 1.605057680389e-10 5792 KSP Residual norm 1.432126206287e-10 5793 KSP Residual norm 1.395361154996e-10 5794 KSP Residual norm 1.463435452772e-10 5795 KSP Residual norm 1.432461509420e-10 5796 KSP Residual norm 1.361785700969e-10 5797 KSP Residual norm 1.361006346693e-10 5798 KSP Residual norm 1.399983943472e-10 5799 KSP Residual norm 1.452725223508e-10 5800 KSP Residual norm 1.442211067935e-10 5801 KSP Residual norm 1.346628556516e-10 5802 KSP Residual norm 1.356441924416e-10 5803 KSP Residual norm 1.299349396197e-10 5804 KSP Residual norm 1.113033283379e-10 5805 KSP Residual norm 1.039366224894e-10 5806 KSP Residual norm 1.084960630769e-10 5807 KSP Residual norm 1.081336413826e-10 5808 KSP Residual norm 1.030561431364e-10 5809 KSP Residual norm 9.212207613555e-11 5810 KSP Residual norm 8.327444190202e-11 5811 KSP Residual norm 8.318318548337e-11 5812 KSP Residual norm 8.613996377652e-11 5813 KSP Residual norm 7.748248729194e-11 5814 KSP Residual norm 7.383935918690e-11 5815 KSP Residual norm 7.819782891562e-11 5816 KSP Residual norm 8.169883511052e-11 5817 KSP Residual norm 7.723758924498e-11 5818 KSP Residual norm 7.475759197589e-11 5819 KSP Residual norm 7.684645157648e-11 5820 KSP Residual norm 8.219268226744e-11 5821 KSP Residual norm 7.417878294479e-11 5822 KSP Residual norm 6.640057309780e-11 5823 KSP Residual norm 7.377727010312e-11 5824 KSP Residual norm 8.856273578044e-11 5825 KSP Residual norm 8.609443979617e-11 5826 KSP Residual norm 8.506913818783e-11 5827 KSP Residual norm 9.627083097049e-11 5828 KSP Residual norm 1.007096535428e-10 5829 KSP Residual norm 9.852594002881e-11 5830 KSP Residual norm 1.015189302557e-10 5831 KSP Residual norm 1.003569948251e-10 5832 KSP Residual norm 1.018370286075e-10 5833 KSP Residual norm 1.057034209955e-10 5834 KSP Residual norm 1.016330717476e-10 5835 KSP Residual norm 9.443668422471e-11 5836 KSP Residual norm 9.037752804484e-11 5837 KSP Residual norm 9.110894450970e-11 5838 KSP Residual norm 9.541165287370e-11 5839 KSP Residual norm 8.918138152508e-11 5840 KSP Residual norm 7.634130012907e-11 5841 KSP Residual norm 7.678527940424e-11 5842 KSP Residual norm 8.689756028621e-11 5843 KSP Residual norm 7.792217841204e-11 5844 KSP Residual norm 6.791380452440e-11 5845 KSP Residual norm 7.269258653373e-11 5846 KSP Residual norm 8.077625651603e-11 5847 KSP Residual norm 8.553588385418e-11 5848 KSP Residual norm 8.390286343256e-11 5849 KSP Residual norm 8.456223814207e-11 5850 KSP Residual norm 1.004512177627e-10 5851 KSP Residual norm 1.202272607853e-10 5852 KSP Residual norm 1.213000084140e-10 5853 KSP Residual norm 1.113221666329e-10 5854 KSP Residual norm 9.780681784626e-11 5855 KSP Residual norm 9.158730128583e-11 5856 KSP Residual norm 1.003749028518e-10 5857 KSP Residual norm 1.070610373438e-10 5858 KSP Residual norm 1.046269894158e-10 5859 KSP Residual norm 1.239288723376e-10 5860 KSP Residual norm 1.547477405855e-10 5861 KSP Residual norm 1.378143794562e-10 5862 KSP Residual norm 1.017073226796e-10 5863 KSP Residual norm 8.561880900112e-11 5864 KSP Residual norm 8.533456053422e-11 5865 KSP Residual norm 8.506281951156e-11 5866 KSP Residual norm 7.360216662078e-11 5867 KSP Residual norm 6.400250684363e-11 5868 KSP Residual norm 6.424161817584e-11 5869 KSP Residual norm 6.692885883273e-11 5870 KSP Residual norm 6.744589834035e-11 5871 KSP Residual norm 7.196414017098e-11 5872 KSP Residual norm 7.690288041792e-11 5873 KSP Residual norm 7.814729797736e-11 5874 KSP Residual norm 7.817507959900e-11 5875 KSP Residual norm 8.060030669745e-11 5876 KSP Residual norm 8.513025706974e-11 5877 KSP Residual norm 9.731240960080e-11 5878 KSP Residual norm 1.037414966910e-10 5879 KSP Residual norm 1.008673858655e-10 5880 KSP Residual norm 1.090908749139e-10 5881 KSP Residual norm 1.267070364733e-10 5882 KSP Residual norm 1.241475047424e-10 5883 KSP Residual norm 1.280073700169e-10 5884 KSP Residual norm 1.489172032276e-10 5885 KSP Residual norm 1.561390384459e-10 5886 KSP Residual norm 1.485361753268e-10 5887 KSP Residual norm 1.495104830699e-10 5888 KSP Residual norm 1.504202222468e-10 5889 KSP Residual norm 1.346720743183e-10 5890 KSP Residual norm 1.218045943131e-10 5891 KSP Residual norm 1.093043236346e-10 5892 KSP Residual norm 9.477972125942e-11 5893 KSP Residual norm 9.341793794874e-11 5894 KSP Residual norm 1.058600854413e-10 5895 KSP Residual norm 1.115128400226e-10 5896 KSP Residual norm 1.041207809990e-10 5897 KSP Residual norm 9.583970348767e-11 5898 KSP Residual norm 9.777535096215e-11 5899 KSP Residual norm 9.474401366706e-11 5900 KSP Residual norm 9.087173050590e-11 5901 KSP Residual norm 9.814759183158e-11 5902 KSP Residual norm 1.010187887439e-10 5903 KSP Residual norm 9.240473442485e-11 5904 KSP Residual norm 9.906273514330e-11 5905 KSP Residual norm 1.235739398530e-10 5906 KSP Residual norm 1.203534874351e-10 5907 KSP Residual norm 1.092106157425e-10 5908 KSP Residual norm 1.207887102168e-10 5909 KSP Residual norm 1.378268146834e-10 5910 KSP Residual norm 1.358145069690e-10 5911 KSP Residual norm 1.309616576167e-10 5912 KSP Residual norm 1.349661256595e-10 5913 KSP Residual norm 1.213429989130e-10 5914 KSP Residual norm 9.556006585300e-11 5915 KSP Residual norm 8.333572748946e-11 5916 KSP Residual norm 8.093181478501e-11 5917 KSP Residual norm 7.517408151023e-11 5918 KSP Residual norm 6.477328330274e-11 5919 KSP Residual norm 6.051444325039e-11 5920 KSP Residual norm 5.928822003694e-11 5921 KSP Residual norm 5.393272573685e-11 5922 KSP Residual norm 5.275324709911e-11 5923 KSP Residual norm 5.540127171928e-11 5924 KSP Residual norm 5.099315819361e-11 5925 KSP Residual norm 4.603188859251e-11 5926 KSP Residual norm 4.426258066348e-11 5927 KSP Residual norm 4.170952953323e-11 5928 KSP Residual norm 4.161198668452e-11 5929 KSP Residual norm 4.767327259135e-11 5930 KSP Residual norm 5.472654339013e-11 5931 KSP Residual norm 5.828566553660e-11 5932 KSP Residual norm 6.461107652153e-11 5933 KSP Residual norm 6.646430054011e-11 5934 KSP Residual norm 6.707808932525e-11 5935 KSP Residual norm 6.975473834227e-11 5936 KSP Residual norm 6.477733807429e-11 5937 KSP Residual norm 6.374916657040e-11 5938 KSP Residual norm 7.611580566168e-11 5939 KSP Residual norm 7.660504525112e-11 5940 KSP Residual norm 6.781580714379e-11 5941 KSP Residual norm 6.574410103546e-11 5942 KSP Residual norm 6.072364872694e-11 5943 KSP Residual norm 5.270446532914e-11 5944 KSP Residual norm 4.994771303891e-11 5945 KSP Residual norm 4.887491548898e-11 5946 KSP Residual norm 4.660079555963e-11 5947 KSP Residual norm 4.222394503038e-11 5948 KSP Residual norm 4.143249986091e-11 5949 KSP Residual norm 4.078249190659e-11 5950 KSP Residual norm 3.863372359253e-11 5951 KSP Residual norm 3.716741272742e-11 5952 KSP Residual norm 4.015874986995e-11 5953 KSP Residual norm 4.125877527437e-11 5954 KSP Residual norm 3.625630583457e-11 5955 KSP Residual norm 3.559343454009e-11 5956 KSP Residual norm 4.031449886192e-11 5957 KSP Residual norm 3.722192136577e-11 5958 KSP Residual norm 3.329075975801e-11 5959 KSP Residual norm 3.466427832161e-11 5960 KSP Residual norm 3.663650859890e-11 5961 KSP Residual norm 4.202150067272e-11 5962 KSP Residual norm 5.200641059960e-11 5963 KSP Residual norm 5.813939821609e-11 5964 KSP Residual norm 5.714715471928e-11 5965 KSP Residual norm 5.646893489239e-11 5966 KSP Residual norm 5.499031474681e-11 5967 KSP Residual norm 5.173814366849e-11 5968 KSP Residual norm 5.429249897624e-11 5969 KSP Residual norm 6.462715802647e-11 5970 KSP Residual norm 7.785761818193e-11 5971 KSP Residual norm 8.700703117530e-11 5972 KSP Residual norm 8.993871687674e-11 5973 KSP Residual norm 9.609659048824e-11 5974 KSP Residual norm 1.068226511863e-10 5975 KSP Residual norm 1.127703274380e-10 5976 KSP Residual norm 1.193851823609e-10 5977 KSP Residual norm 1.443660608508e-10 5978 KSP Residual norm 1.679999300579e-10 5979 KSP Residual norm 1.665906906259e-10 5980 KSP Residual norm 1.603834074196e-10 5981 KSP Residual norm 1.518458922183e-10 5982 KSP Residual norm 1.422039734212e-10 5983 KSP Residual norm 1.289443453085e-10 5984 KSP Residual norm 1.053650097050e-10 5985 KSP Residual norm 8.985168070641e-11 5986 KSP Residual norm 8.754505438088e-11 5987 KSP Residual norm 8.705281583706e-11 5988 KSP Residual norm 8.727818322764e-11 5989 KSP Residual norm 8.601023308319e-11 5990 KSP Residual norm 7.754646282901e-11 5991 KSP Residual norm 7.634934062486e-11 5992 KSP Residual norm 7.646924522212e-11 5993 KSP Residual norm 6.275081995883e-11 5994 KSP Residual norm 5.042198120209e-11 5995 KSP Residual norm 5.174675224079e-11 5996 KSP Residual norm 6.192604483184e-11 5997 KSP Residual norm 6.480544309045e-11 5998 KSP Residual norm 6.043941633936e-11 5999 KSP Residual norm 5.688984290809e-11 6000 KSP Residual norm 6.153570873707e-11 6001 KSP Residual norm 7.055374212696e-11 6002 KSP Residual norm 7.996035899778e-11 6003 KSP Residual norm 8.315489907540e-11 6004 KSP Residual norm 7.809938120457e-11 6005 KSP Residual norm 7.370801848636e-11 6006 KSP Residual norm 7.232328876318e-11 6007 KSP Residual norm 6.893265056490e-11 6008 KSP Residual norm 6.215116572809e-11 6009 KSP Residual norm 5.952797447756e-11 6010 KSP Residual norm 5.843167301971e-11 6011 KSP Residual norm 4.881687185081e-11 6012 KSP Residual norm 4.309196222023e-11 6013 KSP Residual norm 4.525902101113e-11 6014 KSP Residual norm 4.705835977562e-11 6015 KSP Residual norm 4.777147220703e-11 6016 KSP Residual norm 5.205031936963e-11 6017 KSP Residual norm 4.700818719864e-11 6018 KSP Residual norm 3.620602021949e-11 6019 KSP Residual norm 3.237132953078e-11 6020 KSP Residual norm 3.055178591077e-11 6021 KSP Residual norm 2.811245210145e-11 6022 KSP Residual norm 2.528961926683e-11 6023 KSP Residual norm 2.341215597173e-11 6024 KSP Residual norm 2.115207115705e-11 6025 KSP Residual norm 2.068847831662e-11 6026 KSP Residual norm 2.032203732432e-11 6027 KSP Residual norm 1.935994579464e-11 6028 KSP Residual norm 1.746403478509e-11 6029 KSP Residual norm 1.661646153240e-11 6030 KSP Residual norm 1.626810951695e-11 6031 KSP Residual norm 1.456175899872e-11 6032 KSP Residual norm 1.238499311132e-11 6033 KSP Residual norm 1.201875477374e-11 6034 KSP Residual norm 1.182580076661e-11 6035 KSP Residual norm 1.012100891493e-11 6036 KSP Residual norm 9.133131903760e-12 6037 KSP Residual norm 8.870422513034e-12 6038 KSP Residual norm 8.344984715360e-12 6039 KSP Residual norm 8.627606099311e-12 6040 KSP Residual norm 9.307486565596e-12 6041 KSP Residual norm 9.164096122709e-12 6042 KSP Residual norm 8.904994947782e-12 6043 KSP Residual norm 9.123580237034e-12 6044 KSP Residual norm 9.270093380464e-12 6045 KSP Residual norm 8.907549292948e-12 6046 KSP Residual norm 8.515651720645e-12 6047 KSP Residual norm 8.358097765138e-12 6048 KSP Residual norm 8.867279270219e-12 6049 KSP Residual norm 9.027176312400e-12 6050 KSP Residual norm 8.407713257764e-12 6051 KSP Residual norm 8.204880618109e-12 6052 KSP Residual norm 8.297844963422e-12 6053 KSP Residual norm 8.358495694263e-12 6054 KSP Residual norm 9.055899779415e-12 6055 KSP Residual norm 1.067125644897e-11 6056 KSP Residual norm 1.179247104036e-11 6057 KSP Residual norm 1.323546995781e-11 6058 KSP Residual norm 1.468584329199e-11 6059 KSP Residual norm 1.493666305568e-11 6060 KSP Residual norm 1.425958406936e-11 6061 KSP Residual norm 1.301153915274e-11 6062 KSP Residual norm 1.214297419279e-11 6063 KSP Residual norm 1.205248585571e-11 6064 KSP Residual norm 1.117956245808e-11 6065 KSP Residual norm 1.001031197149e-11 6066 KSP Residual norm 1.061978302351e-11 6067 KSP Residual norm 1.171671438435e-11 6068 KSP Residual norm 1.094645008941e-11 6069 KSP Residual norm 1.055300855068e-11 6070 KSP Residual norm 9.932014465356e-12 6071 KSP Residual norm 7.849487947874e-12 6072 KSP Residual norm 7.237863920675e-12 6073 KSP Residual norm 8.363893584069e-12 6074 KSP Residual norm 8.820157634672e-12 6075 KSP Residual norm 7.821883340973e-12 6076 KSP Residual norm 7.398531637995e-12 6077 KSP Residual norm 7.316354920353e-12 6078 KSP Residual norm 6.812004232210e-12 6079 KSP Residual norm 6.972314201247e-12 6080 KSP Residual norm 7.920294918265e-12 6081 KSP Residual norm 8.801369145327e-12 6082 KSP Residual norm 8.950825652825e-12 6083 KSP Residual norm 8.957956380879e-12 6084 KSP Residual norm 9.343286368598e-12 6085 KSP Residual norm 1.013201421231e-11 6086 KSP Residual norm 1.163506274229e-11 6087 KSP Residual norm 1.368140713205e-11 6088 KSP Residual norm 1.393732489030e-11 6089 KSP Residual norm 1.427054692467e-11 6090 KSP Residual norm 1.494990610644e-11 6091 KSP Residual norm 1.447764494045e-11 6092 KSP Residual norm 1.422463924564e-11 6093 KSP Residual norm 1.554601588697e-11 6094 KSP Residual norm 1.512832738598e-11 6095 KSP Residual norm 1.196193173304e-11 6096 KSP Residual norm 1.011815962223e-11 6097 KSP Residual norm 1.037988957872e-11 6098 KSP Residual norm 1.063097926261e-11 6099 KSP Residual norm 9.025121533150e-12 6100 KSP Residual norm 7.807377233446e-12 6101 KSP Residual norm 7.590657227247e-12 6102 KSP Residual norm 7.565856827194e-12 6103 KSP Residual norm 7.664278495335e-12 6104 KSP Residual norm 7.858015140892e-12 6105 KSP Residual norm 7.866027619786e-12 6106 KSP Residual norm 7.845311582316e-12 6107 KSP Residual norm 8.112226673436e-12 6108 KSP Residual norm 7.996914596951e-12 6109 KSP Residual norm 7.077237094116e-12 6110 KSP Residual norm 6.608665602628e-12 6111 KSP Residual norm 7.268243812588e-12 6112 KSP Residual norm 8.691787724813e-12 6113 KSP Residual norm 9.949927056381e-12 6114 KSP Residual norm 1.046891548083e-11 6115 KSP Residual norm 1.007873783824e-11 6116 KSP Residual norm 9.319802472384e-12 6117 KSP Residual norm 8.230892421744e-12 6118 KSP Residual norm 7.018436917741e-12 6119 KSP Residual norm 7.145070800058e-12 6120 KSP Residual norm 8.268647964261e-12 6121 KSP Residual norm 9.368909237968e-12 6122 KSP Residual norm 1.048973177754e-11 6123 KSP Residual norm 1.141468121149e-11 6124 KSP Residual norm 1.087615446004e-11 6125 KSP Residual norm 9.881721634641e-12 6126 KSP Residual norm 9.683996051274e-12 6127 KSP Residual norm 9.431813068795e-12 6128 KSP Residual norm 9.292141100861e-12 6129 KSP Residual norm 9.158965552592e-12 6130 KSP Residual norm 8.276428489833e-12 6131 KSP Residual norm 7.406595151324e-12 6132 KSP Residual norm 6.499131457868e-12 6133 KSP Residual norm 5.671565910740e-12 6134 KSP Residual norm 5.407162563112e-12 6135 KSP Residual norm 5.289426147416e-12 6136 KSP Residual norm 4.959007649346e-12 6137 KSP Residual norm 4.944594628033e-12 6138 KSP Residual norm 5.054854738154e-12 6139 KSP Residual norm 5.362651151330e-12 6140 KSP Residual norm 6.425476757168e-12 6141 KSP Residual norm 7.356937491201e-12 6142 KSP Residual norm 6.914516904486e-12 6143 KSP Residual norm 6.712758425577e-12 6144 KSP Residual norm 6.417628218738e-12 6145 KSP Residual norm 5.178819577014e-12 6146 KSP Residual norm 4.670193997786e-12 6147 KSP Residual norm 5.308942288506e-12 6148 KSP Residual norm 6.318809092779e-12 6149 KSP Residual norm 6.855865487971e-12 6150 KSP Residual norm 7.737877038467e-12 6151 KSP Residual norm 9.336593347318e-12 6152 KSP Residual norm 9.381340078474e-12 6153 KSP Residual norm 8.420831240133e-12 6154 KSP Residual norm 7.710255212856e-12 6155 KSP Residual norm 8.097709196618e-12 6156 KSP Residual norm 9.024485483506e-12 6157 KSP Residual norm 9.623181946656e-12 6158 KSP Residual norm 9.556773848916e-12 6159 KSP Residual norm 8.190838301943e-12 6160 KSP Residual norm 6.681379226326e-12 6161 KSP Residual norm 6.026475570832e-12 6162 KSP Residual norm 5.926891189696e-12 6163 KSP Residual norm 5.883357125780e-12 6164 KSP Residual norm 5.600671185371e-12 6165 KSP Residual norm 4.863587677538e-12 6166 KSP Residual norm 4.086544640680e-12 6167 KSP Residual norm 3.579839146206e-12 6168 KSP Residual norm 3.165158380509e-12 6169 KSP Residual norm 2.727991415883e-12 6170 KSP Residual norm 2.553982049387e-12 6171 KSP Residual norm 2.493598832989e-12 6172 KSP Residual norm 2.143873916698e-12 6173 KSP Residual norm 1.766446869847e-12 6174 KSP Residual norm 1.531195948780e-12 6175 KSP Residual norm 1.500057230144e-12 6176 KSP Residual norm 1.711523956778e-12 6177 KSP Residual norm 2.021899438975e-12 6178 KSP Residual norm 2.095224967377e-12 6179 KSP Residual norm 1.952394667114e-12 6180 KSP Residual norm 1.843287213171e-12 6181 KSP Residual norm 1.927074123429e-12 6182 KSP Residual norm 2.061844787311e-12 6183 KSP Residual norm 2.102487536545e-12 6184 KSP Residual norm 2.110913034934e-12 6185 KSP Residual norm 2.181746844626e-12 6186 KSP Residual norm 2.044764459588e-12 6187 KSP Residual norm 1.833295079031e-12 6188 KSP Residual norm 1.728429417596e-12 6189 KSP Residual norm 1.675857659890e-12 6190 KSP Residual norm 1.728710098934e-12 6191 KSP Residual norm 1.820866500974e-12 6192 KSP Residual norm 1.799562209636e-12 6193 KSP Residual norm 1.865313072023e-12 6194 KSP Residual norm 2.091919303233e-12 6195 KSP Residual norm 2.085196055514e-12 6196 KSP Residual norm 1.911602742495e-12 6197 KSP Residual norm 1.991891049311e-12 6198 KSP Residual norm 2.014341909910e-12 6199 KSP Residual norm 1.842303803951e-12 6200 KSP Residual norm 1.784766858316e-12 6201 KSP Residual norm 1.850213985967e-12 6202 KSP Residual norm 1.902660457568e-12 6203 KSP Residual norm 1.918280828462e-12 6204 KSP Residual norm 2.075256019994e-12 6205 KSP Residual norm 2.180153393240e-12 6206 KSP Residual norm 2.011735542863e-12 6207 KSP Residual norm 1.817782852107e-12 6208 KSP Residual norm 1.802621297576e-12 6209 KSP Residual norm 1.756182066397e-12 6210 KSP Residual norm 1.681826917955e-12 6211 KSP Residual norm 1.730337808959e-12 6212 KSP Residual norm 1.938167787927e-12 6213 KSP Residual norm 2.106872252186e-12 6214 KSP Residual norm 2.027810660619e-12 6215 KSP Residual norm 2.014327711251e-12 6216 KSP Residual norm 2.251505279205e-12 6217 KSP Residual norm 2.730209526766e-12 6218 KSP Residual norm 3.068186089681e-12 6219 KSP Residual norm 3.067585568432e-12 6220 KSP Residual norm 2.942165843834e-12 6221 KSP Residual norm 3.013851608775e-12 6222 KSP Residual norm 3.262819100960e-12 6223 KSP Residual norm 3.297712466264e-12 6224 KSP Residual norm 3.277305104978e-12 6225 KSP Residual norm 3.095325808063e-12 6226 KSP Residual norm 2.995060338819e-12 6227 KSP Residual norm 3.125035573952e-12 6228 KSP Residual norm 3.271712479773e-12 6229 KSP Residual norm 3.232111472352e-12 6230 KSP Residual norm 3.562790170085e-12 6231 KSP Residual norm 4.053249678612e-12 6232 KSP Residual norm 4.184029236508e-12 6233 KSP Residual norm 4.489313327367e-12 6234 KSP Residual norm 5.180520004327e-12 6235 KSP Residual norm 5.493860412508e-12 6236 KSP Residual norm 5.211460674639e-12 6237 KSP Residual norm 5.670669865848e-12 6238 KSP Residual norm 6.571325540592e-12 6239 KSP Residual norm 6.680545906921e-12 6240 KSP Residual norm 6.617866104882e-12 6241 KSP Residual norm 7.319968714893e-12 6242 KSP Residual norm 7.198781060890e-12 6243 KSP Residual norm 6.471457244881e-12 6244 KSP Residual norm 6.900604874236e-12 6245 KSP Residual norm 7.851974101375e-12 6246 KSP Residual norm 7.785422791554e-12 6247 KSP Residual norm 7.400473646111e-12 6248 KSP Residual norm 6.966549883938e-12 6249 KSP Residual norm 6.514984025252e-12 6250 KSP Residual norm 5.814243427761e-12 6251 KSP Residual norm 4.999184267309e-12 6252 KSP Residual norm 4.558221494594e-12 6253 KSP Residual norm 4.658655238880e-12 6254 KSP Residual norm 4.895685496820e-12 6255 KSP Residual norm 4.720887965718e-12 6256 KSP Residual norm 4.649013130254e-12 6257 KSP Residual norm 4.669778689014e-12 6258 KSP Residual norm 4.666232942087e-12 6259 KSP Residual norm 4.594647450559e-12 6260 KSP Residual norm 4.240992899628e-12 6261 KSP Residual norm 3.942221886821e-12 6262 KSP Residual norm 3.812513113763e-12 6263 KSP Residual norm 3.707883638356e-12 6264 KSP Residual norm 3.371223181576e-12 6265 KSP Residual norm 3.063157676758e-12 6266 KSP Residual norm 2.624185594915e-12 6267 KSP Residual norm 2.241011314784e-12 6268 KSP Residual norm 2.018720914461e-12 6269 KSP Residual norm 2.123934126088e-12 6270 KSP Residual norm 2.684106729061e-12 6271 KSP Residual norm 3.671835614509e-12 6272 KSP Residual norm 3.554657094596e-12 6273 KSP Residual norm 2.840277607804e-12 6274 KSP Residual norm 2.581933305231e-12 6275 KSP Residual norm 2.586579091349e-12 6276 KSP Residual norm 2.503879286560e-12 6277 KSP Residual norm 2.672603757186e-12 6278 KSP Residual norm 3.077565464311e-12 6279 KSP Residual norm 3.145567091956e-12 6280 KSP Residual norm 3.115261729658e-12 6281 KSP Residual norm 3.104606656148e-12 6282 KSP Residual norm 2.936699857278e-12 6283 KSP Residual norm 2.781511640020e-12 6284 KSP Residual norm 2.557105337908e-12 6285 KSP Residual norm 2.215311915199e-12 6286 KSP Residual norm 2.107918059937e-12 6287 KSP Residual norm 2.114714991963e-12 6288 KSP Residual norm 2.008220413993e-12 6289 KSP Residual norm 1.748020696670e-12 6290 KSP Residual norm 1.529987445798e-12 6291 KSP Residual norm 1.441564751622e-12 6292 KSP Residual norm 1.284438217384e-12 6293 KSP Residual norm 1.117144442717e-12 6294 KSP Residual norm 9.916070704323e-13 6295 KSP Residual norm 9.056205542244e-13 6296 KSP Residual norm 7.809150105465e-13 6297 KSP Residual norm 7.168072899049e-13 6298 KSP Residual norm 7.762929198993e-13 6299 KSP Residual norm 8.195180335768e-13 6300 KSP Residual norm 7.834255161755e-13 6301 KSP Residual norm 7.807438983814e-13 6302 KSP Residual norm 7.917123178801e-13 6303 KSP Residual norm 7.277486451241e-13 6304 KSP Residual norm 6.151838727387e-13 6305 KSP Residual norm 5.577869965085e-13 6306 KSP Residual norm 5.790906428624e-13 6307 KSP Residual norm 6.223010835772e-13 6308 KSP Residual norm 6.309347591521e-13 6309 KSP Residual norm 6.568806863555e-13 6310 KSP Residual norm 7.515999655250e-13 6311 KSP Residual norm 8.523532161536e-13 6312 KSP Residual norm 9.053249424343e-13 6313 KSP Residual norm 9.817148629637e-13 6314 KSP Residual norm 1.121473061125e-12 6315 KSP Residual norm 1.093016469358e-12 6316 KSP Residual norm 9.097535270373e-13 6317 KSP Residual norm 8.383120318295e-13 6318 KSP Residual norm 9.402598006788e-13 6319 KSP Residual norm 1.068250438640e-12 6320 KSP Residual norm 1.159795250130e-12 6321 KSP Residual norm 1.271584748554e-12 6322 KSP Residual norm 1.292059809790e-12 6323 KSP Residual norm 1.081292932737e-12 6324 KSP Residual norm 9.762291184488e-13 6325 KSP Residual norm 1.013449263832e-12 6326 KSP Residual norm 1.042015380517e-12 6327 KSP Residual norm 9.586958853510e-13 6328 KSP Residual norm 7.594666934158e-13 6329 KSP Residual norm 6.020557194235e-13 6330 KSP Residual norm 5.653239520633e-13 6331 KSP Residual norm 5.602332284611e-13 6332 KSP Residual norm 5.234046249674e-13 6333 KSP Residual norm 4.673555480568e-13 6334 KSP Residual norm 4.427462804127e-13 6335 KSP Residual norm 4.758602440865e-13 6336 KSP Residual norm 5.428242174262e-13 6337 KSP Residual norm 5.689572956687e-13 6338 KSP Residual norm 5.511013968087e-13 6339 KSP Residual norm 5.196579997918e-13 6340 KSP Residual norm 4.744319745921e-13 6341 KSP Residual norm 4.559328463109e-13 6342 KSP Residual norm 4.756682233377e-13 6343 KSP Residual norm 5.084565971691e-13 6344 KSP Residual norm 5.548744339238e-13 6345 KSP Residual norm 5.618392039749e-13 6346 KSP Residual norm 4.897157572681e-13 6347 KSP Residual norm 4.733599228276e-13 6348 KSP Residual norm 5.358004479611e-13 6349 KSP Residual norm 5.807672245831e-13 6350 KSP Residual norm 6.267283463161e-13 6351 KSP Residual norm 7.068616831703e-13 6352 KSP Residual norm 7.125565536086e-13 6353 KSP Residual norm 6.542396668676e-13 6354 KSP Residual norm 6.639017377710e-13 6355 KSP Residual norm 7.279799680717e-13 6356 KSP Residual norm 7.955914472743e-13 6357 KSP Residual norm 9.463652106966e-13 6358 KSP Residual norm 1.124980304968e-12 6359 KSP Residual norm 1.140173637744e-12 6360 KSP Residual norm 1.192523160707e-12 6361 KSP Residual norm 1.406926275097e-12 6362 KSP Residual norm 1.528851459775e-12 6363 KSP Residual norm 1.393997842856e-12 6364 KSP Residual norm 1.096311988706e-12 6365 KSP Residual norm 8.667405174648e-13 6366 KSP Residual norm 7.225605491059e-13 6367 KSP Residual norm 7.068699425585e-13 6368 KSP Residual norm 7.622841624712e-13 6369 KSP Residual norm 7.947744749370e-13 6370 KSP Residual norm 6.709019339944e-13 6371 KSP Residual norm 5.665480800689e-13 6372 KSP Residual norm 5.789669305618e-13 6373 KSP Residual norm 6.212472187691e-13 6374 KSP Residual norm 6.767974725556e-13 6375 KSP Residual norm 7.807467334895e-13 6376 KSP Residual norm 8.100080679159e-13 6377 KSP Residual norm 7.155650767157e-13 6378 KSP Residual norm 6.243222327082e-13 6379 KSP Residual norm 5.977812769521e-13 6380 KSP Residual norm 6.700597387918e-13 6381 KSP Residual norm 7.753564296866e-13 6382 KSP Residual norm 8.341892750895e-13 6383 KSP Residual norm 8.589126668977e-13 6384 KSP Residual norm 8.882707937719e-13 6385 KSP Residual norm 8.769363290395e-13 6386 KSP Residual norm 8.347345346401e-13 6387 KSP Residual norm 8.250014613191e-13 6388 KSP Residual norm 8.588647714335e-13 6389 KSP Residual norm 8.545911149212e-13 6390 KSP Residual norm 8.763448697152e-13 6391 KSP Residual norm 1.021634323004e-12 6392 KSP Residual norm 1.394587682409e-12 6393 KSP Residual norm 1.634491798537e-12 6394 KSP Residual norm 1.425597165333e-12 6395 KSP Residual norm 1.185665994278e-12 6396 KSP Residual norm 1.046999623761e-12 6397 KSP Residual norm 9.600698652348e-13 6398 KSP Residual norm 9.548127101086e-13 6399 KSP Residual norm 9.899760258211e-13 6400 KSP Residual norm 1.086659755166e-12 6401 KSP Residual norm 1.246289542969e-12 6402 KSP Residual norm 1.186252539340e-12 6403 KSP Residual norm 9.780182297002e-13 6404 KSP Residual norm 9.463228380148e-13 6405 KSP Residual norm 1.062570697470e-12 6406 KSP Residual norm 9.443430454097e-13 6407 KSP Residual norm 8.282469251560e-13 6408 KSP Residual norm 9.860488851893e-13 6409 KSP Residual norm 1.363816838120e-12 6410 KSP Residual norm 1.362773207068e-12 6411 KSP Residual norm 1.132067821407e-12 6412 KSP Residual norm 9.989886970451e-13 6413 KSP Residual norm 9.085876052131e-13 6414 KSP Residual norm 8.227808018468e-13 6415 KSP Residual norm 8.665184525798e-13 6416 KSP Residual norm 1.033077750628e-12 6417 KSP Residual norm 1.076161629339e-12 6418 KSP Residual norm 9.367531933098e-13 6419 KSP Residual norm 8.410380502892e-13 6420 KSP Residual norm 7.738282504364e-13 6421 KSP Residual norm 7.525560277724e-13 6422 KSP Residual norm 8.059057394671e-13 6423 KSP Residual norm 8.963457205953e-13 6424 KSP Residual norm 1.008987375659e-12 6425 KSP Residual norm 1.005752960340e-12 6426 KSP Residual norm 9.981161831966e-13 6427 KSP Residual norm 1.125960719128e-12 6428 KSP Residual norm 1.363044308332e-12 6429 KSP Residual norm 1.547849827979e-12 6430 KSP Residual norm 1.590056441708e-12 6431 KSP Residual norm 1.443414917110e-12 6432 KSP Residual norm 1.206663743892e-12 6433 KSP Residual norm 1.016125485204e-12 6434 KSP Residual norm 9.521505304236e-13 6435 KSP Residual norm 9.685272411580e-13 6436 KSP Residual norm 1.047476802415e-12 6437 KSP Residual norm 1.101205733541e-12 6438 KSP Residual norm 1.140173464985e-12 6439 KSP Residual norm 1.175108214282e-12 6440 KSP Residual norm 1.175200665289e-12 6441 KSP Residual norm 1.144406494104e-12 6442 KSP Residual norm 1.053480276899e-12 6443 KSP Residual norm 8.372945929723e-13 6444 KSP Residual norm 6.877957005424e-13 6445 KSP Residual norm 7.328422707942e-13 6446 KSP Residual norm 9.191290004538e-13 6447 KSP Residual norm 1.056028352808e-12 6448 KSP Residual norm 1.113884450382e-12 6449 KSP Residual norm 1.099985992084e-12 6450 KSP Residual norm 1.026390137518e-12 6451 KSP Residual norm 9.383772309035e-13 6452 KSP Residual norm 8.254587136243e-13 6453 KSP Residual norm 7.684109253375e-13 6454 KSP Residual norm 7.840043811932e-13 6455 KSP Residual norm 7.605067860639e-13 6456 KSP Residual norm 6.375053299062e-13 6457 KSP Residual norm 5.720584688426e-13 6458 KSP Residual norm 5.819583740939e-13 6459 KSP Residual norm 6.970077599502e-13 6460 KSP Residual norm 8.146164807450e-13 6461 KSP Residual norm 7.840968520698e-13 6462 KSP Residual norm 7.313909076841e-13 6463 KSP Residual norm 6.884428796987e-13 6464 KSP Residual norm 6.133082750850e-13 6465 KSP Residual norm 5.872898587243e-13 6466 KSP Residual norm 7.039452047563e-13 6467 KSP Residual norm 9.472517208189e-13 6468 KSP Residual norm 1.103920049717e-12 6469 KSP Residual norm 1.002253279924e-12 6470 KSP Residual norm 9.144381228149e-13 6471 KSP Residual norm 9.710865303757e-13 6472 KSP Residual norm 1.038156679645e-12 6473 KSP Residual norm 9.183425749076e-13 6474 KSP Residual norm 7.983579890790e-13 6475 KSP Residual norm 7.448583054290e-13 6476 KSP Residual norm 7.280638001547e-13 6477 KSP Residual norm 7.713638889727e-13 6478 KSP Residual norm 8.949475625926e-13 6479 KSP Residual norm 1.030367246282e-12 6480 KSP Residual norm 9.641833973230e-13 6481 KSP Residual norm 8.238596558642e-13 6482 KSP Residual norm 8.513162407286e-13 6483 KSP Residual norm 9.318635602426e-13 6484 KSP Residual norm 9.027987666300e-13 6485 KSP Residual norm 8.766379021197e-13 6486 KSP Residual norm 8.428842142648e-13 6487 KSP Residual norm 7.130591730843e-13 6488 KSP Residual norm 5.600039060888e-13 6489 KSP Residual norm 4.878621484431e-13 6490 KSP Residual norm 4.851391550004e-13 6491 KSP Residual norm 5.153845663974e-13 6492 KSP Residual norm 5.505370420567e-13 6493 KSP Residual norm 5.375826294780e-13 6494 KSP Residual norm 5.147598298374e-13 6495 KSP Residual norm 4.722226249848e-13 6496 KSP Residual norm 4.182172213001e-13 6497 KSP Residual norm 4.011326144195e-13 6498 KSP Residual norm 4.105582673283e-13 6499 KSP Residual norm 4.024558565460e-13 6500 KSP Residual norm 3.674000146833e-13 6501 KSP Residual norm 3.247280410545e-13 6502 KSP Residual norm 3.008861435751e-13 6503 KSP Residual norm 3.224842510274e-13 6504 KSP Residual norm 3.622190072354e-13 6505 KSP Residual norm 3.678724219604e-13 6506 KSP Residual norm 3.744597313217e-13 6507 KSP Residual norm 3.856537043005e-13 6508 KSP Residual norm 3.356823288465e-13 6509 KSP Residual norm 2.855075831533e-13 6510 KSP Residual norm 2.828046257568e-13 6511 KSP Residual norm 2.877164120038e-13 6512 KSP Residual norm 3.003698469051e-13 6513 KSP Residual norm 3.198786493560e-13 6514 KSP Residual norm 3.287902163109e-13 6515 KSP Residual norm 2.977349497977e-13 6516 KSP Residual norm 2.526671606065e-13 6517 KSP Residual norm 2.354020238555e-13 6518 KSP Residual norm 2.411502884992e-13 6519 KSP Residual norm 2.685515266370e-13 6520 KSP Residual norm 3.261003366865e-13 6521 KSP Residual norm 4.177763629349e-13 6522 KSP Residual norm 4.879002525861e-13 6523 KSP Residual norm 5.044327011168e-13 6524 KSP Residual norm 4.927734830947e-13 6525 KSP Residual norm 4.361576637241e-13 6526 KSP Residual norm 4.036562642627e-13 6527 KSP Residual norm 4.104964986397e-13 6528 KSP Residual norm 4.360189662553e-13 6529 KSP Residual norm 4.610660841185e-13 6530 KSP Residual norm 5.314630782206e-13 6531 KSP Residual norm 6.877436578038e-13 6532 KSP Residual norm 9.142930047915e-13 6533 KSP Residual norm 1.080826068695e-12 6534 KSP Residual norm 1.042457200628e-12 6535 KSP Residual norm 8.827117573815e-13 6536 KSP Residual norm 6.595201047066e-13 6537 KSP Residual norm 4.600406163755e-13 6538 KSP Residual norm 3.386670368995e-13 6539 KSP Residual norm 3.182479067427e-13 6540 KSP Residual norm 3.807679142454e-13 6541 KSP Residual norm 4.934242815615e-13 6542 KSP Residual norm 6.028675161283e-13 6543 KSP Residual norm 6.974997781850e-13 6544 KSP Residual norm 7.359624357539e-13 6545 KSP Residual norm 6.475720922805e-13 6546 KSP Residual norm 5.102620020504e-13 6547 KSP Residual norm 4.266719256571e-13 6548 KSP Residual norm 3.422166375003e-13 6549 KSP Residual norm 2.797064119836e-13 6550 KSP Residual norm 2.766789598794e-13 6551 KSP Residual norm 3.286814382996e-13 6552 KSP Residual norm 4.162568204089e-13 6553 KSP Residual norm 4.561556347996e-13 6554 KSP Residual norm 4.619614689035e-13 6555 KSP Residual norm 4.896745900075e-13 6556 KSP Residual norm 5.495681564647e-13 6557 KSP Residual norm 6.271703801311e-13 6558 KSP Residual norm 7.378873395045e-13 6559 KSP Residual norm 8.296429573313e-13 6560 KSP Residual norm 7.696862835047e-13 6561 KSP Residual norm 5.792630249001e-13 6562 KSP Residual norm 4.613083715098e-13 6563 KSP Residual norm 4.390007103729e-13 6564 KSP Residual norm 4.482232070880e-13 6565 KSP Residual norm 4.798351367413e-13 6566 KSP Residual norm 5.403470179746e-13 6567 KSP Residual norm 6.697664665234e-13 6568 KSP Residual norm 8.835575898900e-13 6569 KSP Residual norm 1.063937635791e-12 6570 KSP Residual norm 1.078527041804e-12 6571 KSP Residual norm 9.524496535539e-13 6572 KSP Residual norm 8.842097498448e-13 6573 KSP Residual norm 9.356392066022e-13 6574 KSP Residual norm 9.743335169204e-13 6575 KSP Residual norm 1.097171237261e-12 6576 KSP Residual norm 1.452136456759e-12 6577 KSP Residual norm 1.836411562309e-12 6578 KSP Residual norm 1.578050690762e-12 6579 KSP Residual norm 1.137252402985e-12 6580 KSP Residual norm 8.379221837655e-13 6581 KSP Residual norm 7.017385201683e-13 6582 KSP Residual norm 7.302033682482e-13 6583 KSP Residual norm 9.213382353717e-13 6584 KSP Residual norm 1.164263146552e-12 6585 KSP Residual norm 1.126290050475e-12 6586 KSP Residual norm 1.000200867909e-12 6587 KSP Residual norm 8.927582963779e-13 6588 KSP Residual norm 8.174199473761e-13 6589 KSP Residual norm 7.732828262029e-13 6590 KSP Residual norm 7.310345391212e-13 6591 KSP Residual norm 6.222410657803e-13 6592 KSP Residual norm 4.737492520057e-13 6593 KSP Residual norm 3.651477628465e-13 6594 KSP Residual norm 3.113533210874e-13 6595 KSP Residual norm 2.990720485877e-13 6596 KSP Residual norm 3.378256928243e-13 6597 KSP Residual norm 4.088597380339e-13 6598 KSP Residual norm 4.510668270378e-13 6599 KSP Residual norm 4.373613862822e-13 6600 KSP Residual norm 4.093082920924e-13 6601 KSP Residual norm 3.617110443379e-13 6602 KSP Residual norm 2.844255958998e-13 6603 KSP Residual norm 2.290989820966e-13 6604 KSP Residual norm 2.076026123199e-13 6605 KSP Residual norm 2.161045101119e-13 6606 KSP Residual norm 2.361637693074e-13 6607 KSP Residual norm 2.579896401181e-13 6608 KSP Residual norm 2.743682103811e-13 6609 KSP Residual norm 3.040949407432e-13 6610 KSP Residual norm 3.288917527683e-13 6611 KSP Residual norm 3.256842494228e-13 6612 KSP Residual norm 3.037357121850e-13 6613 KSP Residual norm 3.121659584915e-13 6614 KSP Residual norm 4.065255411212e-13 6615 KSP Residual norm 5.654336378386e-13 6616 KSP Residual norm 6.900607696494e-13 6617 KSP Residual norm 6.783973834697e-13 6618 KSP Residual norm 5.172974731742e-13 6619 KSP Residual norm 3.878880423231e-13 6620 KSP Residual norm 3.586488652329e-13 6621 KSP Residual norm 3.855970560752e-13 6622 KSP Residual norm 4.262320183105e-13 6623 KSP Residual norm 5.202675717976e-13 6624 KSP Residual norm 6.295011119428e-13 6625 KSP Residual norm 6.145666766096e-13 6626 KSP Residual norm 6.203803498048e-13 6627 KSP Residual norm 6.862556191082e-13 6628 KSP Residual norm 7.651167830911e-13 6629 KSP Residual norm 7.827133005217e-13 6630 KSP Residual norm 7.704958444141e-13 6631 KSP Residual norm 7.313174273393e-13 6632 KSP Residual norm 7.040307346723e-13 6633 KSP Residual norm 6.828907476161e-13 6634 KSP Residual norm 6.744319019889e-13 6635 KSP Residual norm 7.346602627366e-13 6636 KSP Residual norm 9.187603629288e-13 6637 KSP Residual norm 1.140706457226e-12 6638 KSP Residual norm 1.198058829936e-12 6639 KSP Residual norm 1.102100695344e-12 6640 KSP Residual norm 9.365664260138e-13 6641 KSP Residual norm 7.798411895426e-13 6642 KSP Residual norm 7.316860534007e-13 6643 KSP Residual norm 7.876594834112e-13 6644 KSP Residual norm 8.822100743832e-13 6645 KSP Residual norm 9.569760896950e-13 6646 KSP Residual norm 9.972754753137e-13 6647 KSP Residual norm 1.109915104315e-12 6648 KSP Residual norm 1.267391027131e-12 6649 KSP Residual norm 1.309623324107e-12 6650 KSP Residual norm 1.294344866017e-12 6651 KSP Residual norm 1.209052715194e-12 6652 KSP Residual norm 1.108208370955e-12 6653 KSP Residual norm 1.072686546176e-12 6654 KSP Residual norm 1.107329690212e-12 6655 KSP Residual norm 1.126894356092e-12 6656 KSP Residual norm 1.042029176145e-12 6657 KSP Residual norm 8.803391422863e-13 6658 KSP Residual norm 8.248986017820e-13 6659 KSP Residual norm 9.018498159244e-13 6660 KSP Residual norm 1.072713612134e-12 6661 KSP Residual norm 1.229907217576e-12 6662 KSP Residual norm 1.284712127034e-12 6663 KSP Residual norm 1.270342512862e-12 6664 KSP Residual norm 1.229253152846e-12 6665 KSP Residual norm 1.302788269181e-12 6666 KSP Residual norm 1.627834245443e-12 6667 KSP Residual norm 2.007610420521e-12 6668 KSP Residual norm 2.059275376221e-12 6669 KSP Residual norm 2.042349254666e-12 6670 KSP Residual norm 2.143667995538e-12 6671 KSP Residual norm 2.253022942359e-12 6672 KSP Residual norm 2.346643143085e-12 6673 KSP Residual norm 2.250493252518e-12 6674 KSP Residual norm 1.934854033751e-12 6675 KSP Residual norm 1.669760067203e-12 6676 KSP Residual norm 1.586511107984e-12 6677 KSP Residual norm 1.596071130563e-12 6678 KSP Residual norm 1.896431639487e-12 6679 KSP Residual norm 2.557094673539e-12 6680 KSP Residual norm 3.635820534181e-12 6681 KSP Residual norm 4.792115395804e-12 6682 KSP Residual norm 5.463865751447e-12 6683 KSP Residual norm 5.010915219993e-12 6684 KSP Residual norm 4.000703001443e-12 6685 KSP Residual norm 3.191506823819e-12 6686 KSP Residual norm 2.805711906450e-12 6687 KSP Residual norm 2.475234517921e-12 6688 KSP Residual norm 2.425721455161e-12 6689 KSP Residual norm 2.642491983402e-12 6690 KSP Residual norm 2.850932201153e-12 6691 KSP Residual norm 3.081503832868e-12 6692 KSP Residual norm 3.397458039473e-12 6693 KSP Residual norm 3.544049307671e-12 6694 KSP Residual norm 3.776607961328e-12 6695 KSP Residual norm 3.907305702869e-12 6696 KSP Residual norm 3.483379184542e-12 6697 KSP Residual norm 2.909427783970e-12 6698 KSP Residual norm 2.535335182250e-12 6699 KSP Residual norm 2.297012048417e-12 6700 KSP Residual norm 2.184957556587e-12 6701 KSP Residual norm 2.250290216470e-12 6702 KSP Residual norm 2.500972610035e-12 6703 KSP Residual norm 2.712785646125e-12 6704 KSP Residual norm 2.530919006036e-12 6705 KSP Residual norm 2.081793467089e-12 6706 KSP Residual norm 1.959873501014e-12 6707 KSP Residual norm 2.220967803355e-12 6708 KSP Residual norm 2.641333945361e-12 6709 KSP Residual norm 2.988694603992e-12 6710 KSP Residual norm 3.295762703372e-12 6711 KSP Residual norm 3.442642759726e-12 6712 KSP Residual norm 3.597448738030e-12 6713 KSP Residual norm 3.726038032633e-12 6714 KSP Residual norm 3.829975672105e-12 6715 KSP Residual norm 3.893047435795e-12 6716 KSP Residual norm 3.697809959528e-12 6717 KSP Residual norm 3.465977770416e-12 6718 KSP Residual norm 3.179275571634e-12 6719 KSP Residual norm 2.650045772893e-12 6720 KSP Residual norm 2.433091102376e-12 6721 KSP Residual norm 2.519663096190e-12 6722 KSP Residual norm 3.013835119829e-12 6723 KSP Residual norm 3.683083118870e-12 6724 KSP Residual norm 4.087305151230e-12 6725 KSP Residual norm 4.425327567919e-12 6726 KSP Residual norm 5.157898365103e-12 6727 KSP Residual norm 6.007224667260e-12 6728 KSP Residual norm 6.614956713829e-12 6729 KSP Residual norm 7.302102837926e-12 6730 KSP Residual norm 7.789665323726e-12 6731 KSP Residual norm 6.948871605606e-12 6732 KSP Residual norm 6.563157782671e-12 6733 KSP Residual norm 6.886027118364e-12 6734 KSP Residual norm 6.816973511304e-12 6735 KSP Residual norm 6.572845014578e-12 6736 KSP Residual norm 6.366154263601e-12 6737 KSP Residual norm 6.429508620284e-12 6738 KSP Residual norm 7.097459241675e-12 6739 KSP Residual norm 7.346701919450e-12 6740 KSP Residual norm 7.490457040818e-12 6741 KSP Residual norm 8.244431245747e-12 6742 KSP Residual norm 8.538900652641e-12 6743 KSP Residual norm 8.311030252970e-12 6744 KSP Residual norm 8.315296227534e-12 6745 KSP Residual norm 8.532891435907e-12 6746 KSP Residual norm 8.070577559116e-12 6747 KSP Residual norm 6.783000664127e-12 6748 KSP Residual norm 6.120856571810e-12 6749 KSP Residual norm 6.019677791326e-12 6750 KSP Residual norm 5.765586191732e-12 6751 KSP Residual norm 5.279442498847e-12 6752 KSP Residual norm 5.092327783711e-12 6753 KSP Residual norm 4.768228630125e-12 6754 KSP Residual norm 4.227146082819e-12 6755 KSP Residual norm 3.888790806712e-12 6756 KSP Residual norm 4.192605826988e-12 6757 KSP Residual norm 4.956659514932e-12 6758 KSP Residual norm 5.262628057482e-12 6759 KSP Residual norm 5.274008196949e-12 6760 KSP Residual norm 4.995090032872e-12 6761 KSP Residual norm 4.966339973344e-12 6762 KSP Residual norm 5.028283076912e-12 6763 KSP Residual norm 4.666113309143e-12 6764 KSP Residual norm 4.000312692917e-12 6765 KSP Residual norm 3.397105470345e-12 6766 KSP Residual norm 3.166445759287e-12 6767 KSP Residual norm 2.984707421220e-12 6768 KSP Residual norm 2.618189360368e-12 6769 KSP Residual norm 2.286663038463e-12 6770 KSP Residual norm 2.135727644816e-12 6771 KSP Residual norm 2.066660961143e-12 6772 KSP Residual norm 2.252697087523e-12 6773 KSP Residual norm 2.595520796802e-12 6774 KSP Residual norm 2.693006405583e-12 6775 KSP Residual norm 2.875455087755e-12 6776 KSP Residual norm 3.476980351304e-12 6777 KSP Residual norm 4.141638145123e-12 6778 KSP Residual norm 4.471320664747e-12 6779 KSP Residual norm 4.675833549447e-12 6780 KSP Residual norm 4.512568113425e-12 6781 KSP Residual norm 4.381923896052e-12 6782 KSP Residual norm 4.461488673271e-12 6783 KSP Residual norm 4.033594006620e-12 6784 KSP Residual norm 3.507444418065e-12 6785 KSP Residual norm 3.247448173539e-12 6786 KSP Residual norm 2.987499785184e-12 6787 KSP Residual norm 2.870263239097e-12 6788 KSP Residual norm 3.110754727187e-12 6789 KSP Residual norm 3.492882145730e-12 6790 KSP Residual norm 3.770858558975e-12 6791 KSP Residual norm 3.644470641028e-12 6792 KSP Residual norm 3.474281816039e-12 6793 KSP Residual norm 3.846291561483e-12 6794 KSP Residual norm 4.349675279076e-12 6795 KSP Residual norm 4.466635975265e-12 6796 KSP Residual norm 4.288879935822e-12 6797 KSP Residual norm 4.301271163204e-12 6798 KSP Residual norm 4.584571307192e-12 6799 KSP Residual norm 5.000584616832e-12 6800 KSP Residual norm 4.882610574317e-12 6801 KSP Residual norm 4.183410376837e-12 6802 KSP Residual norm 3.582406754238e-12 6803 KSP Residual norm 3.138842145634e-12 6804 KSP Residual norm 2.926785294218e-12 6805 KSP Residual norm 2.975677303620e-12 6806 KSP Residual norm 2.969522916598e-12 6807 KSP Residual norm 2.648506588978e-12 6808 KSP Residual norm 2.279908159189e-12 6809 KSP Residual norm 2.262444484701e-12 6810 KSP Residual norm 2.492023204960e-12 6811 KSP Residual norm 2.720059522904e-12 6812 KSP Residual norm 2.842688573383e-12 6813 KSP Residual norm 2.885646562178e-12 6814 KSP Residual norm 2.870494081064e-12 6815 KSP Residual norm 2.744809366888e-12 6816 KSP Residual norm 2.638894493964e-12 6817 KSP Residual norm 2.514559211643e-12 6818 KSP Residual norm 2.278216729505e-12 6819 KSP Residual norm 2.000517422179e-12 6820 KSP Residual norm 1.918608446299e-12 6821 KSP Residual norm 2.038083474699e-12 6822 KSP Residual norm 1.946074090701e-12 6823 KSP Residual norm 1.767419452285e-12 6824 KSP Residual norm 1.763753483081e-12 6825 KSP Residual norm 1.842595387719e-12 6826 KSP Residual norm 1.794343878318e-12 6827 KSP Residual norm 1.707171858242e-12 6828 KSP Residual norm 1.705001040245e-12 6829 KSP Residual norm 1.756531509474e-12 6830 KSP Residual norm 1.919331131540e-12 6831 KSP Residual norm 2.067180228684e-12 6832 KSP Residual norm 2.033086811392e-12 6833 KSP Residual norm 1.952445094046e-12 6834 KSP Residual norm 1.915437061235e-12 6835 KSP Residual norm 2.009394860665e-12 6836 KSP Residual norm 2.127948288575e-12 6837 KSP Residual norm 2.297454328637e-12 6838 KSP Residual norm 2.358333456749e-12 6839 KSP Residual norm 2.284418551198e-12 6840 KSP Residual norm 2.416958042910e-12 6841 KSP Residual norm 2.873774672260e-12 6842 KSP Residual norm 3.166301284399e-12 6843 KSP Residual norm 3.142599645332e-12 6844 KSP Residual norm 2.906341804050e-12 6845 KSP Residual norm 2.523073353938e-12 6846 KSP Residual norm 2.332514882419e-12 6847 KSP Residual norm 2.682155856643e-12 6848 KSP Residual norm 3.387958894812e-12 6849 KSP Residual norm 3.835468017579e-12 6850 KSP Residual norm 4.072656400841e-12 6851 KSP Residual norm 4.320328816913e-12 6852 KSP Residual norm 4.572184719910e-12 6853 KSP Residual norm 4.599859216593e-12 6854 KSP Residual norm 4.550733493374e-12 6855 KSP Residual norm 4.569125362196e-12 6856 KSP Residual norm 5.205900298423e-12 6857 KSP Residual norm 5.933544778426e-12 6858 KSP Residual norm 6.084600128220e-12 6859 KSP Residual norm 6.344742035839e-12 6860 KSP Residual norm 6.726064421953e-12 6861 KSP Residual norm 6.570584548932e-12 6862 KSP Residual norm 6.527306885992e-12 6863 KSP Residual norm 7.534145025894e-12 6864 KSP Residual norm 8.278765439859e-12 6865 KSP Residual norm 7.701415783246e-12 6866 KSP Residual norm 6.513292061436e-12 6867 KSP Residual norm 5.712122651076e-12 6868 KSP Residual norm 5.254426175940e-12 6869 KSP Residual norm 4.913615475932e-12 6870 KSP Residual norm 4.655217368739e-12 6871 KSP Residual norm 4.241178432908e-12 6872 KSP Residual norm 3.843185652965e-12 6873 KSP Residual norm 3.600928984111e-12 6874 KSP Residual norm 3.641255064764e-12 6875 KSP Residual norm 4.239415493699e-12 6876 KSP Residual norm 5.286075784059e-12 6877 KSP Residual norm 5.245108995457e-12 6878 KSP Residual norm 5.182739174215e-12 6879 KSP Residual norm 6.040336184419e-12 6880 KSP Residual norm 6.407493887773e-12 6881 KSP Residual norm 5.537446088928e-12 6882 KSP Residual norm 4.655263947201e-12 6883 KSP Residual norm 4.243937162935e-12 6884 KSP Residual norm 4.193807422692e-12 6885 KSP Residual norm 4.471323951692e-12 6886 KSP Residual norm 4.248105257139e-12 6887 KSP Residual norm 4.106483179768e-12 6888 KSP Residual norm 4.174373199344e-12 6889 KSP Residual norm 4.393914527713e-12 6890 KSP Residual norm 4.887328023707e-12 6891 KSP Residual norm 5.709495798651e-12 6892 KSP Residual norm 6.038726266706e-12 6893 KSP Residual norm 6.205798811600e-12 6894 KSP Residual norm 6.346446397738e-12 6895 KSP Residual norm 6.636311024786e-12 6896 KSP Residual norm 7.422108065695e-12 6897 KSP Residual norm 7.507655532004e-12 6898 KSP Residual norm 7.082322399911e-12 6899 KSP Residual norm 6.907587608139e-12 6900 KSP Residual norm 7.097662150478e-12 6901 KSP Residual norm 7.275953409506e-12 6902 KSP Residual norm 8.153618258884e-12 6903 KSP Residual norm 8.844139575425e-12 6904 KSP Residual norm 8.307390695692e-12 6905 KSP Residual norm 7.924809710063e-12 6906 KSP Residual norm 8.426707708131e-12 6907 KSP Residual norm 9.433530067087e-12 6908 KSP Residual norm 1.023697674155e-11 6909 KSP Residual norm 1.108350910484e-11 6910 KSP Residual norm 1.262590407739e-11 6911 KSP Residual norm 1.404683365935e-11 6912 KSP Residual norm 1.433449236084e-11 6913 KSP Residual norm 1.511918007225e-11 6914 KSP Residual norm 1.490760062646e-11 6915 KSP Residual norm 1.406181169382e-11 6916 KSP Residual norm 1.369437981601e-11 6917 KSP Residual norm 1.302727425189e-11 6918 KSP Residual norm 1.215540945047e-11 6919 KSP Residual norm 1.155715967507e-11 6920 KSP Residual norm 1.146436888779e-11 6921 KSP Residual norm 1.159608398023e-11 6922 KSP Residual norm 1.198329818039e-11 6923 KSP Residual norm 1.107630012964e-11 6924 KSP Residual norm 1.002213988123e-11 6925 KSP Residual norm 1.120885784978e-11 6926 KSP Residual norm 1.331241705900e-11 6927 KSP Residual norm 1.394011301280e-11 6928 KSP Residual norm 1.385162413737e-11 6929 KSP Residual norm 1.236194014564e-11 6930 KSP Residual norm 1.170976843280e-11 6931 KSP Residual norm 1.192831716472e-11 6932 KSP Residual norm 1.094762860562e-11 6933 KSP Residual norm 9.391710207169e-12 6934 KSP Residual norm 9.044686804153e-12 6935 KSP Residual norm 9.765150666316e-12 6936 KSP Residual norm 1.053842369839e-11 6937 KSP Residual norm 1.074524950098e-11 6938 KSP Residual norm 9.884005904600e-12 6939 KSP Residual norm 9.162878529825e-12 6940 KSP Residual norm 9.672984712188e-12 6941 KSP Residual norm 1.065461439704e-11 6942 KSP Residual norm 1.119403746990e-11 6943 KSP Residual norm 1.128326618488e-11 6944 KSP Residual norm 1.089192386356e-11 6945 KSP Residual norm 1.033026795511e-11 6946 KSP Residual norm 1.037116488298e-11 6947 KSP Residual norm 1.070730054145e-11 6948 KSP Residual norm 9.721744760991e-12 6949 KSP Residual norm 8.296659564762e-12 6950 KSP Residual norm 7.796660191774e-12 6951 KSP Residual norm 7.957803318852e-12 6952 KSP Residual norm 8.174238537445e-12 6953 KSP Residual norm 8.407697409445e-12 6954 KSP Residual norm 9.286737324110e-12 6955 KSP Residual norm 1.094081766513e-11 6956 KSP Residual norm 1.179741758483e-11 6957 KSP Residual norm 1.208412714547e-11 6958 KSP Residual norm 1.284070538421e-11 6959 KSP Residual norm 1.366870188158e-11 6960 KSP Residual norm 1.347488289960e-11 6961 KSP Residual norm 1.138255467547e-11 6962 KSP Residual norm 9.421920058334e-12 6963 KSP Residual norm 8.417881360899e-12 6964 KSP Residual norm 7.922529871623e-12 6965 KSP Residual norm 7.603592866078e-12 6966 KSP Residual norm 7.296963747639e-12 6967 KSP Residual norm 6.949609125989e-12 6968 KSP Residual norm 6.602856648073e-12 6969 KSP Residual norm 6.497717105927e-12 6970 KSP Residual norm 6.770441161655e-12 6971 KSP Residual norm 7.146269065070e-12 6972 KSP Residual norm 7.184791529467e-12 6973 KSP Residual norm 6.493264136193e-12 6974 KSP Residual norm 5.991693535170e-12 6975 KSP Residual norm 5.876724088225e-12 6976 KSP Residual norm 5.886189495963e-12 6977 KSP Residual norm 5.813119909746e-12 6978 KSP Residual norm 5.483657482707e-12 6979 KSP Residual norm 4.936402987922e-12 6980 KSP Residual norm 4.759014535329e-12 6981 KSP Residual norm 4.502716214163e-12 6982 KSP Residual norm 3.945338822463e-12 6983 KSP Residual norm 3.498067001642e-12 6984 KSP Residual norm 3.038218196034e-12 6985 KSP Residual norm 3.008863069367e-12 6986 KSP Residual norm 3.321640059513e-12 6987 KSP Residual norm 3.219702342180e-12 6988 KSP Residual norm 2.597186179697e-12 6989 KSP Residual norm 2.469317362025e-12 6990 KSP Residual norm 2.490759644632e-12 6991 KSP Residual norm 2.402303703581e-12 6992 KSP Residual norm 2.350401612266e-12 6993 KSP Residual norm 2.472453915649e-12 6994 KSP Residual norm 2.523310846527e-12 6995 KSP Residual norm 2.400209452250e-12 6996 KSP Residual norm 2.184712538329e-12 6997 KSP Residual norm 2.151650988985e-12 6998 KSP Residual norm 2.116301852718e-12 6999 KSP Residual norm 1.864050214750e-12 7000 KSP Residual norm 1.535524507776e-12 7001 KSP Residual norm 1.399027926282e-12 7002 KSP Residual norm 1.578869154872e-12 7003 KSP Residual norm 1.961505698592e-12 7004 KSP Residual norm 2.097384762413e-12 7005 KSP Residual norm 1.923746127837e-12 7006 KSP Residual norm 1.856267501503e-12 7007 KSP Residual norm 1.995845396381e-12 7008 KSP Residual norm 2.174762577296e-12 7009 KSP Residual norm 2.221582744604e-12 7010 KSP Residual norm 2.061657069688e-12 7011 KSP Residual norm 1.847725234425e-12 7012 KSP Residual norm 1.747438945698e-12 7013 KSP Residual norm 1.781454984069e-12 7014 KSP Residual norm 1.571980382032e-12 7015 KSP Residual norm 1.200117935589e-12 7016 KSP Residual norm 1.039595574044e-12 7017 KSP Residual norm 1.033852522377e-12 7018 KSP Residual norm 1.030281702312e-12 7019 KSP Residual norm 1.008994586326e-12 7020 KSP Residual norm 1.010640783301e-12 7021 KSP Residual norm 1.036421930723e-12 7022 KSP Residual norm 1.144585794192e-12 7023 KSP Residual norm 1.426929066878e-12 7024 KSP Residual norm 1.678278995460e-12 7025 KSP Residual norm 1.768945326915e-12 7026 KSP Residual norm 1.709917772409e-12 7027 KSP Residual norm 1.778480185091e-12 7028 KSP Residual norm 1.920637763728e-12 7029 KSP Residual norm 2.197831742127e-12 7030 KSP Residual norm 2.528467908382e-12 7031 KSP Residual norm 2.669652518637e-12 7032 KSP Residual norm 2.892180862528e-12 7033 KSP Residual norm 3.602717990522e-12 7034 KSP Residual norm 4.084516269648e-12 7035 KSP Residual norm 3.967296524216e-12 7036 KSP Residual norm 3.922609774751e-12 7037 KSP Residual norm 3.990432971511e-12 7038 KSP Residual norm 4.198417693906e-12 7039 KSP Residual norm 4.529634479549e-12 7040 KSP Residual norm 4.934704829229e-12 7041 KSP Residual norm 4.970322170993e-12 7042 KSP Residual norm 4.417859119278e-12 7043 KSP Residual norm 3.917164299486e-12 7044 KSP Residual norm 4.216838225129e-12 7045 KSP Residual norm 5.145672519512e-12 7046 KSP Residual norm 5.750995221900e-12 7047 KSP Residual norm 5.385497139837e-12 7048 KSP Residual norm 5.083271743739e-12 7049 KSP Residual norm 5.319335256959e-12 7050 KSP Residual norm 5.546995999719e-12 7051 KSP Residual norm 5.741636275815e-12 7052 KSP Residual norm 5.530206412859e-12 7053 KSP Residual norm 5.129743016119e-12 7054 KSP Residual norm 5.024964805055e-12 7055 KSP Residual norm 5.244977671427e-12 7056 KSP Residual norm 5.470965953203e-12 7057 KSP Residual norm 5.166643647366e-12 7058 KSP Residual norm 4.936770430265e-12 7059 KSP Residual norm 4.962611040529e-12 7060 KSP Residual norm 4.775150019746e-12 7061 KSP Residual norm 4.605760434291e-12 7062 KSP Residual norm 4.306514413570e-12 7063 KSP Residual norm 3.670218369758e-12 7064 KSP Residual norm 3.480934493828e-12 7065 KSP Residual norm 3.622483755697e-12 7066 KSP Residual norm 3.579675625958e-12 7067 KSP Residual norm 3.384065714145e-12 7068 KSP Residual norm 3.215062769575e-12 7069 KSP Residual norm 2.966372613041e-12 7070 KSP Residual norm 2.956031889060e-12 7071 KSP Residual norm 3.385643703128e-12 7072 KSP Residual norm 3.305862845020e-12 7073 KSP Residual norm 2.819272572561e-12 7074 KSP Residual norm 2.611077864544e-12 7075 KSP Residual norm 2.702840805890e-12 7076 KSP Residual norm 3.083666461437e-12 7077 KSP Residual norm 3.297923341470e-12 7078 KSP Residual norm 3.221331203289e-12 7079 KSP Residual norm 3.028678334177e-12 7080 KSP Residual norm 2.969337340250e-12 7081 KSP Residual norm 3.079592631679e-12 7082 KSP Residual norm 3.287041564194e-12 7083 KSP Residual norm 3.064703907962e-12 7084 KSP Residual norm 2.616776268260e-12 7085 KSP Residual norm 2.526594672035e-12 7086 KSP Residual norm 2.689356777997e-12 7087 KSP Residual norm 2.885217875396e-12 7088 KSP Residual norm 3.156471084029e-12 7089 KSP Residual norm 3.315769463125e-12 7090 KSP Residual norm 3.609609036120e-12 7091 KSP Residual norm 4.425525657020e-12 7092 KSP Residual norm 4.973239265069e-12 7093 KSP Residual norm 4.715671825514e-12 7094 KSP Residual norm 4.558533520513e-12 7095 KSP Residual norm 4.548477353336e-12 7096 KSP Residual norm 4.573825162349e-12 7097 KSP Residual norm 4.587228542071e-12 7098 KSP Residual norm 4.665306391824e-12 7099 KSP Residual norm 4.437856885798e-12 7100 KSP Residual norm 4.224986593127e-12 7101 KSP Residual norm 4.282120590865e-12 7102 KSP Residual norm 4.453660972891e-12 7103 KSP Residual norm 4.213806156487e-12 7104 KSP Residual norm 4.109061731889e-12 7105 KSP Residual norm 4.291431125081e-12 7106 KSP Residual norm 4.224653242215e-12 7107 KSP Residual norm 3.953977963458e-12 7108 KSP Residual norm 3.676626952153e-12 7109 KSP Residual norm 3.704729865843e-12 7110 KSP Residual norm 4.044749675357e-12 7111 KSP Residual norm 4.118527283785e-12 7112 KSP Residual norm 3.421529170706e-12 7113 KSP Residual norm 2.820881794137e-12 7114 KSP Residual norm 2.711990123098e-12 7115 KSP Residual norm 2.672646564011e-12 7116 KSP Residual norm 2.542559914013e-12 7117 KSP Residual norm 2.398801538523e-12 7118 KSP Residual norm 2.071610189922e-12 7119 KSP Residual norm 1.879662616197e-12 7120 KSP Residual norm 1.934069586779e-12 7121 KSP Residual norm 1.945671490936e-12 7122 KSP Residual norm 1.804966170455e-12 7123 KSP Residual norm 1.817305978052e-12 7124 KSP Residual norm 2.016790382469e-12 7125 KSP Residual norm 2.290706121463e-12 7126 KSP Residual norm 2.450986844130e-12 7127 KSP Residual norm 2.152610088285e-12 7128 KSP Residual norm 1.673373897504e-12 7129 KSP Residual norm 1.503455570246e-12 7130 KSP Residual norm 1.694876628492e-12 7131 KSP Residual norm 1.947188470809e-12 7132 KSP Residual norm 1.863565987871e-12 7133 KSP Residual norm 1.691018809283e-12 7134 KSP Residual norm 1.537816988978e-12 7135 KSP Residual norm 1.507904824438e-12 7136 KSP Residual norm 1.575585576948e-12 7137 KSP Residual norm 1.530242959293e-12 7138 KSP Residual norm 1.414642176510e-12 7139 KSP Residual norm 1.488205720504e-12 7140 KSP Residual norm 1.622732289004e-12 7141 KSP Residual norm 1.547661895522e-12 7142 KSP Residual norm 1.388577795504e-12 7143 KSP Residual norm 1.284155045709e-12 7144 KSP Residual norm 1.313508815813e-12 7145 KSP Residual norm 1.323675527462e-12 7146 KSP Residual norm 1.373610630062e-12 7147 KSP Residual norm 1.350077008553e-12 7148 KSP Residual norm 1.405233831379e-12 7149 KSP Residual norm 1.589700893622e-12 7150 KSP Residual norm 1.684527820229e-12 7151 KSP Residual norm 1.712963803902e-12 7152 KSP Residual norm 1.724157990149e-12 7153 KSP Residual norm 1.692485387648e-12 7154 KSP Residual norm 1.640767115615e-12 7155 KSP Residual norm 1.602160025028e-12 7156 KSP Residual norm 1.485795154272e-12 7157 KSP Residual norm 1.279038088805e-12 7158 KSP Residual norm 1.124150787210e-12 7159 KSP Residual norm 1.033491135376e-12 7160 KSP Residual norm 9.832128981339e-13 7161 KSP Residual norm 9.345687609290e-13 7162 KSP Residual norm 8.117453604684e-13 7163 KSP Residual norm 6.455081356350e-13 7164 KSP Residual norm 5.306869422073e-13 7165 KSP Residual norm 4.995064712322e-13 7166 KSP Residual norm 4.932310799687e-13 7167 KSP Residual norm 4.549567248007e-13 7168 KSP Residual norm 4.252104884583e-13 7169 KSP Residual norm 4.223619574309e-13 7170 KSP Residual norm 4.215800723231e-13 7171 KSP Residual norm 4.300042235788e-13 7172 KSP Residual norm 4.175400137432e-13 7173 KSP Residual norm 4.015507795241e-13 7174 KSP Residual norm 4.185574762798e-13 7175 KSP Residual norm 4.498466702574e-13 7176 KSP Residual norm 4.242388142254e-13 7177 KSP Residual norm 3.635558979675e-13 7178 KSP Residual norm 3.433737997199e-13 7179 KSP Residual norm 3.410331089387e-13 7180 KSP Residual norm 3.122958778130e-13 7181 KSP Residual norm 2.647850076273e-13 7182 KSP Residual norm 2.306104135441e-13 7183 KSP Residual norm 2.160158380962e-13 7184 KSP Residual norm 2.145678812033e-13 7185 KSP Residual norm 1.984858865969e-13 7186 KSP Residual norm 1.797581096108e-13 7187 KSP Residual norm 1.799976868019e-13 7188 KSP Residual norm 1.840937483505e-13 7189 KSP Residual norm 1.693814131448e-13 7190 KSP Residual norm 1.551434785277e-13 7191 KSP Residual norm 1.559392926667e-13 7192 KSP Residual norm 1.743428714320e-13 7193 KSP Residual norm 1.784957643243e-13 7194 KSP Residual norm 1.649119263312e-13 7195 KSP Residual norm 1.563271384366e-13 7196 KSP Residual norm 1.503030986726e-13 7197 KSP Residual norm 1.561945754659e-13 7198 KSP Residual norm 1.710262381745e-13 7199 KSP Residual norm 1.841212933044e-13 7200 KSP Residual norm 1.773832213833e-13 7201 KSP Residual norm 1.655035667047e-13 7202 KSP Residual norm 1.660669733571e-13 7203 KSP Residual norm 1.787259025283e-13 7204 KSP Residual norm 1.945381784330e-13 7205 KSP Residual norm 1.934800481238e-13 7206 KSP Residual norm 1.680807178356e-13 7207 KSP Residual norm 1.572962389503e-13 7208 KSP Residual norm 1.651430484195e-13 7209 KSP Residual norm 1.740795778866e-13 7210 KSP Residual norm 1.684027527271e-13 7211 KSP Residual norm 1.524480471969e-13 7212 KSP Residual norm 1.396565015889e-13 7213 KSP Residual norm 1.474570222632e-13 7214 KSP Residual norm 1.776578127675e-13 7215 KSP Residual norm 1.902982109606e-13 7216 KSP Residual norm 1.852319175996e-13 7217 KSP Residual norm 2.165500340214e-13 7218 KSP Residual norm 2.620338241456e-13 7219 KSP Residual norm 2.493328211354e-13 7220 KSP Residual norm 2.072285906524e-13 7221 KSP Residual norm 1.988913141208e-13 7222 KSP Residual norm 2.141616564411e-13 7223 KSP Residual norm 2.212254941886e-13 7224 KSP Residual norm 1.869394030263e-13 7225 KSP Residual norm 1.542384940752e-13 7226 KSP Residual norm 1.544009121344e-13 7227 KSP Residual norm 1.610542419612e-13 7228 KSP Residual norm 1.484026807280e-13 7229 KSP Residual norm 1.282754431283e-13 7230 KSP Residual norm 1.222086131301e-13 7231 KSP Residual norm 1.299973606511e-13 7232 KSP Residual norm 1.247225569202e-13 7233 KSP Residual norm 1.140236620610e-13 7234 KSP Residual norm 1.078527678607e-13 7235 KSP Residual norm 1.019111321444e-13 7236 KSP Residual norm 9.310290901568e-14 7237 KSP Residual norm 9.257312002382e-14 7238 KSP Residual norm 9.665223132698e-14 7239 KSP Residual norm 1.031308455639e-13 7240 KSP Residual norm 1.137230906608e-13 7241 KSP Residual norm 1.276253973526e-13 7242 KSP Residual norm 1.253179555883e-13 7243 KSP Residual norm 1.145011164345e-13 7244 KSP Residual norm 1.123896800635e-13 7245 KSP Residual norm 1.120559288527e-13 7246 KSP Residual norm 1.096096535477e-13 7247 KSP Residual norm 1.040637760995e-13 7248 KSP Residual norm 9.599713456412e-14 7249 KSP Residual norm 8.680985072399e-14 7250 KSP Residual norm 8.018894383417e-14 7251 KSP Residual norm 7.226272064244e-14 7252 KSP Residual norm 5.896040649208e-14 7253 KSP Residual norm 5.193258530761e-14 7254 KSP Residual norm 5.223062218428e-14 7255 KSP Residual norm 5.245178794763e-14 7256 KSP Residual norm 4.595585498512e-14 7257 KSP Residual norm 4.144261780133e-14 7258 KSP Residual norm 3.785495859415e-14 7259 KSP Residual norm 3.593302591830e-14 7260 KSP Residual norm 3.997153697133e-14 7261 KSP Residual norm 4.535529338456e-14 7262 KSP Residual norm 4.338257631529e-14 7263 KSP Residual norm 4.123575245139e-14 7264 KSP Residual norm 4.415584448653e-14 7265 KSP Residual norm 5.232057646790e-14 7266 KSP Residual norm 6.401705339893e-14 7267 KSP Residual norm 6.423609982907e-14 7268 KSP Residual norm 6.110072044529e-14 7269 KSP Residual norm 6.570638351360e-14 7270 KSP Residual norm 6.872841880202e-14 7271 KSP Residual norm 6.555233290253e-14 7272 KSP Residual norm 6.263777749875e-14 7273 KSP Residual norm 5.967126468313e-14 7274 KSP Residual norm 5.684645281480e-14 7275 KSP Residual norm 5.848397794758e-14 7276 KSP Residual norm 5.827487018853e-14 7277 KSP Residual norm 5.451577605113e-14 7278 KSP Residual norm 5.398665001915e-14 7279 KSP Residual norm 5.905825329091e-14 7280 KSP Residual norm 6.581916994730e-14 7281 KSP Residual norm 7.452601638092e-14 7282 KSP Residual norm 8.394094102362e-14 7283 KSP Residual norm 8.806090293657e-14 7284 KSP Residual norm 9.220147662602e-14 7285 KSP Residual norm 1.047565556411e-13 7286 KSP Residual norm 1.151404733478e-13 7287 KSP Residual norm 1.164153319315e-13 7288 KSP Residual norm 1.157081242168e-13 7289 KSP Residual norm 1.182325224862e-13 7290 KSP Residual norm 1.183525688466e-13 7291 KSP Residual norm 1.115399330124e-13 7292 KSP Residual norm 1.005431747240e-13 7293 KSP Residual norm 8.691161292594e-14 7294 KSP Residual norm 7.593909515637e-14 7295 KSP Residual norm 6.992616268158e-14 7296 KSP Residual norm 6.409086170697e-14 7297 KSP Residual norm 6.019016038559e-14 7298 KSP Residual norm 6.038814818388e-14 7299 KSP Residual norm 5.930871108725e-14 7300 KSP Residual norm 5.336765440165e-14 7301 KSP Residual norm 4.985299037976e-14 7302 KSP Residual norm 4.583077037819e-14 7303 KSP Residual norm 4.218746467174e-14 7304 KSP Residual norm 3.937201781081e-14 7305 KSP Residual norm 3.696856723461e-14 7306 KSP Residual norm 3.724718301436e-14 7307 KSP Residual norm 4.110407164512e-14 7308 KSP Residual norm 4.537366479458e-14 7309 KSP Residual norm 4.858008464346e-14 7310 KSP Residual norm 5.735439828393e-14 7311 KSP Residual norm 6.305256578234e-14 7312 KSP Residual norm 6.804749504707e-14 7313 KSP Residual norm 7.753568238599e-14 7314 KSP Residual norm 8.225039801725e-14 7315 KSP Residual norm 7.874314475546e-14 7316 KSP Residual norm 7.537515485286e-14 7317 KSP Residual norm 7.807347713306e-14 7318 KSP Residual norm 8.583097576526e-14 7319 KSP Residual norm 8.646031560070e-14 7320 KSP Residual norm 8.217227186127e-14 7321 KSP Residual norm 7.906046972555e-14 7322 KSP Residual norm 7.912695461149e-14 7323 KSP Residual norm 7.719302880100e-14 7324 KSP Residual norm 7.216027884290e-14 7325 KSP Residual norm 7.028922262447e-14 7326 KSP Residual norm 6.925504575685e-14 7327 KSP Residual norm 6.875531461662e-14 7328 KSP Residual norm 6.676264338258e-14 7329 KSP Residual norm 6.476908836842e-14 7330 KSP Residual norm 6.380924812648e-14 7331 KSP Residual norm 6.258334972281e-14 7332 KSP Residual norm 6.360551055602e-14 7333 KSP Residual norm 6.768577399255e-14 7334 KSP Residual norm 7.529413994087e-14 7335 KSP Residual norm 8.482374531993e-14 7336 KSP Residual norm 9.748719386534e-14 7337 KSP Residual norm 1.080654452382e-13 7338 KSP Residual norm 1.067767601379e-13 7339 KSP Residual norm 1.137818920854e-13 7340 KSP Residual norm 1.363802920803e-13 7341 KSP Residual norm 1.580068540449e-13 7342 KSP Residual norm 1.718252272461e-13 7343 KSP Residual norm 1.780509210480e-13 7344 KSP Residual norm 1.966735048312e-13 7345 KSP Residual norm 2.356262473313e-13 7346 KSP Residual norm 2.623549879331e-13 7347 KSP Residual norm 2.868149644783e-13 7348 KSP Residual norm 3.144953788951e-13 7349 KSP Residual norm 3.231052987112e-13 7350 KSP Residual norm 3.250612686140e-13 7351 KSP Residual norm 3.226648397164e-13 7352 KSP Residual norm 2.860786820736e-13 7353 KSP Residual norm 2.533381708189e-13 7354 KSP Residual norm 2.487536508625e-13 7355 KSP Residual norm 2.250027082847e-13 7356 KSP Residual norm 1.989108588501e-13 7357 KSP Residual norm 1.869651820322e-13 7358 KSP Residual norm 1.812243831025e-13 7359 KSP Residual norm 1.777519728726e-13 7360 KSP Residual norm 1.721504308507e-13 7361 KSP Residual norm 1.599887770121e-13 7362 KSP Residual norm 1.482494679818e-13 7363 KSP Residual norm 1.415304969898e-13 7364 KSP Residual norm 1.411436501623e-13 7365 KSP Residual norm 1.436736013323e-13 7366 KSP Residual norm 1.415499522422e-13 7367 KSP Residual norm 1.423541601522e-13 7368 KSP Residual norm 1.502427335569e-13 7369 KSP Residual norm 1.771026075983e-13 7370 KSP Residual norm 2.315838018925e-13 7371 KSP Residual norm 2.762328493793e-13 7372 KSP Residual norm 3.087234960639e-13 7373 KSP Residual norm 3.422233194675e-13 7374 KSP Residual norm 3.831793720186e-13 7375 KSP Residual norm 4.386837089904e-13 7376 KSP Residual norm 4.636302235918e-13 7377 KSP Residual norm 4.657412360718e-13 7378 KSP Residual norm 4.934335911624e-13 7379 KSP Residual norm 5.150089239700e-13 7380 KSP Residual norm 4.798321016531e-13 7381 KSP Residual norm 4.528505762218e-13 7382 KSP Residual norm 4.586159709999e-13 7383 KSP Residual norm 4.086923995500e-13 7384 KSP Residual norm 3.623349763292e-13 7385 KSP Residual norm 3.703557156903e-13 7386 KSP Residual norm 3.886703716875e-13 7387 KSP Residual norm 3.787994899294e-13 7388 KSP Residual norm 3.621797191030e-13 7389 KSP Residual norm 3.318294944344e-13 7390 KSP Residual norm 3.108868805694e-13 7391 KSP Residual norm 3.214638010837e-13 7392 KSP Residual norm 3.392285743025e-13 7393 KSP Residual norm 3.592730931622e-13 7394 KSP Residual norm 3.597859374784e-13 7395 KSP Residual norm 3.698148996909e-13 7396 KSP Residual norm 3.560851210545e-13 7397 KSP Residual norm 3.656910580536e-13 7398 KSP Residual norm 4.373769317830e-13 7399 KSP Residual norm 5.089968056150e-13 7400 KSP Residual norm 4.890671047714e-13 7401 KSP Residual norm 4.442049943869e-13 7402 KSP Residual norm 4.365009156568e-13 7403 KSP Residual norm 4.334258280601e-13 7404 KSP Residual norm 3.937267137344e-13 7405 KSP Residual norm 3.815888878310e-13 7406 KSP Residual norm 4.058641561438e-13 7407 KSP Residual norm 4.057488435734e-13 7408 KSP Residual norm 3.575914624661e-13 7409 KSP Residual norm 3.315124548513e-13 7410 KSP Residual norm 3.335216105721e-13 7411 KSP Residual norm 3.254842202101e-13 7412 KSP Residual norm 3.081724869709e-13 7413 KSP Residual norm 2.918625270108e-13 7414 KSP Residual norm 2.709029524602e-13 7415 KSP Residual norm 2.549280259433e-13 7416 KSP Residual norm 2.398592259674e-13 7417 KSP Residual norm 2.226624409076e-13 7418 KSP Residual norm 2.292047019029e-13 7419 KSP Residual norm 2.547002228306e-13 7420 KSP Residual norm 2.583253669854e-13 7421 KSP Residual norm 2.422895682478e-13 7422 KSP Residual norm 2.528815516569e-13 7423 KSP Residual norm 2.990404515076e-13 7424 KSP Residual norm 3.261705352927e-13 7425 KSP Residual norm 3.205806061571e-13 7426 KSP Residual norm 2.988176104203e-13 7427 KSP Residual norm 2.917773622559e-13 7428 KSP Residual norm 3.054423512407e-13 7429 KSP Residual norm 2.847320405821e-13 7430 KSP Residual norm 2.537151078668e-13 7431 KSP Residual norm 2.570048449654e-13 7432 KSP Residual norm 2.756050251875e-13 7433 KSP Residual norm 2.742588603015e-13 7434 KSP Residual norm 2.535480275752e-13 7435 KSP Residual norm 2.360466136851e-13 7436 KSP Residual norm 2.415253661331e-13 7437 KSP Residual norm 2.443639403340e-13 7438 KSP Residual norm 2.211379000972e-13 7439 KSP Residual norm 1.997241632692e-13 7440 KSP Residual norm 2.010041062681e-13 7441 KSP Residual norm 1.987552567561e-13 7442 KSP Residual norm 1.780706041180e-13 7443 KSP Residual norm 1.602789715274e-13 7444 KSP Residual norm 1.561305666405e-13 7445 KSP Residual norm 1.591868513037e-13 7446 KSP Residual norm 1.759966285951e-13 7447 KSP Residual norm 1.843777751829e-13 7448 KSP Residual norm 1.820426226581e-13 7449 KSP Residual norm 1.775002443867e-13 7450 KSP Residual norm 1.797834384203e-13 7451 KSP Residual norm 1.984838763802e-13 7452 KSP Residual norm 2.122070775368e-13 7453 KSP Residual norm 1.985992772974e-13 7454 KSP Residual norm 1.998443800024e-13 7455 KSP Residual norm 2.082565796200e-13 7456 KSP Residual norm 1.841941930457e-13 7457 KSP Residual norm 1.548493079732e-13 7458 KSP Residual norm 1.450513762024e-13 7459 KSP Residual norm 1.412757965796e-13 7460 KSP Residual norm 1.403567907713e-13 7461 KSP Residual norm 1.429002887638e-13 7462 KSP Residual norm 1.425264166613e-13 7463 KSP Residual norm 1.463210228447e-13 7464 KSP Residual norm 1.524649415085e-13 7465 KSP Residual norm 1.611450897501e-13 7466 KSP Residual norm 1.808711953422e-13 7467 KSP Residual norm 1.980387305396e-13 7468 KSP Residual norm 2.145236044950e-13 7469 KSP Residual norm 2.219383238235e-13 7470 KSP Residual norm 2.288209966611e-13 7471 KSP Residual norm 2.468798549875e-13 7472 KSP Residual norm 2.531940740244e-13 7473 KSP Residual norm 2.452531339189e-13 7474 KSP Residual norm 2.660486084878e-13 7475 KSP Residual norm 2.860686772226e-13 7476 KSP Residual norm 2.813029941587e-13 7477 KSP Residual norm 2.766231714525e-13 7478 KSP Residual norm 2.653340423184e-13 7479 KSP Residual norm 2.379705935850e-13 7480 KSP Residual norm 2.294041858230e-13 7481 KSP Residual norm 2.367729763450e-13 7482 KSP Residual norm 2.462497355506e-13 7483 KSP Residual norm 2.578377888792e-13 7484 KSP Residual norm 2.621844820674e-13 7485 KSP Residual norm 2.228233842528e-13 7486 KSP Residual norm 1.853368516297e-13 7487 KSP Residual norm 1.721256910097e-13 7488 KSP Residual norm 1.682577022555e-13 7489 KSP Residual norm 1.668108995615e-13 7490 KSP Residual norm 1.751207183552e-13 7491 KSP Residual norm 2.041592417258e-13 7492 KSP Residual norm 2.245608948590e-13 7493 KSP Residual norm 2.035063312372e-13 7494 KSP Residual norm 1.897383106275e-13 7495 KSP Residual norm 1.773152854770e-13 7496 KSP Residual norm 1.775539182552e-13 7497 KSP Residual norm 1.904499932289e-13 7498 KSP Residual norm 1.862354493395e-13 7499 KSP Residual norm 1.925486917868e-13 7500 KSP Residual norm 2.296199594446e-13 7501 KSP Residual norm 2.342312563571e-13 7502 KSP Residual norm 2.055891955144e-13 7503 KSP Residual norm 1.753182108102e-13 7504 KSP Residual norm 1.506290858967e-13 7505 KSP Residual norm 1.459453323427e-13 7506 KSP Residual norm 1.630921883906e-13 7507 KSP Residual norm 1.641362077969e-13 7508 KSP Residual norm 1.529757319273e-13 7509 KSP Residual norm 1.577329924832e-13 7510 KSP Residual norm 1.675313304656e-13 7511 KSP Residual norm 1.781133341752e-13 7512 KSP Residual norm 1.800815408100e-13 7513 KSP Residual norm 1.819565049601e-13 7514 KSP Residual norm 1.812213086965e-13 7515 KSP Residual norm 1.777870951633e-13 7516 KSP Residual norm 1.680455723461e-13 7517 KSP Residual norm 1.685796044183e-13 7518 KSP Residual norm 1.974851486339e-13 7519 KSP Residual norm 2.283797614143e-13 7520 KSP Residual norm 2.315572667588e-13 7521 KSP Residual norm 2.363712746296e-13 7522 KSP Residual norm 2.529526958202e-13 7523 KSP Residual norm 2.631512897072e-13 7524 KSP Residual norm 2.632570098630e-13 7525 KSP Residual norm 2.844737687154e-13 7526 KSP Residual norm 3.218343083835e-13 7527 KSP Residual norm 3.149516703547e-13 7528 KSP Residual norm 2.908813787626e-13 7529 KSP Residual norm 2.824554917916e-13 7530 KSP Residual norm 2.943050745268e-13 7531 KSP Residual norm 3.245724043979e-13 7532 KSP Residual norm 3.471158272613e-13 7533 KSP Residual norm 3.774907806608e-13 7534 KSP Residual norm 4.279817223389e-13 7535 KSP Residual norm 4.471708772205e-13 7536 KSP Residual norm 4.212334641287e-13 7537 KSP Residual norm 3.729169893189e-13 7538 KSP Residual norm 3.527160229519e-13 7539 KSP Residual norm 3.978628897016e-13 7540 KSP Residual norm 4.187599888482e-13 7541 KSP Residual norm 3.970258054846e-13 7542 KSP Residual norm 4.012443122051e-13 7543 KSP Residual norm 4.758078914336e-13 7544 KSP Residual norm 5.848680183814e-13 7545 KSP Residual norm 6.649087629792e-13 7546 KSP Residual norm 7.169605672094e-13 7547 KSP Residual norm 7.496788525859e-13 7548 KSP Residual norm 7.398972193426e-13 7549 KSP Residual norm 7.368341983929e-13 7550 KSP Residual norm 6.975621710655e-13 7551 KSP Residual norm 7.092132712935e-13 7552 KSP Residual norm 8.229486866569e-13 7553 KSP Residual norm 8.730168055784e-13 7554 KSP Residual norm 8.138651540545e-13 7555 KSP Residual norm 7.351092329276e-13 7556 KSP Residual norm 7.108547898382e-13 7557 KSP Residual norm 7.819787686070e-13 7558 KSP Residual norm 8.558411089424e-13 7559 KSP Residual norm 8.596665368614e-13 7560 KSP Residual norm 8.697789235546e-13 7561 KSP Residual norm 9.633236604162e-13 7562 KSP Residual norm 9.917488879367e-13 7563 KSP Residual norm 9.425355133933e-13 7564 KSP Residual norm 9.928511640491e-13 7565 KSP Residual norm 1.066408050617e-12 7566 KSP Residual norm 1.036129826394e-12 7567 KSP Residual norm 9.323032089209e-13 7568 KSP Residual norm 8.931650501179e-13 7569 KSP Residual norm 9.413940160048e-13 7570 KSP Residual norm 1.132057244165e-12 7571 KSP Residual norm 1.294330420294e-12 7572 KSP Residual norm 1.176205015803e-12 7573 KSP Residual norm 1.047828348736e-12 7574 KSP Residual norm 9.166066480167e-13 7575 KSP Residual norm 7.173952275169e-13 7576 KSP Residual norm 5.832469978483e-13 7577 KSP Residual norm 5.372010579900e-13 7578 KSP Residual norm 5.584976258812e-13 7579 KSP Residual norm 6.322105227295e-13 7580 KSP Residual norm 7.088089383062e-13 7581 KSP Residual norm 7.234123062545e-13 7582 KSP Residual norm 6.920600782922e-13 7583 KSP Residual norm 6.869299926558e-13 7584 KSP Residual norm 7.147357223932e-13 7585 KSP Residual norm 7.582910577171e-13 7586 KSP Residual norm 7.706580526776e-13 7587 KSP Residual norm 7.688033236089e-13 7588 KSP Residual norm 7.482732319094e-13 7589 KSP Residual norm 7.448456726473e-13 7590 KSP Residual norm 7.576265558448e-13 7591 KSP Residual norm 7.724811104206e-13 7592 KSP Residual norm 8.538482784711e-13 7593 KSP Residual norm 9.803355375121e-13 7594 KSP Residual norm 1.021825028818e-12 7595 KSP Residual norm 1.105322154626e-12 7596 KSP Residual norm 1.194791513193e-12 7597 KSP Residual norm 1.065598341873e-12 7598 KSP Residual norm 8.781196764273e-13 7599 KSP Residual norm 8.455038339434e-13 7600 KSP Residual norm 7.989275111702e-13 7601 KSP Residual norm 7.414268691579e-13 7602 KSP Residual norm 6.889642855460e-13 7603 KSP Residual norm 5.958222821302e-13 7604 KSP Residual norm 4.926920945470e-13 7605 KSP Residual norm 4.811692428034e-13 7606 KSP Residual norm 4.763790988367e-13 7607 KSP Residual norm 4.012545232230e-13 7608 KSP Residual norm 3.393047623797e-13 7609 KSP Residual norm 3.298002908640e-13 7610 KSP Residual norm 3.719213762758e-13 7611 KSP Residual norm 4.745803343887e-13 7612 KSP Residual norm 5.638094251136e-13 7613 KSP Residual norm 5.507232060383e-13 7614 KSP Residual norm 4.803996737573e-13 7615 KSP Residual norm 4.753913541833e-13 7616 KSP Residual norm 5.019301625279e-13 7617 KSP Residual norm 5.114945729089e-13 7618 KSP Residual norm 5.323217639950e-13 7619 KSP Residual norm 5.688773112219e-13 7620 KSP Residual norm 6.166832475193e-13 7621 KSP Residual norm 7.284999934378e-13 7622 KSP Residual norm 8.385319669632e-13 7623 KSP Residual norm 8.196116605667e-13 7624 KSP Residual norm 7.980093845249e-13 7625 KSP Residual norm 9.178503505858e-13 7626 KSP Residual norm 1.023756999532e-12 7627 KSP Residual norm 1.045292723855e-12 7628 KSP Residual norm 1.097161044556e-12 7629 KSP Residual norm 1.153591569090e-12 7630 KSP Residual norm 1.218189671259e-12 7631 KSP Residual norm 1.131982484043e-12 7632 KSP Residual norm 8.809262810960e-13 7633 KSP Residual norm 7.963463797603e-13 7634 KSP Residual norm 8.276124793902e-13 7635 KSP Residual norm 8.310027182937e-13 7636 KSP Residual norm 7.589094750402e-13 7637 KSP Residual norm 7.368234478771e-13 7638 KSP Residual norm 8.262476591280e-13 7639 KSP Residual norm 9.491102318632e-13 7640 KSP Residual norm 1.029721301164e-12 7641 KSP Residual norm 9.634517246503e-13 7642 KSP Residual norm 9.093595496952e-13 7643 KSP Residual norm 9.715120504248e-13 7644 KSP Residual norm 9.888716200346e-13 7645 KSP Residual norm 8.326242848132e-13 7646 KSP Residual norm 7.460538864522e-13 7647 KSP Residual norm 8.054314256617e-13 7648 KSP Residual norm 8.012315429970e-13 7649 KSP Residual norm 6.497574514395e-13 7650 KSP Residual norm 5.812821706448e-13 7651 KSP Residual norm 6.329180622732e-13 7652 KSP Residual norm 6.672661807087e-13 7653 KSP Residual norm 6.445743282467e-13 7654 KSP Residual norm 5.323418751505e-13 7655 KSP Residual norm 4.459261655378e-13 7656 KSP Residual norm 4.777023367109e-13 7657 KSP Residual norm 5.173568680601e-13 7658 KSP Residual norm 4.674285652867e-13 7659 KSP Residual norm 4.153295598925e-13 7660 KSP Residual norm 3.981913616951e-13 7661 KSP Residual norm 3.992422106442e-13 7662 KSP Residual norm 4.132075174612e-13 7663 KSP Residual norm 4.636254694342e-13 7664 KSP Residual norm 5.219612477145e-13 7665 KSP Residual norm 5.873852414823e-13 7666 KSP Residual norm 6.173994836277e-13 7667 KSP Residual norm 5.684589945229e-13 7668 KSP Residual norm 5.376065001917e-13 7669 KSP Residual norm 5.351656602218e-13 7670 KSP Residual norm 5.306564957903e-13 7671 KSP Residual norm 4.884341205658e-13 7672 KSP Residual norm 4.334309524600e-13 7673 KSP Residual norm 3.645129398258e-13 7674 KSP Residual norm 3.174085436766e-13 7675 KSP Residual norm 3.214377142368e-13 7676 KSP Residual norm 3.421398022688e-13 7677 KSP Residual norm 3.402091065800e-13 7678 KSP Residual norm 3.377308105921e-13 7679 KSP Residual norm 3.350366589929e-13 7680 KSP Residual norm 3.176781263850e-13 7681 KSP Residual norm 3.127766550450e-13 7682 KSP Residual norm 3.241160670222e-13 7683 KSP Residual norm 3.045269362551e-13 7684 KSP Residual norm 2.618480900466e-13 7685 KSP Residual norm 2.372560466921e-13 7686 KSP Residual norm 2.231247809971e-13 7687 KSP Residual norm 2.403117454388e-13 7688 KSP Residual norm 2.911271922322e-13 7689 KSP Residual norm 3.233842325878e-13 7690 KSP Residual norm 3.443347269886e-13 7691 KSP Residual norm 3.869819798594e-13 7692 KSP Residual norm 3.887481566957e-13 7693 KSP Residual norm 3.138967476676e-13 7694 KSP Residual norm 2.654039287054e-13 7695 KSP Residual norm 2.595514919557e-13 7696 KSP Residual norm 2.783821088064e-13 7697 KSP Residual norm 3.088027512775e-13 7698 KSP Residual norm 3.333521067283e-13 7699 KSP Residual norm 3.575313961218e-13 7700 KSP Residual norm 3.983303612591e-13 7701 KSP Residual norm 4.317759386790e-13 7702 KSP Residual norm 4.086059100034e-13 7703 KSP Residual norm 4.012703630968e-13 7704 KSP Residual norm 4.310939138429e-13 7705 KSP Residual norm 4.815475383273e-13 7706 KSP Residual norm 5.081580356470e-13 7707 KSP Residual norm 4.966807079939e-13 7708 KSP Residual norm 4.725562776928e-13 7709 KSP Residual norm 4.486568129829e-13 7710 KSP Residual norm 4.360208818136e-13 7711 KSP Residual norm 4.115734282025e-13 7712 KSP Residual norm 3.623501525378e-13 7713 KSP Residual norm 3.528672353011e-13 7714 KSP Residual norm 3.845446425810e-13 7715 KSP Residual norm 4.306941160469e-13 7716 KSP Residual norm 5.089228799372e-13 7717 KSP Residual norm 6.250453051612e-13 7718 KSP Residual norm 6.550028522639e-13 7719 KSP Residual norm 5.847173385523e-13 7720 KSP Residual norm 5.265441571843e-13 7721 KSP Residual norm 5.562219436542e-13 7722 KSP Residual norm 6.840931919459e-13 7723 KSP Residual norm 8.530237543342e-13 7724 KSP Residual norm 9.844957412603e-13 7725 KSP Residual norm 1.153822789936e-12 7726 KSP Residual norm 1.356410197937e-12 7727 KSP Residual norm 1.395161473798e-12 7728 KSP Residual norm 1.314084971494e-12 7729 KSP Residual norm 1.448378954124e-12 7730 KSP Residual norm 1.757314482134e-12 7731 KSP Residual norm 2.073894402390e-12 7732 KSP Residual norm 2.290424996899e-12 7733 KSP Residual norm 2.170201507433e-12 7734 KSP Residual norm 1.730422202439e-12 7735 KSP Residual norm 1.510596032556e-12 7736 KSP Residual norm 1.451280757527e-12 7737 KSP Residual norm 1.270618378272e-12 7738 KSP Residual norm 1.015547234305e-12 7739 KSP Residual norm 8.434531100169e-13 7740 KSP Residual norm 8.555732657493e-13 7741 KSP Residual norm 1.075835904867e-12 7742 KSP Residual norm 1.407817694623e-12 7743 KSP Residual norm 1.608776786179e-12 7744 KSP Residual norm 1.580720206908e-12 7745 KSP Residual norm 1.525628988359e-12 7746 KSP Residual norm 1.576082524000e-12 7747 KSP Residual norm 1.620159910695e-12 7748 KSP Residual norm 1.673691566840e-12 7749 KSP Residual norm 1.928327077023e-12 7750 KSP Residual norm 2.183879229528e-12 7751 KSP Residual norm 2.332558605763e-12 7752 KSP Residual norm 2.293511075151e-12 7753 KSP Residual norm 2.066870042879e-12 7754 KSP Residual norm 2.036684375943e-12 7755 KSP Residual norm 2.191179619555e-12 7756 KSP Residual norm 2.266760194661e-12 7757 KSP Residual norm 2.438193210121e-12 7758 KSP Residual norm 2.637417692438e-12 7759 KSP Residual norm 2.556994033109e-12 7760 KSP Residual norm 2.393235828751e-12 7761 KSP Residual norm 2.236661925217e-12 7762 KSP Residual norm 2.043347292547e-12 7763 KSP Residual norm 1.972641340067e-12 7764 KSP Residual norm 2.121231789974e-12 7765 KSP Residual norm 2.225365175710e-12 7766 KSP Residual norm 2.336593354441e-12 7767 KSP Residual norm 2.579076751912e-12 7768 KSP Residual norm 2.754017683251e-12 7769 KSP Residual norm 2.739160645893e-12 7770 KSP Residual norm 2.947995752649e-12 7771 KSP Residual norm 3.575120726244e-12 7772 KSP Residual norm 3.762251930777e-12 7773 KSP Residual norm 2.977381123545e-12 7774 KSP Residual norm 2.405293401829e-12 7775 KSP Residual norm 2.393154955084e-12 7776 KSP Residual norm 2.496210347138e-12 7777 KSP Residual norm 2.871972327628e-12 7778 KSP Residual norm 3.356668557516e-12 7779 KSP Residual norm 3.463161085549e-12 7780 KSP Residual norm 3.364189718610e-12 7781 KSP Residual norm 3.632572585331e-12 7782 KSP Residual norm 4.127824451400e-12 7783 KSP Residual norm 4.515259435082e-12 7784 KSP Residual norm 4.846534188837e-12 7785 KSP Residual norm 4.825011797118e-12 7786 KSP Residual norm 4.391717279684e-12 7787 KSP Residual norm 4.381721937401e-12 7788 KSP Residual norm 4.729638804602e-12 7789 KSP Residual norm 4.955897303374e-12 7790 KSP Residual norm 5.293790349207e-12 7791 KSP Residual norm 5.617896799041e-12 7792 KSP Residual norm 5.588562168915e-12 7793 KSP Residual norm 5.295793901439e-12 7794 KSP Residual norm 4.532212947523e-12 7795 KSP Residual norm 3.982264153903e-12 7796 KSP Residual norm 4.075475283841e-12 7797 KSP Residual norm 4.607718746484e-12 7798 KSP Residual norm 4.979721304956e-12 7799 KSP Residual norm 4.874622119983e-12 7800 KSP Residual norm 4.967788193091e-12 7801 KSP Residual norm 4.947258241718e-12 7802 KSP Residual norm 4.881870257782e-12 7803 KSP Residual norm 5.004884036076e-12 7804 KSP Residual norm 5.482963866741e-12 7805 KSP Residual norm 6.518193777735e-12 7806 KSP Residual norm 8.754908067927e-12 7807 KSP Residual norm 1.038402512838e-11 7808 KSP Residual norm 9.029468060963e-12 7809 KSP Residual norm 7.505241458090e-12 7810 KSP Residual norm 7.337767327736e-12 7811 KSP Residual norm 8.408390653967e-12 7812 KSP Residual norm 1.093615628013e-11 7813 KSP Residual norm 1.254251198960e-11 7814 KSP Residual norm 1.117373944616e-11 7815 KSP Residual norm 1.003638698477e-11 7816 KSP Residual norm 9.309377925520e-12 7817 KSP Residual norm 7.460770940601e-12 7818 KSP Residual norm 6.377641380694e-12 7819 KSP Residual norm 6.906334784951e-12 7820 KSP Residual norm 8.009428240380e-12 7821 KSP Residual norm 8.833421756135e-12 7822 KSP Residual norm 8.936861652442e-12 7823 KSP Residual norm 8.485990550786e-12 7824 KSP Residual norm 8.329296944992e-12 7825 KSP Residual norm 8.036892358296e-12 7826 KSP Residual norm 7.299585157135e-12 7827 KSP Residual norm 6.967378519347e-12 7828 KSP Residual norm 7.554657544609e-12 7829 KSP Residual norm 8.542823806256e-12 7830 KSP Residual norm 9.007431225929e-12 7831 KSP Residual norm 9.036966316882e-12 7832 KSP Residual norm 9.268034278139e-12 7833 KSP Residual norm 1.017483282473e-11 7834 KSP Residual norm 1.246552131499e-11 7835 KSP Residual norm 1.525684406523e-11 7836 KSP Residual norm 1.564681798325e-11 7837 KSP Residual norm 1.472287086748e-11 7838 KSP Residual norm 1.522429641847e-11 7839 KSP Residual norm 1.514006351533e-11 7840 KSP Residual norm 1.224668119737e-11 7841 KSP Residual norm 1.012877121990e-11 7842 KSP Residual norm 1.005505954997e-11 7843 KSP Residual norm 1.064342048533e-11 7844 KSP Residual norm 1.006785278964e-11 7845 KSP Residual norm 7.777808292507e-12 7846 KSP Residual norm 6.234101186632e-12 7847 KSP Residual norm 6.627007025227e-12 7848 KSP Residual norm 8.862331782728e-12 7849 KSP Residual norm 1.036050820472e-11 7850 KSP Residual norm 9.678630161381e-12 7851 KSP Residual norm 9.188327812782e-12 7852 KSP Residual norm 9.809906469246e-12 7853 KSP Residual norm 1.247002865207e-11 7854 KSP Residual norm 1.765426445461e-11 7855 KSP Residual norm 1.989035560985e-11 7856 KSP Residual norm 1.675934977281e-11 7857 KSP Residual norm 1.412596795407e-11 7858 KSP Residual norm 1.318706967744e-11 7859 KSP Residual norm 1.238692067343e-11 7860 KSP Residual norm 1.182679588462e-11 7861 KSP Residual norm 1.164780604591e-11 7862 KSP Residual norm 1.060499081414e-11 7863 KSP Residual norm 9.239919262693e-12 7864 KSP Residual norm 8.807027276712e-12 7865 KSP Residual norm 8.709204375696e-12 7866 KSP Residual norm 8.434311470714e-12 7867 KSP Residual norm 8.332532658072e-12 7868 KSP Residual norm 7.916973908033e-12 7869 KSP Residual norm 6.793044665684e-12 7870 KSP Residual norm 6.079911034527e-12 7871 KSP Residual norm 6.012013727332e-12 7872 KSP Residual norm 6.089025467139e-12 7873 KSP Residual norm 6.350385597151e-12 7874 KSP Residual norm 6.199734118070e-12 7875 KSP Residual norm 5.490865491278e-12 7876 KSP Residual norm 5.148150806266e-12 7877 KSP Residual norm 5.097358245933e-12 7878 KSP Residual norm 5.121673124665e-12 7879 KSP Residual norm 5.404286356277e-12 7880 KSP Residual norm 5.890044724071e-12 7881 KSP Residual norm 6.193397462807e-12 7882 KSP Residual norm 6.163494589869e-12 7883 KSP Residual norm 6.026094906149e-12 7884 KSP Residual norm 5.519611060887e-12 7885 KSP Residual norm 5.618958209326e-12 7886 KSP Residual norm 6.540138891911e-12 7887 KSP Residual norm 8.256611419756e-12 7888 KSP Residual norm 9.714784881590e-12 7889 KSP Residual norm 1.086978141669e-11 7890 KSP Residual norm 1.112873379923e-11 7891 KSP Residual norm 1.113720767031e-11 7892 KSP Residual norm 1.204767631789e-11 7893 KSP Residual norm 1.289200065017e-11 7894 KSP Residual norm 1.320860382600e-11 7895 KSP Residual norm 1.399824161279e-11 7896 KSP Residual norm 1.469115783586e-11 7897 KSP Residual norm 1.473593370287e-11 7898 KSP Residual norm 1.393380094868e-11 7899 KSP Residual norm 1.178410124308e-11 7900 KSP Residual norm 8.910304061811e-12 7901 KSP Residual norm 6.566265192993e-12 7902 KSP Residual norm 5.586076066953e-12 7903 KSP Residual norm 5.276727400867e-12 7904 KSP Residual norm 5.117531607468e-12 7905 KSP Residual norm 5.220524597794e-12 7906 KSP Residual norm 5.342721926399e-12 7907 KSP Residual norm 5.076932876238e-12 7908 KSP Residual norm 5.304519670976e-12 7909 KSP Residual norm 6.739179724009e-12 7910 KSP Residual norm 7.325666906258e-12 7911 KSP Residual norm 6.733763235157e-12 7912 KSP Residual norm 6.657494039131e-12 7913 KSP Residual norm 6.803225766968e-12 7914 KSP Residual norm 6.643826895528e-12 7915 KSP Residual norm 6.564552798759e-12 7916 KSP Residual norm 7.341693900262e-12 7917 KSP Residual norm 9.713852310020e-12 7918 KSP Residual norm 1.186757678574e-11 7919 KSP Residual norm 1.163176560036e-11 7920 KSP Residual norm 1.135847783916e-11 7921 KSP Residual norm 1.182153154053e-11 7922 KSP Residual norm 1.126861489410e-11 7923 KSP Residual norm 9.543896413281e-12 7924 KSP Residual norm 7.843549213881e-12 7925 KSP Residual norm 7.040592964766e-12 7926 KSP Residual norm 7.677508166157e-12 7927 KSP Residual norm 9.262357755254e-12 7928 KSP Residual norm 1.195801258592e-11 7929 KSP Residual norm 1.416725878977e-11 7930 KSP Residual norm 1.306563770972e-11 7931 KSP Residual norm 1.027706381659e-11 7932 KSP Residual norm 8.040850887065e-12 7933 KSP Residual norm 7.124505083435e-12 7934 KSP Residual norm 7.410006059268e-12 7935 KSP Residual norm 7.620910127014e-12 7936 KSP Residual norm 7.849336006518e-12 7937 KSP Residual norm 7.884924764024e-12 7938 KSP Residual norm 6.495987930448e-12 7939 KSP Residual norm 5.254171522024e-12 7940 KSP Residual norm 4.916030986793e-12 7941 KSP Residual norm 4.674730383977e-12 7942 KSP Residual norm 4.177317722290e-12 7943 KSP Residual norm 4.229636276450e-12 7944 KSP Residual norm 4.447064106992e-12 7945 KSP Residual norm 4.016284558496e-12 7946 KSP Residual norm 3.399004555068e-12 7947 KSP Residual norm 2.925716132915e-12 7948 KSP Residual norm 2.681352377550e-12 7949 KSP Residual norm 2.683251070188e-12 7950 KSP Residual norm 2.896518606237e-12 7951 KSP Residual norm 3.272749598375e-12 7952 KSP Residual norm 3.626322637737e-12 7953 KSP Residual norm 3.705552281445e-12 7954 KSP Residual norm 4.009978920900e-12 7955 KSP Residual norm 4.275209277911e-12 7956 KSP Residual norm 3.940153698469e-12 7957 KSP Residual norm 3.512922288189e-12 7958 KSP Residual norm 3.310515827727e-12 7959 KSP Residual norm 3.190213328370e-12 7960 KSP Residual norm 3.064971593588e-12 7961 KSP Residual norm 2.670529527490e-12 7962 KSP Residual norm 2.374143926570e-12 7963 KSP Residual norm 2.009678946199e-12 7964 KSP Residual norm 1.622270523389e-12 7965 KSP Residual norm 1.584229899985e-12 7966 KSP Residual norm 1.716394211949e-12 7967 KSP Residual norm 1.741954487196e-12 7968 KSP Residual norm 1.755621534254e-12 7969 KSP Residual norm 1.641362180726e-12 7970 KSP Residual norm 1.509475214078e-12 7971 KSP Residual norm 1.530487471655e-12 7972 KSP Residual norm 1.674880207701e-12 7973 KSP Residual norm 1.605266966311e-12 7974 KSP Residual norm 1.345697942925e-12 7975 KSP Residual norm 1.202589103502e-12 7976 KSP Residual norm 1.099934917259e-12 7977 KSP Residual norm 1.013188675224e-12 7978 KSP Residual norm 9.343601955511e-13 7979 KSP Residual norm 8.782463162755e-13 7980 KSP Residual norm 8.605804962555e-13 7981 KSP Residual norm 9.138108728857e-13 7982 KSP Residual norm 9.777551890931e-13 7983 KSP Residual norm 9.913794283619e-13 7984 KSP Residual norm 1.014333466342e-12 7985 KSP Residual norm 1.175022705198e-12 7986 KSP Residual norm 1.401401319099e-12 7987 KSP Residual norm 1.372645097225e-12 7988 KSP Residual norm 1.323898065715e-12 7989 KSP Residual norm 1.327322381726e-12 7990 KSP Residual norm 1.253296314880e-12 7991 KSP Residual norm 1.201725219056e-12 7992 KSP Residual norm 1.122152981521e-12 7993 KSP Residual norm 9.710536694927e-13 7994 KSP Residual norm 9.215264443346e-13 7995 KSP Residual norm 1.017896246923e-12 7996 KSP Residual norm 1.257096323213e-12 7997 KSP Residual norm 1.614134870187e-12 7998 KSP Residual norm 1.855627493639e-12 7999 KSP Residual norm 1.885458576993e-12 8000 KSP Residual norm 2.139884322483e-12 8001 KSP Residual norm 2.673016233258e-12 8002 KSP Residual norm 2.983559220282e-12 8003 KSP Residual norm 2.677070281132e-12 8004 KSP Residual norm 1.822829859337e-12 8005 KSP Residual norm 1.197124215835e-12 8006 KSP Residual norm 9.752456159362e-13 8007 KSP Residual norm 9.785369839400e-13 8008 KSP Residual norm 1.113543967478e-12 8009 KSP Residual norm 1.379898427374e-12 8010 KSP Residual norm 1.634414421448e-12 8011 KSP Residual norm 1.623444412048e-12 8012 KSP Residual norm 1.268321271938e-12 8013 KSP Residual norm 9.884295495442e-13 8014 KSP Residual norm 9.309741894614e-13 8015 KSP Residual norm 1.021804451680e-12 8016 KSP Residual norm 1.262822718401e-12 8017 KSP Residual norm 1.574048205884e-12 8018 KSP Residual norm 1.759808816820e-12 8019 KSP Residual norm 1.728531582622e-12 8020 KSP Residual norm 1.780519534887e-12 8021 KSP Residual norm 1.898972481400e-12 8022 KSP Residual norm 1.889596667160e-12 8023 KSP Residual norm 1.797026856460e-12 8024 KSP Residual norm 1.608584819567e-12 8025 KSP Residual norm 1.392816312184e-12 8026 KSP Residual norm 1.318097428555e-12 8027 KSP Residual norm 1.493552271679e-12 8028 KSP Residual norm 1.803144730093e-12 8029 KSP Residual norm 1.719955954569e-12 8030 KSP Residual norm 1.475874723916e-12 8031 KSP Residual norm 1.485049834374e-12 8032 KSP Residual norm 1.612817844913e-12 8033 KSP Residual norm 1.714087320936e-12 8034 KSP Residual norm 1.568841981556e-12 8035 KSP Residual norm 1.222833893556e-12 8036 KSP Residual norm 1.020196364219e-12 8037 KSP Residual norm 9.090304340421e-13 8038 KSP Residual norm 8.194728285695e-13 8039 KSP Residual norm 8.584096839161e-13 8040 KSP Residual norm 1.002057423967e-12 8041 KSP Residual norm 1.151958784094e-12 8042 KSP Residual norm 1.053552251337e-12 8043 KSP Residual norm 8.212272939747e-13 8044 KSP Residual norm 6.923011369716e-13 8045 KSP Residual norm 6.623208436709e-13 8046 KSP Residual norm 5.569838144050e-13 8047 KSP Residual norm 4.309526519913e-13 8048 KSP Residual norm 4.038468888546e-13 8049 KSP Residual norm 5.070270089244e-13 8050 KSP Residual norm 6.723204128354e-13 8051 KSP Residual norm 7.788304937448e-13 8052 KSP Residual norm 8.111631043041e-13 8053 KSP Residual norm 8.051158087207e-13 8054 KSP Residual norm 7.340388919981e-13 8055 KSP Residual norm 7.064015725695e-13 8056 KSP Residual norm 7.933919697917e-13 8057 KSP Residual norm 9.269044646518e-13 8058 KSP Residual norm 1.036611383084e-12 8059 KSP Residual norm 1.054537972585e-12 8060 KSP Residual norm 9.248785347597e-13 8061 KSP Residual norm 7.923800206360e-13 8062 KSP Residual norm 7.723190159987e-13 8063 KSP Residual norm 8.107652482237e-13 8064 KSP Residual norm 7.993830323043e-13 8065 KSP Residual norm 7.778097324870e-13 8066 KSP Residual norm 7.787876941700e-13 8067 KSP Residual norm 7.562126097712e-13 8068 KSP Residual norm 7.381813070490e-13 8069 KSP Residual norm 7.817379515344e-13 8070 KSP Residual norm 7.574031303734e-13 8071 KSP Residual norm 6.876131628293e-13 8072 KSP Residual norm 6.484457046057e-13 8073 KSP Residual norm 5.696931276987e-13 8074 KSP Residual norm 5.104650707221e-13 8075 KSP Residual norm 5.127984757632e-13 8076 KSP Residual norm 5.588403353175e-13 8077 KSP Residual norm 6.575025731760e-13 8078 KSP Residual norm 7.453324460147e-13 8079 KSP Residual norm 7.272748818496e-13 8080 KSP Residual norm 6.902171712185e-13 8081 KSP Residual norm 7.066495940395e-13 8082 KSP Residual norm 7.341095052674e-13 8083 KSP Residual norm 7.163685016603e-13 8084 KSP Residual norm 6.883194278414e-13 8085 KSP Residual norm 6.539457141140e-13 8086 KSP Residual norm 6.639254231574e-13 8087 KSP Residual norm 8.065304741115e-13 8088 KSP Residual norm 1.044826950451e-12 8089 KSP Residual norm 1.115620344863e-12 8090 KSP Residual norm 1.038200614357e-12 8091 KSP Residual norm 9.315098829633e-13 8092 KSP Residual norm 8.754865886870e-13 8093 KSP Residual norm 9.174340859188e-13 8094 KSP Residual norm 1.073574194195e-12 8095 KSP Residual norm 1.149977755839e-12 8096 KSP Residual norm 1.151902659205e-12 8097 KSP Residual norm 1.172319197797e-12 8098 KSP Residual norm 1.225078270794e-12 8099 KSP Residual norm 1.214177679110e-12 8100 KSP Residual norm 1.150445829929e-12 8101 KSP Residual norm 1.019631922200e-12 8102 KSP Residual norm 8.312827010001e-13 8103 KSP Residual norm 7.323322680565e-13 8104 KSP Residual norm 7.572581345187e-13 8105 KSP Residual norm 8.420066653727e-13 8106 KSP Residual norm 9.313352164136e-13 8107 KSP Residual norm 9.948774464440e-13 8108 KSP Residual norm 9.790715846550e-13 8109 KSP Residual norm 9.650043337563e-13 8110 KSP Residual norm 1.070139922112e-12 8111 KSP Residual norm 1.322517107858e-12 8112 KSP Residual norm 1.480877874165e-12 8113 KSP Residual norm 1.351890309275e-12 8114 KSP Residual norm 1.220509523321e-12 8115 KSP Residual norm 1.278567255544e-12 8116 KSP Residual norm 1.447414903135e-12 8117 KSP Residual norm 1.512467927993e-12 8118 KSP Residual norm 1.642682460113e-12 8119 KSP Residual norm 1.983707443032e-12 8120 KSP Residual norm 2.326995850640e-12 8121 KSP Residual norm 2.367967143096e-12 8122 KSP Residual norm 2.339888470975e-12 8123 KSP Residual norm 2.184205706366e-12 8124 KSP Residual norm 1.917557257164e-12 8125 KSP Residual norm 1.761664849400e-12 8126 KSP Residual norm 1.572285837606e-12 8127 KSP Residual norm 1.344709360689e-12 8128 KSP Residual norm 1.315576153885e-12 8129 KSP Residual norm 1.473564994016e-12 8130 KSP Residual norm 1.559920455129e-12 8131 KSP Residual norm 1.487102004701e-12 8132 KSP Residual norm 1.609397047292e-12 8133 KSP Residual norm 1.725551521641e-12 8134 KSP Residual norm 1.680261315686e-12 8135 KSP Residual norm 1.538745486822e-12 8136 KSP Residual norm 1.434978433902e-12 8137 KSP Residual norm 1.308049131431e-12 8138 KSP Residual norm 1.217837377282e-12 8139 KSP Residual norm 1.156082119591e-12 8140 KSP Residual norm 1.151107178413e-12 8141 KSP Residual norm 1.399939768660e-12 8142 KSP Residual norm 1.839031600816e-12 8143 KSP Residual norm 2.200809871839e-12 8144 KSP Residual norm 2.387934265025e-12 8145 KSP Residual norm 2.500847179119e-12 8146 KSP Residual norm 2.442127961902e-12 8147 KSP Residual norm 2.043933442923e-12 8148 KSP Residual norm 1.695152254268e-12 8149 KSP Residual norm 1.819506845898e-12 8150 KSP Residual norm 2.518866712064e-12 8151 KSP Residual norm 3.137372190680e-12 8152 KSP Residual norm 3.101096862650e-12 8153 KSP Residual norm 2.750105725947e-12 8154 KSP Residual norm 2.618890149176e-12 8155 KSP Residual norm 2.508969770101e-12 8156 KSP Residual norm 2.467436884668e-12 8157 KSP Residual norm 2.398156110246e-12 8158 KSP Residual norm 2.243317638736e-12 8159 KSP Residual norm 1.830892938751e-12 8160 KSP Residual norm 1.398579258754e-12 8161 KSP Residual norm 1.100084786758e-12 8162 KSP Residual norm 9.134058518838e-13 8163 KSP Residual norm 8.336424683903e-13 8164 KSP Residual norm 8.190970957541e-13 8165 KSP Residual norm 7.277540400607e-13 8166 KSP Residual norm 6.999769390226e-13 8167 KSP Residual norm 8.347416688425e-13 8168 KSP Residual norm 1.201759870785e-12 8169 KSP Residual norm 1.561329304463e-12 8170 KSP Residual norm 1.614479453641e-12 8171 KSP Residual norm 1.417896489359e-12 8172 KSP Residual norm 1.152615456896e-12 8173 KSP Residual norm 1.002346137129e-12 8174 KSP Residual norm 9.861980167485e-13 8175 KSP Residual norm 1.057163774224e-12 8176 KSP Residual norm 1.258556480418e-12 8177 KSP Residual norm 1.485661878687e-12 8178 KSP Residual norm 1.590746004774e-12 8179 KSP Residual norm 1.600979395383e-12 8180 KSP Residual norm 1.479894841916e-12 8181 KSP Residual norm 1.388200792382e-12 8182 KSP Residual norm 1.394344749611e-12 8183 KSP Residual norm 1.498108566389e-12 8184 KSP Residual norm 1.757769752881e-12 8185 KSP Residual norm 1.965337433307e-12 8186 KSP Residual norm 2.247915825659e-12 8187 KSP Residual norm 2.573939345220e-12 8188 KSP Residual norm 2.603330186089e-12 8189 KSP Residual norm 2.382818646396e-12 8190 KSP Residual norm 2.140081728047e-12 8191 KSP Residual norm 1.952611657219e-12 8192 KSP Residual norm 1.894423805840e-12 8193 KSP Residual norm 2.032502087180e-12 8194 KSP Residual norm 2.164484238183e-12 8195 KSP Residual norm 2.265511172813e-12 8196 KSP Residual norm 2.386590667632e-12 8197 KSP Residual norm 2.329994687168e-12 8198 KSP Residual norm 2.309505448967e-12 8199 KSP Residual norm 2.628059422235e-12 8200 KSP Residual norm 2.988191469977e-12 8201 KSP Residual norm 3.132221078510e-12 8202 KSP Residual norm 2.756194794646e-12 8203 KSP Residual norm 2.389600266342e-12 8204 KSP Residual norm 2.137667035895e-12 8205 KSP Residual norm 1.982397326790e-12 8206 KSP Residual norm 1.982099737187e-12 8207 KSP Residual norm 2.069831111011e-12 8208 KSP Residual norm 1.959566378868e-12 8209 KSP Residual norm 1.772085785483e-12 8210 KSP Residual norm 1.542249079172e-12 8211 KSP Residual norm 1.195562738267e-12 8212 KSP Residual norm 1.034971216429e-12 8213 KSP Residual norm 1.041684689984e-12 8214 KSP Residual norm 1.117117581337e-12 8215 KSP Residual norm 1.393121856906e-12 8216 KSP Residual norm 1.641205915398e-12 8217 KSP Residual norm 1.636923621079e-12 8218 KSP Residual norm 1.533612857387e-12 8219 KSP Residual norm 1.351348190075e-12 8220 KSP Residual norm 1.195533391113e-12 8221 KSP Residual norm 1.187885063644e-12 8222 KSP Residual norm 1.266844419467e-12 8223 KSP Residual norm 1.291137797956e-12 8224 KSP Residual norm 1.377255263845e-12 8225 KSP Residual norm 1.504056531048e-12 8226 KSP Residual norm 1.592985999570e-12 8227 KSP Residual norm 1.689024060472e-12 8228 KSP Residual norm 1.721068522516e-12 8229 KSP Residual norm 1.685105249767e-12 8230 KSP Residual norm 1.653551274955e-12 8231 KSP Residual norm 1.511328225197e-12 8232 KSP Residual norm 1.236504033510e-12 8233 KSP Residual norm 1.132945627086e-12 8234 KSP Residual norm 1.293862582760e-12 8235 KSP Residual norm 1.686000462282e-12 8236 KSP Residual norm 2.224628072001e-12 8237 KSP Residual norm 2.834533806054e-12 8238 KSP Residual norm 3.199721239496e-12 8239 KSP Residual norm 2.836800906916e-12 8240 KSP Residual norm 2.389252694137e-12 8241 KSP Residual norm 2.134628174016e-12 8242 KSP Residual norm 1.834326576456e-12 8243 KSP Residual norm 1.446339945710e-12 8244 KSP Residual norm 1.140321990337e-12 8245 KSP Residual norm 9.120866784871e-13 8246 KSP Residual norm 7.942115879252e-13 8247 KSP Residual norm 6.854572604145e-13 8248 KSP Residual norm 5.749483812424e-13 8249 KSP Residual norm 5.171656441685e-13 8250 KSP Residual norm 5.027694121846e-13 8251 KSP Residual norm 5.060452668351e-13 8252 KSP Residual norm 5.337368357915e-13 8253 KSP Residual norm 5.330211583895e-13 8254 KSP Residual norm 4.518090526819e-13 8255 KSP Residual norm 4.044333979336e-13 8256 KSP Residual norm 4.393737964531e-13 8257 KSP Residual norm 4.759895104176e-13 8258 KSP Residual norm 4.587174501314e-13 8259 KSP Residual norm 4.622509815619e-13 8260 KSP Residual norm 5.422119805480e-13 8261 KSP Residual norm 5.832620941009e-13 8262 KSP Residual norm 5.077923890157e-13 8263 KSP Residual norm 4.541075776876e-13 8264 KSP Residual norm 4.630699529990e-13 8265 KSP Residual norm 4.655272609549e-13 8266 KSP Residual norm 4.607606252990e-13 8267 KSP Residual norm 4.510032310869e-13 8268 KSP Residual norm 4.475861572721e-13 8269 KSP Residual norm 4.940501311665e-13 8270 KSP Residual norm 5.711529598786e-13 8271 KSP Residual norm 6.354844739706e-13 8272 KSP Residual norm 6.648693609686e-13 8273 KSP Residual norm 6.481003966630e-13 8274 KSP Residual norm 5.555032756193e-13 8275 KSP Residual norm 4.236819845889e-13 8276 KSP Residual norm 3.394759670186e-13 8277 KSP Residual norm 3.223995704607e-13 8278 KSP Residual norm 3.126136972214e-13 8279 KSP Residual norm 2.975270147178e-13 8280 KSP Residual norm 2.891212621576e-13 8281 KSP Residual norm 2.659339566703e-13 8282 KSP Residual norm 2.393509607355e-13 8283 KSP Residual norm 2.452091069469e-13 8284 KSP Residual norm 2.812993023144e-13 8285 KSP Residual norm 3.019938662643e-13 8286 KSP Residual norm 3.151242685805e-13 8287 KSP Residual norm 3.242607137484e-13 8288 KSP Residual norm 3.056077913966e-13 8289 KSP Residual norm 2.702738585250e-13 8290 KSP Residual norm 2.316167684755e-13 8291 KSP Residual norm 2.138971634700e-13 8292 KSP Residual norm 2.311150074740e-13 8293 KSP Residual norm 2.645523559141e-13 8294 KSP Residual norm 2.997874712429e-13 8295 KSP Residual norm 3.355843954387e-13 8296 KSP Residual norm 3.740673010708e-13 8297 KSP Residual norm 4.506391313262e-13 8298 KSP Residual norm 5.264852459291e-13 8299 KSP Residual norm 5.715946486913e-13 8300 KSP Residual norm 6.512522269081e-13 8301 KSP Residual norm 7.735164021159e-13 8302 KSP Residual norm 7.804423445484e-13 8303 KSP Residual norm 7.813830319672e-13 8304 KSP Residual norm 8.873941741712e-13 8305 KSP Residual norm 9.974409390370e-13 8306 KSP Residual norm 1.055539553251e-12 8307 KSP Residual norm 1.054537752517e-12 8308 KSP Residual norm 1.032723865680e-12 8309 KSP Residual norm 9.648069998665e-13 8310 KSP Residual norm 8.559696918552e-13 8311 KSP Residual norm 7.290193909546e-13 8312 KSP Residual norm 5.950304431214e-13 8313 KSP Residual norm 4.903383952342e-13 8314 KSP Residual norm 4.654276559851e-13 8315 KSP Residual norm 4.465587261480e-13 8316 KSP Residual norm 4.415151525440e-13 8317 KSP Residual norm 5.009685069177e-13 8318 KSP Residual norm 6.155175422901e-13 8319 KSP Residual norm 6.952799845996e-13 8320 KSP Residual norm 6.236963999911e-13 8321 KSP Residual norm 5.122203505722e-13 8322 KSP Residual norm 4.341113310152e-13 8323 KSP Residual norm 3.926896861018e-13 8324 KSP Residual norm 3.596875313506e-13 8325 KSP Residual norm 3.255875715250e-13 8326 KSP Residual norm 3.101995463737e-13 8327 KSP Residual norm 2.998340879713e-13 8328 KSP Residual norm 2.671456691312e-13 8329 KSP Residual norm 2.664925839220e-13 8330 KSP Residual norm 3.040851663283e-13 8331 KSP Residual norm 3.308877194804e-13 8332 KSP Residual norm 3.424110377370e-13 8333 KSP Residual norm 3.665467574051e-13 8334 KSP Residual norm 3.955552087098e-13 8335 KSP Residual norm 4.057969558244e-13 8336 KSP Residual norm 3.840657174897e-13 8337 KSP Residual norm 3.499084013917e-13 8338 KSP Residual norm 3.309909719138e-13 8339 KSP Residual norm 3.210015703765e-13 8340 KSP Residual norm 3.157687381608e-13 8341 KSP Residual norm 3.454224369747e-13 8342 KSP Residual norm 3.738856455304e-13 8343 KSP Residual norm 3.450309675304e-13 8344 KSP Residual norm 3.223687680617e-13 8345 KSP Residual norm 3.300095890291e-13 8346 KSP Residual norm 3.465294930966e-13 8347 KSP Residual norm 2.895022577537e-13 8348 KSP Residual norm 2.037331316814e-13 8349 KSP Residual norm 1.818053916577e-13 8350 KSP Residual norm 1.960614682504e-13 8351 KSP Residual norm 2.011334243858e-13 8352 KSP Residual norm 1.948658713393e-13 8353 KSP Residual norm 2.033774074032e-13 8354 KSP Residual norm 2.192417301265e-13 8355 KSP Residual norm 2.270960009964e-13 8356 KSP Residual norm 2.302527608768e-13 8357 KSP Residual norm 2.270616321346e-13 8358 KSP Residual norm 2.131533812293e-13 8359 KSP Residual norm 1.900315563741e-13 8360 KSP Residual norm 1.942349552406e-13 8361 KSP Residual norm 2.000119683424e-13 8362 KSP Residual norm 1.907666035574e-13 8363 KSP Residual norm 1.903994718762e-13 8364 KSP Residual norm 1.772726314509e-13 8365 KSP Residual norm 1.556371211117e-13 8366 KSP Residual norm 1.528082863521e-13 8367 KSP Residual norm 1.556075031018e-13 8368 KSP Residual norm 1.384660601584e-13 8369 KSP Residual norm 1.282052275832e-13 8370 KSP Residual norm 1.322827468811e-13 8371 KSP Residual norm 1.501475063106e-13 8372 KSP Residual norm 1.671171953725e-13 8373 KSP Residual norm 1.836399696109e-13 8374 KSP Residual norm 1.922056817122e-13 8375 KSP Residual norm 1.951382318169e-13 8376 KSP Residual norm 1.867108164064e-13 8377 KSP Residual norm 1.788188218203e-13 8378 KSP Residual norm 1.792835454917e-13 8379 KSP Residual norm 1.735013773520e-13 8380 KSP Residual norm 1.775927004240e-13 8381 KSP Residual norm 1.873438347067e-13 8382 KSP Residual norm 1.950715563534e-13 8383 KSP Residual norm 1.985548936050e-13 8384 KSP Residual norm 1.932490177351e-13 8385 KSP Residual norm 1.813686508648e-13 8386 KSP Residual norm 1.764057626472e-13 8387 KSP Residual norm 1.796394189888e-13 8388 KSP Residual norm 1.877601612329e-13 8389 KSP Residual norm 2.004648172069e-13 8390 KSP Residual norm 2.132535888838e-13 8391 KSP Residual norm 2.496329459044e-13 8392 KSP Residual norm 2.946350506472e-13 8393 KSP Residual norm 2.784442299417e-13 8394 KSP Residual norm 2.573511040097e-13 8395 KSP Residual norm 2.679713572541e-13 8396 KSP Residual norm 2.657276068198e-13 8397 KSP Residual norm 2.472969012217e-13 8398 KSP Residual norm 2.146554861508e-13 8399 KSP Residual norm 2.146956058580e-13 8400 KSP Residual norm 2.445528075885e-13 8401 KSP Residual norm 2.807214692594e-13 8402 KSP Residual norm 2.902610227885e-13 8403 KSP Residual norm 2.813612478459e-13 8404 KSP Residual norm 2.860432277957e-13 8405 KSP Residual norm 2.988213638997e-13 8406 KSP Residual norm 3.129331460619e-13 8407 KSP Residual norm 3.277691586118e-13 8408 KSP Residual norm 3.407127580826e-13 8409 KSP Residual norm 3.306655322212e-13 8410 KSP Residual norm 2.728874127963e-13 8411 KSP Residual norm 2.296420947113e-13 8412 KSP Residual norm 2.173358805628e-13 8413 KSP Residual norm 2.219782143548e-13 8414 KSP Residual norm 2.218250373334e-13 8415 KSP Residual norm 2.343565510491e-13 8416 KSP Residual norm 2.700921579876e-13 8417 KSP Residual norm 2.773238802448e-13 8418 KSP Residual norm 2.691752501575e-13 8419 KSP Residual norm 3.006501993058e-13 8420 KSP Residual norm 3.556388342331e-13 8421 KSP Residual norm 3.383152843170e-13 8422 KSP Residual norm 3.076790615748e-13 8423 KSP Residual norm 3.044366661059e-13 8424 KSP Residual norm 2.756361778237e-13 8425 KSP Residual norm 2.389033839092e-13 8426 KSP Residual norm 2.218955391264e-13 8427 KSP Residual norm 2.009783871767e-13 8428 KSP Residual norm 1.672730551314e-13 8429 KSP Residual norm 1.346088958959e-13 8430 KSP Residual norm 1.183423281115e-13 8431 KSP Residual norm 1.180449843232e-13 8432 KSP Residual norm 1.267195177292e-13 8433 KSP Residual norm 1.357801668150e-13 8434 KSP Residual norm 1.431477689929e-13 8435 KSP Residual norm 1.488773470594e-13 8436 KSP Residual norm 1.557667436247e-13 8437 KSP Residual norm 1.508720303563e-13 8438 KSP Residual norm 1.349190712024e-13 8439 KSP Residual norm 1.248977885361e-13 8440 KSP Residual norm 1.273876372666e-13 8441 KSP Residual norm 1.417274852468e-13 8442 KSP Residual norm 1.469841945165e-13 8443 KSP Residual norm 1.388222326761e-13 8444 KSP Residual norm 1.202070080158e-13 8445 KSP Residual norm 1.013114538943e-13 8446 KSP Residual norm 8.950530423180e-14 8447 KSP Residual norm 8.211625544193e-14 8448 KSP Residual norm 7.934484802376e-14 8449 KSP Residual norm 8.131190810159e-14 8450 KSP Residual norm 8.059038010516e-14 8451 KSP Residual norm 7.615315399224e-14 8452 KSP Residual norm 8.161655005164e-14 8453 KSP Residual norm 1.019702647576e-13 8454 KSP Residual norm 1.196330905040e-13 8455 KSP Residual norm 1.268762961525e-13 8456 KSP Residual norm 1.246590080257e-13 8457 KSP Residual norm 1.184480985419e-13 8458 KSP Residual norm 1.090749305961e-13 8459 KSP Residual norm 1.017654249623e-13 8460 KSP Residual norm 1.076838500501e-13 8461 KSP Residual norm 1.140911949995e-13 8462 KSP Residual norm 1.042025556842e-13 8463 KSP Residual norm 9.596354759382e-14 8464 KSP Residual norm 9.963015233215e-14 8465 KSP Residual norm 1.143839773312e-13 8466 KSP Residual norm 1.231466483912e-13 8467 KSP Residual norm 1.326512745683e-13 8468 KSP Residual norm 1.560871848672e-13 8469 KSP Residual norm 1.840497317850e-13 8470 KSP Residual norm 2.061674137667e-13 8471 KSP Residual norm 2.270657261907e-13 8472 KSP Residual norm 2.762312561425e-13 8473 KSP Residual norm 3.170217065494e-13 8474 KSP Residual norm 3.419797080772e-13 8475 KSP Residual norm 3.563274532794e-13 8476 KSP Residual norm 3.680405460044e-13 8477 KSP Residual norm 3.866743161952e-13 8478 KSP Residual norm 3.735269525353e-13 8479 KSP Residual norm 3.420680168814e-13 8480 KSP Residual norm 3.332142457882e-13 8481 KSP Residual norm 3.363593024330e-13 8482 KSP Residual norm 3.498926112925e-13 8483 KSP Residual norm 3.443397408087e-13 8484 KSP Residual norm 3.346421092902e-13 8485 KSP Residual norm 3.561770539033e-13 8486 KSP Residual norm 3.819308611528e-13 8487 KSP Residual norm 3.883194669158e-13 8488 KSP Residual norm 4.065492681318e-13 8489 KSP Residual norm 4.398594286480e-13 8490 KSP Residual norm 4.518278562267e-13 8491 KSP Residual norm 4.478425079308e-13 8492 KSP Residual norm 4.695681719070e-13 8493 KSP Residual norm 4.730865254946e-13 8494 KSP Residual norm 4.250238104376e-13 8495 KSP Residual norm 3.726722327838e-13 8496 KSP Residual norm 3.546559878957e-13 8497 KSP Residual norm 3.644870053439e-13 8498 KSP Residual norm 3.482513708472e-13 8499 KSP Residual norm 3.185485627429e-13 8500 KSP Residual norm 3.125198312261e-13 8501 KSP Residual norm 3.271221464307e-13 8502 KSP Residual norm 3.290250078092e-13 8503 KSP Residual norm 3.401866309664e-13 8504 KSP Residual norm 3.838159984717e-13 8505 KSP Residual norm 4.625047512261e-13 8506 KSP Residual norm 4.565284830056e-13 8507 KSP Residual norm 4.012131675105e-13 8508 KSP Residual norm 4.128404480680e-13 8509 KSP Residual norm 4.489041098766e-13 8510 KSP Residual norm 4.231769952800e-13 8511 KSP Residual norm 3.571035931853e-13 8512 KSP Residual norm 3.205834838831e-13 8513 KSP Residual norm 3.337867866494e-13 8514 KSP Residual norm 3.548556536463e-13 8515 KSP Residual norm 3.241773910396e-13 8516 KSP Residual norm 2.950700574042e-13 8517 KSP Residual norm 2.992010017558e-13 8518 KSP Residual norm 3.068676913260e-13 8519 KSP Residual norm 2.990993999671e-13 8520 KSP Residual norm 2.958900157212e-13 8521 KSP Residual norm 2.900841550226e-13 8522 KSP Residual norm 2.677745316329e-13 8523 KSP Residual norm 2.507247580348e-13 8524 KSP Residual norm 2.538130320529e-13 8525 KSP Residual norm 2.668841285740e-13 8526 KSP Residual norm 2.431797321909e-13 8527 KSP Residual norm 1.907190802272e-13 8528 KSP Residual norm 1.635043152448e-13 8529 KSP Residual norm 1.399058448554e-13 8530 KSP Residual norm 1.165452232380e-13 8531 KSP Residual norm 1.000788702929e-13 8532 KSP Residual norm 9.339283202514e-14 8533 KSP Residual norm 7.990730600194e-14 8534 KSP Residual norm 6.492896622078e-14 8535 KSP Residual norm 5.265444932396e-14 8536 KSP Residual norm 4.793626058357e-14 8537 KSP Residual norm 5.104386446335e-14 8538 KSP Residual norm 5.546570399796e-14 8539 KSP Residual norm 6.082810888450e-14 8540 KSP Residual norm 7.027090485407e-14 8541 KSP Residual norm 7.840082420585e-14 8542 KSP Residual norm 8.540245120069e-14 8543 KSP Residual norm 9.531584230949e-14 8544 KSP Residual norm 1.036499891032e-13 8545 KSP Residual norm 1.216415025774e-13 8546 KSP Residual norm 1.428557015622e-13 8547 KSP Residual norm 1.524756071051e-13 8548 KSP Residual norm 1.562081036931e-13 8549 KSP Residual norm 1.438908867589e-13 8550 KSP Residual norm 1.248159937456e-13 8551 KSP Residual norm 1.124555115609e-13 8552 KSP Residual norm 1.076088136741e-13 8553 KSP Residual norm 1.093169079845e-13 8554 KSP Residual norm 1.237274147995e-13 8555 KSP Residual norm 1.323041363020e-13 8556 KSP Residual norm 1.332526005054e-13 8557 KSP Residual norm 1.285115200543e-13 8558 KSP Residual norm 1.358179404127e-13 8559 KSP Residual norm 1.600312923102e-13 8560 KSP Residual norm 1.820452284648e-13 8561 KSP Residual norm 1.885836551875e-13 8562 KSP Residual norm 2.044018756656e-13 8563 KSP Residual norm 2.296485994941e-13 8564 KSP Residual norm 2.238629035808e-13 8565 KSP Residual norm 2.104952302935e-13 8566 KSP Residual norm 1.875605877986e-13 8567 KSP Residual norm 1.565332459613e-13 8568 KSP Residual norm 1.441141565715e-13 8569 KSP Residual norm 1.600276170039e-13 8570 KSP Residual norm 1.795853550367e-13 8571 KSP Residual norm 1.656892294009e-13 8572 KSP Residual norm 1.399597564490e-13 8573 KSP Residual norm 1.340190700339e-13 8574 KSP Residual norm 1.400259553597e-13 8575 KSP Residual norm 1.441243934368e-13 8576 KSP Residual norm 1.436694245980e-13 8577 KSP Residual norm 1.445441364226e-13 8578 KSP Residual norm 1.446230720373e-13 8579 KSP Residual norm 1.406874357620e-13 8580 KSP Residual norm 1.406940045337e-13 8581 KSP Residual norm 1.351802943770e-13 8582 KSP Residual norm 1.182051549321e-13 8583 KSP Residual norm 9.490533262949e-14 8584 KSP Residual norm 8.760102841546e-14 8585 KSP Residual norm 9.387450960201e-14 8586 KSP Residual norm 1.012503902179e-13 8587 KSP Residual norm 1.011575165831e-13 8588 KSP Residual norm 8.484272185587e-14 8589 KSP Residual norm 7.876273869965e-14 8590 KSP Residual norm 9.114296793682e-14 8591 KSP Residual norm 1.018388808177e-13 8592 KSP Residual norm 9.547097044588e-14 8593 KSP Residual norm 9.066068009183e-14 8594 KSP Residual norm 9.996631442325e-14 8595 KSP Residual norm 1.263578230878e-13 8596 KSP Residual norm 1.417828034686e-13 8597 KSP Residual norm 1.239188938073e-13 8598 KSP Residual norm 1.038926748905e-13 8599 KSP Residual norm 9.617256935683e-14 8600 KSP Residual norm 8.907331146121e-14 8601 KSP Residual norm 7.993871507715e-14 8602 KSP Residual norm 7.531715888566e-14 8603 KSP Residual norm 7.046139598548e-14 8604 KSP Residual norm 6.416937851521e-14 8605 KSP Residual norm 6.546029562239e-14 8606 KSP Residual norm 7.469596314592e-14 8607 KSP Residual norm 7.871379566721e-14 8608 KSP Residual norm 7.306264421987e-14 8609 KSP Residual norm 7.148305850669e-14 8610 KSP Residual norm 7.508714352060e-14 8611 KSP Residual norm 7.222328174457e-14 8612 KSP Residual norm 6.345451591733e-14 8613 KSP Residual norm 5.999260414904e-14 8614 KSP Residual norm 6.157860270190e-14 8615 KSP Residual norm 6.118946578782e-14 8616 KSP Residual norm 5.560174485684e-14 8617 KSP Residual norm 4.913556977464e-14 8618 KSP Residual norm 4.699445555206e-14 8619 KSP Residual norm 4.622291618863e-14 8620 KSP Residual norm 4.685621138786e-14 8621 KSP Residual norm 5.062017130644e-14 8622 KSP Residual norm 5.034271911502e-14 8623 KSP Residual norm 4.264454745684e-14 8624 KSP Residual norm 3.680455223204e-14 8625 KSP Residual norm 3.442926207987e-14 8626 KSP Residual norm 3.523532462009e-14 8627 KSP Residual norm 3.862504520275e-14 8628 KSP Residual norm 3.861149607326e-14 8629 KSP Residual norm 3.424867472188e-14 8630 KSP Residual norm 3.193892357215e-14 8631 KSP Residual norm 3.373034156009e-14 8632 KSP Residual norm 3.662907252206e-14 8633 KSP Residual norm 4.028224400079e-14 8634 KSP Residual norm 4.327011857151e-14 8635 KSP Residual norm 4.346908012663e-14 8636 KSP Residual norm 4.451346079016e-14 8637 KSP Residual norm 5.197100169204e-14 8638 KSP Residual norm 6.542891077464e-14 8639 KSP Residual norm 7.259991216566e-14 8640 KSP Residual norm 7.604349606867e-14 8641 KSP Residual norm 8.231662580420e-14 8642 KSP Residual norm 8.804973267362e-14 8643 KSP Residual norm 9.817998922852e-14 8644 KSP Residual norm 1.047188920530e-13 8645 KSP Residual norm 9.910059797992e-14 8646 KSP Residual norm 9.892643146612e-14 8647 KSP Residual norm 1.127014828315e-13 8648 KSP Residual norm 1.208617116659e-13 8649 KSP Residual norm 1.124213111310e-13 8650 KSP Residual norm 1.021637348050e-13 8651 KSP Residual norm 1.019379051815e-13 8652 KSP Residual norm 1.024674940320e-13 8653 KSP Residual norm 9.409010939044e-14 8654 KSP Residual norm 8.731978373156e-14 8655 KSP Residual norm 8.283059013558e-14 8656 KSP Residual norm 7.344130489835e-14 8657 KSP Residual norm 6.738169114549e-14 8658 KSP Residual norm 7.115885343313e-14 8659 KSP Residual norm 7.496571499094e-14 8660 KSP Residual norm 7.712464829599e-14 8661 KSP Residual norm 8.332897404540e-14 8662 KSP Residual norm 8.739608065157e-14 8663 KSP Residual norm 8.451267777934e-14 8664 KSP Residual norm 8.203679503749e-14 8665 KSP Residual norm 7.753699500762e-14 8666 KSP Residual norm 7.146579495247e-14 8667 KSP Residual norm 6.591030904723e-14 8668 KSP Residual norm 6.201459877928e-14 8669 KSP Residual norm 5.714344046839e-14 8670 KSP Residual norm 5.538909724892e-14 8671 KSP Residual norm 5.768302158340e-14 8672 KSP Residual norm 5.843472110890e-14 8673 KSP Residual norm 5.167936763068e-14 8674 KSP Residual norm 4.546894501676e-14 8675 KSP Residual norm 3.925125054481e-14 8676 KSP Residual norm 3.409686080720e-14 8677 KSP Residual norm 3.202680921189e-14 8678 KSP Residual norm 3.012396253861e-14 8679 KSP Residual norm 2.727748422958e-14 8680 KSP Residual norm 2.434424661559e-14 8681 KSP Residual norm 1.987638795779e-14 8682 KSP Residual norm 1.718466449319e-14 8683 KSP Residual norm 1.665717707003e-14 8684 KSP Residual norm 1.430452593341e-14 8685 KSP Residual norm 1.180072540018e-14 8686 KSP Residual norm 1.084700575984e-14 8687 KSP Residual norm 1.074124999293e-14 8688 KSP Residual norm 1.024097118710e-14 8689 KSP Residual norm 1.048741843336e-14 8690 KSP Residual norm 1.155336843376e-14 8691 KSP Residual norm 1.238949795935e-14 8692 KSP Residual norm 1.173666777919e-14 8693 KSP Residual norm 1.031443474719e-14 8694 KSP Residual norm 1.073539529220e-14 8695 KSP Residual norm 1.248031363200e-14 8696 KSP Residual norm 1.278463355503e-14 8697 KSP Residual norm 1.157327944716e-14 8698 KSP Residual norm 1.080665085104e-14 8699 KSP Residual norm 1.171996486813e-14 8700 KSP Residual norm 1.345951575753e-14 8701 KSP Residual norm 1.317802443636e-14 8702 KSP Residual norm 1.165961273497e-14 8703 KSP Residual norm 1.096820651287e-14 8704 KSP Residual norm 1.131510800034e-14 8705 KSP Residual norm 1.188587410254e-14 8706 KSP Residual norm 1.187678464291e-14 8707 KSP Residual norm 1.187679074295e-14 8708 KSP Residual norm 1.208709807280e-14 8709 KSP Residual norm 1.232913423542e-14 8710 KSP Residual norm 1.443746341201e-14 8711 KSP Residual norm 1.762957700338e-14 8712 KSP Residual norm 1.819269772111e-14 8713 KSP Residual norm 1.774275229551e-14 8714 KSP Residual norm 1.862056024630e-14 8715 KSP Residual norm 1.877968721430e-14 8716 KSP Residual norm 1.921920762896e-14 8717 KSP Residual norm 2.136114136039e-14 8718 KSP Residual norm 2.218158455400e-14 8719 KSP Residual norm 1.859455253929e-14 8720 KSP Residual norm 1.754400506810e-14 8721 KSP Residual norm 2.013118578087e-14 8722 KSP Residual norm 2.165777540089e-14 8723 KSP Residual norm 2.020956840178e-14 8724 KSP Residual norm 1.878198346002e-14 8725 KSP Residual norm 1.962434993947e-14 8726 KSP Residual norm 1.934782853350e-14 8727 KSP Residual norm 1.889784837639e-14 8728 KSP Residual norm 1.937085417932e-14 8729 KSP Residual norm 1.890586454918e-14 8730 KSP Residual norm 1.680331544343e-14 8731 KSP Residual norm 1.575829612372e-14 8732 KSP Residual norm 1.624604132569e-14 8733 KSP Residual norm 1.560739543694e-14 8734 KSP Residual norm 1.444074106656e-14 8735 KSP Residual norm 1.465278257090e-14 8736 KSP Residual norm 1.540168121107e-14 8737 KSP Residual norm 1.580668297260e-14 8738 KSP Residual norm 1.517482744787e-14 8739 KSP Residual norm 1.386739174802e-14 8740 KSP Residual norm 1.331680514651e-14 8741 KSP Residual norm 1.256538465282e-14 8742 KSP Residual norm 1.118591383838e-14 8743 KSP Residual norm 1.033085367057e-14 8744 KSP Residual norm 1.043761596261e-14 8745 KSP Residual norm 1.115946363289e-14 8746 KSP Residual norm 1.125495560131e-14 8747 KSP Residual norm 1.033512871241e-14 8748 KSP Residual norm 1.002153151274e-14 8749 KSP Residual norm 9.879064490278e-15 8750 KSP Residual norm 9.053892684873e-15 8751 KSP Residual norm 8.610835303500e-15 8752 KSP Residual norm 8.704490350748e-15 8753 KSP Residual norm 9.196900533833e-15 8754 KSP Residual norm 1.018894347341e-14 8755 KSP Residual norm 1.146294196113e-14 8756 KSP Residual norm 1.288384924388e-14 8757 KSP Residual norm 1.345408994646e-14 8758 KSP Residual norm 1.304143052223e-14 8759 KSP Residual norm 1.314860871431e-14 8760 KSP Residual norm 1.496998720285e-14 8761 KSP Residual norm 1.709119429479e-14 8762 KSP Residual norm 1.795052786616e-14 8763 KSP Residual norm 1.773489252126e-14 8764 KSP Residual norm 1.654202482200e-14 8765 KSP Residual norm 1.543822696459e-14 8766 KSP Residual norm 1.486958094919e-14 8767 KSP Residual norm 1.498048009560e-14 8768 KSP Residual norm 1.487028654002e-14 8769 KSP Residual norm 1.406630044857e-14 8770 KSP Residual norm 1.370735514548e-14 8771 KSP Residual norm 1.441103709090e-14 8772 KSP Residual norm 1.533425175784e-14 8773 KSP Residual norm 1.677915571019e-14 8774 KSP Residual norm 1.993155113187e-14 8775 KSP Residual norm 2.170375330715e-14 8776 KSP Residual norm 2.246279236036e-14 8777 KSP Residual norm 2.347671212867e-14 8778 KSP Residual norm 2.277654373645e-14 8779 KSP Residual norm 2.106399246114e-14 8780 KSP Residual norm 2.096814844731e-14 8781 KSP Residual norm 2.049385839792e-14 8782 KSP Residual norm 1.922329473649e-14 8783 KSP Residual norm 1.700886243592e-14 8784 KSP Residual norm 1.664025819024e-14 8785 KSP Residual norm 1.751872309889e-14 8786 KSP Residual norm 1.835945772053e-14 8787 KSP Residual norm 1.949913445228e-14 8788 KSP Residual norm 1.996625080217e-14 8789 KSP Residual norm 1.817150755202e-14 8790 KSP Residual norm 1.641854498494e-14 8791 KSP Residual norm 1.549381477432e-14 8792 KSP Residual norm 1.514243528331e-14 8793 KSP Residual norm 1.529299176116e-14 8794 KSP Residual norm 1.566842493208e-14 8795 KSP Residual norm 1.568647391629e-14 8796 KSP Residual norm 1.494723376969e-14 8797 KSP Residual norm 1.498803523162e-14 8798 KSP Residual norm 1.457902670898e-14 8799 KSP Residual norm 1.294181016884e-14 8800 KSP Residual norm 1.175511496910e-14 8801 KSP Residual norm 1.154778839303e-14 8802 KSP Residual norm 1.106536804309e-14 8803 KSP Residual norm 1.003965037799e-14 8804 KSP Residual norm 9.987668774659e-15 8805 KSP Residual norm 1.057946500537e-14 8806 KSP Residual norm 1.002222444324e-14 8807 KSP Residual norm 9.623630828757e-15 8808 KSP Residual norm 9.604605164527e-15 8809 KSP Residual norm 9.178977024342e-15 8810 KSP Residual norm 8.348084108334e-15 8811 KSP Residual norm 8.179063654940e-15 8812 KSP Residual norm 8.973260312549e-15 8813 KSP Residual norm 8.703065094894e-15 8814 KSP Residual norm 7.242523503854e-15 8815 KSP Residual norm 6.772760753434e-15 8816 KSP Residual norm 7.016907167108e-15 8817 KSP Residual norm 7.252161166635e-15 8818 KSP Residual norm 7.463683648768e-15 8819 KSP Residual norm 7.498714403759e-15 8820 KSP Residual norm 7.698537521435e-15 8821 KSP Residual norm 8.213535115422e-15 8822 KSP Residual norm 9.059527406983e-15 8823 KSP Residual norm 9.640454855390e-15 8824 KSP Residual norm 9.757479821798e-15 8825 KSP Residual norm 1.026126776132e-14 8826 KSP Residual norm 1.036726609718e-14 8827 KSP Residual norm 9.741517662450e-15 8828 KSP Residual norm 9.379205026806e-15 8829 KSP Residual norm 1.037064089200e-14 8830 KSP Residual norm 1.131064386495e-14 8831 KSP Residual norm 1.135933346408e-14 8832 KSP Residual norm 1.099244123038e-14 8833 KSP Residual norm 1.112298227967e-14 8834 KSP Residual norm 1.121364450892e-14 8835 KSP Residual norm 1.139829165979e-14 8836 KSP Residual norm 1.169793391332e-14 8837 KSP Residual norm 1.217132007593e-14 8838 KSP Residual norm 1.223414342704e-14 8839 KSP Residual norm 1.258441318921e-14 8840 KSP Residual norm 1.323877123723e-14 8841 KSP Residual norm 1.238418863507e-14 8842 KSP Residual norm 1.065435689660e-14 8843 KSP Residual norm 9.881081031366e-15 8844 KSP Residual norm 1.024852262342e-14 8845 KSP Residual norm 1.094989227899e-14 8846 KSP Residual norm 1.245888556657e-14 8847 KSP Residual norm 1.350037595501e-14 8848 KSP Residual norm 1.268852794129e-14 8849 KSP Residual norm 1.207361231109e-14 8850 KSP Residual norm 1.202048643532e-14 8851 KSP Residual norm 1.226860055066e-14 8852 KSP Residual norm 1.348736045677e-14 8853 KSP Residual norm 1.511157637832e-14 8854 KSP Residual norm 1.687810779425e-14 8855 KSP Residual norm 1.927901028133e-14 8856 KSP Residual norm 2.065271654792e-14 8857 KSP Residual norm 1.794856147899e-14 8858 KSP Residual norm 1.440987660868e-14 8859 KSP Residual norm 1.276158968853e-14 8860 KSP Residual norm 1.375092796871e-14 8861 KSP Residual norm 1.525866366352e-14 8862 KSP Residual norm 1.599326827412e-14 8863 KSP Residual norm 1.610744724426e-14 8864 KSP Residual norm 1.536287380873e-14 8865 KSP Residual norm 1.469392464952e-14 8866 KSP Residual norm 1.445534071351e-14 8867 KSP Residual norm 1.436634173723e-14 8868 KSP Residual norm 1.460923031049e-14 8869 KSP Residual norm 1.419028858542e-14 8870 KSP Residual norm 1.271112104762e-14 8871 KSP Residual norm 1.210272826236e-14 8872 KSP Residual norm 1.279156054071e-14 8873 KSP Residual norm 1.312402512524e-14 8874 KSP Residual norm 1.277007349662e-14 8875 KSP Residual norm 1.186399922799e-14 8876 KSP Residual norm 1.178869955505e-14 8877 KSP Residual norm 1.268822347283e-14 8878 KSP Residual norm 1.306614121278e-14 8879 KSP Residual norm 1.234436505561e-14 8880 KSP Residual norm 1.226474062278e-14 8881 KSP Residual norm 1.274249868150e-14 8882 KSP Residual norm 1.285071498186e-14 8883 KSP Residual norm 1.206928049289e-14 8884 KSP Residual norm 1.091491078995e-14 8885 KSP Residual norm 1.096668142721e-14 8886 KSP Residual norm 1.068746539458e-14 8887 KSP Residual norm 9.523687549370e-15 8888 KSP Residual norm 9.247326697468e-15 8889 KSP Residual norm 9.315941869623e-15 8890 KSP Residual norm 7.952570878562e-15 8891 KSP Residual norm 7.035984444996e-15 8892 KSP Residual norm 7.251641147823e-15 8893 KSP Residual norm 7.695983612195e-15 8894 KSP Residual norm 7.000988406685e-15 8895 KSP Residual norm 6.055835978860e-15 8896 KSP Residual norm 6.297276122771e-15 8897 KSP Residual norm 7.234659542297e-15 8898 KSP Residual norm 7.172537973014e-15 8899 KSP Residual norm 6.405434091962e-15 8900 KSP Residual norm 6.250291359564e-15 8901 KSP Residual norm 5.750225127069e-15 8902 KSP Residual norm 5.284662380638e-15 8903 KSP Residual norm 5.261609882521e-15 8904 KSP Residual norm 5.696115691171e-15 8905 KSP Residual norm 6.320460084061e-15 8906 KSP Residual norm 6.884268215719e-15 8907 KSP Residual norm 7.149209246252e-15 8908 KSP Residual norm 7.114554065318e-15 8909 KSP Residual norm 6.727903511753e-15 8910 KSP Residual norm 7.573724141107e-15 8911 KSP Residual norm 9.010537046846e-15 8912 KSP Residual norm 8.385283632852e-15 8913 KSP Residual norm 7.520416004430e-15 8914 KSP Residual norm 7.960049664770e-15 8915 KSP Residual norm 8.224844817960e-15 8916 KSP Residual norm 7.623232044115e-15 8917 KSP Residual norm 6.919478459129e-15 8918 KSP Residual norm 7.407511461985e-15 8919 KSP Residual norm 9.330339280863e-15 8920 KSP Residual norm 1.078368860718e-14 8921 KSP Residual norm 9.747358216959e-15 8922 KSP Residual norm 8.742521545317e-15 8923 KSP Residual norm 8.511576740553e-15 8924 KSP Residual norm 8.623508010105e-15 8925 KSP Residual norm 9.433690895103e-15 8926 KSP Residual norm 9.818833375390e-15 8927 KSP Residual norm 9.201673946696e-15 8928 KSP Residual norm 9.598143661961e-15 8929 KSP Residual norm 1.176379488067e-14 8930 KSP Residual norm 1.461274234568e-14 8931 KSP Residual norm 1.653579579934e-14 8932 KSP Residual norm 1.726319292678e-14 8933 KSP Residual norm 1.870952211470e-14 8934 KSP Residual norm 1.974419151246e-14 8935 KSP Residual norm 1.955741886984e-14 8936 KSP Residual norm 2.046747032718e-14 8937 KSP Residual norm 2.111863372193e-14 8938 KSP Residual norm 1.973036776111e-14 8939 KSP Residual norm 2.059281667903e-14 8940 KSP Residual norm 2.249590635709e-14 8941 KSP Residual norm 2.172468393985e-14 8942 KSP Residual norm 1.896209442106e-14 8943 KSP Residual norm 1.765859548911e-14 8944 KSP Residual norm 1.957374972482e-14 8945 KSP Residual norm 2.097392755704e-14 8946 KSP Residual norm 1.866119447746e-14 8947 KSP Residual norm 1.756509148359e-14 8948 KSP Residual norm 1.768796917415e-14 8949 KSP Residual norm 1.820464295976e-14 8950 KSP Residual norm 1.805093321840e-14 8951 KSP Residual norm 1.669096910997e-14 8952 KSP Residual norm 1.411472566716e-14 8953 KSP Residual norm 1.304670457303e-14 8954 KSP Residual norm 1.456473150874e-14 8955 KSP Residual norm 1.651236721887e-14 8956 KSP Residual norm 1.706104882254e-14 8957 KSP Residual norm 1.686843029418e-14 8958 KSP Residual norm 1.641811831399e-14 8959 KSP Residual norm 1.691051342056e-14 8960 KSP Residual norm 1.752345970383e-14 8961 KSP Residual norm 1.836755618696e-14 8962 KSP Residual norm 1.685636613914e-14 8963 KSP Residual norm 1.562909309529e-14 8964 KSP Residual norm 1.699092125143e-14 8965 KSP Residual norm 1.749848952793e-14 8966 KSP Residual norm 1.622379457013e-14 8967 KSP Residual norm 1.512175700354e-14 8968 KSP Residual norm 1.432231421952e-14 8969 KSP Residual norm 1.495803721460e-14 8970 KSP Residual norm 1.560733825782e-14 8971 KSP Residual norm 1.470415609682e-14 8972 KSP Residual norm 1.608568830670e-14 8973 KSP Residual norm 1.972246578404e-14 8974 KSP Residual norm 1.940715219800e-14 8975 KSP Residual norm 1.890238444260e-14 8976 KSP Residual norm 2.184350860414e-14 8977 KSP Residual norm 2.495288701931e-14 8978 KSP Residual norm 2.584611051453e-14 8979 KSP Residual norm 2.631889186258e-14 8980 KSP Residual norm 2.887818621735e-14 8981 KSP Residual norm 3.031781266390e-14 8982 KSP Residual norm 3.099501634965e-14 8983 KSP Residual norm 3.337193565137e-14 8984 KSP Residual norm 3.604888670668e-14 8985 KSP Residual norm 3.422700664338e-14 8986 KSP Residual norm 3.169611554186e-14 8987 KSP Residual norm 3.249986401150e-14 8988 KSP Residual norm 3.337388498119e-14 8989 KSP Residual norm 3.155772394763e-14 8990 KSP Residual norm 3.110333506560e-14 8991 KSP Residual norm 2.986413089251e-14 8992 KSP Residual norm 2.819675561971e-14 8993 KSP Residual norm 2.835108541859e-14 8994 KSP Residual norm 2.765734465347e-14 8995 KSP Residual norm 2.474814988021e-14 8996 KSP Residual norm 2.421404945453e-14 8997 KSP Residual norm 2.825044061797e-14 8998 KSP Residual norm 3.162010715682e-14 8999 KSP Residual norm 2.991389376550e-14 9000 KSP Residual norm 2.790937429058e-14 9001 KSP Residual norm 2.758770041169e-14 9002 KSP Residual norm 2.733546317060e-14 9003 KSP Residual norm 2.697166561733e-14 9004 KSP Residual norm 2.770373043105e-14 9005 KSP Residual norm 2.669473798794e-14 9006 KSP Residual norm 2.580110323446e-14 9007 KSP Residual norm 2.619909425318e-14 9008 KSP Residual norm 2.362024016404e-14 9009 KSP Residual norm 2.084082140914e-14 9010 KSP Residual norm 2.081073341714e-14 9011 KSP Residual norm 2.189275046618e-14 9012 KSP Residual norm 2.301954646717e-14 9013 KSP Residual norm 2.335721662552e-14 9014 KSP Residual norm 2.435375698914e-14 9015 KSP Residual norm 2.397367370062e-14 9016 KSP Residual norm 1.940577445610e-14 9017 KSP Residual norm 1.604111462853e-14 9018 KSP Residual norm 1.585958908101e-14 9019 KSP Residual norm 1.485206789485e-14 9020 KSP Residual norm 1.408127289502e-14 9021 KSP Residual norm 1.447662039715e-14 9022 KSP Residual norm 1.464301870951e-14 9023 KSP Residual norm 1.528565685550e-14 9024 KSP Residual norm 1.718835359540e-14 9025 KSP Residual norm 1.726963002001e-14 9026 KSP Residual norm 1.605533501647e-14 9027 KSP Residual norm 1.562186088963e-14 9028 KSP Residual norm 1.594901880003e-14 9029 KSP Residual norm 1.648985738136e-14 9030 KSP Residual norm 1.689194371157e-14 9031 KSP Residual norm 1.642938479848e-14 9032 KSP Residual norm 1.496966823412e-14 9033 KSP Residual norm 1.406602301415e-14 9034 KSP Residual norm 1.699475747468e-14 9035 KSP Residual norm 2.214253669930e-14 9036 KSP Residual norm 2.250991094688e-14 9037 KSP Residual norm 2.090216990873e-14 9038 KSP Residual norm 2.129473081338e-14 9039 KSP Residual norm 2.074779888403e-14 9040 KSP Residual norm 2.035233897199e-14 9041 KSP Residual norm 2.190691680588e-14 9042 KSP Residual norm 2.458512304794e-14 9043 KSP Residual norm 2.553511960597e-14 9044 KSP Residual norm 2.556805736793e-14 9045 KSP Residual norm 2.556132450190e-14 9046 KSP Residual norm 2.609892300526e-14 9047 KSP Residual norm 2.686735347674e-14 9048 KSP Residual norm 3.062068057759e-14 9049 KSP Residual norm 3.714287928313e-14 9050 KSP Residual norm 3.244144252901e-14 9051 KSP Residual norm 2.988574511825e-14 9052 KSP Residual norm 3.261444744494e-14 9053 KSP Residual norm 3.072401316275e-14 9054 KSP Residual norm 2.842795510307e-14 9055 KSP Residual norm 2.901684730075e-14 9056 KSP Residual norm 2.960755938724e-14 9057 KSP Residual norm 2.822473125595e-14 9058 KSP Residual norm 2.657957948117e-14 9059 KSP Residual norm 2.641505588529e-14 9060 KSP Residual norm 2.551827271115e-14 9061 KSP Residual norm 2.508189675249e-14 9062 KSP Residual norm 2.847389072290e-14 9063 KSP Residual norm 2.980177503820e-14 9064 KSP Residual norm 2.859435769666e-14 9065 KSP Residual norm 2.838810177800e-14 9066 KSP Residual norm 2.558367466202e-14 9067 KSP Residual norm 2.079442300253e-14 9068 KSP Residual norm 1.747365396742e-14 9069 KSP Residual norm 1.555703372582e-14 9070 KSP Residual norm 1.488009968560e-14 9071 KSP Residual norm 1.591812273997e-14 9072 KSP Residual norm 1.935283826421e-14 9073 KSP Residual norm 2.476368447326e-14 9074 KSP Residual norm 2.753915532880e-14 9075 KSP Residual norm 2.886273878870e-14 9076 KSP Residual norm 3.146221075189e-14 9077 KSP Residual norm 3.649104733418e-14 9078 KSP Residual norm 4.122381977352e-14 9079 KSP Residual norm 4.674835105953e-14 9080 KSP Residual norm 4.797235780612e-14 9081 KSP Residual norm 4.482029327996e-14 9082 KSP Residual norm 4.295373190427e-14 9083 KSP Residual norm 4.707831028348e-14 9084 KSP Residual norm 4.850152590168e-14 9085 KSP Residual norm 4.833778798741e-14 9086 KSP Residual norm 5.286350455295e-14 9087 KSP Residual norm 5.497131544752e-14 9088 KSP Residual norm 4.920180036102e-14 9089 KSP Residual norm 4.467218058711e-14 9090 KSP Residual norm 4.433426365228e-14 9091 KSP Residual norm 4.344062692086e-14 9092 KSP Residual norm 3.999178928395e-14 9093 KSP Residual norm 3.684631706162e-14 9094 KSP Residual norm 3.685894431913e-14 9095 KSP Residual norm 3.735851584260e-14 9096 KSP Residual norm 3.529244404479e-14 9097 KSP Residual norm 2.995081079752e-14 9098 KSP Residual norm 2.709008670932e-14 9099 KSP Residual norm 2.818760955285e-14 9100 KSP Residual norm 3.006552211564e-14 9101 KSP Residual norm 3.167253650890e-14 9102 KSP Residual norm 3.410953005694e-14 9103 KSP Residual norm 3.513131299874e-14 9104 KSP Residual norm 3.251118195865e-14 9105 KSP Residual norm 3.176371801433e-14 9106 KSP Residual norm 3.146676133950e-14 9107 KSP Residual norm 2.918589321127e-14 9108 KSP Residual norm 2.744164462872e-14 9109 KSP Residual norm 2.569106006511e-14 9110 KSP Residual norm 2.199208274094e-14 9111 KSP Residual norm 2.041029952702e-14 9112 KSP Residual norm 2.141585663651e-14 9113 KSP Residual norm 2.041415974760e-14 9114 KSP Residual norm 1.674049769935e-14 9115 KSP Residual norm 1.468328851750e-14 9116 KSP Residual norm 1.482691280602e-14 9117 KSP Residual norm 1.381029273695e-14 9118 KSP Residual norm 1.245313185533e-14 9119 KSP Residual norm 1.368500948985e-14 9120 KSP Residual norm 1.585221259816e-14 9121 KSP Residual norm 1.547417221770e-14 9122 KSP Residual norm 1.426151909889e-14 9123 KSP Residual norm 1.537889273253e-14 9124 KSP Residual norm 1.682305681553e-14 9125 KSP Residual norm 1.621638493318e-14 9126 KSP Residual norm 1.590524338620e-14 9127 KSP Residual norm 1.665983243967e-14 9128 KSP Residual norm 1.703691825908e-14 9129 KSP Residual norm 1.791998077936e-14 9130 KSP Residual norm 1.919904179678e-14 9131 KSP Residual norm 1.848156517240e-14 9132 KSP Residual norm 1.640358708024e-14 9133 KSP Residual norm 1.522187623252e-14 9134 KSP Residual norm 1.474313206749e-14 9135 KSP Residual norm 1.388154432692e-14 9136 KSP Residual norm 1.250687425702e-14 9137 KSP Residual norm 1.099224317057e-14 9138 KSP Residual norm 1.002650652787e-14 9139 KSP Residual norm 9.826049051255e-15 9140 KSP Residual norm 1.070088531298e-14 9141 KSP Residual norm 1.173089906026e-14 9142 KSP Residual norm 1.146829286258e-14 9143 KSP Residual norm 1.039461000358e-14 9144 KSP Residual norm 9.692603247354e-15 9145 KSP Residual norm 9.574539119198e-15 9146 KSP Residual norm 9.648678192672e-15 9147 KSP Residual norm 9.932575189238e-15 9148 KSP Residual norm 1.060530261667e-14 9149 KSP Residual norm 1.176791188983e-14 9150 KSP Residual norm 1.287784315386e-14 9151 KSP Residual norm 1.290976145607e-14 9152 KSP Residual norm 1.333395631183e-14 9153 KSP Residual norm 1.298414114579e-14 9154 KSP Residual norm 1.198336383732e-14 9155 KSP Residual norm 1.207402048986e-14 9156 KSP Residual norm 1.200828413916e-14 9157 KSP Residual norm 1.070755788239e-14 9158 KSP Residual norm 9.488127030292e-15 9159 KSP Residual norm 9.111340395373e-15 9160 KSP Residual norm 1.003427645318e-14 9161 KSP Residual norm 1.092576661592e-14 9162 KSP Residual norm 1.116362534981e-14 9163 KSP Residual norm 1.136277123437e-14 9164 KSP Residual norm 1.070234967248e-14 9165 KSP Residual norm 9.414286209489e-15 9166 KSP Residual norm 9.631035784247e-15 9167 KSP Residual norm 1.042780656958e-14 9168 KSP Residual norm 9.402011974727e-15 9169 KSP Residual norm 8.737090706628e-15 9170 KSP Residual norm 9.008898568264e-15 9171 KSP Residual norm 8.840942359011e-15 9172 KSP Residual norm 8.041660576649e-15 9173 KSP Residual norm 7.576075368878e-15 9174 KSP Residual norm 8.233120887949e-15 9175 KSP Residual norm 9.147104210363e-15 9176 KSP Residual norm 9.296264354106e-15 9177 KSP Residual norm 8.303171561636e-15 9178 KSP Residual norm 7.429452435568e-15 9179 KSP Residual norm 7.075806974472e-15 9180 KSP Residual norm 7.196274996196e-15 9181 KSP Residual norm 7.587936655192e-15 9182 KSP Residual norm 7.642439011473e-15 9183 KSP Residual norm 7.748025350579e-15 9184 KSP Residual norm 7.308392772232e-15 9185 KSP Residual norm 6.690661423926e-15 9186 KSP Residual norm 6.985758204799e-15 9187 KSP Residual norm 7.853148695577e-15 9188 KSP Residual norm 8.321947504989e-15 9189 KSP Residual norm 8.481750637914e-15 9190 KSP Residual norm 8.442815520063e-15 9191 KSP Residual norm 8.762575493196e-15 9192 KSP Residual norm 8.860001713959e-15 9193 KSP Residual norm 8.823972676293e-15 9194 KSP Residual norm 8.953485662489e-15 9195 KSP Residual norm 8.956904691112e-15 9196 KSP Residual norm 8.755709650688e-15 9197 KSP Residual norm 8.456855447792e-15 9198 KSP Residual norm 8.025838514180e-15 9199 KSP Residual norm 8.436859520175e-15 9200 KSP Residual norm 9.662073545465e-15 9201 KSP Residual norm 1.010995341077e-14 9202 KSP Residual norm 1.036473079521e-14 9203 KSP Residual norm 1.085357450650e-14 9204 KSP Residual norm 1.128587907408e-14 9205 KSP Residual norm 1.219676744220e-14 9206 KSP Residual norm 1.232748730930e-14 9207 KSP Residual norm 1.116674511441e-14 9208 KSP Residual norm 1.110599756682e-14 9209 KSP Residual norm 1.212785252882e-14 9210 KSP Residual norm 1.334271816050e-14 9211 KSP Residual norm 1.310895905220e-14 9212 KSP Residual norm 1.284409200849e-14 9213 KSP Residual norm 1.479260812639e-14 9214 KSP Residual norm 1.612835470600e-14 9215 KSP Residual norm 1.531213493541e-14 9216 KSP Residual norm 1.587925962076e-14 9217 KSP Residual norm 1.664244097840e-14 9218 KSP Residual norm 1.568387231177e-14 9219 KSP Residual norm 1.585020901221e-14 9220 KSP Residual norm 1.817576387880e-14 9221 KSP Residual norm 1.930950232843e-14 9222 KSP Residual norm 1.908890428552e-14 9223 KSP Residual norm 1.838840156265e-14 9224 KSP Residual norm 1.727969132184e-14 9225 KSP Residual norm 1.604343819332e-14 9226 KSP Residual norm 1.502938197736e-14 9227 KSP Residual norm 1.501069094124e-14 9228 KSP Residual norm 1.508267702412e-14 9229 KSP Residual norm 1.380868858610e-14 9230 KSP Residual norm 1.194122881044e-14 9231 KSP Residual norm 9.697318830659e-15 9232 KSP Residual norm 7.956261436143e-15 9233 KSP Residual norm 7.181112906066e-15 9234 KSP Residual norm 6.646189007872e-15 9235 KSP Residual norm 6.164703554689e-15 9236 KSP Residual norm 5.749769792037e-15 9237 KSP Residual norm 5.439301076373e-15 9238 KSP Residual norm 4.905301298865e-15 9239 KSP Residual norm 4.498767596433e-15 9240 KSP Residual norm 4.150029782386e-15 9241 KSP Residual norm 3.759020714171e-15 9242 KSP Residual norm 4.024717814437e-15 9243 KSP Residual norm 4.540688225080e-15 9244 KSP Residual norm 4.815711851291e-15 9245 KSP Residual norm 4.851345107199e-15 9246 KSP Residual norm 4.824949481016e-15 9247 KSP Residual norm 4.291523609310e-15 9248 KSP Residual norm 3.774057032606e-15 9249 KSP Residual norm 3.823016631081e-15 9250 KSP Residual norm 4.339072800663e-15 9251 KSP Residual norm 4.275158111314e-15 9252 KSP Residual norm 3.961837473205e-15 9253 KSP Residual norm 3.836138464604e-15 9254 KSP Residual norm 3.784722460005e-15 9255 KSP Residual norm 3.411223019952e-15 9256 KSP Residual norm 2.991952667593e-15 9257 KSP Residual norm 2.738904410644e-15 9258 KSP Residual norm 2.531742666309e-15 9259 KSP Residual norm 2.667527805970e-15 9260 KSP Residual norm 2.892797566513e-15 9261 KSP Residual norm 2.659602008808e-15 9262 KSP Residual norm 2.498732315075e-15 9263 KSP Residual norm 2.678558934355e-15 9264 KSP Residual norm 2.798150118860e-15 9265 KSP Residual norm 2.754133427213e-15 9266 KSP Residual norm 2.866800460238e-15 9267 KSP Residual norm 3.386645589945e-15 9268 KSP Residual norm 4.505681691060e-15 9269 KSP Residual norm 5.685520563306e-15 9270 KSP Residual norm 6.546617694284e-15 9271 KSP Residual norm 6.727377549918e-15 9272 KSP Residual norm 6.780230159595e-15 9273 KSP Residual norm 7.608845894131e-15 9274 KSP Residual norm 8.034192807295e-15 9275 KSP Residual norm 6.927346833155e-15 9276 KSP Residual norm 6.256926873401e-15 9277 KSP Residual norm 6.097451819555e-15 9278 KSP Residual norm 5.702929616655e-15 9279 KSP Residual norm 4.850560824207e-15 9280 KSP Residual norm 4.320008177558e-15 9281 KSP Residual norm 4.292423202500e-15 9282 KSP Residual norm 4.560619115856e-15 9283 KSP Residual norm 4.578470888297e-15 9284 KSP Residual norm 4.104728302685e-15 9285 KSP Residual norm 3.694990390643e-15 9286 KSP Residual norm 3.667272930671e-15 9287 KSP Residual norm 4.133102156304e-15 9288 KSP Residual norm 4.831207220009e-15 9289 KSP Residual norm 5.118727505834e-15 9290 KSP Residual norm 4.956211601906e-15 9291 KSP Residual norm 4.373262262468e-15 9292 KSP Residual norm 3.942507517194e-15 9293 KSP Residual norm 4.266329523068e-15 9294 KSP Residual norm 4.583313596032e-15 9295 KSP Residual norm 4.620580515393e-15 9296 KSP Residual norm 4.440135739395e-15 9297 KSP Residual norm 3.953372395305e-15 9298 KSP Residual norm 3.928297130420e-15 9299 KSP Residual norm 4.615832020086e-15 9300 KSP Residual norm 5.012487447065e-15 9301 KSP Residual norm 4.415552098678e-15 9302 KSP Residual norm 4.175672646589e-15 9303 KSP Residual norm 4.634801659910e-15 9304 KSP Residual norm 4.826727148862e-15 9305 KSP Residual norm 4.670505690115e-15 9306 KSP Residual norm 4.976120201915e-15 9307 KSP Residual norm 5.643258175217e-15 9308 KSP Residual norm 5.782585587664e-15 9309 KSP Residual norm 5.277085203613e-15 9310 KSP Residual norm 5.372754034031e-15 9311 KSP Residual norm 5.683540113337e-15 9312 KSP Residual norm 5.935417856865e-15 9313 KSP Residual norm 6.778883902737e-15 9314 KSP Residual norm 7.352511041029e-15 9315 KSP Residual norm 7.023774329350e-15 9316 KSP Residual norm 7.033227455306e-15 9317 KSP Residual norm 8.240253049080e-15 9318 KSP Residual norm 9.579702059610e-15 9319 KSP Residual norm 9.783197471875e-15 9320 KSP Residual norm 1.017048546594e-14 9321 KSP Residual norm 1.067111214849e-14 9322 KSP Residual norm 9.051507749121e-15 9323 KSP Residual norm 7.257339962734e-15 9324 KSP Residual norm 6.476919993810e-15 9325 KSP Residual norm 6.273585577315e-15 9326 KSP Residual norm 5.996433467446e-15 9327 KSP Residual norm 5.598551090688e-15 9328 KSP Residual norm 5.274000479849e-15 9329 KSP Residual norm 5.405237315381e-15 9330 KSP Residual norm 5.562755656629e-15 9331 KSP Residual norm 5.169209145677e-15 9332 KSP Residual norm 4.020484530917e-15 9333 KSP Residual norm 3.072491242952e-15 9334 KSP Residual norm 2.701492726092e-15 9335 KSP Residual norm 2.523437810801e-15 9336 KSP Residual norm 2.254036010295e-15 9337 KSP Residual norm 2.083474106549e-15 9338 KSP Residual norm 2.284223245844e-15 9339 KSP Residual norm 2.968961542976e-15 9340 KSP Residual norm 3.816882185534e-15 9341 KSP Residual norm 4.422432095500e-15 9342 KSP Residual norm 4.015585887103e-15 9343 KSP Residual norm 3.729822740431e-15 9344 KSP Residual norm 3.773624481319e-15 9345 KSP Residual norm 3.565607962606e-15 9346 KSP Residual norm 3.636584688026e-15 9347 KSP Residual norm 4.599220162259e-15 9348 KSP Residual norm 6.343880823929e-15 9349 KSP Residual norm 7.427256855945e-15 9350 KSP Residual norm 6.949301136188e-15 9351 KSP Residual norm 6.537950992119e-15 9352 KSP Residual norm 6.535442229061e-15 9353 KSP Residual norm 6.535928572811e-15 9354 KSP Residual norm 6.841432435220e-15 9355 KSP Residual norm 6.920516747553e-15 9356 KSP Residual norm 6.210851190673e-15 9357 KSP Residual norm 5.596805012535e-15 9358 KSP Residual norm 5.285696165821e-15 9359 KSP Residual norm 4.537075188035e-15 9360 KSP Residual norm 3.829795786015e-15 9361 KSP Residual norm 3.892639061230e-15 9362 KSP Residual norm 4.217753444146e-15 9363 KSP Residual norm 4.141273012190e-15 9364 KSP Residual norm 4.344146665933e-15 9365 KSP Residual norm 5.426382716396e-15 9366 KSP Residual norm 7.070009409216e-15 9367 KSP Residual norm 8.258356113262e-15 9368 KSP Residual norm 8.343133239166e-15 9369 KSP Residual norm 7.184856711875e-15 9370 KSP Residual norm 5.927947992869e-15 9371 KSP Residual norm 5.794138596608e-15 9372 KSP Residual norm 6.204234792181e-15 9373 KSP Residual norm 6.086624595408e-15 9374 KSP Residual norm 6.041251917018e-15 9375 KSP Residual norm 6.184545030143e-15 9376 KSP Residual norm 5.381881932556e-15 9377 KSP Residual norm 4.624335308162e-15 9378 KSP Residual norm 4.608374030844e-15 9379 KSP Residual norm 4.683555690121e-15 9380 KSP Residual norm 4.325274469553e-15 9381 KSP Residual norm 4.051423949426e-15 9382 KSP Residual norm 3.966628221781e-15 9383 KSP Residual norm 3.821614145293e-15 9384 KSP Residual norm 3.412218000237e-15 9385 KSP Residual norm 3.042866090395e-15 9386 KSP Residual norm 2.598407307620e-15 9387 KSP Residual norm 2.428677272833e-15 9388 KSP Residual norm 2.665805839101e-15 9389 KSP Residual norm 2.887329257590e-15 9390 KSP Residual norm 2.810487912914e-15 9391 KSP Residual norm 2.872604043281e-15 9392 KSP Residual norm 3.258462231653e-15 9393 KSP Residual norm 3.464523929580e-15 9394 KSP Residual norm 3.477114082596e-15 9395 KSP Residual norm 3.301689995227e-15 9396 KSP Residual norm 2.935536245869e-15 9397 KSP Residual norm 2.815172961856e-15 9398 KSP Residual norm 2.952424510427e-15 9399 KSP Residual norm 2.891376838884e-15 9400 KSP Residual norm 2.874560325878e-15 9401 KSP Residual norm 3.181173291875e-15 9402 KSP Residual norm 3.473289597162e-15 9403 KSP Residual norm 3.435544229353e-15 9404 KSP Residual norm 3.250031573428e-15 9405 KSP Residual norm 2.947420673821e-15 9406 KSP Residual norm 2.698221073014e-15 9407 KSP Residual norm 2.761758417515e-15 9408 KSP Residual norm 3.107285739791e-15 9409 KSP Residual norm 3.242450260512e-15 9410 KSP Residual norm 2.835452207108e-15 9411 KSP Residual norm 2.725229525769e-15 9412 KSP Residual norm 2.999659951835e-15 9413 KSP Residual norm 3.313086391514e-15 9414 KSP Residual norm 3.412628779280e-15 9415 KSP Residual norm 3.298460931724e-15 9416 KSP Residual norm 3.504251126746e-15 9417 KSP Residual norm 4.113389991233e-15 9418 KSP Residual norm 4.684097247310e-15 9419 KSP Residual norm 5.340611767575e-15 9420 KSP Residual norm 5.528828542720e-15 9421 KSP Residual norm 5.289784054583e-15 9422 KSP Residual norm 5.429881082465e-15 9423 KSP Residual norm 5.123190827200e-15 9424 KSP Residual norm 4.648981763593e-15 9425 KSP Residual norm 4.599540553773e-15 9426 KSP Residual norm 5.045822739977e-15 9427 KSP Residual norm 5.615436896525e-15 9428 KSP Residual norm 5.401328506369e-15 9429 KSP Residual norm 5.013789897722e-15 9430 KSP Residual norm 5.212374508155e-15 9431 KSP Residual norm 5.539280225525e-15 9432 KSP Residual norm 5.861276415591e-15 9433 KSP Residual norm 5.927535248764e-15 9434 KSP Residual norm 6.094461453616e-15 9435 KSP Residual norm 6.432453643655e-15 9436 KSP Residual norm 6.577149234847e-15 9437 KSP Residual norm 6.139211944649e-15 9438 KSP Residual norm 5.624503984129e-15 9439 KSP Residual norm 5.186142526477e-15 9440 KSP Residual norm 5.342347471701e-15 9441 KSP Residual norm 6.160515923416e-15 9442 KSP Residual norm 6.442266581885e-15 9443 KSP Residual norm 5.700610207676e-15 9444 KSP Residual norm 4.998833225428e-15 9445 KSP Residual norm 4.818082390822e-15 9446 KSP Residual norm 5.224432767968e-15 9447 KSP Residual norm 5.701363226785e-15 9448 KSP Residual norm 5.829861749495e-15 9449 KSP Residual norm 5.975384690231e-15 9450 KSP Residual norm 5.790037362985e-15 9451 KSP Residual norm 5.401432228377e-15 9452 KSP Residual norm 5.298904407184e-15 9453 KSP Residual norm 5.462202059568e-15 9454 KSP Residual norm 5.553505369169e-15 9455 KSP Residual norm 5.356570179269e-15 9456 KSP Residual norm 5.192547895170e-15 9457 KSP Residual norm 5.117701871839e-15 9458 KSP Residual norm 5.391242782896e-15 9459 KSP Residual norm 5.859874365962e-15 9460 KSP Residual norm 6.238241606020e-15 9461 KSP Residual norm 6.815147817997e-15 9462 KSP Residual norm 7.429067744481e-15 9463 KSP Residual norm 6.843679843867e-15 9464 KSP Residual norm 5.706600222302e-15 9465 KSP Residual norm 5.323346966792e-15 9466 KSP Residual norm 5.652342475115e-15 9467 KSP Residual norm 6.571747815066e-15 9468 KSP Residual norm 7.162647618738e-15 9469 KSP Residual norm 7.280177154589e-15 9470 KSP Residual norm 7.465523254279e-15 9471 KSP Residual norm 7.391928076386e-15 9472 KSP Residual norm 6.853717416991e-15 9473 KSP Residual norm 7.004398308309e-15 9474 KSP Residual norm 8.367016310898e-15 9475 KSP Residual norm 1.077917293783e-14 9476 KSP Residual norm 1.238879214507e-14 9477 KSP Residual norm 1.278339700700e-14 9478 KSP Residual norm 1.262766796661e-14 9479 KSP Residual norm 1.199376308310e-14 9480 KSP Residual norm 1.099912828807e-14 9481 KSP Residual norm 1.064226699286e-14 9482 KSP Residual norm 1.121970625467e-14 9483 KSP Residual norm 1.204791553451e-14 9484 KSP Residual norm 1.289007318565e-14 9485 KSP Residual norm 1.348512220343e-14 9486 KSP Residual norm 1.281259550366e-14 9487 KSP Residual norm 1.108446307686e-14 9488 KSP Residual norm 1.110466677621e-14 9489 KSP Residual norm 1.259904105420e-14 9490 KSP Residual norm 1.300508290565e-14 9491 KSP Residual norm 1.284155607738e-14 9492 KSP Residual norm 1.302649221525e-14 9493 KSP Residual norm 1.267328360459e-14 9494 KSP Residual norm 1.158891283398e-14 9495 KSP Residual norm 1.144916302415e-14 9496 KSP Residual norm 1.198792031339e-14 9497 KSP Residual norm 1.171087883064e-14 9498 KSP Residual norm 1.162774118158e-14 9499 KSP Residual norm 1.191506955211e-14 9500 KSP Residual norm 1.260471360130e-14 9501 KSP Residual norm 1.384489559443e-14 9502 KSP Residual norm 1.602845064113e-14 9503 KSP Residual norm 1.656411148524e-14 9504 KSP Residual norm 1.450948165696e-14 9505 KSP Residual norm 1.273576977428e-14 9506 KSP Residual norm 1.146352641729e-14 9507 KSP Residual norm 9.694883209962e-15 9508 KSP Residual norm 9.920906643356e-15 9509 KSP Residual norm 1.151017413016e-14 9510 KSP Residual norm 1.134193899613e-14 9511 KSP Residual norm 9.095923536439e-15 9512 KSP Residual norm 7.692353038209e-15 9513 KSP Residual norm 7.338127664650e-15 9514 KSP Residual norm 7.157550473650e-15 9515 KSP Residual norm 7.069409735135e-15 9516 KSP Residual norm 6.899057770678e-15 9517 KSP Residual norm 6.314944196615e-15 9518 KSP Residual norm 5.597123312520e-15 9519 KSP Residual norm 5.088391788599e-15 9520 KSP Residual norm 5.459350955952e-15 9521 KSP Residual norm 5.851061056375e-15 9522 KSP Residual norm 6.314638008870e-15 9523 KSP Residual norm 7.087487190274e-15 9524 KSP Residual norm 7.453901122470e-15 9525 KSP Residual norm 7.564568608633e-15 9526 KSP Residual norm 7.588635222121e-15 9527 KSP Residual norm 8.101367014585e-15 9528 KSP Residual norm 8.379831153973e-15 9529 KSP Residual norm 8.436373553752e-15 9530 KSP Residual norm 8.410827741007e-15 9531 KSP Residual norm 8.254941567261e-15 9532 KSP Residual norm 7.869633293819e-15 9533 KSP Residual norm 7.357025716922e-15 9534 KSP Residual norm 6.625838223723e-15 9535 KSP Residual norm 5.190383406704e-15 9536 KSP Residual norm 4.471977567772e-15 9537 KSP Residual norm 4.513505169152e-15 9538 KSP Residual norm 4.528567693562e-15 9539 KSP Residual norm 4.556358245389e-15 9540 KSP Residual norm 4.706018153538e-15 9541 KSP Residual norm 4.685527140507e-15 9542 KSP Residual norm 4.519803093931e-15 9543 KSP Residual norm 3.978093751583e-15 9544 KSP Residual norm 3.598207305460e-15 9545 KSP Residual norm 3.570426886893e-15 9546 KSP Residual norm 3.699987757989e-15 9547 KSP Residual norm 3.848249594558e-15 9548 KSP Residual norm 4.031728150645e-15 9549 KSP Residual norm 3.814091787717e-15 9550 KSP Residual norm 3.176014359424e-15 9551 KSP Residual norm 2.646392634344e-15 9552 KSP Residual norm 2.555185228114e-15 9553 KSP Residual norm 2.680290311956e-15 9554 KSP Residual norm 2.790000732818e-15 9555 KSP Residual norm 2.864679007364e-15 9556 KSP Residual norm 3.082030513089e-15 9557 KSP Residual norm 3.359513114761e-15 9558 KSP Residual norm 3.345548371493e-15 9559 KSP Residual norm 2.932733533164e-15 9560 KSP Residual norm 2.439387077592e-15 9561 KSP Residual norm 2.125121401864e-15 9562 KSP Residual norm 2.091257379569e-15 9563 KSP Residual norm 2.273408167429e-15 9564 KSP Residual norm 2.443968727645e-15 9565 KSP Residual norm 2.412180047832e-15 9566 KSP Residual norm 2.434514540146e-15 9567 KSP Residual norm 2.539242314509e-15 9568 KSP Residual norm 2.535329264580e-15 9569 KSP Residual norm 2.479050492618e-15 9570 KSP Residual norm 2.419681336614e-15 9571 KSP Residual norm 2.302608339316e-15 9572 KSP Residual norm 1.938508221947e-15 9573 KSP Residual norm 1.541476556155e-15 9574 KSP Residual norm 1.176917049061e-15 3 KSP Residual norm 2.680497175260e-04 KSP Object: 1 MPI processes type: cg maximum iterations=1000 tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using nonzero initial guess using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_FE_split_) 1 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_FE_split_) 1 MPI processes type: bjacobi block Jacobi: number of blocks = 1 Local solve is same for all blocks, in the following KSP and PC objects: KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=28476, cols=28476 package used to perform factorization: petsc total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: schurcomplement rows=28476, cols=28476 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=28476, cols=324 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 5717 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=324, cols=28476 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 67 nodes, limit used is 5 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: () 1 MPI processes type: seqaij rows=28800, cols=28800 total: nonzeros=1024686, allocated nonzeros=1024794 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9600 nodes, limit used is 5 ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 20:39:53 2017 Using Petsc Release Version 3.7.3, unknown Max Max/Min Avg Total Time (sec): 1.040e+02 1.00000 1.040e+02 Objects: 1.990e+02 1.00000 1.990e+02 Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 Flops/sec: 1.571e+09 1.00000 1.571e+09 1.571e+09 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.0396e+02 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDot 42 1.0 2.1935e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 389 VecTDot 74012 1.0 1.7883e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2357 VecNorm 37020 1.0 9.6073e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2195 VecScale 37008 1.0 3.7900e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2781 VecCopy 37034 1.0 3.5509e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 74137 1.0 3.1407e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 74029 1.0 1.8788e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2244 VecAYPX 37001 1.0 1.3694e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1539 VecAssemblyBegin 68 1.0 2.1172e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 68 1.0 3.8862e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 48 1.0 6.0940e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 37017 1.0 4.6933e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1629 MatMultAdd 37015 1.0 3.8572e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 1951 MatSolve 74021 1.0 5.0074e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 48 45 0 0 0 48 45 0 0 0 1482 MatLUFactorNum 1 1.0 1.7191e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1422 MatCholFctrSym 1 1.0 8.8000e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatCholFctrNum 1 1.0 3.5787e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatILUFactorSym 1 1.0 3.7310e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyBegin 29 1.0 1.9073e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 29 1.0 9.9218e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 58026 1.0 2.8136e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 6 1.0 1.5302e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 2 1.0 2.9898e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 6 1.0 2.9335e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 7 1.0 2.6632e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 4 1.0 9.7990e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 1.0089e+02 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1619 PCSetUp 4 1.0 3.8233e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 639 PCSetUpOnBlocks 5 1.0 2.1231e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1151 PCApply 5 1.0 1.0088e+02 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1619 KSPSolve_FS_0 5 1.0 8.0919e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve_FS_Schu 5 1.0 1.0088e+02 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1619 KSPSolve_FS_Low 5 1.0 2.1861e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 91 91 9693912 0. Vector Scatter 24 24 15936 0. Index Set 51 51 537888 0. IS L to G Mapping 3 3 240408 0. Matrix 13 13 64097868 0. Krylov Solver 6 6 7888 0. Preconditioner 6 6 6288 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 0. #PETSc Option Table entries: -fieldsplit_FE_split_ksp_monitor -JSON_INIT /home/dknez/akselos-dev/data/instance/workers/fe-0d134a805c21419dacaad5fc4256078d/json_init.json -JSON_INPUT /home/dknez/akselos-dev/data/instance/workers/fe-0d134a805c21419dacaad5fc4256078d/json_input.json -ksp_monitor -ksp_view -log_view #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml ----------------------------------------- Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/dknez/software/petsc-src Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl ----------------------------------------- -------------- next part -------------- Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 3.488000613428e+03 1 KSP Residual norm 1.913340289348e+01 2 KSP Residual norm 3.047725579058e+00 3 KSP Residual norm 3.529039571593e+01 4 KSP Residual norm 8.262625546196e+00 5 KSP Residual norm 7.990055719857e-02 6 KSP Residual norm 2.526767904945e+02 7 KSP Residual norm 2.056036001808e+02 8 KSP Residual norm 1.435294157910e-03 0 KSP Residual norm 5.406453059409e+04 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 3.488000613428e+03 1 KSP Residual norm 1.913340289348e+01 2 KSP Residual norm 3.047725579058e+00 3 KSP Residual norm 3.529039571593e+01 4 KSP Residual norm 8.262625546196e+00 5 KSP Residual norm 7.990055719857e-02 6 KSP Residual norm 2.526767904945e+02 7 KSP Residual norm 2.056036001808e+02 8 KSP Residual norm 1.435294157910e-03 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 1.425561332687e-03 1 KSP Residual norm 1.126199428914e-03 2 KSP Residual norm 3.541272181089e-04 3 KSP Residual norm 7.057045636795e-03 4 KSP Residual norm 4.941909117571e+00 5 KSP Residual norm 1.892567169990e-03 6 KSP Residual norm 1.767149982341e-04 7 KSP Residual norm 2.376234803480e-05 8 KSP Residual norm 1.478680549262e-08 9 KSP Residual norm 2.286631816570e-03 10 KSP Residual norm 3.133892392939e-05 11 KSP Residual norm 1.386605538133e-05 12 KSP Residual norm 3.440599766764e-05 13 KSP Residual norm 3.707164789996e-10 1 KSP Residual norm 7.914512561765e+01 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 4.504468374410e-05 1 KSP Residual norm 2.471894787785e-07 2 KSP Residual norm 4.826148705457e-08 3 KSP Residual norm 4.642653751093e-07 4 KSP Residual norm 9.602100964786e-08 5 KSP Residual norm 1.082771472305e-09 6 KSP Residual norm 3.367681326448e-06 7 KSP Residual norm 2.574877773283e-06 8 KSP Residual norm 1.714795084862e-11 2 KSP Residual norm 6.982879146138e-04 KSP Object: 1 MPI processes type: cg maximum iterations=1000 tolerances: relative=1e-06, absolute=1e-50, divergence=10000. left preconditioning using nonzero initial guess using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_FE_split_) 1 MPI processes type: cg maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_FE_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 5., needed 23.5462 Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=28476, cols=28476 package used to perform factorization: petsc total: nonzeros=12309111, allocated nonzeros=12309111 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: schurcomplement rows=28476, cols=28476 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=28476, cols=324 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 5717 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_RB_split_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_RB_split_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 0., needed 0. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=324, cols=324 package used to perform factorization: mumps total: nonzeros=3042, allocated nonzeros=3042 total number of mallocs used during MatSetValues calls =0 MUMPS run parameters: SYM (matrix type): 2 PAR (host participation): 1 ICNTL(1) (output for error): 6 ICNTL(2) (output of diagnostic msg): 0 ICNTL(3) (output for global info): 0 ICNTL(4) (level of printing): 0 ICNTL(5) (input mat struct): 0 ICNTL(6) (matrix prescaling): 7 ICNTL(7) (sequentia matrix ordering):7 ICNTL(8) (scalling strategy): 77 ICNTL(10) (max num of refinements): 0 ICNTL(11) (error analysis): 0 ICNTL(12) (efficiency control): 0 ICNTL(13) (efficiency control): 0 ICNTL(14) (percentage of estimated workspace increase): 20 ICNTL(18) (input mat struct): 0 ICNTL(19) (Shur complement info): 0 ICNTL(20) (rhs sparse pattern): 0 ICNTL(21) (solution struct): 0 ICNTL(22) (in-core/out-of-core facility): 0 ICNTL(23) (max size of memory can be allocated locally):0 ICNTL(24) (detection of null pivot rows): 0 ICNTL(25) (computation of a null space basis): 0 ICNTL(26) (Schur options for rhs or solution): 0 ICNTL(27) (experimental parameter): -24 ICNTL(28) (use parallel or sequential ordering): 1 ICNTL(29) (parallel ordering): 0 ICNTL(30) (user-specified set of entries in inv(A)): 0 ICNTL(31) (factors is discarded in the solve phase): 0 ICNTL(33) (compute determinant): 0 CNTL(1) (relative pivoting threshold): 0.01 CNTL(2) (stopping criterion of refinement): 1.49012e-08 CNTL(3) (absolute pivoting threshold): 0. CNTL(4) (value of static pivoting): -1. CNTL(5) (fixation for null pivots): 0. RINFO(1) (local estimated flops for the elimination after analysis): [0] 29394. RINFO(2) (local estimated flops for the assembly after factorization): [0] 1092. RINFO(3) (local estimated flops for the elimination after factorization): [0] 29394. INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): [0] 1 INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): [0] 1 INFO(23) (num of pivots eliminated on this processor after factorization): [0] 324 RINFOG(1) (global estimated flops for the elimination after analysis): 29394. RINFOG(2) (global estimated flops for the assembly after factorization): 1092. RINFOG(3) (global estimated flops for the elimination after factorization): 29394. (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 INFOG(5) (estimated maximum front size in the complete tree): 12 INFOG(6) (number of nodes in the complete tree): 53 INFOG(7) (ordering option effectively use after analysis): 2 INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 INFOG(10) (total integer space store the matrix factors after factorization): 2067 INFOG(11) (order of largest frontal matrix after factorization): 12 INFOG(12) (number of off-diagonal pivots): 0 INFOG(13) (number of delayed pivots after factorization): 0 INFOG(14) (number of memory compress after factorization): 0 INFOG(15) (number of steps of iterative refinement after solution): 0 INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 INFOG(20) (estimated number of entries in the factors): 3042 INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 INFOG(28) (after factorization: number of null pivots encountered): 0 INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 INFOG(32) (after analysis: type of analysis done): 1 INFOG(33) (value used for ICNTL(8)): -2 INFOG(34) (exponent of the determinant if determinant is requested): 0 linear system matrix = precond matrix: Mat Object: (fieldsplit_RB_split_) 1 MPI processes type: seqaij rows=324, cols=324 total: nonzeros=5760, allocated nonzeros=5760 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 108 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=324, cols=28476 total: nonzeros=936, allocated nonzeros=936 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 67 nodes, limit used is 5 Mat Object: (fieldsplit_FE_split_) 1 MPI processes type: seqaij rows=28476, cols=28476 total: nonzeros=1017054, allocated nonzeros=1017054 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9492 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: () 1 MPI processes type: seqaij rows=28800, cols=28800 total: nonzeros=1024686, allocated nonzeros=1024794 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9600 nodes, limit used is 5 ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 20:42:58 2017 Using Petsc Release Version 3.7.3, unknown Max Max/Min Avg Total Time (sec): 1.861e+01 1.00000 1.861e+01 Objects: 1.950e+02 1.00000 1.950e+02 Flops: 2.112e+09 1.00000 2.112e+09 2.112e+09 Flops/sec: 1.134e+08 1.00000 1.134e+08 1.134e+08 MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 MPI Reductions: 0.000e+00 0.00000 Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) e.g., VecAXPY() for real vectors of length N --> 2N flops and VecAXPY() for complex vectors of length N --> 8N flops Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- Avg %Total Avg %Total counts %Total Avg %Total counts %Total 0: Main Stage: 1.8614e+01 100.0% 2.1118e+09 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% ------------------------------------------------------------------------------------------------------------------------ See the 'Profiling' chapter of the users' manual for details on interpreting output. Phase summary info: Count: number of times phase was executed Time and Flops: Max - maximum over all processors Ratio - ratio of maximum to minimum over all processors Mess: number of messages sent Avg. len: average message length (bytes) Reduct: number of global reductions Global: entire computation Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). %T - percent time in this phase %F - percent flops in this phase %M - percent messages in this phase %L - percent message lengths in this phase %R - percent reductions in this phase Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) ------------------------------------------------------------------------------------------------------------------------ Event Count Time (sec) Flops --- Global --- --- Stage --- Total Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s ------------------------------------------------------------------------------------------------------------------------ --- Event Stage 0: Main Stage VecDot 42 1.0 2.1935e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 389 VecTDot 78 1.0 1.4541e-03 1.0 4.44e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 3057 VecNorm 52 1.0 1.1883e-03 1.0 2.96e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2494 VecScale 41 1.0 4.0245e-04 1.0 1.17e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2901 VecCopy 64 1.0 2.5177e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecSet 160 1.0 4.2248e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAXPY 94 1.0 1.8404e-03 1.0 4.48e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 2435 VecAYPX 35 1.0 1.1680e-03 1.0 1.97e+06 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1683 VecAssemblyBegin 68 1.0 1.9956e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecAssemblyEnd 68 1.0 2.7418e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 VecScatterBegin 44 1.0 3.8147e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatMult 48 1.0 4.4571e-02 1.0 8.25e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 4 0 0 0 0 4 0 0 0 1852 MatMultAdd 49 1.0 3.3489e-02 1.0 7.53e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 4 0 0 0 0 4 0 0 0 2249 MatSolve 86 1.0 9.5013e-01 1.0 2.02e+09 1.0 0.0e+00 0.0e+00 0.0e+00 5 95 0 0 0 5 95 0 0 0 2121 MatCholFctrSym 2 1.0 1.0396e+01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 56 0 0 0 0 56 0 0 0 0 0 MatCholFctrNum 2 1.0 4.0893e+00 1.0 2.85e+04 1.0 0.0e+00 0.0e+00 0.0e+00 22 0 0 0 0 22 0 0 0 0 0 MatAssemblyBegin 29 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatAssemblyEnd 29 1.0 1.0732e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRow 58026 1.0 3.2039e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetRowIJ 2 1.0 9.5367e-07 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetSubMatrice 6 1.0 1.7126e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatGetOrdering 2 1.0 3.0589e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatZeroEntries 6 1.0 3.0849e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 MatView 6 1.0 2.4055e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSetUp 3 1.0 1.0300e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve 1 1.0 1.5489e+01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 83100 0 0 0 83100 0 0 0 136 PCSetUp 3 1.0 1.4508e+01 1.0 2.85e+04 1.0 0.0e+00 0.0e+00 0.0e+00 78 0 0 0 0 78 0 0 0 0 0 PCApply 4 1.0 1.5486e+01 1.0 2.10e+09 1.0 0.0e+00 0.0e+00 0.0e+00 83100 0 0 0 83100 0 0 0 136 KSPSolve_FS_0 4 1.0 6.0940e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 KSPSolve_FS_Schu 4 1.0 1.5483e+01 1.0 2.10e+09 1.0 0.0e+00 0.0e+00 0.0e+00 83100 0 0 0 83100 0 0 0 136 KSPSolve_FS_Low 4 1.0 2.0130e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 ------------------------------------------------------------------------------------------------------------------------ Memory usage is given in bytes: Object Type Creations Destructions Memory Descendants' Mem. Reports information only for process 0. --- Event Stage 0: Main Stage Vector 89 89 9690840 0. Vector Scatter 24 24 15936 0. Index Set 51 51 537888 0. IS L to G Mapping 3 3 240408 0. Matrix 13 13 199146960 0. Krylov Solver 5 5 6720 0. Preconditioner 5 5 5360 0. Viewer 1 0 0 0. Distributed Mesh 1 1 4624 0. Star Forest Bipartite Graph 2 2 1616 0. Discrete System 1 1 872 0. ======================================================================================================================== Average time to get PetscTime(): 0. #PETSc Option Table entries: -fieldsplit_FE_split_ksp_monitor -JSON_INIT /home/dknez/akselos-dev/data/instance/workers/fe-0d134a805c21419dacaad5fc4256078d/json_init.json -JSON_INPUT /home/dknez/akselos-dev/data/instance/workers/fe-0d134a805c21419dacaad5fc4256078d/json_input.json -ksp_monitor -ksp_view -log_view #End of PETSc Option Table entries Compiled without FORTRAN kernels Compiled with full precision matrices (default) sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml ----------------------------------------- Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial Using PETSc directory: /home/dknez/software/petsc-src Using PETSc arch: arch-linux2-c-opt ----------------------------------------- Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} ----------------------------------------- Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi ----------------------------------------- Using C linker: mpicc Using Fortran linker: mpif90 Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl ----------------------------------------- From bsmith at mcs.anl.gov Wed Jan 11 20:31:24 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Jan 2017 20:31:24 -0600 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> Message-ID: <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Thanks, this is very useful information. It means that 1) the approximate Sp is actually a very good approximation to the true Schur complement S, since using Sp^-1 to precondition S gives iteration counts from 8 to 13. 2) using ilu(0) as a preconditioner for Sp is not good, since replacing Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is actually not super surprising since ilu(0) is generally "not so good" for elasticity. So the next step is to try using -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type gamg the one open question is if any options should be passed to the gamg to tell it that the underly problem comes from "elasticity"; that is something about the null space. Mark Adams, since the GAMG is coming from inside another preconditioner it may not be easy for the easy for the user to attach the near null space to that inner matrix. Would it make sense for there to be a GAMG command line option to indicate that it is a 3d elasticity problem so GAMG could set up the near null space for itself? or does that not make sense? Barry > On Jan 11, 2017, at 7:47 PM, David Knezevic wrote: > > I've attached the two log files. Using cholesky for "FE_split" seems to have helped a lot! > > David > > > -- > David J. Knezevic | CTO > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 > > Phone: +1-617-599-4755 > > This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith wrote: > > Can you please run with all the monitoring on? So we can see the convergence of all the inner solvers > -fieldsplit_FE_split_ksp_monitor > > Then run again with > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky > > > and send both sets of results > > Barry > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic wrote: > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May wrote: > > so I gather that I'll have to look into a user-defined approximation to S. > > > > Where does the 2x2 block system come from? > > Maybe someone on the list knows the right approximation to use for S. > > > > The model is 3D linear elasticity using a finite element discretization. I applied substructuring to part of the system to "condense" it, and that results in the small A00 block. The A11 block is just standard 3D elasticity; no substructuring was applied there. There are constraints to connect the degrees of freedom on the interface of the substructured and non-substructured regions. > > > > If anyone has suggestions for a good way to precondition this type of system, I'd be most appreciative! > > > > Thanks, > > David > > > > > > > > ----------------------------------------- > > > > 0 KSP Residual norm 5.405528187695e+04 > > 1 KSP Residual norm 2.187814910803e+02 > > 2 KSP Residual norm 1.019051577515e-01 > > 3 KSP Residual norm 4.370464012859e-04 > > KSP Object: 1 MPI processes > > type: cg > > maximum iterations=1000 > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > left preconditioning > > using nonzero initial guess > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: fieldsplit > > FieldSplit with Schur preconditioner, factorization FULL > > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > > Split info: > > Split number 0 Defined by IS > > Split number 1 Defined by IS > > KSP solver for A00 block > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): 0 > > ICNTL(13) (efficiency control): 0 > > ICNTL(14) (percentage of estimated workspace increase): 20 > > ICNTL(18) (input mat struct): 0 > > ICNTL(19) (Shur complement info): 0 > > ICNTL(20) (rhs sparse pattern): 0 > > ICNTL(21) (solution struct): 0 > > ICNTL(22) (in-core/out-of-core facility): 0 > > ICNTL(23) (max size of memory can be allocated locally):0 > > ICNTL(24) (detection of null pivot rows): 0 > > ICNTL(25) (computation of a null space basis): 0 > > ICNTL(26) (Schur options for rhs or solution): 0 > > ICNTL(27) (experimental parameter): -24 > > ICNTL(28) (use parallel or sequential ordering): 1 > > ICNTL(29) (parallel ordering): 0 > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve phase): 0 > > ICNTL(33) (compute determinant): 0 > > CNTL(1) (relative pivoting threshold): 0.01 > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): 0. > > CNTL(4) (value of static pivoting): -1. > > CNTL(5) (fixation for null pivots): 0. > > RINFO(1) (local estimated flops for the elimination after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the assembly after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the elimination after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this processor after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > INFOG(6) (number of nodes in the complete tree): 53 > > INFOG(7) (ordering option effectively use after analysis): 2 > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after factorization): 0 > > INFOG(14) (number of memory compress after factorization): 0 > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in the factors): 3042 > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 108 nodes, limit used is 5 > > KSP solver for S = A11 - A10 inv(A00) A01 > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > type: cg > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > type: bjacobi > > block Jacobi: number of blocks = 1 > > Local solve is same for all blocks, in the following KSP and PC objects: > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > type: ilu > > ILU: out-of-place factorization > > 0 levels of fill > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 1., needed 1. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > package used to perform factorization: petsc > > total: nonzeros=1037052, allocated nonzeros=1037052 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9489 nodes, limit used is 5 > > linear system matrix = precond matrix: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1037052, allocated nonzeros=1037052 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9489 nodes, limit used is 5 > > linear system matrix followed by preconditioner matrix: > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: schurcomplement > > rows=28476, cols=28476 > > Schur complement A11 - A10 inv(A00) A01 > > A11 > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is 5 > > A10 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=324 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 5717 nodes, limit used is 5 > > KSP of A00 > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): 0 > > ICNTL(13) (efficiency control): 0 > > ICNTL(14) (percentage of estimated workspace increase): 20 > > ICNTL(18) (input mat struct): 0 > > ICNTL(19) (Shur complement info): 0 > > ICNTL(20) (rhs sparse pattern): 0 > > ICNTL(21) (solution struct): 0 > > ICNTL(22) (in-core/out-of-core facility): 0 > > ICNTL(23) (max size of memory can be allocated locally):0 > > ICNTL(24) (detection of null pivot rows): 0 > > ICNTL(25) (computation of a null space basis): 0 > > ICNTL(26) (Schur options for rhs or solution): 0 > > ICNTL(27) (experimental parameter): -24 > > ICNTL(28) (use parallel or sequential ordering): 1 > > ICNTL(29) (parallel ordering): 0 > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve phase): 0 > > ICNTL(33) (compute determinant): 0 > > CNTL(1) (relative pivoting threshold): 0.01 > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): 0. > > CNTL(4) (value of static pivoting): -1. > > CNTL(5) (fixation for null pivots): 0. > > RINFO(1) (local estimated flops for the elimination after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the assembly after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the elimination after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this processor after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > INFOG(6) (number of nodes in the complete tree): 53 > > INFOG(7) (ordering option effectively use after analysis): 2 > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after factorization): 0 > > INFOG(14) (number of memory compress after factorization): 0 > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in the factors): 3042 > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 108 nodes, limit used is 5 > > A01 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=28476 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 67 nodes, limit used is 5 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1037052, allocated nonzeros=1037052 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9489 nodes, limit used is 5 > > linear system matrix = precond matrix: > > Mat Object: () 1 MPI processes > > type: seqaij > > rows=28800, cols=28800 > > total: nonzeros=1024686, allocated nonzeros=1024794 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9600 nodes, limit used is 5 > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 17:22:10 2017 > > Using Petsc Release Version 3.7.3, unknown > > > > Max Max/Min Avg Total > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > Objects: 2.030e+02 1.00000 2.030e+02 > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Reductions: 0.000e+00 0.00000 > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > ------------------------------------------------------------------------------------------------------------------------ > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > Phase summary info: > > Count: number of times phase was executed > > Time and Flops: Max - maximum over all processors > > Ratio - ratio of maximum to minimum over all processors > > Mess: number of messages sent > > Avg. len: average message length (bytes) > > Reduct: number of global reductions > > Global: entire computation > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > %T - percent time in this phase %F - percent flops in this phase > > %M - percent messages in this phase %L - percent message lengths in this phase > > %R - percent reductions in this phase > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > ------------------------------------------------------------------------------------------------------------------------ > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > ------------------------------------------------------------------------------------------------------------------------ > > > > --- Event Stage 0: Main Stage > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > ------------------------------------------------------------------------------------------------------------------------ > > > > Memory usage is given in bytes: > > > > Object Type Creations Destructions Memory Descendants' Mem. > > Reports information only for process 0. > > > > --- Event Stage 0: Main Stage > > > > Vector 92 92 9698040 0. > > Vector Scatter 24 24 15936 0. > > Index Set 51 51 537876 0. > > IS L to G Mapping 3 3 240408 0. > > Matrix 16 16 77377776 0. > > Krylov Solver 6 6 7888 0. > > Preconditioner 6 6 6288 0. > > Viewer 1 0 0 0. > > Distributed Mesh 1 1 4624 0. > > Star Forest Bipartite Graph 2 2 1616 0. > > Discrete System 1 1 872 0. > > ======================================================================================================================== > > Average time to get PetscTime(): 0. > > #PETSc Option Table entries: > > -ksp_monitor > > -ksp_view > > -log_view > > #End of PETSc Option Table entries > > Compiled without FORTRAN kernels > > Compiled with full precision matrices (default) > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > ----------------------------------------- > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > Using PETSc directory: /home/dknez/software/petsc-src > > Using PETSc arch: arch-linux2-c-opt > > ----------------------------------------- > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > ----------------------------------------- > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > ----------------------------------------- > > > > Using C linker: mpicc > > Using Fortran linker: mpif90 > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > ----------------------------------------- > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May wrote: > > It looks like the Schur solve is requiring a huge number of iterates to converge (based on the instances of MatMult). > > This is killing the performance. > > > > Are you sure that A11 is a good approximation to S? You might consider trying the selfp option > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > Note that the best approx to S is likely both problem and discretisation dependent so if selfp is also terrible, you might want to consider coding up your own approx to S for your specific system. > > > > > > Thanks, > > Dave > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic wrote: > > I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. > > > > The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. > > > > So I did the following: > > - PCFieldSplitSetIS to specify the indices of the two splits > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > - I set -pc_fieldsplit_schur_fact_type full > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a test case. It seems to converge well, but I'm concerned about the speed (about 90 seconds, vs. about 1 second if I use a direct solver for the entire system). I just wanted to check if I'm setting this up in a good way? > > > > Many thanks, > > David > > > > ----------------------------------------------------------------------------------- > > > > 0 KSP Residual norm 5.405774214400e+04 > > 1 KSP Residual norm 1.849649014371e+02 > > 2 KSP Residual norm 7.462775074989e-02 > > 3 KSP Residual norm 2.680497175260e-04 > > KSP Object: 1 MPI processes > > type: cg > > maximum iterations=1000 > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > left preconditioning > > using nonzero initial guess > > using PRECONDITIONED norm type for convergence test > > PC Object: 1 MPI processes > > type: fieldsplit > > FieldSplit with Schur preconditioner, factorization FULL > > Preconditioner for the Schur complement formed from A11 > > Split info: > > Split number 0 Defined by IS > > Split number 1 Defined by IS > > KSP solver for A00 block > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): 0 > > ICNTL(13) (efficiency control): 0 > > ICNTL(14) (percentage of estimated workspace increase): 20 > > ICNTL(18) (input mat struct): 0 > > ICNTL(19) (Shur complement info): 0 > > ICNTL(20) (rhs sparse pattern): 0 > > ICNTL(21) (solution struct): 0 > > ICNTL(22) (in-core/out-of-core facility): 0 > > ICNTL(23) (max size of memory can be allocated locally):0 > > ICNTL(24) (detection of null pivot rows): 0 > > ICNTL(25) (computation of a null space basis): 0 > > ICNTL(26) (Schur options for rhs or solution): 0 > > ICNTL(27) (experimental parameter): -24 > > ICNTL(28) (use parallel or sequential ordering): 1 > > ICNTL(29) (parallel ordering): 0 > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve phase): 0 > > ICNTL(33) (compute determinant): 0 > > CNTL(1) (relative pivoting threshold): 0.01 > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): 0. > > CNTL(4) (value of static pivoting): -1. > > CNTL(5) (fixation for null pivots): 0. > > RINFO(1) (local estimated flops for the elimination after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the assembly after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the elimination after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this processor after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > INFOG(6) (number of nodes in the complete tree): 53 > > INFOG(7) (ordering option effectively use after analysis): 2 > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after factorization): 0 > > INFOG(14) (number of memory compress after factorization): 0 > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in the factors): 3042 > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 108 nodes, limit used is 5 > > KSP solver for S = A11 - A10 inv(A00) A01 > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > type: cg > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using PRECONDITIONED norm type for convergence test > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > type: bjacobi > > block Jacobi: number of blocks = 1 > > Local solve is same for all blocks, in the following KSP and PC objects: > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > type: ilu > > ILU: out-of-place factorization > > 0 levels of fill > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 1., needed 1. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > package used to perform factorization: petsc > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is 5 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is 5 > > linear system matrix followed by preconditioner matrix: > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: schurcomplement > > rows=28476, cols=28476 > > Schur complement A11 - A10 inv(A00) A01 > > A11 > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is 5 > > A10 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=28476, cols=324 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 5717 nodes, limit used is 5 > > KSP of A00 > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > type: preonly > > maximum iterations=10000, initial guess is zero > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > left preconditioning > > using NONE norm type for convergence test > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > type: cholesky > > Cholesky: out-of-place factorization > > tolerance for zero pivot 2.22045e-14 > > matrix ordering: natural > > factor fill ratio given 0., needed 0. > > Factored matrix follows: > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > package used to perform factorization: mumps > > total: nonzeros=3042, allocated nonzeros=3042 > > total number of mallocs used during MatSetValues calls =0 > > MUMPS run parameters: > > SYM (matrix type): 2 > > PAR (host participation): 1 > > ICNTL(1) (output for error): 6 > > ICNTL(2) (output of diagnostic msg): 0 > > ICNTL(3) (output for global info): 0 > > ICNTL(4) (level of printing): 0 > > ICNTL(5) (input mat struct): 0 > > ICNTL(6) (matrix prescaling): 7 > > ICNTL(7) (sequentia matrix ordering):7 > > ICNTL(8) (scalling strategy): 77 > > ICNTL(10) (max num of refinements): 0 > > ICNTL(11) (error analysis): 0 > > ICNTL(12) (efficiency control): 0 > > ICNTL(13) (efficiency control): 0 > > ICNTL(14) (percentage of estimated workspace increase): 20 > > ICNTL(18) (input mat struct): 0 > > ICNTL(19) (Shur complement info): 0 > > ICNTL(20) (rhs sparse pattern): 0 > > ICNTL(21) (solution struct): 0 > > ICNTL(22) (in-core/out-of-core facility): 0 > > ICNTL(23) (max size of memory can be allocated locally):0 > > ICNTL(24) (detection of null pivot rows): 0 > > ICNTL(25) (computation of a null space basis): 0 > > ICNTL(26) (Schur options for rhs or solution): 0 > > ICNTL(27) (experimental parameter): -24 > > ICNTL(28) (use parallel or sequential ordering): 1 > > ICNTL(29) (parallel ordering): 0 > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > ICNTL(31) (factors is discarded in the solve phase): 0 > > ICNTL(33) (compute determinant): 0 > > CNTL(1) (relative pivoting threshold): 0.01 > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > CNTL(3) (absolute pivoting threshold): 0. > > CNTL(4) (value of static pivoting): -1. > > CNTL(5) (fixation for null pivots): 0. > > RINFO(1) (local estimated flops for the elimination after analysis): > > [0] 29394. > > RINFO(2) (local estimated flops for the assembly after factorization): > > [0] 1092. > > RINFO(3) (local estimated flops for the elimination after factorization): > > [0] 29394. > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > [0] 1 > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > [0] 1 > > INFO(23) (num of pivots eliminated on this processor after factorization): > > [0] 324 > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > INFOG(6) (number of nodes in the complete tree): 53 > > INFOG(7) (ordering option effectively use after analysis): 2 > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > INFOG(12) (number of off-diagonal pivots): 0 > > INFOG(13) (number of delayed pivots after factorization): 0 > > INFOG(14) (number of memory compress after factorization): 0 > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > INFOG(20) (estimated number of entries in the factors): 3042 > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > INFOG(32) (after analysis: type of analysis done): 1 > > INFOG(33) (value used for ICNTL(8)): -2 > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > linear system matrix = precond matrix: > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > type: seqaij > > rows=324, cols=324 > > total: nonzeros=5760, allocated nonzeros=5760 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 108 nodes, limit used is 5 > > A01 > > Mat Object: 1 MPI processes > > type: seqaij > > rows=324, cols=28476 > > total: nonzeros=936, allocated nonzeros=936 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 67 nodes, limit used is 5 > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > type: seqaij > > rows=28476, cols=28476 > > total: nonzeros=1017054, allocated nonzeros=1017054 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9492 nodes, limit used is 5 > > linear system matrix = precond matrix: > > Mat Object: () 1 MPI processes > > type: seqaij > > rows=28800, cols=28800 > > total: nonzeros=1024686, allocated nonzeros=1024794 > > total number of mallocs used during MatSetValues calls =0 > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 16:16:47 2017 > > Using Petsc Release Version 3.7.3, unknown > > > > Max Max/Min Avg Total > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > Objects: 1.990e+02 1.00000 1.990e+02 > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > MPI Reductions: 0.000e+00 0.00000 > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > ------------------------------------------------------------------------------------------------------------------------ > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > Phase summary info: > > Count: number of times phase was executed > > Time and Flops: Max - maximum over all processors > > Ratio - ratio of maximum to minimum over all processors > > Mess: number of messages sent > > Avg. len: average message length (bytes) > > Reduct: number of global reductions > > Global: entire computation > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > %T - percent time in this phase %F - percent flops in this phase > > %M - percent messages in this phase %L - percent message lengths in this phase > > %R - percent reductions in this phase > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > ------------------------------------------------------------------------------------------------------------------------ > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > ------------------------------------------------------------------------------------------------------------------------ > > > > --- Event Stage 0: Main Stage > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > ------------------------------------------------------------------------------------------------------------------------ > > > > Memory usage is given in bytes: > > > > Object Type Creations Destructions Memory Descendants' Mem. > > Reports information only for process 0. > > > > --- Event Stage 0: Main Stage > > > > Vector 91 91 9693912 0. > > Vector Scatter 24 24 15936 0. > > Index Set 51 51 537888 0. > > IS L to G Mapping 3 3 240408 0. > > Matrix 13 13 64097868 0. > > Krylov Solver 6 6 7888 0. > > Preconditioner 6 6 6288 0. > > Viewer 1 0 0 0. > > Distributed Mesh 1 1 4624 0. > > Star Forest Bipartite Graph 2 2 1616 0. > > Discrete System 1 1 872 0. > > ======================================================================================================================== > > Average time to get PetscTime(): 0. > > #PETSc Option Table entries: > > -ksp_monitor > > -ksp_view > > -log_view > > #End of PETSc Option Table entries > > Compiled without FORTRAN kernels > > Compiled with full precision matrices (default) > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > ----------------------------------------- > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > Using PETSc directory: /home/dknez/software/petsc-src > > Using PETSc arch: arch-linux2-c-opt > > ----------------------------------------- > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > ----------------------------------------- > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > ----------------------------------------- > > > > Using C linker: mpicc > > Using Fortran linker: mpif90 > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > ----------------------------------------- > > > > > > > > > > > > > > > > > From david.knezevic at akselos.com Wed Jan 11 20:49:10 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Wed, 11 Jan 2017 21:49:10 -0500 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Message-ID: OK, that's encouraging. However, OK, that's encouraging. However, regarding this: So the next step is to try using -fieldsplit_FE_split_ksp_monitor > -fieldsplit_FE_split_pc_type gamg I tried this and it didn't converge at all (it hit the 10000 iteration max in the output from -fieldsplit_FE_split_ksp_monitor). So I guess I'd need to attach the near nullspace to make this work reasonably, as you said. Sounds like that may not be easy to do in this case though? I'll try some other preconditioners in the meantime. Thanks, David On Wed, Jan 11, 2017 at 9:31 PM, Barry Smith wrote: > > Thanks, this is very useful information. It means that > > 1) the approximate Sp is actually a very good approximation to the true > Schur complement S, since using Sp^-1 to precondition S gives iteration > counts from 8 to 13. > > 2) using ilu(0) as a preconditioner for Sp is not good, since replacing > Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is actually not > super surprising since ilu(0) is generally "not so good" for elasticity. > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor > -fieldsplit_FE_split_pc_type gamg > > the one open question is if any options should be passed to the gamg to > tell it that the underly problem comes from "elasticity"; that is something > about the null space. > > Mark Adams, since the GAMG is coming from inside another preconditioner > it may not be easy for the easy for the user to attach the near null space > to that inner matrix. Would it make sense for there to be a GAMG command > line option to indicate that it is a 3d elasticity problem so GAMG could > set up the near null space for itself? or does that not make sense? > > Barry > > > > > On Jan 11, 2017, at 7:47 PM, David Knezevic > wrote: > > > > I've attached the two log files. Using cholesky for "FE_split" seems to > have helped a lot! > > > > David > > > > > > -- > > David J. Knezevic | CTO > > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 > > > > Phone: +1-617-599-4755 > > > > This e-mail and any attachments may contain confidential material for > the sole use of the intended recipient(s). Any review or distribution by > others is strictly prohibited. If you are not the intended recipient, > please contact the sender and delete all copies. > > > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith wrote: > > > > Can you please run with all the monitoring on? So we can see the > convergence of all the inner solvers > > -fieldsplit_FE_split_ksp_monitor > > > > Then run again with > > > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky > > > > > > and send both sets of results > > > > Barry > > > > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic < > david.knezevic at akselos.com> wrote: > > > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May > wrote: > > > so I gather that I'll have to look into a user-defined approximation > to S. > > > > > > Where does the 2x2 block system come from? > > > Maybe someone on the list knows the right approximation to use for S. > > > > > > The model is 3D linear elasticity using a finite element > discretization. I applied substructuring to part of the system to > "condense" it, and that results in the small A00 block. The A11 block is > just standard 3D elasticity; no substructuring was applied there. There are > constraints to connect the degrees of freedom on the interface of the > substructured and non-substructured regions. > > > > > > If anyone has suggestions for a good way to precondition this type of > system, I'd be most appreciative! > > > > > > Thanks, > > > David > > > > > > > > > > > > ----------------------------------------- > > > > > > 0 KSP Residual norm 5.405528187695e+04 > > > 1 KSP Residual norm 2.187814910803e+02 > > > 2 KSP Residual norm 1.019051577515e-01 > > > 3 KSP Residual norm 4.370464012859e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from Sp, an > assembled approximation to S, which uses (lumped, if requested) A00's > diagonal's inverse > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls > =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be allocated > locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space basis): > 0 > > > ICNTL(26) (Schur options for rhs or solution): > 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential ordering): > 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly > after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on > all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors > on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after > analysis): 2 > > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store > the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative refinement > after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal > data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the > factors): 3042 > > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > > INFOG(29) (after factorization: effective number > of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): > 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI > processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and > PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 9489 nodes, limit > used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is > 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 9492 nodes, limit used > is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 5717 nodes, limit used > is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during > MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated > workspace increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be > allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space > basis): 0 > > > ICNTL(26) (Schur options for rhs or > solution): 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries > in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the > solve phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): > 0.01 > > > CNTL(2) (stopping criterion of > refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): > 0. > > > CNTL(4) (value of static pivoting): > -1. > > > CNTL(5) (fixation for null pivots): > 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the > assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in > the complete tree): 12 > > > INFOG(6) (number of nodes in the complete > tree): 53 > > > INFOG(7) (ordering option effectively use > after analysis): 2 > > > INFOG(8) (structural symmetry in percent > of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix > after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): > 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in > the factors): 3042 > > > INFOG(21) (size in MB of memory > effectively used during factorization - value on the most memory consuming > processor): 1 > > > INFOG(22) (size in MB of memory > effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of > ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of > null pivots encountered): 0 > > > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of > analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 108 nodes, limit used > is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 67 nodes, limit used is > 5 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 17:22:10 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > > Objects: 2.030e+02 1.00000 2.030e+02 > > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length > N --> 2N flops > > > and VecAXPY() for complex vectors of > length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- > Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all > processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() > and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in > this phase > > > %M - percent messages in this phase %L - percent message > lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg > len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' > Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 92 92 9698040 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537876 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 16 16 77377776 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ============================================================ > ============================================================ > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May > wrote: > > > It looks like the Schur solve is requiring a huge number of iterates > to converge (based on the instances of MatMult). > > > This is killing the performance. > > > > > > Are you sure that A11 is a good approximation to S? You might consider > trying the selfp option > > > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > > > Note that the best approx to S is likely both problem and > discretisation dependent so if selfp is also terrible, you might want to > consider coding up your own approx to S for your specific system. > > > > > > > > > Thanks, > > > Dave > > > > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic < > david.knezevic at akselos.com> wrote: > > > I have a definite block 2x2 system and I figured it'd be good to apply > the PCFIELDSPLIT functionality with Schur complement, as described in > Section 4.5 of the manual. > > > > > > The A00 block of my matrix is very small so I figured I'd specify a > direct solver (i.e. MUMPS) for that block. > > > > > > So I did the following: > > > - PCFieldSplitSetIS to specify the indices of the two splits > > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the > solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > > - I set -pc_fieldsplit_schur_fact_type full > > > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" > for a test case. It seems to converge well, but I'm concerned about the > speed (about 90 seconds, vs. about 1 second if I use a direct solver for > the entire system). I just wanted to check if I'm setting this up in a good > way? > > > > > > Many thanks, > > > David > > > > > > ------------------------------------------------------------ > ----------------------- > > > > > > 0 KSP Residual norm 5.405774214400e+04 > > > 1 KSP Residual norm 1.849649014371e+02 > > > 2 KSP Residual norm 7.462775074989e-02 > > > 3 KSP Residual norm 2.680497175260e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from A11 > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls > =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be allocated > locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space basis): > 0 > > > ICNTL(26) (Schur options for rhs or solution): > 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential ordering): > 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly > after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on > all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors > on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after > analysis): 2 > > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store > the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative refinement > after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal > data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the > factors): 3042 > > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > > INFOG(29) (after factorization: effective number > of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): > 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI > processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and > PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 9492 nodes, limit > used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is > 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 9492 nodes, limit used > is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 5717 nodes, limit used > is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during > MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated > workspace increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be > allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space > basis): 0 > > > ICNTL(26) (Schur options for rhs or > solution): 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries > in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the > solve phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): > 0.01 > > > CNTL(2) (stopping criterion of > refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): > 0. > > > CNTL(4) (value of static pivoting): > -1. > > > CNTL(5) (fixation for null pivots): > 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the > assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in > the complete tree): 12 > > > INFOG(6) (number of nodes in the complete > tree): 53 > > > INFOG(7) (ordering option effectively use > after analysis): 2 > > > INFOG(8) (structural symmetry in percent > of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix > after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): > 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in > the factors): 3042 > > > INFOG(21) (size in MB of memory > effectively used during factorization - value on the most memory consuming > processor): 1 > > > INFOG(22) (size in MB of memory > effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of > ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of > null pivots encountered): 0 > > > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of > analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 108 nodes, limit used > is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 67 nodes, limit used is > 5 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 16:16:47 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > > Objects: 1.990e+02 1.00000 1.990e+02 > > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length > N --> 2N flops > > > and VecAXPY() for complex vectors of > length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- > Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all > processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() > and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in > this phase > > > %M - percent messages in this phase %L - percent message > lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg > len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' > Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 91 91 9693912 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537888 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 13 13 64097868 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ============================================================ > ============================================================ > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Wed Jan 11 20:55:05 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Jan 2017 20:55:05 -0600 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Message-ID: That is disappointing, Please try using -pc_fieldsplit_schur_precondition full with the two cases of -fieldsplit_FE_split_pc_type gamg and -fieldsplit_FE_split_pc_type cholesky Barry > On Jan 11, 2017, at 8:49 PM, David Knezevic wrote: > > OK, that's encouraging. However, OK, that's encouraging. However, regarding this: > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type gamg > > I tried this and it didn't converge at all (it hit the 10000 iteration max in the output from -fieldsplit_FE_split_ksp_monitor). So I guess I'd need to attach the near nullspace to make this work reasonably, as you said. Sounds like that may not be easy to do in this case though? I'll try some other preconditioners in the meantime. > > Thanks, > David > > > On Wed, Jan 11, 2017 at 9:31 PM, Barry Smith wrote: > > Thanks, this is very useful information. It means that > > 1) the approximate Sp is actually a very good approximation to the true Schur complement S, since using Sp^-1 to precondition S gives iteration counts from 8 to 13. > > 2) using ilu(0) as a preconditioner for Sp is not good, since replacing Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is actually not super surprising since ilu(0) is generally "not so good" for elasticity. > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type gamg > > the one open question is if any options should be passed to the gamg to tell it that the underly problem comes from "elasticity"; that is something about the null space. > > Mark Adams, since the GAMG is coming from inside another preconditioner it may not be easy for the easy for the user to attach the near null space to that inner matrix. Would it make sense for there to be a GAMG command line option to indicate that it is a 3d elasticity problem so GAMG could set up the near null space for itself? or does that not make sense? > > Barry > > > > > On Jan 11, 2017, at 7:47 PM, David Knezevic wrote: > > > > I've attached the two log files. Using cholesky for "FE_split" seems to have helped a lot! > > > > David > > > > > > -- > > David J. Knezevic | CTO > > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 > > > > Phone: +1-617-599-4755 > > > > This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. > > > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith wrote: > > > > Can you please run with all the monitoring on? So we can see the convergence of all the inner solvers > > -fieldsplit_FE_split_ksp_monitor > > > > Then run again with > > > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky > > > > > > and send both sets of results > > > > Barry > > > > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic wrote: > > > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May wrote: > > > so I gather that I'll have to look into a user-defined approximation to S. > > > > > > Where does the 2x2 block system come from? > > > Maybe someone on the list knows the right approximation to use for S. > > > > > > The model is 3D linear elasticity using a finite element discretization. I applied substructuring to part of the system to "condense" it, and that results in the small A00 block. The A11 block is just standard 3D elasticity; no substructuring was applied there. There are constraints to connect the degrees of freedom on the interface of the substructured and non-substructured regions. > > > > > > If anyone has suggestions for a good way to precondition this type of system, I'd be most appreciative! > > > > > > Thanks, > > > David > > > > > > > > > > > > ----------------------------------------- > > > > > > 0 KSP Residual norm 5.405528187695e+04 > > > 1 KSP Residual norm 2.187814910803e+02 > > > 2 KSP Residual norm 1.019051577515e-01 > > > 3 KSP Residual norm 4.370464012859e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 5717 nodes, limit used is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 67 nodes, limit used is 5 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 17:22:10 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > > Objects: 2.030e+02 1.00000 2.030e+02 > > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in this phase > > > %M - percent messages in this phase %L - percent message lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > > ------------------------------------------------------------------------------------------------------------------------ > > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 92 92 9698040 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537876 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 16 16 77377776 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ======================================================================================================================== > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May wrote: > > > It looks like the Schur solve is requiring a huge number of iterates to converge (based on the instances of MatMult). > > > This is killing the performance. > > > > > > Are you sure that A11 is a good approximation to S? You might consider trying the selfp option > > > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > > > Note that the best approx to S is likely both problem and discretisation dependent so if selfp is also terrible, you might want to consider coding up your own approx to S for your specific system. > > > > > > > > > Thanks, > > > Dave > > > > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic wrote: > > > I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. > > > > > > The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. > > > > > > So I did the following: > > > - PCFieldSplitSetIS to specify the indices of the two splits > > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > > - I set -pc_fieldsplit_schur_fact_type full > > > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a test case. It seems to converge well, but I'm concerned about the speed (about 90 seconds, vs. about 1 second if I use a direct solver for the entire system). I just wanted to check if I'm setting this up in a good way? > > > > > > Many thanks, > > > David > > > > > > ----------------------------------------------------------------------------------- > > > > > > 0 KSP Residual norm 5.405774214400e+04 > > > 1 KSP Residual norm 1.849649014371e+02 > > > 2 KSP Residual norm 7.462775074989e-02 > > > 3 KSP Residual norm 2.680497175260e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from A11 > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 5717 nodes, limit used is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 67 nodes, limit used is 5 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 16:16:47 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > > Objects: 1.990e+02 1.00000 1.990e+02 > > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in this phase > > > %M - percent messages in this phase %L - percent message lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > > ------------------------------------------------------------------------------------------------------------------------ > > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 91 91 9693912 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537888 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 13 13 64097868 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ======================================================================================================================== > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > > > > > > > > > > From knepley at gmail.com Wed Jan 11 21:21:05 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 11 Jan 2017 21:21:05 -0600 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Message-ID: On Wed, Jan 11, 2017 at 8:31 PM, Barry Smith wrote: > > Thanks, this is very useful information. It means that > > 1) the approximate Sp is actually a very good approximation to the true > Schur complement S, since using Sp^-1 to precondition S gives iteration > counts from 8 to 13. > > 2) using ilu(0) as a preconditioner for Sp is not good, since replacing > Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is actually not > super surprising since ilu(0) is generally "not so good" for elasticity. > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor > -fieldsplit_FE_split_pc_type gamg > > the one open question is if any options should be passed to the gamg to > tell it that the underly problem comes from "elasticity"; that is something > about the null space. > > Mark Adams, since the GAMG is coming from inside another preconditioner > it may not be easy for the easy for the user to attach the near null space > to that inner matrix. Would it make sense for there to be a GAMG command > line option to indicate that it is a 3d elasticity problem so GAMG could > set up the near null space for itself? or does that not make sense? > We could do that if somehow we knew the problem geometry, which is the origin of Mark's PCSetCoordinates() interface. Matt > Barry > > > > > On Jan 11, 2017, at 7:47 PM, David Knezevic > wrote: > > > > I've attached the two log files. Using cholesky for "FE_split" seems to > have helped a lot! > > > > David > > > > > > -- > > David J. Knezevic | CTO > > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 > > > > Phone: +1-617-599-4755 > > > > This e-mail and any attachments may contain confidential material for > the sole use of the intended recipient(s). Any review or distribution by > others is strictly prohibited. If you are not the intended recipient, > please contact the sender and delete all copies. > > > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith wrote: > > > > Can you please run with all the monitoring on? So we can see the > convergence of all the inner solvers > > -fieldsplit_FE_split_ksp_monitor > > > > Then run again with > > > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky > > > > > > and send both sets of results > > > > Barry > > > > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic < > david.knezevic at akselos.com> wrote: > > > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May > wrote: > > > so I gather that I'll have to look into a user-defined approximation > to S. > > > > > > Where does the 2x2 block system come from? > > > Maybe someone on the list knows the right approximation to use for S. > > > > > > The model is 3D linear elasticity using a finite element > discretization. I applied substructuring to part of the system to > "condense" it, and that results in the small A00 block. The A11 block is > just standard 3D elasticity; no substructuring was applied there. There are > constraints to connect the degrees of freedom on the interface of the > substructured and non-substructured regions. > > > > > > If anyone has suggestions for a good way to precondition this type of > system, I'd be most appreciative! > > > > > > Thanks, > > > David > > > > > > > > > > > > ----------------------------------------- > > > > > > 0 KSP Residual norm 5.405528187695e+04 > > > 1 KSP Residual norm 2.187814910803e+02 > > > 2 KSP Residual norm 1.019051577515e-01 > > > 3 KSP Residual norm 4.370464012859e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from Sp, an > assembled approximation to S, which uses (lumped, if requested) A00's > diagonal's inverse > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls > =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be allocated > locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space basis): > 0 > > > ICNTL(26) (Schur options for rhs or solution): > 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential ordering): > 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly > after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on > all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors > on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after > analysis): 2 > > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store > the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative refinement > after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal > data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the > factors): 3042 > > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > > INFOG(29) (after factorization: effective number > of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): > 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI > processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and > PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 9489 nodes, limit > used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is > 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 9492 nodes, limit used > is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 5717 nodes, limit used > is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during > MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated > workspace increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be > allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space > basis): 0 > > > ICNTL(26) (Schur options for rhs or > solution): 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries > in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the > solve phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): > 0.01 > > > CNTL(2) (stopping criterion of > refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): > 0. > > > CNTL(4) (value of static pivoting): > -1. > > > CNTL(5) (fixation for null pivots): > 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the > assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in > the complete tree): 12 > > > INFOG(6) (number of nodes in the complete > tree): 53 > > > INFOG(7) (ordering option effectively use > after analysis): 2 > > > INFOG(8) (structural symmetry in percent > of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix > after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): > 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in > the factors): 3042 > > > INFOG(21) (size in MB of memory > effectively used during factorization - value on the most memory consuming > processor): 1 > > > INFOG(22) (size in MB of memory > effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of > ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of > null pivots encountered): 0 > > > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of > analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 108 nodes, limit used > is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 67 nodes, limit used is > 5 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 17:22:10 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > > Objects: 2.030e+02 1.00000 2.030e+02 > > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length > N --> 2N flops > > > and VecAXPY() for complex vectors of > length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- > Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all > processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() > and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in > this phase > > > %M - percent messages in this phase %L - percent message > lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg > len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' > Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 92 92 9698040 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537876 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 16 16 77377776 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ============================================================ > ============================================================ > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May > wrote: > > > It looks like the Schur solve is requiring a huge number of iterates > to converge (based on the instances of MatMult). > > > This is killing the performance. > > > > > > Are you sure that A11 is a good approximation to S? You might consider > trying the selfp option > > > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > > > Note that the best approx to S is likely both problem and > discretisation dependent so if selfp is also terrible, you might want to > consider coding up your own approx to S for your specific system. > > > > > > > > > Thanks, > > > Dave > > > > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic < > david.knezevic at akselos.com> wrote: > > > I have a definite block 2x2 system and I figured it'd be good to apply > the PCFIELDSPLIT functionality with Schur complement, as described in > Section 4.5 of the manual. > > > > > > The A00 block of my matrix is very small so I figured I'd specify a > direct solver (i.e. MUMPS) for that block. > > > > > > So I did the following: > > > - PCFieldSplitSetIS to specify the indices of the two splits > > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the > solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > > - I set -pc_fieldsplit_schur_fact_type full > > > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" > for a test case. It seems to converge well, but I'm concerned about the > speed (about 90 seconds, vs. about 1 second if I use a direct solver for > the entire system). I just wanted to check if I'm setting this up in a good > way? > > > > > > Many thanks, > > > David > > > > > > ------------------------------------------------------------ > ----------------------- > > > > > > 0 KSP Residual norm 5.405774214400e+04 > > > 1 KSP Residual norm 1.849649014371e+02 > > > 2 KSP Residual norm 7.462775074989e-02 > > > 3 KSP Residual norm 2.680497175260e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from A11 > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls > =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be allocated > locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space basis): > 0 > > > ICNTL(26) (Schur options for rhs or solution): > 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential ordering): > 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly > after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal > data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used > during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly > after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): > (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on > all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors > on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after > analysis): 2 > > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store > the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative refinement > after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal > data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the > factors): 3042 > > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > > INFOG(29) (after factorization: effective number > of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): > 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI > processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and > PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 9492 nodes, limit > used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is > 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 9492 nodes, limit used > is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 5717 nodes, limit used > is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during > MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): > 0 > > > ICNTL(13) (efficiency control): > 0 > > > ICNTL(14) (percentage of estimated > workspace increase): 20 > > > ICNTL(18) (input mat struct): > 0 > > > ICNTL(19) (Shur complement info): > 0 > > > ICNTL(20) (rhs sparse pattern): > 0 > > > ICNTL(21) (solution struct): > 0 > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > ICNTL(23) (max size of memory can be > allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): > 0 > > > ICNTL(25) (computation of a null space > basis): 0 > > > ICNTL(26) (Schur options for rhs or > solution): 0 > > > ICNTL(27) (experimental parameter): > -24 > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > ICNTL(29) (parallel ordering): > 0 > > > ICNTL(30) (user-specified set of entries > in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the > solve phase): 0 > > > ICNTL(33) (compute determinant): > 0 > > > CNTL(1) (relative pivoting threshold): > 0.01 > > > CNTL(2) (stopping criterion of > refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): > 0. > > > CNTL(4) (value of static pivoting): > -1. > > > CNTL(5) (fixation for null pivots): > 0. > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the > assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in > the complete tree): 12 > > > INFOG(6) (number of nodes in the complete > tree): 53 > > > INFOG(7) (ordering option effectively use > after analysis): 2 > > > INFOG(8) (structural symmetry in percent > of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to > store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix > after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): > 0 > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > INFOG(14) (number of memory compress after > factorization): 0 > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in > the factors): 3042 > > > INFOG(21) (size in MB of memory > effectively used during factorization - value on the most memory consuming > processor): 1 > > > INFOG(22) (size in MB of memory > effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of > ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of > pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of > null pivots encountered): 0 > > > INFOG(29) (after factorization: effective > number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of > analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues > calls =0 > > > using I-node routines: found 108 nodes, limit used > is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls > =0 > > > using I-node routines: found 67 nodes, limit used is > 5 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 16:16:47 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > > Objects: 1.990e+02 1.00000 1.990e+02 > > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length > N --> 2N flops > > > and VecAXPY() for complex vectors of > length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- > Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 > 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all > processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() > and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in > this phase > > > %M - percent messages in this phase %L - percent message > lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time > over all processors) > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg > len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' > Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 91 91 9693912 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537888 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 13 13 64097868 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ============================================================ > ============================================================ > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Wed Jan 11 21:31:31 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 11 Jan 2017 21:31:31 -0600 Subject: [petsc-users] malconfigured gamg In-Reply-To: <61854A5B-AE25-4386-A36C-6DB72D079214@mcs.anl.gov> References: <735d76e6-3875-05f1-4f2d-1ab0158d2846@sintef.no> <87mvexz2n7.fsf@jedbrown.org> <61854A5B-AE25-4386-A36C-6DB72D079214@mcs.anl.gov> Message-ID: <87eg09ymwc.fsf@jedbrown.org> Barry Smith writes: >> On Jan 11, 2017, at 3:51 PM, Jed Brown wrote: >> >> Arne Morten Kvarving writes: >> >>> hi, >>> >>> first, this was an user error and i totally acknowledge this, but i >>> wonder if this might be an oversight in your error checking: if you >>> configure gamg with ilu/asm smoothing, and are stupid enough to have set >>> the number of smoother cycles to 0, your program churns along and >>> apparently converges just fine (towards garbage, but apparently 'sane' >>> garbage (not 0, not nan, not inf)) >> >> My concern here is that skipping smoothing actually makes sense, e.g., >> for Kaskade cycles (no pre-smoothing). I would suggest checking the >> unpreconditioned (or true) residual in order to notice when a singular >> preconditioner causes stagnation (instead of misdiagnosing it as >> convergence due to the preconditioned residual dropping). > > Jed, > > Yeah but what about checking that the sum of the number of pre and post smooths >=1 ? Usually fine, but what one potential use case is if someone wants to test a more aggressive coarsening strategy. For example, using zero smooths on odd levels would be double-rate coarsening and might be more convenient to implement than the direct operators. (In the strong-scaling limit, it might also be a good communication pattern for reducing the process set.) -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From bsmith at mcs.anl.gov Wed Jan 11 21:35:49 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Jan 2017 21:35:49 -0600 Subject: [petsc-users] malconfigured gamg In-Reply-To: <87eg09ymwc.fsf@jedbrown.org> References: <735d76e6-3875-05f1-4f2d-1ab0158d2846@sintef.no> <87mvexz2n7.fsf@jedbrown.org> <61854A5B-AE25-4386-A36C-6DB72D079214@mcs.anl.gov> <87eg09ymwc.fsf@jedbrown.org> Message-ID: Ok, how about just checking there is at least one smoothing on the finest level? This would catch most simple user errors, or do you know about oddball cases with no smoothing on the finest level? > On Jan 11, 2017, at 9:31 PM, Jed Brown wrote: > > Barry Smith writes: > >>> On Jan 11, 2017, at 3:51 PM, Jed Brown wrote: >>> >>> Arne Morten Kvarving writes: >>> >>>> hi, >>>> >>>> first, this was an user error and i totally acknowledge this, but i >>>> wonder if this might be an oversight in your error checking: if you >>>> configure gamg with ilu/asm smoothing, and are stupid enough to have set >>>> the number of smoother cycles to 0, your program churns along and >>>> apparently converges just fine (towards garbage, but apparently 'sane' >>>> garbage (not 0, not nan, not inf)) >>> >>> My concern here is that skipping smoothing actually makes sense, e.g., >>> for Kaskade cycles (no pre-smoothing). I would suggest checking the >>> unpreconditioned (or true) residual in order to notice when a singular >>> preconditioner causes stagnation (instead of misdiagnosing it as >>> convergence due to the preconditioned residual dropping). >> >> Jed, >> >> Yeah but what about checking that the sum of the number of pre and post smooths >=1 ? > > Usually fine, but what one potential use case is if someone wants to > test a more aggressive coarsening strategy. For example, using zero > smooths on odd levels would be double-rate coarsening and might be more > convenient to implement than the direct operators. (In the > strong-scaling limit, it might also be a good communication pattern for > reducing the process set.) From bsmith at mcs.anl.gov Wed Jan 11 21:37:04 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 11 Jan 2017 21:37:04 -0600 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Message-ID: <347FB913-969D-47A6-8EC3-9AF95BE1A026@mcs.anl.gov> > On Jan 11, 2017, at 9:21 PM, Matthew Knepley wrote: > > On Wed, Jan 11, 2017 at 8:31 PM, Barry Smith wrote: > > Thanks, this is very useful information. It means that > > 1) the approximate Sp is actually a very good approximation to the true Schur complement S, since using Sp^-1 to precondition S gives iteration counts from 8 to 13. > > 2) using ilu(0) as a preconditioner for Sp is not good, since replacing Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is actually not super surprising since ilu(0) is generally "not so good" for elasticity. > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type gamg > > the one open question is if any options should be passed to the gamg to tell it that the underly problem comes from "elasticity"; that is something about the null space. > > Mark Adams, since the GAMG is coming from inside another preconditioner it may not be easy for the easy for the user to attach the near null space to that inner matrix. Would it make sense for there to be a GAMG command line option to indicate that it is a 3d elasticity problem so GAMG could set up the near null space for itself? or does that not make sense? > > We could do that if somehow we knew the problem geometry, which is the origin of Mark's PCSetCoordinates() interface. Ah, so conveying Mat coordinates down to sub matrices? > > Matt > > Barry > > > > > On Jan 11, 2017, at 7:47 PM, David Knezevic wrote: > > > > I've attached the two log files. Using cholesky for "FE_split" seems to have helped a lot! > > > > David > > > > > > -- > > David J. Knezevic | CTO > > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 > > > > Phone: +1-617-599-4755 > > > > This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. > > > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith wrote: > > > > Can you please run with all the monitoring on? So we can see the convergence of all the inner solvers > > -fieldsplit_FE_split_ksp_monitor > > > > Then run again with > > > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky > > > > > > and send both sets of results > > > > Barry > > > > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic wrote: > > > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May wrote: > > > so I gather that I'll have to look into a user-defined approximation to S. > > > > > > Where does the 2x2 block system come from? > > > Maybe someone on the list knows the right approximation to use for S. > > > > > > The model is 3D linear elasticity using a finite element discretization. I applied substructuring to part of the system to "condense" it, and that results in the small A00 block. The A11 block is just standard 3D elasticity; no substructuring was applied there. There are constraints to connect the degrees of freedom on the interface of the substructured and non-substructured regions. > > > > > > If anyone has suggestions for a good way to precondition this type of system, I'd be most appreciative! > > > > > > Thanks, > > > David > > > > > > > > > > > > ----------------------------------------- > > > > > > 0 KSP Residual norm 5.405528187695e+04 > > > 1 KSP Residual norm 2.187814910803e+02 > > > 2 KSP Residual norm 1.019051577515e-01 > > > 3 KSP Residual norm 4.370464012859e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 5717 nodes, limit used is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 67 nodes, limit used is 5 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9489 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 17:22:10 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > > Objects: 2.030e+02 1.00000 2.030e+02 > > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in this phase > > > %M - percent messages in this phase %L - percent message lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > > ------------------------------------------------------------------------------------------------------------------------ > > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 92 92 9698040 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537876 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 16 16 77377776 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ======================================================================================================================== > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May wrote: > > > It looks like the Schur solve is requiring a huge number of iterates to converge (based on the instances of MatMult). > > > This is killing the performance. > > > > > > Are you sure that A11 is a good approximation to S? You might consider trying the selfp option > > > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > > > Note that the best approx to S is likely both problem and discretisation dependent so if selfp is also terrible, you might want to consider coding up your own approx to S for your specific system. > > > > > > > > > Thanks, > > > Dave > > > > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic wrote: > > > I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. > > > > > > The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. > > > > > > So I did the following: > > > - PCFieldSplitSetIS to specify the indices of the two splits > > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > > - I set -pc_fieldsplit_schur_fact_type full > > > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a test case. It seems to converge well, but I'm concerned about the speed (about 90 seconds, vs. about 1 second if I use a direct solver for the entire system). I just wanted to check if I'm setting this up in a good way? > > > > > > Many thanks, > > > David > > > > > > ----------------------------------------------------------------------------------- > > > > > > 0 KSP Residual norm 5.405774214400e+04 > > > 1 KSP Residual norm 1.849649014371e+02 > > > 2 KSP Residual norm 7.462775074989e-02 > > > 3 KSP Residual norm 2.680497175260e-04 > > > KSP Object: 1 MPI processes > > > type: cg > > > maximum iterations=1000 > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using nonzero initial guess > > > using PRECONDITIONED norm type for convergence test > > > PC Object: 1 MPI processes > > > type: fieldsplit > > > FieldSplit with Schur preconditioner, factorization FULL > > > Preconditioner for the Schur complement formed from A11 > > > Split info: > > > Split number 0 Defined by IS > > > Split number 1 Defined by IS > > > KSP solver for A00 block > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: cg > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using PRECONDITIONED norm type for convergence test > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: bjacobi > > > block Jacobi: number of blocks = 1 > > > Local solve is same for all blocks, in the following KSP and PC objects: > > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > type: ilu > > > ILU: out-of-place factorization > > > 0 levels of fill > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 1., needed 1. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > package used to perform factorization: petsc > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix followed by preconditioner matrix: > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: schurcomplement > > > rows=28476, cols=28476 > > > Schur complement A11 - A10 inv(A00) A01 > > > A11 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > A10 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=28476, cols=324 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 5717 nodes, limit used is 5 > > > KSP of A00 > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: preonly > > > maximum iterations=10000, initial guess is zero > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > left preconditioning > > > using NONE norm type for convergence test > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: cholesky > > > Cholesky: out-of-place factorization > > > tolerance for zero pivot 2.22045e-14 > > > matrix ordering: natural > > > factor fill ratio given 0., needed 0. > > > Factored matrix follows: > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > package used to perform factorization: mumps > > > total: nonzeros=3042, allocated nonzeros=3042 > > > total number of mallocs used during MatSetValues calls =0 > > > MUMPS run parameters: > > > SYM (matrix type): 2 > > > PAR (host participation): 1 > > > ICNTL(1) (output for error): 6 > > > ICNTL(2) (output of diagnostic msg): 0 > > > ICNTL(3) (output for global info): 0 > > > ICNTL(4) (level of printing): 0 > > > ICNTL(5) (input mat struct): 0 > > > ICNTL(6) (matrix prescaling): 7 > > > ICNTL(7) (sequentia matrix ordering):7 > > > ICNTL(8) (scalling strategy): 77 > > > ICNTL(10) (max num of refinements): 0 > > > ICNTL(11) (error analysis): 0 > > > ICNTL(12) (efficiency control): 0 > > > ICNTL(13) (efficiency control): 0 > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > ICNTL(18) (input mat struct): 0 > > > ICNTL(19) (Shur complement info): 0 > > > ICNTL(20) (rhs sparse pattern): 0 > > > ICNTL(21) (solution struct): 0 > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > ICNTL(24) (detection of null pivot rows): 0 > > > ICNTL(25) (computation of a null space basis): 0 > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > ICNTL(27) (experimental parameter): -24 > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > ICNTL(29) (parallel ordering): 0 > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > ICNTL(33) (compute determinant): 0 > > > CNTL(1) (relative pivoting threshold): 0.01 > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > CNTL(3) (absolute pivoting threshold): 0. > > > CNTL(4) (value of static pivoting): -1. > > > CNTL(5) (fixation for null pivots): 0. > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > [0] 29394. > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > [0] 1092. > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > [0] 29394. > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > [0] 1 > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > [0] 1 > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > [0] 324 > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > INFOG(6) (number of nodes in the complete tree): 53 > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > INFOG(12) (number of off-diagonal pivots): 0 > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > INFOG(14) (number of memory compress after factorization): 0 > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > INFOG(32) (after analysis: type of analysis done): 1 > > > INFOG(33) (value used for ICNTL(8)): -2 > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > linear system matrix = precond matrix: > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > type: seqaij > > > rows=324, cols=324 > > > total: nonzeros=5760, allocated nonzeros=5760 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 108 nodes, limit used is 5 > > > A01 > > > Mat Object: 1 MPI processes > > > type: seqaij > > > rows=324, cols=28476 > > > total: nonzeros=936, allocated nonzeros=936 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 67 nodes, limit used is 5 > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > type: seqaij > > > rows=28476, cols=28476 > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9492 nodes, limit used is 5 > > > linear system matrix = precond matrix: > > > Mat Object: () 1 MPI processes > > > type: seqaij > > > rows=28800, cols=28800 > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > total number of mallocs used during MatSetValues calls =0 > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 16:16:47 2017 > > > Using Petsc Release Version 3.7.3, unknown > > > > > > Max Max/Min Avg Total > > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > > Objects: 1.990e+02 1.00000 1.990e+02 > > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > ------------------------------------------------------------------------------------------------------------------------ > > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > > Phase summary info: > > > Count: number of times phase was executed > > > Time and Flops: Max - maximum over all processors > > > Ratio - ratio of maximum to minimum over all processors > > > Mess: number of messages sent > > > Avg. len: average message length (bytes) > > > Reduct: number of global reductions > > > Global: entire computation > > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > > %T - percent time in this phase %F - percent flops in this phase > > > %M - percent messages in this phase %L - percent message lengths in this phase > > > %R - percent reductions in this phase > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > > ------------------------------------------------------------------------------------------------------------------------ > > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > --- Event Stage 0: Main Stage > > > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > Memory usage is given in bytes: > > > > > > Object Type Creations Destructions Memory Descendants' Mem. > > > Reports information only for process 0. > > > > > > --- Event Stage 0: Main Stage > > > > > > Vector 91 91 9693912 0. > > > Vector Scatter 24 24 15936 0. > > > Index Set 51 51 537888 0. > > > IS L to G Mapping 3 3 240408 0. > > > Matrix 13 13 64097868 0. > > > Krylov Solver 6 6 7888 0. > > > Preconditioner 6 6 6288 0. > > > Viewer 1 0 0 0. > > > Distributed Mesh 1 1 4624 0. > > > Star Forest Bipartite Graph 2 2 1616 0. > > > Discrete System 1 1 872 0. > > > ======================================================================================================================== > > > Average time to get PetscTime(): 0. > > > #PETSc Option Table entries: > > > -ksp_monitor > > > -ksp_view > > > -log_view > > > #End of PETSc Option Table entries > > > Compiled without FORTRAN kernels > > > Compiled with full precision matrices (default) > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > > ----------------------------------------- > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > > Using PETSc directory: /home/dknez/software/petsc-src > > > Using PETSc arch: arch-linux2-c-opt > > > ----------------------------------------- > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > ----------------------------------------- > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > ----------------------------------------- > > > > > > Using C linker: mpicc > > > Using Fortran linker: mpif90 > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From jed at jedbrown.org Wed Jan 11 22:22:42 2017 From: jed at jedbrown.org (Jed Brown) Date: Wed, 11 Jan 2017 22:22:42 -0600 Subject: [petsc-users] malconfigured gamg In-Reply-To: References: <735d76e6-3875-05f1-4f2d-1ab0158d2846@sintef.no> <87mvexz2n7.fsf@jedbrown.org> <61854A5B-AE25-4386-A36C-6DB72D079214@mcs.anl.gov> <87eg09ymwc.fsf@jedbrown.org> Message-ID: <8737gozz3h.fsf@jedbrown.org> Barry Smith writes: > Ok, how about just checking there is at least one smoothing on the finest level? This would catch most simple user errors, or do you know about oddball cases with no smoothing on the finest level? No idea; what if it's inside a PCComposite SPECIAL and the user only wants it to provide only a coarse grid correction? -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From david.knezevic at akselos.com Wed Jan 11 22:37:06 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Wed, 11 Jan 2017 23:37:06 -0500 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Message-ID: On Wed, Jan 11, 2017 at 9:55 PM, Barry Smith wrote: > > That is disappointing, > > Please try using > > -pc_fieldsplit_schur_precondition full > > with the two cases of -fieldsplit_FE_split_pc_type gamg and > -fieldsplit_FE_split_pc_type cholesky > > One more data point: The initial mesh I tried had some somewhat bad quality elements. I tried some other cases that have better conditioned meshes (nicely shaped hex elements), and using ILU(0) for A11 worked well in those cases. So certainly the conditioning of A11 appears to play a significant role here (not surprising). Regarding --pc_fieldsplit_schur_precondition full: I must have my SetFromOptions call in the wrong place, because I can't get "-pc_fieldsplit_schur_precondition full" to have an effect (I'll look into that some more). However, I was able to use PCFieldSplitSetSchurPre to set the schur_precondition property via code. That worked with A11, SELF, SELFP, but when I did: PCFieldSplitSetSchurPre (pc, PC_FIELDSPLIT_SCHUR_PRE_FULL, NULL); I got an error: [0]PETSC ERROR: No support for this operation for this object type [0]PETSC ERROR: Not yet implemented for Schur complements with non-vanishing D David > On Jan 11, 2017, at 8:49 PM, David Knezevic > wrote: > > > > OK, that's encouraging. However, OK, that's encouraging. However, > regarding this: > > > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor > -fieldsplit_FE_split_pc_type gamg > > > > I tried this and it didn't converge at all (it hit the 10000 iteration > max in the output from -fieldsplit_FE_split_ksp_monitor). So I guess I'd > need to attach the near nullspace to make this work reasonably, as you > said. Sounds like that may not be easy to do in this case though? I'll try > some other preconditioners in the meantime. > > > > Thanks, > > David > > > > > > On Wed, Jan 11, 2017 at 9:31 PM, Barry Smith wrote: > > > > Thanks, this is very useful information. It means that > > > > 1) the approximate Sp is actually a very good approximation to the true > Schur complement S, since using Sp^-1 to precondition S gives iteration > counts from 8 to 13. > > > > 2) using ilu(0) as a preconditioner for Sp is not good, since replacing > Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is actually not > super surprising since ilu(0) is generally "not so good" for elasticity. > > > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor > -fieldsplit_FE_split_pc_type gamg > > > > the one open question is if any options should be passed to the gamg to > tell it that the underly problem comes from "elasticity"; that is something > about the null space. > > > > Mark Adams, since the GAMG is coming from inside another > preconditioner it may not be easy for the easy for the user to attach the > near null space to that inner matrix. Would it make sense for there to be a > GAMG command line option to indicate that it is a 3d elasticity problem so > GAMG could set up the near null space for itself? or does that not make > sense? > > > > Barry > > > > > > > > > On Jan 11, 2017, at 7:47 PM, David Knezevic < > david.knezevic at akselos.com> wrote: > > > > > > I've attached the two log files. Using cholesky for "FE_split" seems > to have helped a lot! > > > > > > David > > > > > > > > > -- > > > David J. Knezevic | CTO > > > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 > > > > > > Phone: +1-617-599-4755 > > > > > > This e-mail and any attachments may contain confidential material for > the sole use of the intended recipient(s). Any review or distribution by > others is strictly prohibited. If you are not the intended recipient, > please contact the sender and delete all copies. > > > > > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith > wrote: > > > > > > Can you please run with all the monitoring on? So we can see the > convergence of all the inner solvers > > > -fieldsplit_FE_split_ksp_monitor > > > > > > Then run again with > > > > > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type > cholesky > > > > > > > > > and send both sets of results > > > > > > Barry > > > > > > > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic < > david.knezevic at akselos.com> wrote: > > > > > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May > wrote: > > > > so I gather that I'll have to look into a user-defined approximation > to S. > > > > > > > > Where does the 2x2 block system come from? > > > > Maybe someone on the list knows the right approximation to use for S. > > > > > > > > The model is 3D linear elasticity using a finite element > discretization. I applied substructuring to part of the system to > "condense" it, and that results in the small A00 block. The A11 block is > just standard 3D elasticity; no substructuring was applied there. There are > constraints to connect the degrees of freedom on the interface of the > substructured and non-substructured regions. > > > > > > > > If anyone has suggestions for a good way to precondition this type > of system, I'd be most appreciative! > > > > > > > > Thanks, > > > > David > > > > > > > > > > > > > > > > ----------------------------------------- > > > > > > > > 0 KSP Residual norm 5.405528187695e+04 > > > > 1 KSP Residual norm 2.187814910803e+02 > > > > 2 KSP Residual norm 1.019051577515e-01 > > > > 3 KSP Residual norm 4.370464012859e-04 > > > > KSP Object: 1 MPI processes > > > > type: cg > > > > maximum iterations=1000 > > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using nonzero initial guess > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 1 MPI processes > > > > type: fieldsplit > > > > FieldSplit with Schur preconditioner, factorization FULL > > > > Preconditioner for the Schur complement formed from Sp, an > assembled approximation to S, which uses (lumped, if requested) A00's > diagonal's inverse > > > > Split info: > > > > Split number 0 Defined by IS > > > > Split number 1 Defined by IS > > > > KSP solver for A00 block > > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): > 0 > > > > ICNTL(13) (efficiency control): > 0 > > > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > > > ICNTL(18) (input mat struct): > 0 > > > > ICNTL(19) (Shur complement info): > 0 > > > > ICNTL(20) (rhs sparse pattern): > 0 > > > > ICNTL(21) (solution struct): > 0 > > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > > ICNTL(23) (max size of memory can be allocated > locally):0 > > > > ICNTL(24) (detection of null pivot rows): > 0 > > > > ICNTL(25) (computation of a null space basis): > 0 > > > > ICNTL(26) (Schur options for rhs or solution): > 0 > > > > ICNTL(27) (experimental parameter): > -24 > > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > > ICNTL(29) (parallel ordering): > 0 > > > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > > > ICNTL(33) (compute determinant): > 0 > > > > CNTL(1) (relative pivoting threshold): 0.01 > > > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): 0. > > > > CNTL(4) (value of static pivoting): -1. > > > > CNTL(5) (fixation for null pivots): 0. > > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the assembly > after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal data > used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for factors > on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > > > INFOG(6) (number of nodes in the complete tree): > 53 > > > > INFOG(7) (ordering option effectively use after > analysis): 2 > > > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace to store > the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > > > INFOG(12) (number of off-diagonal pivots): 0 > > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > > INFOG(14) (number of memory compress after > factorization): 0 > > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS internal > data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries in the > factors): 3042 > > > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > > > INFOG(29) (after factorization: effective number > of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of analysis > done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) 1 MPI > processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 108 nodes, limit used is 5 > > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: cg > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: bjacobi > > > > block Jacobi: number of blocks = 1 > > > > Local solve is same for all blocks, in the following KSP > and PC objects: > > > > KSP Object: (fieldsplit_FE_split_sub_) > 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > > type: ilu > > > > ILU: out-of-place factorization > > > > 0 levels of fill > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 1., needed 1. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > package used to perform factorization: petsc > > > > total: nonzeros=1037052, allocated > nonzeros=1037052 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 9489 nodes, limit > used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > > total number of mallocs used during MatSetValues calls > =0 > > > > using I-node routines: found 9489 nodes, limit used > is 5 > > > > linear system matrix followed by preconditioner matrix: > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > > type: schurcomplement > > > > rows=28476, cols=28476 > > > > Schur complement A11 - A10 inv(A00) A01 > > > > A11 > > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 9492 nodes, limit > used is 5 > > > > A10 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=324 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 5717 nodes, limit > used is 5 > > > > KSP of A00 > > > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI > processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during > MatSetValues calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): > 0 > > > > ICNTL(13) (efficiency control): > 0 > > > > ICNTL(14) (percentage of estimated > workspace increase): 20 > > > > ICNTL(18) (input mat struct): > 0 > > > > ICNTL(19) (Shur complement info): > 0 > > > > ICNTL(20) (rhs sparse pattern): > 0 > > > > ICNTL(21) (solution struct): > 0 > > > > ICNTL(22) (in-core/out-of-core > facility): 0 > > > > ICNTL(23) (max size of memory can be > allocated locally):0 > > > > ICNTL(24) (detection of null pivot > rows): 0 > > > > ICNTL(25) (computation of a null space > basis): 0 > > > > ICNTL(26) (Schur options for rhs or > solution): 0 > > > > ICNTL(27) (experimental parameter): > -24 > > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > > ICNTL(29) (parallel ordering): > 0 > > > > ICNTL(30) (user-specified set of entries > in inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the > solve phase): 0 > > > > ICNTL(33) (compute determinant): > 0 > > > > CNTL(1) (relative pivoting threshold): > 0.01 > > > > CNTL(2) (stopping criterion of > refinement): 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): > 0. > > > > CNTL(4) (value of static pivoting): > -1. > > > > CNTL(5) (fixation for null pivots): > 0. > > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the > assembly after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) > MUMPS internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on > this processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for > the elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for > the assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for > the elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace > for factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size > in the complete tree): 12 > > > > INFOG(6) (number of nodes in the > complete tree): 53 > > > > INFOG(7) (ordering option effectively > use after analysis): 2 > > > > INFOG(8) (structural symmetry in percent > of the permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace > to store the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal > matrix after factorization): 12 > > > > INFOG(12) (number of off-diagonal > pivots): 0 > > > > INFOG(13) (number of delayed pivots > after factorization): 0 > > > > INFOG(14) (number of memory compress > after factorization): 0 > > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal > data allocated during factorization: value on the most memory consuming > processor): 1 > > > > INFOG(19) (size of all MUMPS internal > data allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries > in the factors): 3042 > > > > INFOG(21) (size in MB of memory > effectively used during factorization - value on the most memory consuming > processor): 1 > > > > INFOG(22) (size in MB of memory > effectively used during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of > ICNTL(6) effectively used): 5 > > > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > > > INFOG(25) (after factorization: number > of pivots modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number > of null pivots encountered): 0 > > > > INFOG(29) (after factorization: > effective number of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of > analysis done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant > if determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 108 nodes, limit > used is 5 > > > > A01 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=28476 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 67 nodes, limit used > is 5 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9489 nodes, limit used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: () 1 MPI processes > > > > type: seqaij > > > > rows=28800, cols=28800 > > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 17:22:10 2017 > > > > Using Petsc Release Version 3.7.3, unknown > > > > > > > > Max Max/Min Avg Total > > > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > > > Objects: 2.030e+02 1.00000 2.030e+02 > > > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > > > e.g., VecAXPY() for real vectors of > length N --> 2N flops > > > > and VecAXPY() for complex vectors of > length N --> 8N flops > > > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- > Messages --- -- Message Lengths -- -- Reductions -- > > > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% > 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > > > Phase summary info: > > > > Count: number of times phase was executed > > > > Time and Flops: Max - maximum over all processors > > > > Ratio - ratio of maximum to minimum over all > processors > > > > Mess: number of messages sent > > > > Avg. len: average message length (bytes) > > > > Reduct: number of global reductions > > > > Global: entire computation > > > > Stage: stages of a computation. Set stages with > PetscLogStagePush() and PetscLogStagePop(). > > > > %T - percent time in this phase %F - percent flops in > this phase > > > > %M - percent messages in this phase %L - percent message > lengths in this phase > > > > %R - percent reductions in this phase > > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max > time over all processors) > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > > > Max Ratio Max Ratio Max Ratio Mess Avg > len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > > > Memory usage is given in bytes: > > > > > > > > Object Type Creations Destructions Memory > Descendants' Mem. > > > > Reports information only for process 0. > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > Vector 92 92 9698040 0. > > > > Vector Scatter 24 24 15936 0. > > > > Index Set 51 51 537876 0. > > > > IS L to G Mapping 3 3 240408 0. > > > > Matrix 16 16 77377776 0. > > > > Krylov Solver 6 6 7888 0. > > > > Preconditioner 6 6 6288 0. > > > > Viewer 1 0 0 0. > > > > Distributed Mesh 1 1 4624 0. > > > > Star Forest Bipartite Graph 2 2 1616 0. > > > > Discrete System 1 1 872 0. > > > > ============================================================ > ============================================================ > > > > Average time to get PetscTime(): 0. > > > > #PETSc Option Table entries: > > > > -ksp_monitor > > > > -ksp_view > > > > -log_view > > > > #End of PETSc Option Table entries > > > > Compiled without FORTRAN kernels > > > > Compiled with full precision matrices (default) > > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > > > ----------------------------------------- > > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > > > Using PETSc directory: /home/dknez/software/petsc-src > > > > Using PETSc arch: arch-linux2-c-opt > > > > ----------------------------------------- > > > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > > ----------------------------------------- > > > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > > ----------------------------------------- > > > > > > > > Using C linker: mpicc > > > > Using Fortran linker: mpif90 > > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May > wrote: > > > > It looks like the Schur solve is requiring a huge number of iterates > to converge (based on the instances of MatMult). > > > > This is killing the performance. > > > > > > > > Are you sure that A11 is a good approximation to S? You might > consider trying the selfp option > > > > > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/ > PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > > > > > Note that the best approx to S is likely both problem and > discretisation dependent so if selfp is also terrible, you might want to > consider coding up your own approx to S for your specific system. > > > > > > > > > > > > Thanks, > > > > Dave > > > > > > > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic < > david.knezevic at akselos.com> wrote: > > > > I have a definite block 2x2 system and I figured it'd be good to > apply the PCFIELDSPLIT functionality with Schur complement, as described in > Section 4.5 of the manual. > > > > > > > > The A00 block of my matrix is very small so I figured I'd specify a > direct solver (i.e. MUMPS) for that block. > > > > > > > > So I did the following: > > > > - PCFieldSplitSetIS to specify the indices of the two splits > > > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the > solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > > > - I set -pc_fieldsplit_schur_fact_type full > > > > > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" > for a test case. It seems to converge well, but I'm concerned about the > speed (about 90 seconds, vs. about 1 second if I use a direct solver for > the entire system). I just wanted to check if I'm setting this up in a good > way? > > > > > > > > Many thanks, > > > > David > > > > > > > > ------------------------------------------------------------ > ----------------------- > > > > > > > > 0 KSP Residual norm 5.405774214400e+04 > > > > 1 KSP Residual norm 1.849649014371e+02 > > > > 2 KSP Residual norm 7.462775074989e-02 > > > > 3 KSP Residual norm 2.680497175260e-04 > > > > KSP Object: 1 MPI processes > > > > type: cg > > > > maximum iterations=1000 > > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using nonzero initial guess > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 1 MPI processes > > > > type: fieldsplit > > > > FieldSplit with Schur preconditioner, factorization FULL > > > > Preconditioner for the Schur complement formed from A11 > > > > Split info: > > > > Split number 0 Defined by IS > > > > Split number 1 Defined by IS > > > > KSP solver for A00 block > > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): > 0 > > > > ICNTL(13) (efficiency control): > 0 > > > > ICNTL(14) (percentage of estimated workspace > increase): 20 > > > > ICNTL(18) (input mat struct): > 0 > > > > ICNTL(19) (Shur complement info): > 0 > > > > ICNTL(20) (rhs sparse pattern): > 0 > > > > ICNTL(21) (solution struct): > 0 > > > > ICNTL(22) (in-core/out-of-core facility): > 0 > > > > ICNTL(23) (max size of memory can be allocated > locally):0 > > > > ICNTL(24) (detection of null pivot rows): > 0 > > > > ICNTL(25) (computation of a null space basis): > 0 > > > > ICNTL(26) (Schur options for rhs or solution): > 0 > > > > ICNTL(27) (experimental parameter): > -24 > > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > > ICNTL(29) (parallel ordering): > 0 > > > > ICNTL(30) (user-specified set of entries in > inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the solve > phase): 0 > > > > ICNTL(33) (compute determinant): > 0 > > > > CNTL(1) (relative pivoting threshold): 0.01 > > > > CNTL(2) (stopping criterion of refinement): > 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): 0. > > > > CNTL(4) (value of static pivoting): -1. > > > > CNTL(5) (fixation for null pivots): 0. > > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the assembly > after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) MUMPS > internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal data > used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on this > processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for the > elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for the > assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for the > elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for factors > on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace for > factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size in the > complete tree): 12 > > > > INFOG(6) (number of nodes in the complete tree): > 53 > > > > INFOG(7) (ordering option effectively use after > analysis): 2 > > > > INFOG(8) (structural symmetry in percent of the > permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace to store > the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the matrix > factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal matrix after > factorization): 12 > > > > INFOG(12) (number of off-diagonal pivots): 0 > > > > INFOG(13) (number of delayed pivots after > factorization): 0 > > > > INFOG(14) (number of memory compress after > factorization): 0 > > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all MUMPS > internal data for factorization after analysis: value on the most memory > consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS internal > data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal data > allocated during factorization: value on the most memory consuming > processor): 1 > > > > INFOG(19) (size of all MUMPS internal data > allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries in the > factors): 3042 > > > > INFOG(21) (size in MB of memory effectively used > during factorization - value on the most memory consuming processor): 1 > > > > INFOG(22) (size in MB of memory effectively used > during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of ICNTL(6) > effectively used): 5 > > > > INFOG(24) (after analysis: value of ICNTL(12) > effectively used): 1 > > > > INFOG(25) (after factorization: number of pivots > modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number of null > pivots encountered): 0 > > > > INFOG(29) (after factorization: effective number > of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in Mbytes of > memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of analysis > done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant if > determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) 1 MPI > processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 108 nodes, limit used is 5 > > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: cg > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: bjacobi > > > > block Jacobi: number of blocks = 1 > > > > Local solve is same for all blocks, in the following KSP > and PC objects: > > > > KSP Object: (fieldsplit_FE_split_sub_) > 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_sub_) 1 > MPI processes > > > > type: ilu > > > > ILU: out-of-place factorization > > > > 0 levels of fill > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 1., needed 1. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > package used to perform factorization: petsc > > > > total: nonzeros=1017054, allocated > nonzeros=1017054 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 9492 nodes, limit > used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues calls > =0 > > > > using I-node routines: found 9492 nodes, limit used > is 5 > > > > linear system matrix followed by preconditioner matrix: > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > > type: schurcomplement > > > > rows=28476, cols=28476 > > > > Schur complement A11 - A10 inv(A00) A01 > > > > A11 > > > > Mat Object: (fieldsplit_FE_split_) > 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 9492 nodes, limit > used is 5 > > > > A10 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=324 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 5717 nodes, limit > used is 5 > > > > KSP of A00 > > > > KSP Object: (fieldsplit_RB_split_) > 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) > 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI > processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during > MatSetValues calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): > 0 > > > > ICNTL(13) (efficiency control): > 0 > > > > ICNTL(14) (percentage of estimated > workspace increase): 20 > > > > ICNTL(18) (input mat struct): > 0 > > > > ICNTL(19) (Shur complement info): > 0 > > > > ICNTL(20) (rhs sparse pattern): > 0 > > > > ICNTL(21) (solution struct): > 0 > > > > ICNTL(22) (in-core/out-of-core > facility): 0 > > > > ICNTL(23) (max size of memory can be > allocated locally):0 > > > > ICNTL(24) (detection of null pivot > rows): 0 > > > > ICNTL(25) (computation of a null space > basis): 0 > > > > ICNTL(26) (Schur options for rhs or > solution): 0 > > > > ICNTL(27) (experimental parameter): > -24 > > > > ICNTL(28) (use parallel or sequential > ordering): 1 > > > > ICNTL(29) (parallel ordering): > 0 > > > > ICNTL(30) (user-specified set of entries > in inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the > solve phase): 0 > > > > ICNTL(33) (compute determinant): > 0 > > > > CNTL(1) (relative pivoting threshold): > 0.01 > > > > CNTL(2) (stopping criterion of > refinement): 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): > 0. > > > > CNTL(4) (value of static pivoting): > -1. > > > > CNTL(5) (fixation for null pivots): > 0. > > > > RINFO(1) (local estimated flops for the > elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the > assembly after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the > elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) > MUMPS internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal > data used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on > this processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for > the elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for > the assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for > the elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) > (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for > factors on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace > for factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size > in the complete tree): 12 > > > > INFOG(6) (number of nodes in the > complete tree): 53 > > > > INFOG(7) (ordering option effectively > use after analysis): 2 > > > > INFOG(8) (structural symmetry in percent > of the permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace > to store the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the > matrix factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal > matrix after factorization): 12 > > > > INFOG(12) (number of off-diagonal > pivots): 0 > > > > INFOG(13) (number of delayed pivots > after factorization): 0 > > > > INFOG(14) (number of memory compress > after factorization): 0 > > > > INFOG(15) (number of steps of iterative > refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all > MUMPS internal data for factorization after analysis: value on the most > memory consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS > internal data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal > data allocated during factorization: value on the most memory consuming > processor): 1 > > > > INFOG(19) (size of all MUMPS internal > data allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries > in the factors): 3042 > > > > INFOG(21) (size in MB of memory > effectively used during factorization - value on the most memory consuming > processor): 1 > > > > INFOG(22) (size in MB of memory > effectively used during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of > ICNTL(6) effectively used): 5 > > > > INFOG(24) (after analysis: value of > ICNTL(12) effectively used): 1 > > > > INFOG(25) (after factorization: number > of pivots modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number > of null pivots encountered): 0 > > > > INFOG(29) (after factorization: > effective number of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in > Mbytes of memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of > analysis done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant > if determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) > 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 108 nodes, limit > used is 5 > > > > A01 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=28476 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues > calls =0 > > > > using I-node routines: found 67 nodes, limit used > is 5 > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI > processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9492 nodes, limit used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: () 1 MPI processes > > > > type: seqaij > > > > rows=28800, cols=28800 > > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > > > > > ---------------------------------------------- PETSc Performance > Summary: ---------------------------------------------- > > > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a > arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 > 16:16:47 2017 > > > > Using Petsc Release Version 3.7.3, unknown > > > > > > > > Max Max/Min Avg Total > > > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > > > Objects: 1.990e+02 1.00000 1.990e+02 > > > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > > > Flop counting convention: 1 flop = 1 real number operation of type > (multiply/divide/add/subtract) > > > > e.g., VecAXPY() for real vectors of > length N --> 2N flops > > > > and VecAXPY() for complex vectors of > length N --> 8N flops > > > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- > Messages --- -- Message Lengths -- -- Reductions -- > > > > Avg %Total Avg %Total counts > %Total Avg %Total counts %Total > > > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% > 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > See the 'Profiling' chapter of the users' manual for details on > interpreting output. > > > > Phase summary info: > > > > Count: number of times phase was executed > > > > Time and Flops: Max - maximum over all processors > > > > Ratio - ratio of maximum to minimum over all > processors > > > > Mess: number of messages sent > > > > Avg. len: average message length (bytes) > > > > Reduct: number of global reductions > > > > Global: entire computation > > > > Stage: stages of a computation. Set stages with > PetscLogStagePush() and PetscLogStagePop(). > > > > %T - percent time in this phase %F - percent flops in > this phase > > > > %M - percent messages in this phase %L - percent message > lengths in this phase > > > > %R - percent reductions in this phase > > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max > time over all processors) > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > Event Count Time (sec) Flops > --- Global --- --- Stage --- Total > > > > Max Ratio Max Ratio Max Ratio Mess Avg > len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 > 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 > 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 > 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 > 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > ------------------------------------------------------------ > ------------------------------------------------------------ > > > > > > > > Memory usage is given in bytes: > > > > > > > > Object Type Creations Destructions Memory > Descendants' Mem. > > > > Reports information only for process 0. > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > Vector 91 91 9693912 0. > > > > Vector Scatter 24 24 15936 0. > > > > Index Set 51 51 537888 0. > > > > IS L to G Mapping 3 3 240408 0. > > > > Matrix 13 13 64097868 0. > > > > Krylov Solver 6 6 7888 0. > > > > Preconditioner 6 6 6288 0. > > > > Viewer 1 0 0 0. > > > > Distributed Mesh 1 1 4624 0. > > > > Star Forest Bipartite Graph 2 2 1616 0. > > > > Discrete System 1 1 872 0. > > > > ============================================================ > ============================================================ > > > > Average time to get PetscTime(): 0. > > > > #PETSc Option Table entries: > > > > -ksp_monitor > > > > -ksp_view > > > > -log_view > > > > #End of PETSc Option Table entries > > > > Compiled without FORTRAN kernels > > > > Compiled with full precision matrices (default) > > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 > sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > > Configure options: --with-shared-libraries=1 --with-debugging=0 > --download-suitesparse --download-blacs --download-ptscotch=yes > --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl > --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps > --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc > --download-hypre --download-ml > > > > ----------------------------------------- > > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- > with-Ubuntu-16.04-xenial > > > > Using PETSc directory: /home/dknez/software/petsc-src > > > > Using PETSc arch: arch-linux2-c-opt > > > > ----------------------------------------- > > > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings > -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O > ${COPTFLAGS} ${CFLAGS} > > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 > -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > > ----------------------------------------- > > > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include > -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include > -I/home/dknez/software/libmesh_install/opt_real/petsc/include > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent > -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include > -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > > ----------------------------------------- > > > > > > > > Using C linker: mpicc > > > > Using Fortran linker: mpif90 > > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib > -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc > -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib > -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps > -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx > -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod > -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig > -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 > -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 > -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch > -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 > -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm > -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz > -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib > -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu > -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu > -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl > -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 12 05:16:19 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 12 Jan 2017 05:16:19 -0600 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Message-ID: On Wed, Jan 11, 2017 at 10:37 PM, David Knezevic wrote: > On Wed, Jan 11, 2017 at 9:55 PM, Barry Smith wrote: > >> >> That is disappointing, >> >> Please try using >> >> -pc_fieldsplit_schur_precondition full >> >> with the two cases of -fieldsplit_FE_split_pc_type gamg and >> -fieldsplit_FE_split_pc_type cholesky >> >> > One more data point: The initial mesh I tried had some somewhat bad > quality elements. I tried some other cases that have better conditioned > meshes (nicely shaped hex elements), and using ILU(0) for A11 worked well > in those cases. So certainly the conditioning of A11 appears to play a > significant role here (not surprising). > > > Regarding --pc_fieldsplit_schur_precondition full: I must have my > SetFromOptions call in the wrong place, because I can't get > "-pc_fieldsplit_schur_precondition full" to have an effect (I'll look > into that some more). > > However, I was able to use PCFieldSplitSetSchurPre to set the > schur_precondition property via code. That worked with A11, SELF, SELFP, > but when I did: > > PCFieldSplitSetSchurPre (pc, PC_FIELDSPLIT_SCHUR_PRE_FULL, NULL); > > I got an error: > [0]PETSC ERROR: No support for this operation for this object type > [0]PETSC ERROR: Not yet implemented for Schur complements with > non-vanishing D > Barry fixed this, but it might only be in th 3.7.5 Thanks, Matt > David > > > > On Jan 11, 2017, at 8:49 PM, David Knezevic >> wrote: >> > >> > OK, that's encouraging. However, OK, that's encouraging. However, >> regarding this: >> > >> > So the next step is to try using -fieldsplit_FE_split_ksp_monitor >> -fieldsplit_FE_split_pc_type gamg >> > >> > I tried this and it didn't converge at all (it hit the 10000 iteration >> max in the output from -fieldsplit_FE_split_ksp_monitor). So I guess >> I'd need to attach the near nullspace to make this work reasonably, as you >> said. Sounds like that may not be easy to do in this case though? I'll try >> some other preconditioners in the meantime. >> > >> > Thanks, >> > David >> > >> > >> > On Wed, Jan 11, 2017 at 9:31 PM, Barry Smith >> wrote: >> > >> > Thanks, this is very useful information. It means that >> > >> > 1) the approximate Sp is actually a very good approximation to the true >> Schur complement S, since using Sp^-1 to precondition S gives iteration >> counts from 8 to 13. >> > >> > 2) using ilu(0) as a preconditioner for Sp is not good, since >> replacing Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is >> actually not super surprising since ilu(0) is generally "not so good" for >> elasticity. >> > >> > So the next step is to try using -fieldsplit_FE_split_ksp_monitor >> -fieldsplit_FE_split_pc_type gamg >> > >> > the one open question is if any options should be passed to the gamg to >> tell it that the underly problem comes from "elasticity"; that is something >> about the null space. >> > >> > Mark Adams, since the GAMG is coming from inside another >> preconditioner it may not be easy for the easy for the user to attach the >> near null space to that inner matrix. Would it make sense for there to be a >> GAMG command line option to indicate that it is a 3d elasticity problem so >> GAMG could set up the near null space for itself? or does that not make >> sense? >> > >> > Barry >> > >> > >> > >> > > On Jan 11, 2017, at 7:47 PM, David Knezevic < >> david.knezevic at akselos.com> wrote: >> > > >> > > I've attached the two log files. Using cholesky for "FE_split" seems >> to have helped a lot! >> > > >> > > David >> > > >> > > >> > > -- >> > > David J. Knezevic | CTO >> > > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 >> > > >> > > Phone: +1-617-599-4755 >> > > >> > > This e-mail and any attachments may contain confidential material for >> the sole use of the intended recipient(s). Any review or distribution by >> others is strictly prohibited. If you are not the intended recipient, >> please contact the sender and delete all copies. >> > > >> > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith >> wrote: >> > > >> > > Can you please run with all the monitoring on? So we can see the >> convergence of all the inner solvers >> > > -fieldsplit_FE_split_ksp_monitor >> > > >> > > Then run again with >> > > >> > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type >> cholesky >> > > >> > > >> > > and send both sets of results >> > > >> > > Barry >> > > >> > > >> > > > On Jan 11, 2017, at 6:32 PM, David Knezevic < >> david.knezevic at akselos.com> wrote: >> > > > >> > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May >> wrote: >> > > > so I gather that I'll have to look into a user-defined >> approximation to S. >> > > > >> > > > Where does the 2x2 block system come from? >> > > > Maybe someone on the list knows the right approximation to use for >> S. >> > > > >> > > > The model is 3D linear elasticity using a finite element >> discretization. I applied substructuring to part of the system to >> "condense" it, and that results in the small A00 block. The A11 block is >> just standard 3D elasticity; no substructuring was applied there. There are >> constraints to connect the degrees of freedom on the interface of the >> substructured and non-substructured regions. >> > > > >> > > > If anyone has suggestions for a good way to precondition this type >> of system, I'd be most appreciative! >> > > > >> > > > Thanks, >> > > > David >> > > > >> > > > >> > > > >> > > > ----------------------------------------- >> > > > >> > > > 0 KSP Residual norm 5.405528187695e+04 >> > > > 1 KSP Residual norm 2.187814910803e+02 >> > > > 2 KSP Residual norm 1.019051577515e-01 >> > > > 3 KSP Residual norm 4.370464012859e-04 >> > > > KSP Object: 1 MPI processes >> > > > type: cg >> > > > maximum iterations=1000 >> > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. >> > > > left preconditioning >> > > > using nonzero initial guess >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: 1 MPI processes >> > > > type: fieldsplit >> > > > FieldSplit with Schur preconditioner, factorization FULL >> > > > Preconditioner for the Schur complement formed from Sp, an >> assembled approximation to S, which uses (lumped, if requested) A00's >> diagonal's inverse >> > > > Split info: >> > > > Split number 0 Defined by IS >> > > > Split number 1 Defined by IS >> > > > KSP solver for A00 block >> > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes >> > > > type: preonly >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using NONE norm type for convergence test >> > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes >> > > > type: cholesky >> > > > Cholesky: out-of-place factorization >> > > > tolerance for zero pivot 2.22045e-14 >> > > > matrix ordering: natural >> > > > factor fill ratio given 0., needed 0. >> > > > Factored matrix follows: >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > package used to perform factorization: mumps >> > > > total: nonzeros=3042, allocated nonzeros=3042 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > MUMPS run parameters: >> > > > SYM (matrix type): 2 >> > > > PAR (host participation): 1 >> > > > ICNTL(1) (output for error): 6 >> > > > ICNTL(2) (output of diagnostic msg): 0 >> > > > ICNTL(3) (output for global info): 0 >> > > > ICNTL(4) (level of printing): 0 >> > > > ICNTL(5) (input mat struct): 0 >> > > > ICNTL(6) (matrix prescaling): 7 >> > > > ICNTL(7) (sequentia matrix ordering):7 >> > > > ICNTL(8) (scalling strategy): 77 >> > > > ICNTL(10) (max num of refinements): 0 >> > > > ICNTL(11) (error analysis): 0 >> > > > ICNTL(12) (efficiency control): >> 0 >> > > > ICNTL(13) (efficiency control): >> 0 >> > > > ICNTL(14) (percentage of estimated workspace >> increase): 20 >> > > > ICNTL(18) (input mat struct): >> 0 >> > > > ICNTL(19) (Shur complement info): >> 0 >> > > > ICNTL(20) (rhs sparse pattern): >> 0 >> > > > ICNTL(21) (solution struct): >> 0 >> > > > ICNTL(22) (in-core/out-of-core facility): >> 0 >> > > > ICNTL(23) (max size of memory can be allocated >> locally):0 >> > > > ICNTL(24) (detection of null pivot rows): >> 0 >> > > > ICNTL(25) (computation of a null space basis): >> 0 >> > > > ICNTL(26) (Schur options for rhs or solution): >> 0 >> > > > ICNTL(27) (experimental parameter): >> -24 >> > > > ICNTL(28) (use parallel or sequential >> ordering): 1 >> > > > ICNTL(29) (parallel ordering): >> 0 >> > > > ICNTL(30) (user-specified set of entries in >> inv(A)): 0 >> > > > ICNTL(31) (factors is discarded in the solve >> phase): 0 >> > > > ICNTL(33) (compute determinant): >> 0 >> > > > CNTL(1) (relative pivoting threshold): 0.01 >> > > > CNTL(2) (stopping criterion of refinement): >> 1.49012e-08 >> > > > CNTL(3) (absolute pivoting threshold): 0. >> > > > CNTL(4) (value of static pivoting): -1. >> > > > CNTL(5) (fixation for null pivots): 0. >> > > > RINFO(1) (local estimated flops for the >> elimination after analysis): >> > > > [0] 29394. >> > > > RINFO(2) (local estimated flops for the >> assembly after factorization): >> > > > [0] 1092. >> > > > RINFO(3) (local estimated flops for the >> elimination after factorization): >> > > > [0] 29394. >> > > > INFO(15) (estimated size of (in MB) MUMPS >> internal data for running numerical factorization): >> > > > [0] 1 >> > > > INFO(16) (size of (in MB) MUMPS internal data >> used during numerical factorization): >> > > > [0] 1 >> > > > INFO(23) (num of pivots eliminated on this >> processor after factorization): >> > > > [0] 324 >> > > > RINFOG(1) (global estimated flops for the >> elimination after analysis): 29394. >> > > > RINFOG(2) (global estimated flops for the >> assembly after factorization): 1092. >> > > > RINFOG(3) (global estimated flops for the >> elimination after factorization): 29394. >> > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) >> (determinant): (0.,0.)*(2^0) >> > > > INFOG(3) (estimated real workspace for factors >> on all processors after analysis): 3888 >> > > > INFOG(4) (estimated integer workspace for >> factors on all processors after analysis): 2067 >> > > > INFOG(5) (estimated maximum front size in the >> complete tree): 12 >> > > > INFOG(6) (number of nodes in the complete >> tree): 53 >> > > > INFOG(7) (ordering option effectively use after >> analysis): 2 >> > > > INFOG(8) (structural symmetry in percent of the >> permuted matrix after analysis): 100 >> > > > INFOG(9) (total real/complex workspace to store >> the matrix factors after factorization): 3888 >> > > > INFOG(10) (total integer space store the matrix >> factors after factorization): 2067 >> > > > INFOG(11) (order of largest frontal matrix >> after factorization): 12 >> > > > INFOG(12) (number of off-diagonal pivots): 0 >> > > > INFOG(13) (number of delayed pivots after >> factorization): 0 >> > > > INFOG(14) (number of memory compress after >> factorization): 0 >> > > > INFOG(15) (number of steps of iterative >> refinement after solution): 0 >> > > > INFOG(16) (estimated size (in MB) of all MUMPS >> internal data for factorization after analysis: value on the most memory >> consuming processor): 1 >> > > > INFOG(17) (estimated size of all MUMPS internal >> data for factorization after analysis: sum over all processors): 1 >> > > > INFOG(18) (size of all MUMPS internal data >> allocated during factorization: value on the most memory consuming >> processor): 1 >> > > > INFOG(19) (size of all MUMPS internal data >> allocated during factorization: sum over all processors): 1 >> > > > INFOG(20) (estimated number of entries in the >> factors): 3042 >> > > > INFOG(21) (size in MB of memory effectively >> used during factorization - value on the most memory consuming processor): 1 >> > > > INFOG(22) (size in MB of memory effectively >> used during factorization - sum over all processors): 1 >> > > > INFOG(23) (after analysis: value of ICNTL(6) >> effectively used): 5 >> > > > INFOG(24) (after analysis: value of ICNTL(12) >> effectively used): 1 >> > > > INFOG(25) (after factorization: number of >> pivots modified by static pivoting): 0 >> > > > INFOG(28) (after factorization: number of null >> pivots encountered): 0 >> > > > INFOG(29) (after factorization: effective >> number of entries in the factors (sum over all processors)): 3042 >> > > > INFOG(30, 31) (after solution: size in Mbytes >> of memory used during solution phase): 0, 0 >> > > > INFOG(32) (after analysis: type of analysis >> done): 1 >> > > > INFOG(33) (value used for ICNTL(8)): -2 >> > > > INFOG(34) (exponent of the determinant if >> determinant is requested): 0 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: (fieldsplit_RB_split_) 1 MPI >> processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > total: nonzeros=5760, allocated nonzeros=5760 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > using I-node routines: found 108 nodes, limit used is 5 >> > > > KSP solver for S = A11 - A10 inv(A00) A01 >> > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes >> > > > type: cg >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes >> > > > type: bjacobi >> > > > block Jacobi: number of blocks = 1 >> > > > Local solve is same for all blocks, in the following KSP >> and PC objects: >> > > > KSP Object: (fieldsplit_FE_split_sub_) >> 1 MPI processes >> > > > type: preonly >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using NONE norm type for convergence test >> > > > PC Object: (fieldsplit_FE_split_sub_) >> 1 MPI processes >> > > > type: ilu >> > > > ILU: out-of-place factorization >> > > > 0 levels of fill >> > > > tolerance for zero pivot 2.22045e-14 >> > > > matrix ordering: natural >> > > > factor fill ratio given 1., needed 1. >> > > > Factored matrix follows: >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > package used to perform factorization: petsc >> > > > total: nonzeros=1037052, allocated >> nonzeros=1037052 >> > > > total number of mallocs used during >> MatSetValues calls =0 >> > > > using I-node routines: found 9489 nodes, >> limit used is 5 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > total: nonzeros=1037052, allocated nonzeros=1037052 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 9489 nodes, limit used >> is 5 >> > > > linear system matrix followed by preconditioner matrix: >> > > > Mat Object: (fieldsplit_FE_split_) 1 MPI >> processes >> > > > type: schurcomplement >> > > > rows=28476, cols=28476 >> > > > Schur complement A11 - A10 inv(A00) A01 >> > > > A11 >> > > > Mat Object: (fieldsplit_FE_split_) >> 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > total: nonzeros=1017054, allocated nonzeros=1017054 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 9492 nodes, limit >> used is 5 >> > > > A10 >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=324 >> > > > total: nonzeros=936, allocated nonzeros=936 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 5717 nodes, limit >> used is 5 >> > > > KSP of A00 >> > > > KSP Object: (fieldsplit_RB_split_) >> 1 MPI processes >> > > > type: preonly >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using NONE norm type for convergence test >> > > > PC Object: (fieldsplit_RB_split_) >> 1 MPI processes >> > > > type: cholesky >> > > > Cholesky: out-of-place factorization >> > > > tolerance for zero pivot 2.22045e-14 >> > > > matrix ordering: natural >> > > > factor fill ratio given 0., needed 0. >> > > > Factored matrix follows: >> > > > Mat Object: 1 MPI >> processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > package used to perform factorization: mumps >> > > > total: nonzeros=3042, allocated >> nonzeros=3042 >> > > > total number of mallocs used during >> MatSetValues calls =0 >> > > > MUMPS run parameters: >> > > > SYM (matrix type): 2 >> > > > PAR (host participation): 1 >> > > > ICNTL(1) (output for error): 6 >> > > > ICNTL(2) (output of diagnostic msg): 0 >> > > > ICNTL(3) (output for global info): 0 >> > > > ICNTL(4) (level of printing): 0 >> > > > ICNTL(5) (input mat struct): 0 >> > > > ICNTL(6) (matrix prescaling): 7 >> > > > ICNTL(7) (sequentia matrix ordering):7 >> > > > ICNTL(8) (scalling strategy): 77 >> > > > ICNTL(10) (max num of refinements): 0 >> > > > ICNTL(11) (error analysis): 0 >> > > > ICNTL(12) (efficiency control): >> 0 >> > > > ICNTL(13) (efficiency control): >> 0 >> > > > ICNTL(14) (percentage of estimated >> workspace increase): 20 >> > > > ICNTL(18) (input mat struct): >> 0 >> > > > ICNTL(19) (Shur complement info): >> 0 >> > > > ICNTL(20) (rhs sparse pattern): >> 0 >> > > > ICNTL(21) (solution struct): >> 0 >> > > > ICNTL(22) (in-core/out-of-core >> facility): 0 >> > > > ICNTL(23) (max size of memory can be >> allocated locally):0 >> > > > ICNTL(24) (detection of null pivot >> rows): 0 >> > > > ICNTL(25) (computation of a null space >> basis): 0 >> > > > ICNTL(26) (Schur options for rhs or >> solution): 0 >> > > > ICNTL(27) (experimental parameter): >> -24 >> > > > ICNTL(28) (use parallel or sequential >> ordering): 1 >> > > > ICNTL(29) (parallel ordering): >> 0 >> > > > ICNTL(30) (user-specified set of >> entries in inv(A)): 0 >> > > > ICNTL(31) (factors is discarded in the >> solve phase): 0 >> > > > ICNTL(33) (compute determinant): >> 0 >> > > > CNTL(1) (relative pivoting threshold): >> 0.01 >> > > > CNTL(2) (stopping criterion of >> refinement): 1.49012e-08 >> > > > CNTL(3) (absolute pivoting threshold): >> 0. >> > > > CNTL(4) (value of static pivoting): >> -1. >> > > > CNTL(5) (fixation for null pivots): >> 0. >> > > > RINFO(1) (local estimated flops for the >> elimination after analysis): >> > > > [0] 29394. >> > > > RINFO(2) (local estimated flops for the >> assembly after factorization): >> > > > [0] 1092. >> > > > RINFO(3) (local estimated flops for the >> elimination after factorization): >> > > > [0] 29394. >> > > > INFO(15) (estimated size of (in MB) >> MUMPS internal data for running numerical factorization): >> > > > [0] 1 >> > > > INFO(16) (size of (in MB) MUMPS >> internal data used during numerical factorization): >> > > > [0] 1 >> > > > INFO(23) (num of pivots eliminated on >> this processor after factorization): >> > > > [0] 324 >> > > > RINFOG(1) (global estimated flops for >> the elimination after analysis): 29394. >> > > > RINFOG(2) (global estimated flops for >> the assembly after factorization): 1092. >> > > > RINFOG(3) (global estimated flops for >> the elimination after factorization): 29394. >> > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) >> (determinant): (0.,0.)*(2^0) >> > > > INFOG(3) (estimated real workspace for >> factors on all processors after analysis): 3888 >> > > > INFOG(4) (estimated integer workspace >> for factors on all processors after analysis): 2067 >> > > > INFOG(5) (estimated maximum front size >> in the complete tree): 12 >> > > > INFOG(6) (number of nodes in the >> complete tree): 53 >> > > > INFOG(7) (ordering option effectively >> use after analysis): 2 >> > > > INFOG(8) (structural symmetry in >> percent of the permuted matrix after analysis): 100 >> > > > INFOG(9) (total real/complex workspace >> to store the matrix factors after factorization): 3888 >> > > > INFOG(10) (total integer space store >> the matrix factors after factorization): 2067 >> > > > INFOG(11) (order of largest frontal >> matrix after factorization): 12 >> > > > INFOG(12) (number of off-diagonal >> pivots): 0 >> > > > INFOG(13) (number of delayed pivots >> after factorization): 0 >> > > > INFOG(14) (number of memory compress >> after factorization): 0 >> > > > INFOG(15) (number of steps of iterative >> refinement after solution): 0 >> > > > INFOG(16) (estimated size (in MB) of >> all MUMPS internal data for factorization after analysis: value on the most >> memory consuming processor): 1 >> > > > INFOG(17) (estimated size of all MUMPS >> internal data for factorization after analysis: sum over all processors): 1 >> > > > INFOG(18) (size of all MUMPS internal >> data allocated during factorization: value on the most memory consuming >> processor): 1 >> > > > INFOG(19) (size of all MUMPS internal >> data allocated during factorization: sum over all processors): 1 >> > > > INFOG(20) (estimated number of entries >> in the factors): 3042 >> > > > INFOG(21) (size in MB of memory >> effectively used during factorization - value on the most memory consuming >> processor): 1 >> > > > INFOG(22) (size in MB of memory >> effectively used during factorization - sum over all processors): 1 >> > > > INFOG(23) (after analysis: value of >> ICNTL(6) effectively used): 5 >> > > > INFOG(24) (after analysis: value of >> ICNTL(12) effectively used): 1 >> > > > INFOG(25) (after factorization: number >> of pivots modified by static pivoting): 0 >> > > > INFOG(28) (after factorization: number >> of null pivots encountered): 0 >> > > > INFOG(29) (after factorization: >> effective number of entries in the factors (sum over all processors)): 3042 >> > > > INFOG(30, 31) (after solution: size in >> Mbytes of memory used during solution phase): 0, 0 >> > > > INFOG(32) (after analysis: type of >> analysis done): 1 >> > > > INFOG(33) (value used for ICNTL(8)): -2 >> > > > INFOG(34) (exponent of the determinant >> if determinant is requested): 0 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: (fieldsplit_RB_split_) >> 1 MPI processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > total: nonzeros=5760, allocated nonzeros=5760 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 108 nodes, limit >> used is 5 >> > > > A01 >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=324, cols=28476 >> > > > total: nonzeros=936, allocated nonzeros=936 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 67 nodes, limit used >> is 5 >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > total: nonzeros=1037052, allocated nonzeros=1037052 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > using I-node routines: found 9489 nodes, limit used is 5 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: () 1 MPI processes >> > > > type: seqaij >> > > > rows=28800, cols=28800 >> > > > total: nonzeros=1024686, allocated nonzeros=1024794 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > using I-node routines: found 9600 nodes, limit used is 5 >> > > > >> > > > ---------------------------------------------- PETSc Performance >> Summary: ---------------------------------------------- >> > > > >> > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a >> arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 >> 17:22:10 2017 >> > > > Using Petsc Release Version 3.7.3, unknown >> > > > >> > > > Max Max/Min Avg Total >> > > > Time (sec): 9.638e+01 1.00000 9.638e+01 >> > > > Objects: 2.030e+02 1.00000 2.030e+02 >> > > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 >> > > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 >> > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> > > > MPI Reductions: 0.000e+00 0.00000 >> > > > >> > > > Flop counting convention: 1 flop = 1 real number operation of type >> (multiply/divide/add/subtract) >> > > > e.g., VecAXPY() for real vectors of >> length N --> 2N flops >> > > > and VecAXPY() for complex vectors of >> length N --> 8N flops >> > > > >> > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- >> Messages --- -- Message Lengths -- -- Reductions -- >> > > > Avg %Total Avg %Total counts >> %Total Avg %Total counts %Total >> > > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% >> 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% >> > > > >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > See the 'Profiling' chapter of the users' manual for details on >> interpreting output. >> > > > Phase summary info: >> > > > Count: number of times phase was executed >> > > > Time and Flops: Max - maximum over all processors >> > > > Ratio - ratio of maximum to minimum over all >> processors >> > > > Mess: number of messages sent >> > > > Avg. len: average message length (bytes) >> > > > Reduct: number of global reductions >> > > > Global: entire computation >> > > > Stage: stages of a computation. Set stages with >> PetscLogStagePush() and PetscLogStagePop(). >> > > > %T - percent time in this phase %F - percent flops in >> this phase >> > > > %M - percent messages in this phase %L - percent message >> lengths in this phase >> > > > %R - percent reductions in this phase >> > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max >> time over all processors) >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > Event Count Time (sec) Flops >> --- Global --- --- Stage --- Total >> > > > Max Ratio Max Ratio Max Ratio Mess >> Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > >> > > > --- Event Stage 0: Main Stage >> > > > >> > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 >> > > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 >> > > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 >> > > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 >> > > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 >> > > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 >> > > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 >> 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 >> > > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 >> 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 >> > > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 >> 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 >> > > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 >> > > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 >> > > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 >> > > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 >> > > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 >> 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 >> > > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 >> > > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 >> > > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 >> 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 >> > > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 >> 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 >> > > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > >> > > > Memory usage is given in bytes: >> > > > >> > > > Object Type Creations Destructions Memory >> Descendants' Mem. >> > > > Reports information only for process 0. >> > > > >> > > > --- Event Stage 0: Main Stage >> > > > >> > > > Vector 92 92 9698040 0. >> > > > Vector Scatter 24 24 15936 0. >> > > > Index Set 51 51 537876 0. >> > > > IS L to G Mapping 3 3 240408 0. >> > > > Matrix 16 16 77377776 0. >> > > > Krylov Solver 6 6 7888 0. >> > > > Preconditioner 6 6 6288 0. >> > > > Viewer 1 0 0 0. >> > > > Distributed Mesh 1 1 4624 0. >> > > > Star Forest Bipartite Graph 2 2 1616 0. >> > > > Discrete System 1 1 872 0. >> > > > ============================================================ >> ============================================================ >> > > > Average time to get PetscTime(): 0. >> > > > #PETSc Option Table entries: >> > > > -ksp_monitor >> > > > -ksp_view >> > > > -log_view >> > > > #End of PETSc Option Table entries >> > > > Compiled without FORTRAN kernels >> > > > Compiled with full precision matrices (default) >> > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 >> sizeof(PetscScalar) 8 sizeof(PetscInt) 4 >> > > > Configure options: --with-shared-libraries=1 --with-debugging=0 >> --download-suitesparse --download-blacs --download-ptscotch=yes >> --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl >> --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps >> --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc >> --download-hypre --download-ml >> > > > ----------------------------------------- >> > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo >> > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- >> with-Ubuntu-16.04-xenial >> > > > Using PETSc directory: /home/dknez/software/petsc-src >> > > > Using PETSc arch: arch-linux2-c-opt >> > > > ----------------------------------------- >> > > > >> > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O >> ${COPTFLAGS} ${CFLAGS} >> > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 >> -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} >> > > > ----------------------------------------- >> > > > >> > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include >> -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/libmesh_install/opt_real/petsc/include >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include >> -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi >> > > > ----------------------------------------- >> > > > >> > > > Using C linker: mpicc >> > > > Using Fortran linker: mpif90 >> > > > Using libraries: -Wl,-rpath,/home/dknez/softwar >> e/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib >> -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib >> -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps >> -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx >> -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod >> -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig >> -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 >> -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 >> -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch >> -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 >> -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm >> -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl >> -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl >> > > > ----------------------------------------- >> > > > >> > > > >> > > > >> > > > >> > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May >> wrote: >> > > > It looks like the Schur solve is requiring a huge number of >> iterates to converge (based on the instances of MatMult). >> > > > This is killing the performance. >> > > > >> > > > Are you sure that A11 is a good approximation to S? You might >> consider trying the selfp option >> > > > >> > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/ >> PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre >> > > > >> > > > Note that the best approx to S is likely both problem and >> discretisation dependent so if selfp is also terrible, you might want to >> consider coding up your own approx to S for your specific system. >> > > > >> > > > >> > > > Thanks, >> > > > Dave >> > > > >> > > > >> > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic < >> david.knezevic at akselos.com> wrote: >> > > > I have a definite block 2x2 system and I figured it'd be good to >> apply the PCFIELDSPLIT functionality with Schur complement, as described in >> Section 4.5 of the manual. >> > > > >> > > > The A00 block of my matrix is very small so I figured I'd specify a >> direct solver (i.e. MUMPS) for that block. >> > > > >> > > > So I did the following: >> > > > - PCFieldSplitSetIS to specify the indices of the two splits >> > > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the >> solver and PC types for each (MUMPS for A00, ILU+CG for A11) >> > > > - I set -pc_fieldsplit_schur_fact_type full >> > > > >> > > > Below I have pasted the output of "-ksp_view -ksp_monitor >> -log_view" for a test case. It seems to converge well, but I'm concerned >> about the speed (about 90 seconds, vs. about 1 second if I use a direct >> solver for the entire system). I just wanted to check if I'm setting this >> up in a good way? >> > > > >> > > > Many thanks, >> > > > David >> > > > >> > > > ------------------------------------------------------------ >> ----------------------- >> > > > >> > > > 0 KSP Residual norm 5.405774214400e+04 >> > > > 1 KSP Residual norm 1.849649014371e+02 >> > > > 2 KSP Residual norm 7.462775074989e-02 >> > > > 3 KSP Residual norm 2.680497175260e-04 >> > > > KSP Object: 1 MPI processes >> > > > type: cg >> > > > maximum iterations=1000 >> > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. >> > > > left preconditioning >> > > > using nonzero initial guess >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: 1 MPI processes >> > > > type: fieldsplit >> > > > FieldSplit with Schur preconditioner, factorization FULL >> > > > Preconditioner for the Schur complement formed from A11 >> > > > Split info: >> > > > Split number 0 Defined by IS >> > > > Split number 1 Defined by IS >> > > > KSP solver for A00 block >> > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes >> > > > type: preonly >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using NONE norm type for convergence test >> > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes >> > > > type: cholesky >> > > > Cholesky: out-of-place factorization >> > > > tolerance for zero pivot 2.22045e-14 >> > > > matrix ordering: natural >> > > > factor fill ratio given 0., needed 0. >> > > > Factored matrix follows: >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > package used to perform factorization: mumps >> > > > total: nonzeros=3042, allocated nonzeros=3042 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > MUMPS run parameters: >> > > > SYM (matrix type): 2 >> > > > PAR (host participation): 1 >> > > > ICNTL(1) (output for error): 6 >> > > > ICNTL(2) (output of diagnostic msg): 0 >> > > > ICNTL(3) (output for global info): 0 >> > > > ICNTL(4) (level of printing): 0 >> > > > ICNTL(5) (input mat struct): 0 >> > > > ICNTL(6) (matrix prescaling): 7 >> > > > ICNTL(7) (sequentia matrix ordering):7 >> > > > ICNTL(8) (scalling strategy): 77 >> > > > ICNTL(10) (max num of refinements): 0 >> > > > ICNTL(11) (error analysis): 0 >> > > > ICNTL(12) (efficiency control): >> 0 >> > > > ICNTL(13) (efficiency control): >> 0 >> > > > ICNTL(14) (percentage of estimated workspace >> increase): 20 >> > > > ICNTL(18) (input mat struct): >> 0 >> > > > ICNTL(19) (Shur complement info): >> 0 >> > > > ICNTL(20) (rhs sparse pattern): >> 0 >> > > > ICNTL(21) (solution struct): >> 0 >> > > > ICNTL(22) (in-core/out-of-core facility): >> 0 >> > > > ICNTL(23) (max size of memory can be allocated >> locally):0 >> > > > ICNTL(24) (detection of null pivot rows): >> 0 >> > > > ICNTL(25) (computation of a null space basis): >> 0 >> > > > ICNTL(26) (Schur options for rhs or solution): >> 0 >> > > > ICNTL(27) (experimental parameter): >> -24 >> > > > ICNTL(28) (use parallel or sequential >> ordering): 1 >> > > > ICNTL(29) (parallel ordering): >> 0 >> > > > ICNTL(30) (user-specified set of entries in >> inv(A)): 0 >> > > > ICNTL(31) (factors is discarded in the solve >> phase): 0 >> > > > ICNTL(33) (compute determinant): >> 0 >> > > > CNTL(1) (relative pivoting threshold): 0.01 >> > > > CNTL(2) (stopping criterion of refinement): >> 1.49012e-08 >> > > > CNTL(3) (absolute pivoting threshold): 0. >> > > > CNTL(4) (value of static pivoting): -1. >> > > > CNTL(5) (fixation for null pivots): 0. >> > > > RINFO(1) (local estimated flops for the >> elimination after analysis): >> > > > [0] 29394. >> > > > RINFO(2) (local estimated flops for the >> assembly after factorization): >> > > > [0] 1092. >> > > > RINFO(3) (local estimated flops for the >> elimination after factorization): >> > > > [0] 29394. >> > > > INFO(15) (estimated size of (in MB) MUMPS >> internal data for running numerical factorization): >> > > > [0] 1 >> > > > INFO(16) (size of (in MB) MUMPS internal data >> used during numerical factorization): >> > > > [0] 1 >> > > > INFO(23) (num of pivots eliminated on this >> processor after factorization): >> > > > [0] 324 >> > > > RINFOG(1) (global estimated flops for the >> elimination after analysis): 29394. >> > > > RINFOG(2) (global estimated flops for the >> assembly after factorization): 1092. >> > > > RINFOG(3) (global estimated flops for the >> elimination after factorization): 29394. >> > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) >> (determinant): (0.,0.)*(2^0) >> > > > INFOG(3) (estimated real workspace for factors >> on all processors after analysis): 3888 >> > > > INFOG(4) (estimated integer workspace for >> factors on all processors after analysis): 2067 >> > > > INFOG(5) (estimated maximum front size in the >> complete tree): 12 >> > > > INFOG(6) (number of nodes in the complete >> tree): 53 >> > > > INFOG(7) (ordering option effectively use after >> analysis): 2 >> > > > INFOG(8) (structural symmetry in percent of the >> permuted matrix after analysis): 100 >> > > > INFOG(9) (total real/complex workspace to store >> the matrix factors after factorization): 3888 >> > > > INFOG(10) (total integer space store the matrix >> factors after factorization): 2067 >> > > > INFOG(11) (order of largest frontal matrix >> after factorization): 12 >> > > > INFOG(12) (number of off-diagonal pivots): 0 >> > > > INFOG(13) (number of delayed pivots after >> factorization): 0 >> > > > INFOG(14) (number of memory compress after >> factorization): 0 >> > > > INFOG(15) (number of steps of iterative >> refinement after solution): 0 >> > > > INFOG(16) (estimated size (in MB) of all MUMPS >> internal data for factorization after analysis: value on the most memory >> consuming processor): 1 >> > > > INFOG(17) (estimated size of all MUMPS internal >> data for factorization after analysis: sum over all processors): 1 >> > > > INFOG(18) (size of all MUMPS internal data >> allocated during factorization: value on the most memory consuming >> processor): 1 >> > > > INFOG(19) (size of all MUMPS internal data >> allocated during factorization: sum over all processors): 1 >> > > > INFOG(20) (estimated number of entries in the >> factors): 3042 >> > > > INFOG(21) (size in MB of memory effectively >> used during factorization - value on the most memory consuming processor): 1 >> > > > INFOG(22) (size in MB of memory effectively >> used during factorization - sum over all processors): 1 >> > > > INFOG(23) (after analysis: value of ICNTL(6) >> effectively used): 5 >> > > > INFOG(24) (after analysis: value of ICNTL(12) >> effectively used): 1 >> > > > INFOG(25) (after factorization: number of >> pivots modified by static pivoting): 0 >> > > > INFOG(28) (after factorization: number of null >> pivots encountered): 0 >> > > > INFOG(29) (after factorization: effective >> number of entries in the factors (sum over all processors)): 3042 >> > > > INFOG(30, 31) (after solution: size in Mbytes >> of memory used during solution phase): 0, 0 >> > > > INFOG(32) (after analysis: type of analysis >> done): 1 >> > > > INFOG(33) (value used for ICNTL(8)): -2 >> > > > INFOG(34) (exponent of the determinant if >> determinant is requested): 0 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: (fieldsplit_RB_split_) 1 MPI >> processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > total: nonzeros=5760, allocated nonzeros=5760 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > using I-node routines: found 108 nodes, limit used is 5 >> > > > KSP solver for S = A11 - A10 inv(A00) A01 >> > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes >> > > > type: cg >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using PRECONDITIONED norm type for convergence test >> > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes >> > > > type: bjacobi >> > > > block Jacobi: number of blocks = 1 >> > > > Local solve is same for all blocks, in the following KSP >> and PC objects: >> > > > KSP Object: (fieldsplit_FE_split_sub_) >> 1 MPI processes >> > > > type: preonly >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using NONE norm type for convergence test >> > > > PC Object: (fieldsplit_FE_split_sub_) >> 1 MPI processes >> > > > type: ilu >> > > > ILU: out-of-place factorization >> > > > 0 levels of fill >> > > > tolerance for zero pivot 2.22045e-14 >> > > > matrix ordering: natural >> > > > factor fill ratio given 1., needed 1. >> > > > Factored matrix follows: >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > package used to perform factorization: petsc >> > > > total: nonzeros=1017054, allocated >> nonzeros=1017054 >> > > > total number of mallocs used during >> MatSetValues calls =0 >> > > > using I-node routines: found 9492 nodes, >> limit used is 5 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: (fieldsplit_FE_split_) >> 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > total: nonzeros=1017054, allocated nonzeros=1017054 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 9492 nodes, limit used >> is 5 >> > > > linear system matrix followed by preconditioner matrix: >> > > > Mat Object: (fieldsplit_FE_split_) 1 MPI >> processes >> > > > type: schurcomplement >> > > > rows=28476, cols=28476 >> > > > Schur complement A11 - A10 inv(A00) A01 >> > > > A11 >> > > > Mat Object: (fieldsplit_FE_split_) >> 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > total: nonzeros=1017054, allocated nonzeros=1017054 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 9492 nodes, limit >> used is 5 >> > > > A10 >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=28476, cols=324 >> > > > total: nonzeros=936, allocated nonzeros=936 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 5717 nodes, limit >> used is 5 >> > > > KSP of A00 >> > > > KSP Object: (fieldsplit_RB_split_) >> 1 MPI processes >> > > > type: preonly >> > > > maximum iterations=10000, initial guess is zero >> > > > tolerances: relative=1e-05, absolute=1e-50, >> divergence=10000. >> > > > left preconditioning >> > > > using NONE norm type for convergence test >> > > > PC Object: (fieldsplit_RB_split_) >> 1 MPI processes >> > > > type: cholesky >> > > > Cholesky: out-of-place factorization >> > > > tolerance for zero pivot 2.22045e-14 >> > > > matrix ordering: natural >> > > > factor fill ratio given 0., needed 0. >> > > > Factored matrix follows: >> > > > Mat Object: 1 MPI >> processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > package used to perform factorization: mumps >> > > > total: nonzeros=3042, allocated >> nonzeros=3042 >> > > > total number of mallocs used during >> MatSetValues calls =0 >> > > > MUMPS run parameters: >> > > > SYM (matrix type): 2 >> > > > PAR (host participation): 1 >> > > > ICNTL(1) (output for error): 6 >> > > > ICNTL(2) (output of diagnostic msg): 0 >> > > > ICNTL(3) (output for global info): 0 >> > > > ICNTL(4) (level of printing): 0 >> > > > ICNTL(5) (input mat struct): 0 >> > > > ICNTL(6) (matrix prescaling): 7 >> > > > ICNTL(7) (sequentia matrix ordering):7 >> > > > ICNTL(8) (scalling strategy): 77 >> > > > ICNTL(10) (max num of refinements): 0 >> > > > ICNTL(11) (error analysis): 0 >> > > > ICNTL(12) (efficiency control): >> 0 >> > > > ICNTL(13) (efficiency control): >> 0 >> > > > ICNTL(14) (percentage of estimated >> workspace increase): 20 >> > > > ICNTL(18) (input mat struct): >> 0 >> > > > ICNTL(19) (Shur complement info): >> 0 >> > > > ICNTL(20) (rhs sparse pattern): >> 0 >> > > > ICNTL(21) (solution struct): >> 0 >> > > > ICNTL(22) (in-core/out-of-core >> facility): 0 >> > > > ICNTL(23) (max size of memory can be >> allocated locally):0 >> > > > ICNTL(24) (detection of null pivot >> rows): 0 >> > > > ICNTL(25) (computation of a null space >> basis): 0 >> > > > ICNTL(26) (Schur options for rhs or >> solution): 0 >> > > > ICNTL(27) (experimental parameter): >> -24 >> > > > ICNTL(28) (use parallel or sequential >> ordering): 1 >> > > > ICNTL(29) (parallel ordering): >> 0 >> > > > ICNTL(30) (user-specified set of >> entries in inv(A)): 0 >> > > > ICNTL(31) (factors is discarded in the >> solve phase): 0 >> > > > ICNTL(33) (compute determinant): >> 0 >> > > > CNTL(1) (relative pivoting threshold): >> 0.01 >> > > > CNTL(2) (stopping criterion of >> refinement): 1.49012e-08 >> > > > CNTL(3) (absolute pivoting threshold): >> 0. >> > > > CNTL(4) (value of static pivoting): >> -1. >> > > > CNTL(5) (fixation for null pivots): >> 0. >> > > > RINFO(1) (local estimated flops for the >> elimination after analysis): >> > > > [0] 29394. >> > > > RINFO(2) (local estimated flops for the >> assembly after factorization): >> > > > [0] 1092. >> > > > RINFO(3) (local estimated flops for the >> elimination after factorization): >> > > > [0] 29394. >> > > > INFO(15) (estimated size of (in MB) >> MUMPS internal data for running numerical factorization): >> > > > [0] 1 >> > > > INFO(16) (size of (in MB) MUMPS >> internal data used during numerical factorization): >> > > > [0] 1 >> > > > INFO(23) (num of pivots eliminated on >> this processor after factorization): >> > > > [0] 324 >> > > > RINFOG(1) (global estimated flops for >> the elimination after analysis): 29394. >> > > > RINFOG(2) (global estimated flops for >> the assembly after factorization): 1092. >> > > > RINFOG(3) (global estimated flops for >> the elimination after factorization): 29394. >> > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) >> (determinant): (0.,0.)*(2^0) >> > > > INFOG(3) (estimated real workspace for >> factors on all processors after analysis): 3888 >> > > > INFOG(4) (estimated integer workspace >> for factors on all processors after analysis): 2067 >> > > > INFOG(5) (estimated maximum front size >> in the complete tree): 12 >> > > > INFOG(6) (number of nodes in the >> complete tree): 53 >> > > > INFOG(7) (ordering option effectively >> use after analysis): 2 >> > > > INFOG(8) (structural symmetry in >> percent of the permuted matrix after analysis): 100 >> > > > INFOG(9) (total real/complex workspace >> to store the matrix factors after factorization): 3888 >> > > > INFOG(10) (total integer space store >> the matrix factors after factorization): 2067 >> > > > INFOG(11) (order of largest frontal >> matrix after factorization): 12 >> > > > INFOG(12) (number of off-diagonal >> pivots): 0 >> > > > INFOG(13) (number of delayed pivots >> after factorization): 0 >> > > > INFOG(14) (number of memory compress >> after factorization): 0 >> > > > INFOG(15) (number of steps of iterative >> refinement after solution): 0 >> > > > INFOG(16) (estimated size (in MB) of >> all MUMPS internal data for factorization after analysis: value on the most >> memory consuming processor): 1 >> > > > INFOG(17) (estimated size of all MUMPS >> internal data for factorization after analysis: sum over all processors): 1 >> > > > INFOG(18) (size of all MUMPS internal >> data allocated during factorization: value on the most memory consuming >> processor): 1 >> > > > INFOG(19) (size of all MUMPS internal >> data allocated during factorization: sum over all processors): 1 >> > > > INFOG(20) (estimated number of entries >> in the factors): 3042 >> > > > INFOG(21) (size in MB of memory >> effectively used during factorization - value on the most memory consuming >> processor): 1 >> > > > INFOG(22) (size in MB of memory >> effectively used during factorization - sum over all processors): 1 >> > > > INFOG(23) (after analysis: value of >> ICNTL(6) effectively used): 5 >> > > > INFOG(24) (after analysis: value of >> ICNTL(12) effectively used): 1 >> > > > INFOG(25) (after factorization: number >> of pivots modified by static pivoting): 0 >> > > > INFOG(28) (after factorization: number >> of null pivots encountered): 0 >> > > > INFOG(29) (after factorization: >> effective number of entries in the factors (sum over all processors)): 3042 >> > > > INFOG(30, 31) (after solution: size in >> Mbytes of memory used during solution phase): 0, 0 >> > > > INFOG(32) (after analysis: type of >> analysis done): 1 >> > > > INFOG(33) (value used for ICNTL(8)): -2 >> > > > INFOG(34) (exponent of the determinant >> if determinant is requested): 0 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: (fieldsplit_RB_split_) >> 1 MPI processes >> > > > type: seqaij >> > > > rows=324, cols=324 >> > > > total: nonzeros=5760, allocated nonzeros=5760 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 108 nodes, limit >> used is 5 >> > > > A01 >> > > > Mat Object: 1 MPI processes >> > > > type: seqaij >> > > > rows=324, cols=28476 >> > > > total: nonzeros=936, allocated nonzeros=936 >> > > > total number of mallocs used during MatSetValues >> calls =0 >> > > > using I-node routines: found 67 nodes, limit used >> is 5 >> > > > Mat Object: (fieldsplit_FE_split_) 1 MPI >> processes >> > > > type: seqaij >> > > > rows=28476, cols=28476 >> > > > total: nonzeros=1017054, allocated nonzeros=1017054 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > using I-node routines: found 9492 nodes, limit used is 5 >> > > > linear system matrix = precond matrix: >> > > > Mat Object: () 1 MPI processes >> > > > type: seqaij >> > > > rows=28800, cols=28800 >> > > > total: nonzeros=1024686, allocated nonzeros=1024794 >> > > > total number of mallocs used during MatSetValues calls =0 >> > > > using I-node routines: found 9600 nodes, limit used is 5 >> > > > >> > > > >> > > > ---------------------------------------------- PETSc Performance >> Summary: ---------------------------------------------- >> > > > >> > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a >> arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 >> 16:16:47 2017 >> > > > Using Petsc Release Version 3.7.3, unknown >> > > > >> > > > Max Max/Min Avg Total >> > > > Time (sec): 9.179e+01 1.00000 9.179e+01 >> > > > Objects: 1.990e+02 1.00000 1.990e+02 >> > > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 >> > > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 >> > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 >> > > > MPI Reductions: 0.000e+00 0.00000 >> > > > >> > > > Flop counting convention: 1 flop = 1 real number operation of type >> (multiply/divide/add/subtract) >> > > > e.g., VecAXPY() for real vectors of >> length N --> 2N flops >> > > > and VecAXPY() for complex vectors of >> length N --> 8N flops >> > > > >> > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- >> Messages --- -- Message Lengths -- -- Reductions -- >> > > > Avg %Total Avg %Total counts >> %Total Avg %Total counts %Total >> > > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% >> 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% >> > > > >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > See the 'Profiling' chapter of the users' manual for details on >> interpreting output. >> > > > Phase summary info: >> > > > Count: number of times phase was executed >> > > > Time and Flops: Max - maximum over all processors >> > > > Ratio - ratio of maximum to minimum over all >> processors >> > > > Mess: number of messages sent >> > > > Avg. len: average message length (bytes) >> > > > Reduct: number of global reductions >> > > > Global: entire computation >> > > > Stage: stages of a computation. Set stages with >> PetscLogStagePush() and PetscLogStagePop(). >> > > > %T - percent time in this phase %F - percent flops in >> this phase >> > > > %M - percent messages in this phase %L - percent message >> lengths in this phase >> > > > %R - percent reductions in this phase >> > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max >> time over all processors) >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > Event Count Time (sec) Flops >> --- Global --- --- Stage --- Total >> > > > Max Ratio Max Ratio Max Ratio Mess >> Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > >> > > > --- Event Stage 0: Main Stage >> > > > >> > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 >> > > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 >> > > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 >> > > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 >> > > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 >> > > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 >> 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 >> > > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 >> 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 >> > > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 >> 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 >> > > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 >> 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 >> > > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 >> > > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 >> 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >> > > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 >> > > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 >> > > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 >> 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >> > > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 >> 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 >> > > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 >> 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 >> > > > ------------------------------------------------------------ >> ------------------------------------------------------------ >> > > > >> > > > Memory usage is given in bytes: >> > > > >> > > > Object Type Creations Destructions Memory >> Descendants' Mem. >> > > > Reports information only for process 0. >> > > > >> > > > --- Event Stage 0: Main Stage >> > > > >> > > > Vector 91 91 9693912 0. >> > > > Vector Scatter 24 24 15936 0. >> > > > Index Set 51 51 537888 0. >> > > > IS L to G Mapping 3 3 240408 0. >> > > > Matrix 13 13 64097868 0. >> > > > Krylov Solver 6 6 7888 0. >> > > > Preconditioner 6 6 6288 0. >> > > > Viewer 1 0 0 0. >> > > > Distributed Mesh 1 1 4624 0. >> > > > Star Forest Bipartite Graph 2 2 1616 0. >> > > > Discrete System 1 1 872 0. >> > > > ============================================================ >> ============================================================ >> > > > Average time to get PetscTime(): 0. >> > > > #PETSc Option Table entries: >> > > > -ksp_monitor >> > > > -ksp_view >> > > > -log_view >> > > > #End of PETSc Option Table entries >> > > > Compiled without FORTRAN kernels >> > > > Compiled with full precision matrices (default) >> > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 >> sizeof(PetscScalar) 8 sizeof(PetscInt) 4 >> > > > Configure options: --with-shared-libraries=1 --with-debugging=0 >> --download-suitesparse --download-blacs --download-ptscotch=yes >> --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl >> --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps >> --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc >> --download-hypre --download-ml >> > > > ----------------------------------------- >> > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo >> > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64- >> with-Ubuntu-16.04-xenial >> > > > Using PETSc directory: /home/dknez/software/petsc-src >> > > > Using PETSc arch: arch-linux2-c-opt >> > > > ----------------------------------------- >> > > > >> > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings >> -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O >> ${COPTFLAGS} ${CFLAGS} >> > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 >> -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} >> > > > ----------------------------------------- >> > > > >> > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include >> -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include >> -I/home/dknez/software/libmesh_install/opt_real/petsc/include >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent >> -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include >> -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi >> > > > ----------------------------------------- >> > > > >> > > > Using C linker: mpicc >> > > > Using Fortran linker: mpif90 >> > > > Using libraries: -Wl,-rpath,/home/dknez/softwar >> e/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib >> -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib >> -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps >> -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx >> -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod >> -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig >> -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 >> -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 >> -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch >> -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 >> -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm >> -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz >> -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib >> -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu >> -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu >> -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl >> -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl >> > > > ----------------------------------------- >> > > > >> > > > >> > > > >> > > > >> > > > >> > > > >> > > > >> > > >> > > >> > > >> > >> > >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Jan 12 10:30:52 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 12 Jan 2017 10:30:52 -0600 Subject: [petsc-users] Using PCFIELDSPLIT with -pc_fieldsplit_type schur In-Reply-To: References: <9641EA68-FC6B-48DC-8A5E-A8B75E54EA64@mcs.anl.gov> <8DBB9010-9A29-4028-A9C1-98DB695602C1@mcs.anl.gov> Message-ID: <1642E4B0-E8B4-48A5-B160-DAC7548F942F@mcs.anl.gov> > On Jan 12, 2017, at 5:16 AM, Matthew Knepley wrote: > > On Wed, Jan 11, 2017 at 10:37 PM, David Knezevic wrote: > On Wed, Jan 11, 2017 at 9:55 PM, Barry Smith wrote: > > That is disappointing, > > Please try using > > PCFieldSplitSetSchurPre (pc, PC_FIELDSPLIT_SCHUR_PRE_FULL, NULL); > > I got an error: > [0]PETSC ERROR: No support for this operation for this object type > [0]PETSC ERROR: Not yet implemented for Schur complements with non-vanishing D > > Barry fixed this, but it might only be in th 3.7.5 It is only supported in master. Barry > > Thanks, > > Matt > > David > > > > On Jan 11, 2017, at 8:49 PM, David Knezevic wrote: > > > > OK, that's encouraging. However, OK, that's encouraging. However, regarding this: > > > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type gamg > > > > I tried this and it didn't converge at all (it hit the 10000 iteration max in the output from -fieldsplit_FE_split_ksp_monitor). So I guess I'd need to attach the near nullspace to make this work reasonably, as you said. Sounds like that may not be easy to do in this case though? I'll try some other preconditioners in the meantime. > > > > Thanks, > > David > > > > > > On Wed, Jan 11, 2017 at 9:31 PM, Barry Smith wrote: > > > > Thanks, this is very useful information. It means that > > > > 1) the approximate Sp is actually a very good approximation to the true Schur complement S, since using Sp^-1 to precondition S gives iteration counts from 8 to 13. > > > > 2) using ilu(0) as a preconditioner for Sp is not good, since replacing Sp^-1 with ilu(0) of Sp gives absurd iteration counts. This is actually not super surprising since ilu(0) is generally "not so good" for elasticity. > > > > So the next step is to try using -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type gamg > > > > the one open question is if any options should be passed to the gamg to tell it that the underly problem comes from "elasticity"; that is something about the null space. > > > > Mark Adams, since the GAMG is coming from inside another preconditioner it may not be easy for the easy for the user to attach the near null space to that inner matrix. Would it make sense for there to be a GAMG command line option to indicate that it is a 3d elasticity problem so GAMG could set up the near null space for itself? or does that not make sense? > > > > Barry > > > > > > > > > On Jan 11, 2017, at 7:47 PM, David Knezevic wrote: > > > > > > I've attached the two log files. Using cholesky for "FE_split" seems to have helped a lot! > > > > > > David > > > > > > > > > -- > > > David J. Knezevic | CTO > > > Akselos | 210 Broadway, #201 | Cambridge, MA | 02139 > > > > > > Phone: +1-617-599-4755 > > > > > > This e-mail and any attachments may contain confidential material for the sole use of the intended recipient(s). Any review or distribution by others is strictly prohibited. If you are not the intended recipient, please contact the sender and delete all copies. > > > > > > On Wed, Jan 11, 2017 at 8:32 PM, Barry Smith wrote: > > > > > > Can you please run with all the monitoring on? So we can see the convergence of all the inner solvers > > > -fieldsplit_FE_split_ksp_monitor > > > > > > Then run again with > > > > > > -fieldsplit_FE_split_ksp_monitor -fieldsplit_FE_split_pc_type cholesky > > > > > > > > > and send both sets of results > > > > > > Barry > > > > > > > > > > On Jan 11, 2017, at 6:32 PM, David Knezevic wrote: > > > > > > > > On Wed, Jan 11, 2017 at 5:52 PM, Dave May wrote: > > > > so I gather that I'll have to look into a user-defined approximation to S. > > > > > > > > Where does the 2x2 block system come from? > > > > Maybe someone on the list knows the right approximation to use for S. > > > > > > > > The model is 3D linear elasticity using a finite element discretization. I applied substructuring to part of the system to "condense" it, and that results in the small A00 block. The A11 block is just standard 3D elasticity; no substructuring was applied there. There are constraints to connect the degrees of freedom on the interface of the substructured and non-substructured regions. > > > > > > > > If anyone has suggestions for a good way to precondition this type of system, I'd be most appreciative! > > > > > > > > Thanks, > > > > David > > > > > > > > > > > > > > > > ----------------------------------------- > > > > > > > > 0 KSP Residual norm 5.405528187695e+04 > > > > 1 KSP Residual norm 2.187814910803e+02 > > > > 2 KSP Residual norm 1.019051577515e-01 > > > > 3 KSP Residual norm 4.370464012859e-04 > > > > KSP Object: 1 MPI processes > > > > type: cg > > > > maximum iterations=1000 > > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using nonzero initial guess > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 1 MPI processes > > > > type: fieldsplit > > > > FieldSplit with Schur preconditioner, factorization FULL > > > > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > > > > Split info: > > > > Split number 0 Defined by IS > > > > Split number 1 Defined by IS > > > > KSP solver for A00 block > > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during MatSetValues calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): 0 > > > > ICNTL(13) (efficiency control): 0 > > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > > ICNTL(18) (input mat struct): 0 > > > > ICNTL(19) (Shur complement info): 0 > > > > ICNTL(20) (rhs sparse pattern): 0 > > > > ICNTL(21) (solution struct): 0 > > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > > ICNTL(24) (detection of null pivot rows): 0 > > > > ICNTL(25) (computation of a null space basis): 0 > > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > > ICNTL(27) (experimental parameter): -24 > > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > > ICNTL(29) (parallel ordering): 0 > > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > > ICNTL(33) (compute determinant): 0 > > > > CNTL(1) (relative pivoting threshold): 0.01 > > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): 0. > > > > CNTL(4) (value of static pivoting): -1. > > > > CNTL(5) (fixation for null pivots): 0. > > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > > INFOG(6) (number of nodes in the complete tree): 53 > > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > > INFOG(12) (number of off-diagonal pivots): 0 > > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > > INFOG(14) (number of memory compress after factorization): 0 > > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of analysis done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 108 nodes, limit used is 5 > > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: cg > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: bjacobi > > > > block Jacobi: number of blocks = 1 > > > > Local solve is same for all blocks, in the following KSP and PC objects: > > > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > > type: ilu > > > > ILU: out-of-place factorization > > > > 0 levels of fill > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 1., needed 1. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > package used to perform factorization: petsc > > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9489 nodes, limit used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9489 nodes, limit used is 5 > > > > linear system matrix followed by preconditioner matrix: > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: schurcomplement > > > > rows=28476, cols=28476 > > > > Schur complement A11 - A10 inv(A00) A01 > > > > A11 > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9492 nodes, limit used is 5 > > > > A10 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=324 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 5717 nodes, limit used is 5 > > > > KSP of A00 > > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during MatSetValues calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): 0 > > > > ICNTL(13) (efficiency control): 0 > > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > > ICNTL(18) (input mat struct): 0 > > > > ICNTL(19) (Shur complement info): 0 > > > > ICNTL(20) (rhs sparse pattern): 0 > > > > ICNTL(21) (solution struct): 0 > > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > > ICNTL(24) (detection of null pivot rows): 0 > > > > ICNTL(25) (computation of a null space basis): 0 > > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > > ICNTL(27) (experimental parameter): -24 > > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > > ICNTL(29) (parallel ordering): 0 > > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > > ICNTL(33) (compute determinant): 0 > > > > CNTL(1) (relative pivoting threshold): 0.01 > > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): 0. > > > > CNTL(4) (value of static pivoting): -1. > > > > CNTL(5) (fixation for null pivots): 0. > > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > > INFOG(6) (number of nodes in the complete tree): 53 > > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > > INFOG(12) (number of off-diagonal pivots): 0 > > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > > INFOG(14) (number of memory compress after factorization): 0 > > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of analysis done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 108 nodes, limit used is 5 > > > > A01 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=28476 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 67 nodes, limit used is 5 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1037052, allocated nonzeros=1037052 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9489 nodes, limit used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: () 1 MPI processes > > > > type: seqaij > > > > rows=28800, cols=28800 > > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 17:22:10 2017 > > > > Using Petsc Release Version 3.7.3, unknown > > > > > > > > Max Max/Min Avg Total > > > > Time (sec): 9.638e+01 1.00000 9.638e+01 > > > > Objects: 2.030e+02 1.00000 2.030e+02 > > > > Flops: 1.732e+11 1.00000 1.732e+11 1.732e+11 > > > > Flops/sec: 1.797e+09 1.00000 1.797e+09 1.797e+09 > > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > > > 0: Main Stage: 9.6379e+01 100.0% 1.7318e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > > > Phase summary info: > > > > Count: number of times phase was executed > > > > Time and Flops: Max - maximum over all processors > > > > Ratio - ratio of maximum to minimum over all processors > > > > Mess: number of messages sent > > > > Avg. len: average message length (bytes) > > > > Reduct: number of global reductions > > > > Global: entire computation > > > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > > > %T - percent time in this phase %F - percent flops in this phase > > > > %M - percent messages in this phase %L - percent message lengths in this phase > > > > %R - percent reductions in this phase > > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > VecDot 42 1.0 2.2411e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 380 > > > > VecTDot 77761 1.0 1.4294e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3098 > > > > VecNorm 38894 1.0 9.1002e-01 1.0 2.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2434 > > > > VecScale 38882 1.0 3.7314e-01 1.0 1.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2967 > > > > VecCopy 38908 1.0 2.1655e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecSet 77887 1.0 3.2034e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAXPY 77777 1.0 1.8382e+00 1.0 4.43e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2409 > > > > VecAYPX 38875 1.0 1.2884e+00 1.0 2.21e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1718 > > > > VecAssemblyBegin 68 1.0 1.9407e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAssemblyEnd 68 1.0 2.6941e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecScatterBegin 48 1.0 4.6349e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMult 38891 1.0 4.3045e+01 1.0 8.03e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 46 0 0 0 45 46 0 0 0 1866 > > > > MatMultAdd 38889 1.0 3.5360e+01 1.0 7.91e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2236 > > > > MatSolve 77769 1.0 4.8780e+01 1.0 7.95e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 46 0 0 0 51 46 0 0 0 1631 > > > > MatLUFactorNum 1 1.0 1.9575e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1274 > > > > MatCholFctrSym 1 1.0 9.4891e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatCholFctrNum 1 1.0 3.7885e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatILUFactorSym 1 1.0 4.1780e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatConvert 1 1.0 3.0041e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatScale 2 1.0 2.7180e-05 1.0 2.53e+04 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 930 > > > > MatAssemblyBegin 32 1.0 4.0531e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAssemblyEnd 32 1.0 1.2032e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRow 114978 1.0 5.9254e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRowIJ 2 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetSubMatrice 6 1.0 1.5707e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetOrdering 2 1.0 3.2425e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatZeroEntries 6 1.0 3.0580e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatView 7 1.0 3.5119e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAXPY 1 1.0 1.9384e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMatMult 1 1.0 2.7120e-03 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 117 > > > > MatMatMultSym 1 1.0 1.8010e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMatMultNum 1 1.0 6.1703e-04 1.0 3.16e+05 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 513 > > > > KSPSetUp 4 1.0 9.8944e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve 1 1.0 9.3380e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > > PCSetUp 4 1.0 6.6326e-02 1.0 2.53e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 381 > > > > PCSetUpOnBlocks 5 1.0 2.4082e-02 1.0 2.49e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1036 > > > > PCApply 5 1.0 9.3376e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > > KSPSolve_FS_0 5 1.0 7.0214e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve_FS_Schu 5 1.0 9.3372e+01 1.0 1.73e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1855 > > > > KSPSolve_FS_Low 5 1.0 2.1377e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > > > Memory usage is given in bytes: > > > > > > > > Object Type Creations Destructions Memory Descendants' Mem. > > > > Reports information only for process 0. > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > Vector 92 92 9698040 0. > > > > Vector Scatter 24 24 15936 0. > > > > Index Set 51 51 537876 0. > > > > IS L to G Mapping 3 3 240408 0. > > > > Matrix 16 16 77377776 0. > > > > Krylov Solver 6 6 7888 0. > > > > Preconditioner 6 6 6288 0. > > > > Viewer 1 0 0 0. > > > > Distributed Mesh 1 1 4624 0. > > > > Star Forest Bipartite Graph 2 2 1616 0. > > > > Discrete System 1 1 872 0. > > > > ======================================================================================================================== > > > > Average time to get PetscTime(): 0. > > > > #PETSc Option Table entries: > > > > -ksp_monitor > > > > -ksp_view > > > > -log_view > > > > #End of PETSc Option Table entries > > > > Compiled without FORTRAN kernels > > > > Compiled with full precision matrices (default) > > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > > > ----------------------------------------- > > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > > > Using PETSc directory: /home/dknez/software/petsc-src > > > > Using PETSc arch: arch-linux2-c-opt > > > > ----------------------------------------- > > > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > > ----------------------------------------- > > > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > > ----------------------------------------- > > > > > > > > Using C linker: mpicc > > > > Using Fortran linker: mpif90 > > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > On Wed, Jan 11, 2017 at 4:49 PM, Dave May wrote: > > > > It looks like the Schur solve is requiring a huge number of iterates to converge (based on the instances of MatMult). > > > > This is killing the performance. > > > > > > > > Are you sure that A11 is a good approximation to S? You might consider trying the selfp option > > > > > > > > http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PC/PCFieldSplitSetSchurPre.html#PCFieldSplitSetSchurPre > > > > > > > > Note that the best approx to S is likely both problem and discretisation dependent so if selfp is also terrible, you might want to consider coding up your own approx to S for your specific system. > > > > > > > > > > > > Thanks, > > > > Dave > > > > > > > > > > > > On Wed, 11 Jan 2017 at 22:34, David Knezevic wrote: > > > > I have a definite block 2x2 system and I figured it'd be good to apply the PCFIELDSPLIT functionality with Schur complement, as described in Section 4.5 of the manual. > > > > > > > > The A00 block of my matrix is very small so I figured I'd specify a direct solver (i.e. MUMPS) for that block. > > > > > > > > So I did the following: > > > > - PCFieldSplitSetIS to specify the indices of the two splits > > > > - PCFieldSplitGetSubKSP to get the two KSP objects, and to set the solver and PC types for each (MUMPS for A00, ILU+CG for A11) > > > > - I set -pc_fieldsplit_schur_fact_type full > > > > > > > > Below I have pasted the output of "-ksp_view -ksp_monitor -log_view" for a test case. It seems to converge well, but I'm concerned about the speed (about 90 seconds, vs. about 1 second if I use a direct solver for the entire system). I just wanted to check if I'm setting this up in a good way? > > > > > > > > Many thanks, > > > > David > > > > > > > > ----------------------------------------------------------------------------------- > > > > > > > > 0 KSP Residual norm 5.405774214400e+04 > > > > 1 KSP Residual norm 1.849649014371e+02 > > > > 2 KSP Residual norm 7.462775074989e-02 > > > > 3 KSP Residual norm 2.680497175260e-04 > > > > KSP Object: 1 MPI processes > > > > type: cg > > > > maximum iterations=1000 > > > > tolerances: relative=1e-06, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using nonzero initial guess > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: 1 MPI processes > > > > type: fieldsplit > > > > FieldSplit with Schur preconditioner, factorization FULL > > > > Preconditioner for the Schur complement formed from A11 > > > > Split info: > > > > Split number 0 Defined by IS > > > > Split number 1 Defined by IS > > > > KSP solver for A00 block > > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during MatSetValues calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): 0 > > > > ICNTL(13) (efficiency control): 0 > > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > > ICNTL(18) (input mat struct): 0 > > > > ICNTL(19) (Shur complement info): 0 > > > > ICNTL(20) (rhs sparse pattern): 0 > > > > ICNTL(21) (solution struct): 0 > > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > > ICNTL(24) (detection of null pivot rows): 0 > > > > ICNTL(25) (computation of a null space basis): 0 > > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > > ICNTL(27) (experimental parameter): -24 > > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > > ICNTL(29) (parallel ordering): 0 > > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > > ICNTL(33) (compute determinant): 0 > > > > CNTL(1) (relative pivoting threshold): 0.01 > > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): 0. > > > > CNTL(4) (value of static pivoting): -1. > > > > CNTL(5) (fixation for null pivots): 0. > > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > > INFOG(6) (number of nodes in the complete tree): 53 > > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > > INFOG(12) (number of off-diagonal pivots): 0 > > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > > INFOG(14) (number of memory compress after factorization): 0 > > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of analysis done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 108 nodes, limit used is 5 > > > > KSP solver for S = A11 - A10 inv(A00) A01 > > > > KSP Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: cg > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using PRECONDITIONED norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: bjacobi > > > > block Jacobi: number of blocks = 1 > > > > Local solve is same for all blocks, in the following KSP and PC objects: > > > > KSP Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_FE_split_sub_) 1 MPI processes > > > > type: ilu > > > > ILU: out-of-place factorization > > > > 0 levels of fill > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 1., needed 1. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > package used to perform factorization: petsc > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9492 nodes, limit used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9492 nodes, limit used is 5 > > > > linear system matrix followed by preconditioner matrix: > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: schurcomplement > > > > rows=28476, cols=28476 > > > > Schur complement A11 - A10 inv(A00) A01 > > > > A11 > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9492 nodes, limit used is 5 > > > > A10 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=324 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 5717 nodes, limit used is 5 > > > > KSP of A00 > > > > KSP Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: preonly > > > > maximum iterations=10000, initial guess is zero > > > > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > > > > left preconditioning > > > > using NONE norm type for convergence test > > > > PC Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: cholesky > > > > Cholesky: out-of-place factorization > > > > tolerance for zero pivot 2.22045e-14 > > > > matrix ordering: natural > > > > factor fill ratio given 0., needed 0. > > > > Factored matrix follows: > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > package used to perform factorization: mumps > > > > total: nonzeros=3042, allocated nonzeros=3042 > > > > total number of mallocs used during MatSetValues calls =0 > > > > MUMPS run parameters: > > > > SYM (matrix type): 2 > > > > PAR (host participation): 1 > > > > ICNTL(1) (output for error): 6 > > > > ICNTL(2) (output of diagnostic msg): 0 > > > > ICNTL(3) (output for global info): 0 > > > > ICNTL(4) (level of printing): 0 > > > > ICNTL(5) (input mat struct): 0 > > > > ICNTL(6) (matrix prescaling): 7 > > > > ICNTL(7) (sequentia matrix ordering):7 > > > > ICNTL(8) (scalling strategy): 77 > > > > ICNTL(10) (max num of refinements): 0 > > > > ICNTL(11) (error analysis): 0 > > > > ICNTL(12) (efficiency control): 0 > > > > ICNTL(13) (efficiency control): 0 > > > > ICNTL(14) (percentage of estimated workspace increase): 20 > > > > ICNTL(18) (input mat struct): 0 > > > > ICNTL(19) (Shur complement info): 0 > > > > ICNTL(20) (rhs sparse pattern): 0 > > > > ICNTL(21) (solution struct): 0 > > > > ICNTL(22) (in-core/out-of-core facility): 0 > > > > ICNTL(23) (max size of memory can be allocated locally):0 > > > > ICNTL(24) (detection of null pivot rows): 0 > > > > ICNTL(25) (computation of a null space basis): 0 > > > > ICNTL(26) (Schur options for rhs or solution): 0 > > > > ICNTL(27) (experimental parameter): -24 > > > > ICNTL(28) (use parallel or sequential ordering): 1 > > > > ICNTL(29) (parallel ordering): 0 > > > > ICNTL(30) (user-specified set of entries in inv(A)): 0 > > > > ICNTL(31) (factors is discarded in the solve phase): 0 > > > > ICNTL(33) (compute determinant): 0 > > > > CNTL(1) (relative pivoting threshold): 0.01 > > > > CNTL(2) (stopping criterion of refinement): 1.49012e-08 > > > > CNTL(3) (absolute pivoting threshold): 0. > > > > CNTL(4) (value of static pivoting): -1. > > > > CNTL(5) (fixation for null pivots): 0. > > > > RINFO(1) (local estimated flops for the elimination after analysis): > > > > [0] 29394. > > > > RINFO(2) (local estimated flops for the assembly after factorization): > > > > [0] 1092. > > > > RINFO(3) (local estimated flops for the elimination after factorization): > > > > [0] 29394. > > > > INFO(15) (estimated size of (in MB) MUMPS internal data for running numerical factorization): > > > > [0] 1 > > > > INFO(16) (size of (in MB) MUMPS internal data used during numerical factorization): > > > > [0] 1 > > > > INFO(23) (num of pivots eliminated on this processor after factorization): > > > > [0] 324 > > > > RINFOG(1) (global estimated flops for the elimination after analysis): 29394. > > > > RINFOG(2) (global estimated flops for the assembly after factorization): 1092. > > > > RINFOG(3) (global estimated flops for the elimination after factorization): 29394. > > > > (RINFOG(12) RINFOG(13))*2^INFOG(34) (determinant): (0.,0.)*(2^0) > > > > INFOG(3) (estimated real workspace for factors on all processors after analysis): 3888 > > > > INFOG(4) (estimated integer workspace for factors on all processors after analysis): 2067 > > > > INFOG(5) (estimated maximum front size in the complete tree): 12 > > > > INFOG(6) (number of nodes in the complete tree): 53 > > > > INFOG(7) (ordering option effectively use after analysis): 2 > > > > INFOG(8) (structural symmetry in percent of the permuted matrix after analysis): 100 > > > > INFOG(9) (total real/complex workspace to store the matrix factors after factorization): 3888 > > > > INFOG(10) (total integer space store the matrix factors after factorization): 2067 > > > > INFOG(11) (order of largest frontal matrix after factorization): 12 > > > > INFOG(12) (number of off-diagonal pivots): 0 > > > > INFOG(13) (number of delayed pivots after factorization): 0 > > > > INFOG(14) (number of memory compress after factorization): 0 > > > > INFOG(15) (number of steps of iterative refinement after solution): 0 > > > > INFOG(16) (estimated size (in MB) of all MUMPS internal data for factorization after analysis: value on the most memory consuming processor): 1 > > > > INFOG(17) (estimated size of all MUMPS internal data for factorization after analysis: sum over all processors): 1 > > > > INFOG(18) (size of all MUMPS internal data allocated during factorization: value on the most memory consuming processor): 1 > > > > INFOG(19) (size of all MUMPS internal data allocated during factorization: sum over all processors): 1 > > > > INFOG(20) (estimated number of entries in the factors): 3042 > > > > INFOG(21) (size in MB of memory effectively used during factorization - value on the most memory consuming processor): 1 > > > > INFOG(22) (size in MB of memory effectively used during factorization - sum over all processors): 1 > > > > INFOG(23) (after analysis: value of ICNTL(6) effectively used): 5 > > > > INFOG(24) (after analysis: value of ICNTL(12) effectively used): 1 > > > > INFOG(25) (after factorization: number of pivots modified by static pivoting): 0 > > > > INFOG(28) (after factorization: number of null pivots encountered): 0 > > > > INFOG(29) (after factorization: effective number of entries in the factors (sum over all processors)): 3042 > > > > INFOG(30, 31) (after solution: size in Mbytes of memory used during solution phase): 0, 0 > > > > INFOG(32) (after analysis: type of analysis done): 1 > > > > INFOG(33) (value used for ICNTL(8)): -2 > > > > INFOG(34) (exponent of the determinant if determinant is requested): 0 > > > > linear system matrix = precond matrix: > > > > Mat Object: (fieldsplit_RB_split_) 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=324 > > > > total: nonzeros=5760, allocated nonzeros=5760 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 108 nodes, limit used is 5 > > > > A01 > > > > Mat Object: 1 MPI processes > > > > type: seqaij > > > > rows=324, cols=28476 > > > > total: nonzeros=936, allocated nonzeros=936 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 67 nodes, limit used is 5 > > > > Mat Object: (fieldsplit_FE_split_) 1 MPI processes > > > > type: seqaij > > > > rows=28476, cols=28476 > > > > total: nonzeros=1017054, allocated nonzeros=1017054 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9492 nodes, limit used is 5 > > > > linear system matrix = precond matrix: > > > > Mat Object: () 1 MPI processes > > > > type: seqaij > > > > rows=28800, cols=28800 > > > > total: nonzeros=1024686, allocated nonzeros=1024794 > > > > total number of mallocs used during MatSetValues calls =0 > > > > using I-node routines: found 9600 nodes, limit used is 5 > > > > > > > > > > > > ---------------------------------------------- PETSc Performance Summary: ---------------------------------------------- > > > > > > > > /home/dknez/akselos-dev/scrbe/build/bin/fe_solver-opt_real on a arch-linux2-c-opt named david-Lenovo with 1 processor, by dknez Wed Jan 11 16:16:47 2017 > > > > Using Petsc Release Version 3.7.3, unknown > > > > > > > > Max Max/Min Avg Total > > > > Time (sec): 9.179e+01 1.00000 9.179e+01 > > > > Objects: 1.990e+02 1.00000 1.990e+02 > > > > Flops: 1.634e+11 1.00000 1.634e+11 1.634e+11 > > > > Flops/sec: 1.780e+09 1.00000 1.780e+09 1.780e+09 > > > > MPI Messages: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Message Lengths: 0.000e+00 0.00000 0.000e+00 0.000e+00 > > > > MPI Reductions: 0.000e+00 0.00000 > > > > > > > > Flop counting convention: 1 flop = 1 real number operation of type (multiply/divide/add/subtract) > > > > e.g., VecAXPY() for real vectors of length N --> 2N flops > > > > and VecAXPY() for complex vectors of length N --> 8N flops > > > > > > > > Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages --- -- Message Lengths -- -- Reductions -- > > > > Avg %Total Avg %Total counts %Total Avg %Total counts %Total > > > > 0: Main Stage: 9.1787e+01 100.0% 1.6336e+11 100.0% 0.000e+00 0.0% 0.000e+00 0.0% 0.000e+00 0.0% > > > > > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > See the 'Profiling' chapter of the users' manual for details on interpreting output. > > > > Phase summary info: > > > > Count: number of times phase was executed > > > > Time and Flops: Max - maximum over all processors > > > > Ratio - ratio of maximum to minimum over all processors > > > > Mess: number of messages sent > > > > Avg. len: average message length (bytes) > > > > Reduct: number of global reductions > > > > Global: entire computation > > > > Stage: stages of a computation. Set stages with PetscLogStagePush() and PetscLogStagePop(). > > > > %T - percent time in this phase %F - percent flops in this phase > > > > %M - percent messages in this phase %L - percent message lengths in this phase > > > > %R - percent reductions in this phase > > > > Total Mflop/s: 10e-6 * (sum of flops over all processors)/(max time over all processors) > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > Event Count Time (sec) Flops --- Global --- --- Stage --- Total > > > > Max Ratio Max Ratio Max Ratio Mess Avg len Reduct %T %F %M %L %R %T %F %M %L %R Mflop/s > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > VecDot 42 1.0 2.4080e-05 1.0 8.53e+03 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 354 > > > > VecTDot 74012 1.0 1.2440e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 3 0 0 0 1 3 0 0 0 3388 > > > > VecNorm 37020 1.0 8.3580e-01 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 2523 > > > > VecScale 37008 1.0 3.5800e-01 1.0 1.05e+09 1.0 0.0e+00 0.0e+00 0.0e+00 0 1 0 0 0 0 1 0 0 0 2944 > > > > VecCopy 37034 1.0 2.5754e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecSet 74137 1.0 3.0537e-01 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAXPY 74029 1.0 1.7233e+00 1.0 4.22e+09 1.0 0.0e+00 0.0e+00 0.0e+00 2 3 0 0 0 2 3 0 0 0 2446 > > > > VecAYPX 37001 1.0 1.2214e+00 1.0 2.11e+09 1.0 0.0e+00 0.0e+00 0.0e+00 1 1 0 0 0 1 1 0 0 0 1725 > > > > VecAssemblyBegin 68 1.0 2.0432e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecAssemblyEnd 68 1.0 2.5988e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > VecScatterBegin 48 1.0 4.6921e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatMult 37017 1.0 4.1269e+01 1.0 7.65e+10 1.0 0.0e+00 0.0e+00 0.0e+00 45 47 0 0 0 45 47 0 0 0 1853 > > > > MatMultAdd 37015 1.0 3.3638e+01 1.0 7.53e+10 1.0 0.0e+00 0.0e+00 0.0e+00 37 46 0 0 0 37 46 0 0 0 2238 > > > > MatSolve 74021 1.0 4.6602e+01 1.0 7.42e+10 1.0 0.0e+00 0.0e+00 0.0e+00 51 45 0 0 0 51 45 0 0 0 1593 > > > > MatLUFactorNum 1 1.0 1.7209e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1420 > > > > MatCholFctrSym 1 1.0 8.8310e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatCholFctrNum 1 1.0 3.6907e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatILUFactorSym 1 1.0 3.7372e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAssemblyBegin 29 1.0 2.1458e-06 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatAssemblyEnd 29 1.0 9.9473e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRow 58026 1.0 2.8155e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetRowIJ 2 1.0 0.0000e+00 0.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetSubMatrice 6 1.0 1.5399e-02 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatGetOrdering 2 1.0 3.0112e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatZeroEntries 6 1.0 2.9490e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > MatView 7 1.0 3.4356e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSetUp 4 1.0 9.4891e-05 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve 1 1.0 8.8793e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > > PCSetUp 4 1.0 3.8375e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 637 > > > > PCSetUpOnBlocks 5 1.0 2.1250e-02 1.0 2.44e+07 1.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 1150 > > > > PCApply 5 1.0 8.8789e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > > KSPSolve_FS_0 5 1.0 7.5364e-04 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > KSPSolve_FS_Schu 5 1.0 8.8785e+01 1.0 1.63e+11 1.0 0.0e+00 0.0e+00 0.0e+00 97100 0 0 0 97100 0 0 0 1840 > > > > KSPSolve_FS_Low 5 1.0 2.1019e-03 1.0 0.00e+00 0.0 0.0e+00 0.0e+00 0.0e+00 0 0 0 0 0 0 0 0 0 0 0 > > > > ------------------------------------------------------------------------------------------------------------------------ > > > > > > > > Memory usage is given in bytes: > > > > > > > > Object Type Creations Destructions Memory Descendants' Mem. > > > > Reports information only for process 0. > > > > > > > > --- Event Stage 0: Main Stage > > > > > > > > Vector 91 91 9693912 0. > > > > Vector Scatter 24 24 15936 0. > > > > Index Set 51 51 537888 0. > > > > IS L to G Mapping 3 3 240408 0. > > > > Matrix 13 13 64097868 0. > > > > Krylov Solver 6 6 7888 0. > > > > Preconditioner 6 6 6288 0. > > > > Viewer 1 0 0 0. > > > > Distributed Mesh 1 1 4624 0. > > > > Star Forest Bipartite Graph 2 2 1616 0. > > > > Discrete System 1 1 872 0. > > > > ======================================================================================================================== > > > > Average time to get PetscTime(): 0. > > > > #PETSc Option Table entries: > > > > -ksp_monitor > > > > -ksp_view > > > > -log_view > > > > #End of PETSc Option Table entries > > > > Compiled without FORTRAN kernels > > > > Compiled with full precision matrices (default) > > > > sizeof(short) 2 sizeof(int) 4 sizeof(long) 8 sizeof(void*) 8 sizeof(PetscScalar) 8 sizeof(PetscInt) 4 > > > > Configure options: --with-shared-libraries=1 --with-debugging=0 --download-suitesparse --download-blacs --download-ptscotch=yes --with-blas-lapack-dir=/opt/intel/system_studio_2015.2.050/mkl --CXXFLAGS=-Wl,--no-as-needed --download-scalapack --download-mumps --download-metis --prefix=/home/dknez/software/libmesh_install/opt_real/petsc --download-hypre --download-ml > > > > ----------------------------------------- > > > > Libraries compiled on Wed Sep 21 17:38:52 2016 on david-Lenovo > > > > Machine characteristics: Linux-4.4.0-38-generic-x86_64-with-Ubuntu-16.04-xenial > > > > Using PETSc directory: /home/dknez/software/petsc-src > > > > Using PETSc arch: arch-linux2-c-opt > > > > ----------------------------------------- > > > > > > > > Using C compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -O ${COPTFLAGS} ${CFLAGS} > > > > Using Fortran compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -O ${FOPTFLAGS} ${FFLAGS} > > > > ----------------------------------------- > > > > > > > > Using include paths: -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/include -I/home/dknez/software/petsc-src/arch-linux2-c-opt/include -I/home/dknez/software/libmesh_install/opt_real/petsc/include -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent -I/usr/lib/openmpi/include/openmpi/opal/mca/event/libevent2021/libevent/include -I/usr/lib/openmpi/include -I/usr/lib/openmpi/include/openmpi > > > > ----------------------------------------- > > > > > > > > Using C linker: mpicc > > > > Using Fortran linker: mpif90 > > > > Using libraries: -Wl,-rpath,/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -L/home/dknez/software/petsc-src/arch-linux2-c-opt/lib -lpetsc -Wl,-rpath,/home/dknez/software/libmesh_install/opt_real/petsc/lib -L/home/dknez/software/libmesh_install/opt_real/petsc/lib -lcmumps -ldmumps -lsmumps -lzmumps -lmumps_common -lpord -lmetis -lHYPRE -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -lmpi_cxx -lstdc++ -lscalapack -lml -lmpi_cxx -lstdc++ -lumfpack -lklu -lcholmod -lbtf -lccolamd -lcolamd -lcamd -lamd -lsuitesparseconfig -Wl,-rpath,/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -L/opt/intel/system_studio_2015.2.050/mkl/lib/intel64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lpthread -lm -lhwloc -lptesmumps -lptscotch -lptscotcherr -lscotch -lscotcherr -lX11 -lm -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -lrt -lm -lpthread -lz -Wl,-rpath,/usr/lib/openmpi/lib -L/usr/lib/openmpi/lib -Wl,-rpath,/usr/lib/gcc/x86_64-linux-gnu/5 -L/usr/lib/gcc/x86_64-linux-gnu/5 -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -Wl,-rpath,/lib/x86_64-linux-gnu -L/lib/x86_64-linux-gnu -Wl,-rpath,/usr/lib/x86_64-linux-gnu -L/usr/lib/x86_64-linux-gnu -ldl -Wl,-rpath,/usr/lib/openmpi/lib -lmpi -lgcc_s -lpthread -ldl > > > > ----------------------------------------- > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From C.Klaij at marin.nl Fri Jan 13 03:46:35 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Fri, 13 Jan 2017 09:46:35 +0000 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl>, <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> Message-ID: <1484300795996.50804@marin.nl> Barry, It's been a while but I'm finally using this function in 3.7.4. Is it supposed to work with fieldsplit? Here's why. I'm solving a Navier-Stokes system with fieldsplit (pc has one velocity solve and one pressure solve) and trying to retrieve the totals like this: CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) print *, 'nusediter_vv', nusediter_vv print *, 'nusediter_pp', nusediter_pp Running the code shows this surprise: Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 nusediter_vv 37 nusediter_pp 37 So the value of nusediter_pp is indeed 37, but for nusediter_vv it should be 66. Any idea what went wrong? Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm ________________________________________ From: Barry Smith Sent: Saturday, April 11, 2015 12:27 AM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 Chris, I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master Barry > On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: > > Barry, > > Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 > ksp and then KSPGetIterationNumber, but what does this number > mean? > > It appears to be the number of iterations of the last time that > the subsystem was solved, right? If so, this corresponds to the > last iteration of the coupled system, how about all the previous > iterations? > > Chris > ________________________________________ > From: Barry Smith > Sent: Friday, April 10, 2015 2:48 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Chris, > > It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. > > Barry > >> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >> >> A question when using PCFieldSplit: for each linear iteration of >> the system, how many iterations for fielsplit 0 and 1? >> >> One way to find out is to run with -ksp_monitor, >> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >> gives the complete convergence history. >> >> Another way, suggested by Matt, is to use -ksp_monitor, >> -fieldsplit_0_ksp_converged_reason and >> -fieldsplit_1_ksp_converged_reason. This gives only the totals >> for fieldsplit 0 and 1 (but without saying for which one). >> >> Both ways require to somehow process the output, which is a bit >> inconvenient. Could KSPGetResidualHistory perhaps return (some) >> information on the subsystems' convergence for processing inside >> the code? >> >> Chris >> >> >> dr. ir. Christiaan Klaij >> CFD Researcher >> Research & Development >> E mailto:C.Klaij at marin.nl >> T +31 317 49 33 44 >> >> >> MARIN >> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >> > From bsmith at mcs.anl.gov Fri Jan 13 12:51:00 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 13 Jan 2017 12:51:00 -0600 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <1484300795996.50804@marin.nl> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl> Message-ID: Yes, I would have expected this to work. Could you send the output from -ksp_view in this case? > On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan wrote: > > Barry, > > It's been a while but I'm finally using this function in > 3.7.4. Is it supposed to work with fieldsplit? Here's why. > > I'm solving a Navier-Stokes system with fieldsplit (pc has one > velocity solve and one pressure solve) and trying to retrieve the > totals like this: > > CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) > CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) > CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) > CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) > print *, 'nusediter_vv', nusediter_vv > print *, 'nusediter_pp', nusediter_pp > > Running the code shows this surprise: > > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > > nusediter_vv 37 > nusediter_pp 37 > > So the value of nusediter_pp is indeed 37, but for nusediter_vv > it should be 66. Any idea what went wrong? > > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm > > ________________________________________ > From: Barry Smith > Sent: Saturday, April 11, 2015 12:27 AM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Chris, > > I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master > > Barry > >> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: >> >> Barry, >> >> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 >> ksp and then KSPGetIterationNumber, but what does this number >> mean? >> >> It appears to be the number of iterations of the last time that >> the subsystem was solved, right? If so, this corresponds to the >> last iteration of the coupled system, how about all the previous >> iterations? >> >> Chris >> ________________________________________ >> From: Barry Smith >> Sent: Friday, April 10, 2015 2:48 PM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >> >> Chris, >> >> It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. >> >> Barry >> >>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >>> >>> A question when using PCFieldSplit: for each linear iteration of >>> the system, how many iterations for fielsplit 0 and 1? >>> >>> One way to find out is to run with -ksp_monitor, >>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >>> gives the complete convergence history. >>> >>> Another way, suggested by Matt, is to use -ksp_monitor, >>> -fieldsplit_0_ksp_converged_reason and >>> -fieldsplit_1_ksp_converged_reason. This gives only the totals >>> for fieldsplit 0 and 1 (but without saying for which one). >>> >>> Both ways require to somehow process the output, which is a bit >>> inconvenient. Could KSPGetResidualHistory perhaps return (some) >>> information on the subsystems' convergence for processing inside >>> the code? >>> >>> Chris >>> >>> >>> dr. ir. Christiaan Klaij >>> CFD Researcher >>> Research & Development >>> E mailto:C.Klaij at marin.nl >>> T +31 317 49 33 44 >>> >>> >>> MARIN >>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >>> >> > From C.Klaij at marin.nl Mon Jan 16 01:47:33 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Mon, 16 Jan 2017 07:47:33 +0000 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl>, Message-ID: <1484552853408.13052@marin.nl> Barry, Sure, here's the output with: -sys_ksp_view -sys_ksp_converged_reason -sys_fieldsplit_0_ksp_converged_reason -sys_fieldsplit_1_ksp_converged_reason (In my previous email, I rearranged 0 & 1 for easy summing.) Chris Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 Linear sys_ solve converged due to CONVERGED_RTOL iterations 6 KSP Object:(sys_) 1 MPI processes type: fgmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=300, initial guess is zero tolerances: relative=0.01, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object:(sys_) 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (sys_fieldsplit_0_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=0.01, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (sys_fieldsplit_0_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=9600, cols=9600 package used to perform factorization: petsc total: nonzeros=47280, allocated nonzeros=47280 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix = precond matrix: Mat Object: (sys_fieldsplit_0_) 1 MPI processes type: seqaij rows=9600, cols=9600 total: nonzeros=47280, allocated nonzeros=47280 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP solver for upper A00 in upper triangular factor KSP Object: (sys_fieldsplit_1_upper_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (sys_fieldsplit_1_upper_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (sys_fieldsplit_0_) 1 MPI processes type: seqaij rows=9600, cols=9600 total: nonzeros=47280, allocated nonzeros=47280 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (sys_fieldsplit_1_) 1 MPI processes type: gmres GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement GMRES: happy breakdown tolerance 1e-30 maximum iterations=10000, initial guess is zero tolerances: relative=0.01, absolute=1e-50, divergence=10000. right preconditioning using UNPRECONDITIONED norm type for convergence test PC Object: (sys_fieldsplit_1_) 1 MPI processes type: ilu ILU: out-of-place factorization 0 levels of fill tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 1., needed 1. Factored matrix follows: Mat Object: 1 MPI processes type: seqaij rows=3200, cols=3200 package used to perform factorization: petsc total: nonzeros=40404, allocated nonzeros=40404 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix followed by preconditioner matrix: Mat Object: (sys_fieldsplit_1_) 1 MPI processes type: schurcomplement rows=3200, cols=3200 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (sys_fieldsplit_1_) 1 MPI processes type: seqaij rows=3200, cols=3200 total: nonzeros=40404, allocated nonzeros=40404 total number of mallocs used during MatSetValues calls =0 not using I-node routines A10 Mat Object: 1 MPI processes type: seqaij rows=3200, cols=9600 total: nonzeros=47280, allocated nonzeros=47280 total number of mallocs used during MatSetValues calls =0 not using I-node routines KSP of A00 KSP Object: (sys_fieldsplit_1_inner_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (sys_fieldsplit_1_inner_) 1 MPI processes type: jacobi linear system matrix = precond matrix: Mat Object: (sys_fieldsplit_0_) 1 MPI processes type: seqaij rows=9600, cols=9600 total: nonzeros=47280, allocated nonzeros=47280 total number of mallocs used during MatSetValues calls =0 not using I-node routines A01 Mat Object: 1 MPI processes type: seqaij rows=9600, cols=3200 total: nonzeros=47280, allocated nonzeros=47280 total number of mallocs used during MatSetValues calls =0 not using I-node routines Mat Object: 1 MPI processes type: seqaij rows=3200, cols=3200 total: nonzeros=40404, allocated nonzeros=40404 total number of mallocs used during MatSetValues calls =0 not using I-node routines linear system matrix followed by preconditioner matrix: Mat Object: 1 MPI processes type: nest rows=12800, cols=12800 Matrix object: type=nest, rows=2, cols=2 MatNest structure: (0,0) : prefix="mom_", type=seqaij, rows=9600, cols=9600 (0,1) : prefix="grad_", type=seqaij, rows=9600, cols=3200 (1,0) : prefix="div_", type=seqaij, rows=3200, cols=9600 (1,1) : prefix="stab_", type=seqaij, rows=3200, cols=3200 Mat Object: 1 MPI processes type: nest rows=12800, cols=12800 Matrix object: type=nest, rows=2, cols=2 MatNest structure: (0,0) : prefix="sys_fieldsplit_0_", type=seqaij, rows=9600, cols=9600 (0,1) : type=seqaij, rows=9600, cols=3200 (1,0) : type=seqaij, rows=3200, cols=9600 (1,1) : prefix="sys_fieldsplit_1_", type=seqaij, rows=3200, cols=3200 nusediter_vv 37 nusediter_pp 37 dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/The-Ocean-Cleanup-testing-continues.htm ________________________________________ From: Barry Smith Sent: Friday, January 13, 2017 7:51 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 Yes, I would have expected this to work. Could you send the output from -ksp_view in this case? > On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan wrote: > > Barry, > > It's been a while but I'm finally using this function in > 3.7.4. Is it supposed to work with fieldsplit? Here's why. > > I'm solving a Navier-Stokes system with fieldsplit (pc has one > velocity solve and one pressure solve) and trying to retrieve the > totals like this: > > CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) > CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) > CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) > CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) > print *, 'nusediter_vv', nusediter_vv > print *, 'nusediter_pp', nusediter_pp > > Running the code shows this surprise: > > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > > nusediter_vv 37 > nusediter_pp 37 > > So the value of nusediter_pp is indeed 37, but for nusediter_vv > it should be 66. Any idea what went wrong? > > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm > > ________________________________________ > From: Barry Smith > Sent: Saturday, April 11, 2015 12:27 AM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Chris, > > I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master > > Barry > >> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: >> >> Barry, >> >> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 >> ksp and then KSPGetIterationNumber, but what does this number >> mean? >> >> It appears to be the number of iterations of the last time that >> the subsystem was solved, right? If so, this corresponds to the >> last iteration of the coupled system, how about all the previous >> iterations? >> >> Chris >> ________________________________________ >> From: Barry Smith >> Sent: Friday, April 10, 2015 2:48 PM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >> >> Chris, >> >> It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. >> >> Barry >> >>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >>> >>> A question when using PCFieldSplit: for each linear iteration of >>> the system, how many iterations for fielsplit 0 and 1? >>> >>> One way to find out is to run with -ksp_monitor, >>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >>> gives the complete convergence history. >>> >>> Another way, suggested by Matt, is to use -ksp_monitor, >>> -fieldsplit_0_ksp_converged_reason and >>> -fieldsplit_1_ksp_converged_reason. This gives only the totals >>> for fieldsplit 0 and 1 (but without saying for which one). >>> >>> Both ways require to somehow process the output, which is a bit >>> inconvenient. Could KSPGetResidualHistory perhaps return (some) >>> information on the subsystems' convergence for processing inside >>> the code? >>> >>> Chris >>> >>> >>> dr. ir. Christiaan Klaij >>> CFD Researcher >>> Research & Development >>> E mailto:C.Klaij at marin.nl >>> T +31 317 49 33 44 >>> >>> >>> MARIN >>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >>> >> > From bsmith at mcs.anl.gov Mon Jan 16 14:28:25 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 16 Jan 2017 14:28:25 -0600 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <1484552853408.13052@marin.nl> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl> <1484552853408.13052@marin.nl> Message-ID: <04C3073A-84AB-419F-A143-ACF10D97B2DE@mcs.anl.gov> Please send all the command line options you use. > On Jan 16, 2017, at 1:47 AM, Klaij, Christiaan wrote: > > Barry, > > Sure, here's the output with: > > -sys_ksp_view -sys_ksp_converged_reason -sys_fieldsplit_0_ksp_converged_reason -sys_fieldsplit_1_ksp_converged_reason > > (In my previous email, I rearranged 0 & 1 for easy summing.) > > Chris > > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_ solve converged due to CONVERGED_RTOL iterations 6 > KSP Object:(sys_) 1 MPI processes > type: fgmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=300, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object:(sys_) 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (sys_fieldsplit_0_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: (sys_fieldsplit_0_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=9600, cols=9600 > package used to perform factorization: petsc > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP solver for upper A00 in upper triangular factor > KSP Object: (sys_fieldsplit_1_upper_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sys_fieldsplit_1_upper_) 1 MPI processes > type: jacobi > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (sys_fieldsplit_1_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: (sys_fieldsplit_1_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=3200 > package used to perform factorization: petsc > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Mat Object: (sys_fieldsplit_1_) 1 MPI processes > type: schurcomplement > rows=3200, cols=3200 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (sys_fieldsplit_1_) 1 MPI processes > type: seqaij > rows=3200, cols=3200 > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP of A00 > KSP Object: (sys_fieldsplit_1_inner_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sys_fieldsplit_1_inner_) 1 MPI processes > type: jacobi > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=9600, cols=3200 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=3200 > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Mat Object: 1 MPI processes > type: nest > rows=12800, cols=12800 > Matrix object: > type=nest, rows=2, cols=2 > MatNest structure: > (0,0) : prefix="mom_", type=seqaij, rows=9600, cols=9600 > (0,1) : prefix="grad_", type=seqaij, rows=9600, cols=3200 > (1,0) : prefix="div_", type=seqaij, rows=3200, cols=9600 > (1,1) : prefix="stab_", type=seqaij, rows=3200, cols=3200 > Mat Object: 1 MPI processes > type: nest > rows=12800, cols=12800 > Matrix object: > type=nest, rows=2, cols=2 > MatNest structure: > (0,0) : prefix="sys_fieldsplit_0_", type=seqaij, rows=9600, cols=9600 > (0,1) : type=seqaij, rows=9600, cols=3200 > (1,0) : type=seqaij, rows=3200, cols=9600 > (1,1) : prefix="sys_fieldsplit_1_", type=seqaij, rows=3200, cols=3200 > nusediter_vv 37 > nusediter_pp 37 > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/The-Ocean-Cleanup-testing-continues.htm > > ________________________________________ > From: Barry Smith > Sent: Friday, January 13, 2017 7:51 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Yes, I would have expected this to work. Could you send the output from -ksp_view in this case? > > >> On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan wrote: >> >> Barry, >> >> It's been a while but I'm finally using this function in >> 3.7.4. Is it supposed to work with fieldsplit? Here's why. >> >> I'm solving a Navier-Stokes system with fieldsplit (pc has one >> velocity solve and one pressure solve) and trying to retrieve the >> totals like this: >> >> CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) >> CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) >> CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) >> CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) >> print *, 'nusediter_vv', nusediter_vv >> print *, 'nusediter_pp', nusediter_pp >> >> Running the code shows this surprise: >> >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> >> nusediter_vv 37 >> nusediter_pp 37 >> >> So the value of nusediter_pp is indeed 37, but for nusediter_vv >> it should be 66. Any idea what went wrong? >> >> Chris >> >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl >> >> MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm >> >> ________________________________________ >> From: Barry Smith >> Sent: Saturday, April 11, 2015 12:27 AM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >> >> Chris, >> >> I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master >> >> Barry >> >>> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: >>> >>> Barry, >>> >>> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 >>> ksp and then KSPGetIterationNumber, but what does this number >>> mean? >>> >>> It appears to be the number of iterations of the last time that >>> the subsystem was solved, right? If so, this corresponds to the >>> last iteration of the coupled system, how about all the previous >>> iterations? >>> >>> Chris >>> ________________________________________ >>> From: Barry Smith >>> Sent: Friday, April 10, 2015 2:48 PM >>> To: Klaij, Christiaan >>> Cc: petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >>> >>> Chris, >>> >>> It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. >>> >>> Barry >>> >>>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >>>> >>>> A question when using PCFieldSplit: for each linear iteration of >>>> the system, how many iterations for fielsplit 0 and 1? >>>> >>>> One way to find out is to run with -ksp_monitor, >>>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >>>> gives the complete convergence history. >>>> >>>> Another way, suggested by Matt, is to use -ksp_monitor, >>>> -fieldsplit_0_ksp_converged_reason and >>>> -fieldsplit_1_ksp_converged_reason. This gives only the totals >>>> for fieldsplit 0 and 1 (but without saying for which one). >>>> >>>> Both ways require to somehow process the output, which is a bit >>>> inconvenient. Could KSPGetResidualHistory perhaps return (some) >>>> information on the subsystems' convergence for processing inside >>>> the code? >>>> >>>> Chris >>>> >>>> >>>> dr. ir. Christiaan Klaij >>>> CFD Researcher >>>> Research & Development >>>> E mailto:C.Klaij at marin.nl >>>> T +31 317 49 33 44 >>>> >>>> >>>> MARIN >>>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >>>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >>>> >>> >> > From bhatiamanav at gmail.com Mon Jan 16 20:35:16 2017 From: bhatiamanav at gmail.com (Manav Bhatia) Date: Mon, 16 Jan 2017 20:35:16 -0600 Subject: [petsc-users] xxxSetFromOptions Message-ID: <4A2AA8EE-9EB9-410F-BC8B-9B2E72035E68@gmail.com> Hi, If I am using SNES and am calling SNESSetFromOptions, do I need to still call the xxxSetForOptions for mat, vec, ksp and pc? If so, what is the recommended order of these calls? Thanks, Manav From bsmith at mcs.anl.gov Mon Jan 16 20:42:19 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 16 Jan 2017 20:42:19 -0600 Subject: [petsc-users] xxxSetFromOptions In-Reply-To: <4A2AA8EE-9EB9-410F-BC8B-9B2E72035E68@gmail.com> References: <4A2AA8EE-9EB9-410F-BC8B-9B2E72035E68@gmail.com> Message-ID: > On Jan 16, 2017, at 8:35 PM, Manav Bhatia wrote: > > Hi, > > If I am using SNES and am calling SNESSetFromOptions, do I need to still call the xxxSetForOptions for mat, vec, ksp and pc? If so, what is the recommended order of these calls? SNESSetFromOptions automatically calls it for KSP and PC underneath. If you wish to set Mat options you should call MatSetFromOptions after you have created the matrix and set its sizes. Barry > > Thanks, > Manav From C.Klaij at marin.nl Tue Jan 17 01:45:26 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Tue, 17 Jan 2017 07:45:26 +0000 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <04C3073A-84AB-419F-A143-ACF10D97B2DE@mcs.anl.gov> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl> <1484552853408.13052@marin.nl>, <04C3073A-84AB-419F-A143-ACF10D97B2DE@mcs.anl.gov> Message-ID: <1484639126495.83463@marin.nl> Well, that's it, all the rest was hard coded. Here's the relevant part of the code: CALL PCSetType(pc_system,PCFIELDSPLIT,ierr); CHKERRQ(ierr) CALL PCFieldSplitSetType(pc_system,PC_COMPOSITE_SCHUR,ierr); CHKERRQ(ierr) CALL PCFieldSplitSetIS(pc_system,"0",isgs(1),ierr); CHKERRQ(ierr) CALL PCFieldSplitSetIS(pc_system,"1",isgs(2),ierr); CHKERRQ(ierr) CALL PCFieldSplitSetSchurFactType(pc_system,PC_FIELDSPLIT_SCHUR_FACT_FULL,ierr);CHKERRQ(ierr) CALL PCFieldSplitSetSchurPre(pc_system,PC_FIELDSPLIT_SCHUR_PRE_SELFP,PETSC_NULL_OBJECT,ierr);CHKERRQ(ierr) CALL KSPSetTolerances(ksp_system,tol,PETSC_DEFAULT_REAL,PETSC_DEFAULT_REAL,maxiter,ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_rtol","0.01",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_rtol","0.01",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_pc_side","right",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_pc_side","right",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_type","gmres",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_ksp_type","preonly",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_pc_type","jacobi",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_ksp_type","preonly",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_pc_type","jacobi",ierr); CHKERRQ(ierr) dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Verification-and-validation-exercises-for-flow-around-KVLCC2-tanker.htm ________________________________________ From: Barry Smith Sent: Monday, January 16, 2017 9:28 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 Please send all the command line options you use. > On Jan 16, 2017, at 1:47 AM, Klaij, Christiaan wrote: > > Barry, > > Sure, here's the output with: > > -sys_ksp_view -sys_ksp_converged_reason -sys_fieldsplit_0_ksp_converged_reason -sys_fieldsplit_1_ksp_converged_reason > > (In my previous email, I rearranged 0 & 1 for easy summing.) > > Chris > > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_ solve converged due to CONVERGED_RTOL iterations 6 > KSP Object:(sys_) 1 MPI processes > type: fgmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=300, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object:(sys_) 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (sys_fieldsplit_0_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: (sys_fieldsplit_0_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=9600, cols=9600 > package used to perform factorization: petsc > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP solver for upper A00 in upper triangular factor > KSP Object: (sys_fieldsplit_1_upper_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sys_fieldsplit_1_upper_) 1 MPI processes > type: jacobi > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (sys_fieldsplit_1_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: (sys_fieldsplit_1_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=3200 > package used to perform factorization: petsc > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Mat Object: (sys_fieldsplit_1_) 1 MPI processes > type: schurcomplement > rows=3200, cols=3200 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (sys_fieldsplit_1_) 1 MPI processes > type: seqaij > rows=3200, cols=3200 > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP of A00 > KSP Object: (sys_fieldsplit_1_inner_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sys_fieldsplit_1_inner_) 1 MPI processes > type: jacobi > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=9600, cols=3200 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=3200 > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Mat Object: 1 MPI processes > type: nest > rows=12800, cols=12800 > Matrix object: > type=nest, rows=2, cols=2 > MatNest structure: > (0,0) : prefix="mom_", type=seqaij, rows=9600, cols=9600 > (0,1) : prefix="grad_", type=seqaij, rows=9600, cols=3200 > (1,0) : prefix="div_", type=seqaij, rows=3200, cols=9600 > (1,1) : prefix="stab_", type=seqaij, rows=3200, cols=3200 > Mat Object: 1 MPI processes > type: nest > rows=12800, cols=12800 > Matrix object: > type=nest, rows=2, cols=2 > MatNest structure: > (0,0) : prefix="sys_fieldsplit_0_", type=seqaij, rows=9600, cols=9600 > (0,1) : type=seqaij, rows=9600, cols=3200 > (1,0) : type=seqaij, rows=3200, cols=9600 > (1,1) : prefix="sys_fieldsplit_1_", type=seqaij, rows=3200, cols=3200 > nusediter_vv 37 > nusediter_pp 37 > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/The-Ocean-Cleanup-testing-continues.htm > > ________________________________________ > From: Barry Smith > Sent: Friday, January 13, 2017 7:51 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Yes, I would have expected this to work. Could you send the output from -ksp_view in this case? > > >> On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan wrote: >> >> Barry, >> >> It's been a while but I'm finally using this function in >> 3.7.4. Is it supposed to work with fieldsplit? Here's why. >> >> I'm solving a Navier-Stokes system with fieldsplit (pc has one >> velocity solve and one pressure solve) and trying to retrieve the >> totals like this: >> >> CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) >> CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) >> CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) >> CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) >> print *, 'nusediter_vv', nusediter_vv >> print *, 'nusediter_pp', nusediter_pp >> >> Running the code shows this surprise: >> >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> >> nusediter_vv 37 >> nusediter_pp 37 >> >> So the value of nusediter_pp is indeed 37, but for nusediter_vv >> it should be 66. Any idea what went wrong? >> >> Chris >> >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl >> >> MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm >> >> ________________________________________ >> From: Barry Smith >> Sent: Saturday, April 11, 2015 12:27 AM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >> >> Chris, >> >> I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master >> >> Barry >> >>> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: >>> >>> Barry, >>> >>> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 >>> ksp and then KSPGetIterationNumber, but what does this number >>> mean? >>> >>> It appears to be the number of iterations of the last time that >>> the subsystem was solved, right? If so, this corresponds to the >>> last iteration of the coupled system, how about all the previous >>> iterations? >>> >>> Chris >>> ________________________________________ >>> From: Barry Smith >>> Sent: Friday, April 10, 2015 2:48 PM >>> To: Klaij, Christiaan >>> Cc: petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >>> >>> Chris, >>> >>> It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. >>> >>> Barry >>> >>>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >>>> >>>> A question when using PCFieldSplit: for each linear iteration of >>>> the system, how many iterations for fielsplit 0 and 1? >>>> >>>> One way to find out is to run with -ksp_monitor, >>>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >>>> gives the complete convergence history. >>>> >>>> Another way, suggested by Matt, is to use -ksp_monitor, >>>> -fieldsplit_0_ksp_converged_reason and >>>> -fieldsplit_1_ksp_converged_reason. This gives only the totals >>>> for fieldsplit 0 and 1 (but without saying for which one). >>>> >>>> Both ways require to somehow process the output, which is a bit >>>> inconvenient. Could KSPGetResidualHistory perhaps return (some) >>>> information on the subsystems' convergence for processing inside >>>> the code? >>>> >>>> Chris >>>> >>>> >>>> dr. ir. Christiaan Klaij >>>> CFD Researcher >>>> Research & Development >>>> E mailto:C.Klaij at marin.nl >>>> T +31 317 49 33 44 >>>> >>>> >>>> MARIN >>>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >>>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >>>> >>> >> > From fangbowa at buffalo.edu Tue Jan 17 11:39:58 2017 From: fangbowa at buffalo.edu (Fangbo Wang) Date: Tue, 17 Jan 2017 12:39:58 -0500 Subject: [petsc-users] Problems on creating matrices based on different communicators other than MPI_COMM_WORLD. Message-ID: Hi, I know how to define groups and communicators in MPI. 1. My question is how can I define matices on small communicators (not MPI_COMM_WORLD)? I tried something like this but does not work: for (int i=0; i<6; i++) { MatCreate(comm1,&general_detM[i]); MatSetSizes(general_detM[i],PETSC_DECIDE,PETSC_DECIDE, General_Dofs, General_Dofs); MatSetFromOptions(general_detM[i]); MatMPIAIJSetPreallocation(general_detM[i],300,NULL,300,NULL); MatSeqAIJSetPreallocation(general_detM[i],600,NULL); MatSetOption(general_detM[i], MAT_NEW_NONZERO_ALLOCATION_ERR, PETSC_FALSE); //to extend the preallocatted memory for more values } If I change MatCreate(comm1,&general_detM[i]) to MatCreate(MPI_COMM_WORLD,&general_detM[i]), the code works. 2. Also, there is one more question, how can I use MatSetValues to insert values to a matrix not based on MPI_COMM_WORLD? Can anyone help me on this? Thank you very much! -- Fangbo Wang, PhD student Stochastic Geomechanics Research Group Department of Civil, Structural and Environmental Engineering University at Buffalo Email: *fangbowa at buffalo.edu * -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Tue Jan 17 12:02:47 2017 From: hzhang at mcs.anl.gov (Hong) Date: Tue, 17 Jan 2017 12:02:47 -0600 Subject: [petsc-users] Problems on creating matrices based on different communicators other than MPI_COMM_WORLD. In-Reply-To: References: Message-ID: Fangbo : > > > 1. My question is how can I define matices on small communicators (not > MPI_COMM_WORLD)? > I tried something like this but does not work: > > for (int i=0; i<6; i++) { > MatCreate(comm1,&general_detM[i]); > MatSetSizes(general_detM[i],PETSC_DECIDE,PETSC_DECIDE, > General_Dofs, General_Dofs); > MatSetFromOptions(general_detM[i]); > MatMPIAIJSetPreallocation(general_detM[i],300,NULL,300,NULL); > MatSeqAIJSetPreallocation(general_detM[i],600,NULL); > MatSetOption(general_detM[i], MAT_NEW_NONZERO_ALLOCATION_ERR, > PETSC_FALSE); //to extend the preallocatted memory for more values > } > > If I change MatCreate(comm1,&general_detM[i]) to > MatCreate(MPI_COMM_WORLD,&general_detM[i]), the code works. > > Only processors that belong to comm1 can call this block of code. > > 2. Also, there is one more question, how can I use MatSetValues to insert > values to a matrix not based on MPI_COMM_WORLD? > Again, processors in comm1 can call MatSetValues(). Hong > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mlohry at princeton.edu Tue Jan 17 13:32:27 2017 From: mlohry at princeton.edu (Mark W. Lohry) Date: Tue, 17 Jan 2017 19:32:27 +0000 Subject: [petsc-users] Modifying the solution vector at each TS and/or SNES step Message-ID: <1C4B04A3719F56479255009C095BE3B5931A5783@CSGMBX202W.pu.win.princeton.edu> I have an algorithm where I'd like to slightly modify the solution vector at certain stages (specifically applying a filtering operation), for example at the conclusion of each TS or SNES step. When TSMonitor is called, the solution vector is locked read-only. I'd also be concerned about modifying it there -- any kind of computation of the time derivative should be done with my altered solution vector, and similar for SNES although it's probably less critical there. Are there any hooks available to let me modify a solution vector at particular points like that? Thanks, Mark Lohry -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Jan 17 14:03:27 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 17 Jan 2017 14:03:27 -0600 Subject: [petsc-users] Modifying the solution vector at each TS and/or SNES step In-Reply-To: <1C4B04A3719F56479255009C095BE3B5931A5783@CSGMBX202W.pu.win.princeton.edu> References: <1C4B04A3719F56479255009C095BE3B5931A5783@CSGMBX202W.pu.win.princeton.edu> Message-ID: On Tue, Jan 17, 2017 at 1:32 PM, Mark W. Lohry wrote: > I have an algorithm where I'd like to slightly modify the solution vector > at certain stages (specifically applying a filtering operation), for > example at the conclusion of each TS or SNES step. > This is called at the beginning: http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESSetUpdate.html but I used to put my modifications for Newton here http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/SNES/SNESLineSearchSetPostCheck.html#SNESLineSearchSetPostCheck For TS, there is http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/TS/TSSetPostStep.html Matt > When TSMonitor is called, the solution vector is locked read-only. I'd > also be concerned about modifying it there -- any kind of computation of > the time derivative should be done with my altered solution vector, and > similar for SNES although it's probably less critical there. > > Are there any hooks available to let me modify a solution vector at > particular points like that? > > Thanks, > Mark Lohry > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From C.Klaij at marin.nl Wed Jan 18 02:40:20 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 18 Jan 2017 08:40:20 +0000 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <1484639126495.83463@marin.nl> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl> <1484552853408.13052@marin.nl>, <04C3073A-84AB-419F-A143-ACF10D97B2DE@mcs.anl.gov>, <1484639126495.83463@marin.nl> Message-ID: <1484728820045.70097@marin.nl> Barry, I've managed to replicate the problem with 3.7.4 snes/examples/tutorials/ex70.c. Basically I've added KSPGetTotalIterations to main (file is attached): $ diff -u ex70.c.bak ex70.c --- ex70.c.bak2017-01-18 09:25:46.286174830 +0100 +++ ex70.c2017-01-18 09:03:40.904483434 +0100 @@ -669,6 +669,10 @@ KSP ksp; PetscErrorCode ierr; + KSP *subksp; + PC pc; + PetscInt numsplit = 1, nusediter_vv, nusediter_pp; + ierr = PetscInitialize(&argc, &argv, NULL, help);CHKERRQ(ierr); s.nx = 4; s.ny = 6; @@ -690,6 +694,13 @@ ierr = StokesSetupPC(&s, ksp);CHKERRQ(ierr); ierr = KSPSolve(ksp, s.b, s.x);CHKERRQ(ierr); + ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); + ierr = PCFieldSplitGetSubKSP(pc,&numsplit,&subksp); CHKERRQ(ierr); + ierr = KSPGetTotalIterations(subksp[0],&nusediter_vv); CHKERRQ(ierr); + ierr = KSPGetTotalIterations(subksp[1],&nusediter_pp); CHKERRQ(ierr); + ierr = PetscPrintf(PETSC_COMM_WORLD," total u solves = %i\n", nusediter_vv); CHKERRQ(ierr); + ierr = PetscPrintf(PETSC_COMM_WORLD," total p solves = %i\n", nusediter_pp); CHKERRQ(ierr); + /* don't trust, verify! */ ierr = StokesCalcResidual(&s);CHKERRQ(ierr); ierr = StokesCalcError(&s);CHKERRQ(ierr); Now run as follows: $ mpirun -n 2 ./ex70 -ksp_type fgmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_fact_type lower -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type bjacobi -fieldsplit_1_pc_type jacobi -fieldsplit_1_inner_ksp_type preonly -fieldsplit_1_inner_pc_type jacobi -fieldsplit_1_upper_ksp_type preonly -fieldsplit_1_upper_pc_type jacobi -fieldsplit_0_ksp_converged_reason -fieldsplit_1_ksp_converged_reason Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 14 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 14 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 16 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 16 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 17 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 18 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 20 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 21 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 23 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 total u solves = 225 total p solves = 225 residual u = 9.67257e-06 residual p = 5.42082e-07 residual [u,p] = 9.68775e-06 discretization error u = 0.0106464 discretization error p = 1.85907 discretization error [u,p] = 1.8591 So here again the total of 225 is correct for p, but for u it should be 60. Hope this helps you find the problem. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Few-places-left-for-Offshore-and-Ship-hydrodynamics-courses.htm ________________________________________ From: Klaij, Christiaan Sent: Tuesday, January 17, 2017 8:45 AM To: Barry Smith Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 Well, that's it, all the rest was hard coded. Here's the relevant part of the code: CALL PCSetType(pc_system,PCFIELDSPLIT,ierr); CHKERRQ(ierr) CALL PCFieldSplitSetType(pc_system,PC_COMPOSITE_SCHUR,ierr); CHKERRQ(ierr) CALL PCFieldSplitSetIS(pc_system,"0",isgs(1),ierr); CHKERRQ(ierr) CALL PCFieldSplitSetIS(pc_system,"1",isgs(2),ierr); CHKERRQ(ierr) CALL PCFieldSplitSetSchurFactType(pc_system,PC_FIELDSPLIT_SCHUR_FACT_FULL,ierr);CHKERRQ(ierr) CALL PCFieldSplitSetSchurPre(pc_system,PC_FIELDSPLIT_SCHUR_PRE_SELFP,PETSC_NULL_OBJECT,ierr);CHKERRQ(ierr) CALL KSPSetTolerances(ksp_system,tol,PETSC_DEFAULT_REAL,PETSC_DEFAULT_REAL,maxiter,ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_rtol","0.01",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_rtol","0.01",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_pc_side","right",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_pc_side","right",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_type","gmres",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_ksp_type","preonly",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_pc_type","jacobi",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_ksp_type","preonly",ierr); CHKERRQ(ierr) CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_pc_type","jacobi",ierr); CHKERRQ(ierr) ________________________________________ From: Barry Smith Sent: Monday, January 16, 2017 9:28 PM To: Klaij, Christiaan Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 Please send all the command line options you use. > On Jan 16, 2017, at 1:47 AM, Klaij, Christiaan wrote: > > Barry, > > Sure, here's the output with: > > -sys_ksp_view -sys_ksp_converged_reason -sys_fieldsplit_0_ksp_converged_reason -sys_fieldsplit_1_ksp_converged_reason > > (In my previous email, I rearranged 0 & 1 for easy summing.) > > Chris > > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 > Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 > Linear sys_ solve converged due to CONVERGED_RTOL iterations 6 > KSP Object:(sys_) 1 MPI processes > type: fgmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=300, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object:(sys_) 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (sys_fieldsplit_0_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: (sys_fieldsplit_0_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=9600, cols=9600 > package used to perform factorization: petsc > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP solver for upper A00 in upper triangular factor > KSP Object: (sys_fieldsplit_1_upper_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sys_fieldsplit_1_upper_) 1 MPI processes > type: jacobi > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (sys_fieldsplit_1_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=0.01, absolute=1e-50, divergence=10000. > right preconditioning > using UNPRECONDITIONED norm type for convergence test > PC Object: (sys_fieldsplit_1_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=3200 > package used to perform factorization: petsc > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Mat Object: (sys_fieldsplit_1_) 1 MPI processes > type: schurcomplement > rows=3200, cols=3200 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (sys_fieldsplit_1_) 1 MPI processes > type: seqaij > rows=3200, cols=3200 > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > KSP of A00 > KSP Object: (sys_fieldsplit_1_inner_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (sys_fieldsplit_1_inner_) 1 MPI processes > type: jacobi > linear system matrix = precond matrix: > Mat Object: (sys_fieldsplit_0_) 1 MPI processes > type: seqaij > rows=9600, cols=9600 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=9600, cols=3200 > total: nonzeros=47280, allocated nonzeros=47280 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > Mat Object: 1 MPI processes > type: seqaij > rows=3200, cols=3200 > total: nonzeros=40404, allocated nonzeros=40404 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix followed by preconditioner matrix: > Mat Object: 1 MPI processes > type: nest > rows=12800, cols=12800 > Matrix object: > type=nest, rows=2, cols=2 > MatNest structure: > (0,0) : prefix="mom_", type=seqaij, rows=9600, cols=9600 > (0,1) : prefix="grad_", type=seqaij, rows=9600, cols=3200 > (1,0) : prefix="div_", type=seqaij, rows=3200, cols=9600 > (1,1) : prefix="stab_", type=seqaij, rows=3200, cols=3200 > Mat Object: 1 MPI processes > type: nest > rows=12800, cols=12800 > Matrix object: > type=nest, rows=2, cols=2 > MatNest structure: > (0,0) : prefix="sys_fieldsplit_0_", type=seqaij, rows=9600, cols=9600 > (0,1) : type=seqaij, rows=9600, cols=3200 > (1,0) : type=seqaij, rows=3200, cols=9600 > (1,1) : prefix="sys_fieldsplit_1_", type=seqaij, rows=3200, cols=3200 > nusediter_vv 37 > nusediter_pp 37 > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/The-Ocean-Cleanup-testing-continues.htm > > ________________________________________ > From: Barry Smith > Sent: Friday, January 13, 2017 7:51 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Yes, I would have expected this to work. Could you send the output from -ksp_view in this case? > > >> On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan wrote: >> >> Barry, >> >> It's been a while but I'm finally using this function in >> 3.7.4. Is it supposed to work with fieldsplit? Here's why. >> >> I'm solving a Navier-Stokes system with fieldsplit (pc has one >> velocity solve and one pressure solve) and trying to retrieve the >> totals like this: >> >> CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) >> CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) >> CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) >> CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) >> print *, 'nusediter_vv', nusediter_vv >> print *, 'nusediter_pp', nusediter_pp >> >> Running the code shows this surprise: >> >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> >> nusediter_vv 37 >> nusediter_pp 37 >> >> So the value of nusediter_pp is indeed 37, but for nusediter_vv >> it should be 66. Any idea what went wrong? >> >> Chris >> >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl >> >> MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm >> >> ________________________________________ >> From: Barry Smith >> Sent: Saturday, April 11, 2015 12:27 AM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >> >> Chris, >> >> I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master >> >> Barry >> >>> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: >>> >>> Barry, >>> >>> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 >>> ksp and then KSPGetIterationNumber, but what does this number >>> mean? >>> >>> It appears to be the number of iterations of the last time that >>> the subsystem was solved, right? If so, this corresponds to the >>> last iteration of the coupled system, how about all the previous >>> iterations? >>> >>> Chris >>> ________________________________________ >>> From: Barry Smith >>> Sent: Friday, April 10, 2015 2:48 PM >>> To: Klaij, Christiaan >>> Cc: petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >>> >>> Chris, >>> >>> It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. >>> >>> Barry >>> >>>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >>>> >>>> A question when using PCFieldSplit: for each linear iteration of >>>> the system, how many iterations for fielsplit 0 and 1? >>>> >>>> One way to find out is to run with -ksp_monitor, >>>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >>>> gives the complete convergence history. >>>> >>>> Another way, suggested by Matt, is to use -ksp_monitor, >>>> -fieldsplit_0_ksp_converged_reason and >>>> -fieldsplit_1_ksp_converged_reason. This gives only the totals >>>> for fieldsplit 0 and 1 (but without saying for which one). >>>> >>>> Both ways require to somehow process the output, which is a bit >>>> inconvenient. Could KSPGetResidualHistory perhaps return (some) >>>> information on the subsystems' convergence for processing inside >>>> the code? >>>> >>>> Chris >>>> >>>> >>>> dr. ir. Christiaan Klaij >>>> CFD Researcher >>>> Research & Development >>>> E mailto:C.Klaij at marin.nl >>>> T +31 317 49 33 44 >>>> >>>> >>>> MARIN >>>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >>>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >>>> >>> >> > -------------- next part -------------- A non-text attachment was scrubbed... Name: ex70.c Type: text/x-csrc Size: 28808 bytes Desc: ex70.c URL: From lawrence.mitchell at imperial.ac.uk Wed Jan 18 03:59:22 2017 From: lawrence.mitchell at imperial.ac.uk (Lawrence Mitchell) Date: Wed, 18 Jan 2017 09:59:22 +0000 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <1484728820045.70097@marin.nl> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl> <1484552853408.13052@marin.nl> <04C3073A-84AB-419F-A143-ACF10D97B2DE@mcs.anl.gov> <1484639126495.83463@marin.nl> <1484728820045.70097@marin.nl> Message-ID: <1110db5d-ddec-d295-2eba-b15182033b69@imperial.ac.uk> On 18/01/17 08:40, Klaij, Christiaan wrote: > Barry, > > I've managed to replicate the problem with 3.7.4 > snes/examples/tutorials/ex70.c. Basically I've added > KSPGetTotalIterations to main (file is attached): PCFieldSplitGetSubKSP returns, in the Schur case: MatSchurComplementGet(pc->schur, &ksp); in subksp[0] and pc->schur in subksp[1] In your case, subksp[0] is the (preonly) approximation to A^{-1} *inside* S = D - C A_inner^{-1} B And subksp[1] is the approximation to S^{-1}. Since each application of S to a vector (required in S^{-1}) requires one application of A^{-1}, because you use 225 iterations in total to invert S, you also use 225 applications of the KSP on A_inner. There doesn't appear to be a way to get the KSP used for A^{-1} if you've asked for different approximations to A^{-1} in the 0,0 block and inside S. Cheers, Lawrence > $ diff -u ex70.c.bak ex70.c > --- ex70.c.bak2017-01-18 09:25:46.286174830 +0100 > +++ ex70.c2017-01-18 09:03:40.904483434 +0100 > @@ -669,6 +669,10 @@ > KSP ksp; > PetscErrorCode ierr; > > + KSP *subksp; > + PC pc; > + PetscInt numsplit = 1, nusediter_vv, nusediter_pp; > + > ierr = PetscInitialize(&argc, &argv, NULL, help);CHKERRQ(ierr); > s.nx = 4; > s.ny = 6; > @@ -690,6 +694,13 @@ > ierr = StokesSetupPC(&s, ksp);CHKERRQ(ierr); > ierr = KSPSolve(ksp, s.b, s.x);CHKERRQ(ierr); > > + ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); > + ierr = PCFieldSplitGetSubKSP(pc,&numsplit,&subksp); CHKERRQ(ierr); > + ierr = KSPGetTotalIterations(subksp[0],&nusediter_vv); CHKERRQ(ierr); > + ierr = KSPGetTotalIterations(subksp[1],&nusediter_pp); CHKERRQ(ierr); > + ierr = PetscPrintf(PETSC_COMM_WORLD," total u solves = %i\n", nusediter_vv); CHKERRQ(ierr); > + ierr = PetscPrintf(PETSC_COMM_WORLD," total p solves = %i\n", nusediter_pp); CHKERRQ(ierr); > + > /* don't trust, verify! */ > ierr = StokesCalcResidual(&s);CHKERRQ(ierr); > ierr = StokesCalcError(&s);CHKERRQ(ierr); > > Now run as follows: > > $ mpirun -n 2 ./ex70 -ksp_type fgmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_fact_type lower -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type bjacobi -fieldsplit_1_pc_type jacobi -fieldsplit_1_inner_ksp_type preonly -fieldsplit_1_inner_pc_type jacobi -fieldsplit_1_upper_ksp_type preonly -fieldsplit_1_upper_pc_type jacobi -fieldsplit_0_ksp_converged_reason -fieldsplit_1_ksp_converged_reason > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 14 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 14 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 16 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 16 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 17 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 18 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 20 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 21 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 23 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > total u solves = 225 > total p solves = 225 > residual u = 9.67257e-06 > residual p = 5.42082e-07 > residual [u,p] = 9.68775e-06 > discretization error u = 0.0106464 > discretization error p = 1.85907 > discretization error [u,p] = 1.8591 > > So here again the total of 225 is correct for p, but for u it > should be 60. Hope this helps you find the problem. > > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Few-places-left-for-Offshore-and-Ship-hydrodynamics-courses.htm > > ________________________________________ > From: Klaij, Christiaan > Sent: Tuesday, January 17, 2017 8:45 AM > To: Barry Smith > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Well, that's it, all the rest was hard coded. Here's the relevant part of the code: > > CALL PCSetType(pc_system,PCFIELDSPLIT,ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetType(pc_system,PC_COMPOSITE_SCHUR,ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetIS(pc_system,"0",isgs(1),ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetIS(pc_system,"1",isgs(2),ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetSchurFactType(pc_system,PC_FIELDSPLIT_SCHUR_FACT_FULL,ierr);CHKERRQ(ierr) > CALL PCFieldSplitSetSchurPre(pc_system,PC_FIELDSPLIT_SCHUR_PRE_SELFP,PETSC_NULL_OBJECT,ierr);CHKERRQ(ierr) > > CALL KSPSetTolerances(ksp_system,tol,PETSC_DEFAULT_REAL,PETSC_DEFAULT_REAL,maxiter,ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_rtol","0.01",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_rtol","0.01",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_pc_side","right",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_pc_side","right",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_type","gmres",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_ksp_type","preonly",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_pc_type","jacobi",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_ksp_type","preonly",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_pc_type","jacobi",ierr); CHKERRQ(ierr) > > ________________________________________ > From: Barry Smith > Sent: Monday, January 16, 2017 9:28 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Please send all the command line options you use. > > >> On Jan 16, 2017, at 1:47 AM, Klaij, Christiaan wrote: >> >> Barry, >> >> Sure, here's the output with: >> >> -sys_ksp_view -sys_ksp_converged_reason -sys_fieldsplit_0_ksp_converged_reason -sys_fieldsplit_1_ksp_converged_reason >> >> (In my previous email, I rearranged 0 & 1 for easy summing.) >> >> Chris >> >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_ solve converged due to CONVERGED_RTOL iterations 6 >> KSP Object:(sys_) 1 MPI processes >> type: fgmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=300, initial guess is zero >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. >> right preconditioning >> using UNPRECONDITIONED norm type for convergence test >> PC Object:(sys_) 1 MPI processes >> type: fieldsplit >> FieldSplit with Schur preconditioner, factorization FULL >> Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse >> Split info: >> Split number 0 Defined by IS >> Split number 1 Defined by IS >> KSP solver for A00 block >> KSP Object: (sys_fieldsplit_0_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. >> right preconditioning >> using UNPRECONDITIONED norm type for convergence test >> PC Object: (sys_fieldsplit_0_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 1., needed 1. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> package used to perform factorization: petsc >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP solver for upper A00 in upper triangular factor >> KSP Object: (sys_fieldsplit_1_upper_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (sys_fieldsplit_1_upper_) 1 MPI processes >> type: jacobi >> linear system matrix = precond matrix: >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP solver for S = A11 - A10 inv(A00) A01 >> KSP Object: (sys_fieldsplit_1_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. >> right preconditioning >> using UNPRECONDITIONED norm type for convergence test >> PC Object: (sys_fieldsplit_1_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 1., needed 1. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=3200, cols=3200 >> package used to perform factorization: petsc >> total: nonzeros=40404, allocated nonzeros=40404 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Mat Object: (sys_fieldsplit_1_) 1 MPI processes >> type: schurcomplement >> rows=3200, cols=3200 >> Schur complement A11 - A10 inv(A00) A01 >> A11 >> Mat Object: (sys_fieldsplit_1_) 1 MPI processes >> type: seqaij >> rows=3200, cols=3200 >> total: nonzeros=40404, allocated nonzeros=40404 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> A10 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=3200, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP of A00 >> KSP Object: (sys_fieldsplit_1_inner_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (sys_fieldsplit_1_inner_) 1 MPI processes >> type: jacobi >> linear system matrix = precond matrix: >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> A01 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=9600, cols=3200 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> Mat Object: 1 MPI processes >> type: seqaij >> rows=3200, cols=3200 >> total: nonzeros=40404, allocated nonzeros=40404 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Mat Object: 1 MPI processes >> type: nest >> rows=12800, cols=12800 >> Matrix object: >> type=nest, rows=2, cols=2 >> MatNest structure: >> (0,0) : prefix="mom_", type=seqaij, rows=9600, cols=9600 >> (0,1) : prefix="grad_", type=seqaij, rows=9600, cols=3200 >> (1,0) : prefix="div_", type=seqaij, rows=3200, cols=9600 >> (1,1) : prefix="stab_", type=seqaij, rows=3200, cols=3200 >> Mat Object: 1 MPI processes >> type: nest >> rows=12800, cols=12800 >> Matrix object: >> type=nest, rows=2, cols=2 >> MatNest structure: >> (0,0) : prefix="sys_fieldsplit_0_", type=seqaij, rows=9600, cols=9600 >> (0,1) : type=seqaij, rows=9600, cols=3200 >> (1,0) : type=seqaij, rows=3200, cols=9600 >> (1,1) : prefix="sys_fieldsplit_1_", type=seqaij, rows=3200, cols=3200 >> nusediter_vv 37 >> nusediter_pp 37 >> >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl >> >> MARIN news: http://www.marin.nl/web/News/News-items/The-Ocean-Cleanup-testing-continues.htm >> >> ________________________________________ >> From: Barry Smith >> Sent: Friday, January 13, 2017 7:51 PM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >> >> Yes, I would have expected this to work. Could you send the output from -ksp_view in this case? >> >> >>> On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan wrote: >>> >>> Barry, >>> >>> It's been a while but I'm finally using this function in >>> 3.7.4. Is it supposed to work with fieldsplit? Here's why. >>> >>> I'm solving a Navier-Stokes system with fieldsplit (pc has one >>> velocity solve and one pressure solve) and trying to retrieve the >>> totals like this: >>> >>> CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) >>> CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) >>> CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) >>> CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) >>> print *, 'nusediter_vv', nusediter_vv >>> print *, 'nusediter_pp', nusediter_pp >>> >>> Running the code shows this surprise: >>> >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >>> >>> nusediter_vv 37 >>> nusediter_pp 37 >>> >>> So the value of nusediter_pp is indeed 37, but for nusediter_vv >>> it should be 66. Any idea what went wrong? >>> >>> Chris >>> >>> >>> >>> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >>> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl >>> >>> MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm >>> >>> ________________________________________ >>> From: Barry Smith >>> Sent: Saturday, April 11, 2015 12:27 AM >>> To: Klaij, Christiaan >>> Cc: petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >>> >>> Chris, >>> >>> I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master >>> >>> Barry >>> >>>> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: >>>> >>>> Barry, >>>> >>>> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 >>>> ksp and then KSPGetIterationNumber, but what does this number >>>> mean? >>>> >>>> It appears to be the number of iterations of the last time that >>>> the subsystem was solved, right? If so, this corresponds to the >>>> last iteration of the coupled system, how about all the previous >>>> iterations? >>>> >>>> Chris >>>> ________________________________________ >>>> From: Barry Smith >>>> Sent: Friday, April 10, 2015 2:48 PM >>>> To: Klaij, Christiaan >>>> Cc: petsc-users at mcs.anl.gov >>>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >>>> >>>> Chris, >>>> >>>> It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. >>>> >>>> Barry >>>> >>>>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >>>>> >>>>> A question when using PCFieldSplit: for each linear iteration of >>>>> the system, how many iterations for fielsplit 0 and 1? >>>>> >>>>> One way to find out is to run with -ksp_monitor, >>>>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >>>>> gives the complete convergence history. >>>>> >>>>> Another way, suggested by Matt, is to use -ksp_monitor, >>>>> -fieldsplit_0_ksp_converged_reason and >>>>> -fieldsplit_1_ksp_converged_reason. This gives only the totals >>>>> for fieldsplit 0 and 1 (but without saying for which one). >>>>> >>>>> Both ways require to somehow process the output, which is a bit >>>>> inconvenient. Could KSPGetResidualHistory perhaps return (some) >>>>> information on the subsystems' convergence for processing inside >>>>> the code? >>>>> >>>>> Chris >>>>> >>>>> >>>>> dr. ir. Christiaan Klaij >>>>> CFD Researcher >>>>> Research & Development >>>>> E mailto:C.Klaij at marin.nl >>>>> T +31 317 49 33 44 >>>>> >>>>> >>>>> MARIN >>>>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >>>>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >>>>> >>>> >>> >> > -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 473 bytes Desc: OpenPGP digital signature URL: From C.Klaij at marin.nl Wed Jan 18 04:42:12 2017 From: C.Klaij at marin.nl (Klaij, Christiaan) Date: Wed, 18 Jan 2017 10:42:12 +0000 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <1110db5d-ddec-d295-2eba-b15182033b69@imperial.ac.uk> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl> <1484552853408.13052@marin.nl> <04C3073A-84AB-419F-A143-ACF10D97B2DE@mcs.anl.gov> <1484639126495.83463@marin.nl> <1484728820045.70097@marin.nl>, <1110db5d-ddec-d295-2eba-b15182033b69@imperial.ac.uk> Message-ID: <1484736132149.87779@marin.nl> Thanks Lawrence, that nicely explains the unexpected behaviour! I guess in general there ought to be getters for the four ksp(A00)'s that occur in the full factorization. Chris dr. ir. Christiaan Klaij | CFD Researcher | Research & Development MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl MARIN news: http://www.marin.nl/web/News/News-items/Verification-and-validation-exercises-for-flow-around-KVLCC2-tanker.htm ________________________________________ From: Lawrence Mitchell Sent: Wednesday, January 18, 2017 10:59 AM To: petsc-users at mcs.anl.gov Cc: bsmith at mcs.anl.gov; Klaij, Christiaan Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 On 18/01/17 08:40, Klaij, Christiaan wrote: > Barry, > > I've managed to replicate the problem with 3.7.4 > snes/examples/tutorials/ex70.c. Basically I've added > KSPGetTotalIterations to main (file is attached): PCFieldSplitGetSubKSP returns, in the Schur case: MatSchurComplementGet(pc->schur, &ksp); in subksp[0] and pc->schur in subksp[1] In your case, subksp[0] is the (preonly) approximation to A^{-1} *inside* S = D - C A_inner^{-1} B And subksp[1] is the approximation to S^{-1}. Since each application of S to a vector (required in S^{-1}) requires one application of A^{-1}, because you use 225 iterations in total to invert S, you also use 225 applications of the KSP on A_inner. There doesn't appear to be a way to get the KSP used for A^{-1} if you've asked for different approximations to A^{-1} in the 0,0 block and inside S. Cheers, Lawrence > $ diff -u ex70.c.bak ex70.c > --- ex70.c.bak2017-01-18 09:25:46.286174830 +0100 > +++ ex70.c2017-01-18 09:03:40.904483434 +0100 > @@ -669,6 +669,10 @@ > KSP ksp; > PetscErrorCode ierr; > > + KSP *subksp; > + PC pc; > + PetscInt numsplit = 1, nusediter_vv, nusediter_pp; > + > ierr = PetscInitialize(&argc, &argv, NULL, help);CHKERRQ(ierr); > s.nx = 4; > s.ny = 6; > @@ -690,6 +694,13 @@ > ierr = StokesSetupPC(&s, ksp);CHKERRQ(ierr); > ierr = KSPSolve(ksp, s.b, s.x);CHKERRQ(ierr); > > + ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); > + ierr = PCFieldSplitGetSubKSP(pc,&numsplit,&subksp); CHKERRQ(ierr); > + ierr = KSPGetTotalIterations(subksp[0],&nusediter_vv); CHKERRQ(ierr); > + ierr = KSPGetTotalIterations(subksp[1],&nusediter_pp); CHKERRQ(ierr); > + ierr = PetscPrintf(PETSC_COMM_WORLD," total u solves = %i\n", nusediter_vv); CHKERRQ(ierr); > + ierr = PetscPrintf(PETSC_COMM_WORLD," total p solves = %i\n", nusediter_pp); CHKERRQ(ierr); > + > /* don't trust, verify! */ > ierr = StokesCalcResidual(&s);CHKERRQ(ierr); > ierr = StokesCalcError(&s);CHKERRQ(ierr); > > Now run as follows: > > $ mpirun -n 2 ./ex70 -ksp_type fgmres -pc_type fieldsplit -pc_fieldsplit_type schur -pc_fieldsplit_schur_fact_type lower -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type bjacobi -fieldsplit_1_pc_type jacobi -fieldsplit_1_inner_ksp_type preonly -fieldsplit_1_inner_pc_type jacobi -fieldsplit_1_upper_ksp_type preonly -fieldsplit_1_upper_pc_type jacobi -fieldsplit_0_ksp_converged_reason -fieldsplit_1_ksp_converged_reason > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 14 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 14 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 16 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 16 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 17 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 18 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 20 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 21 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 23 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 > total u solves = 225 > total p solves = 225 > residual u = 9.67257e-06 > residual p = 5.42082e-07 > residual [u,p] = 9.68775e-06 > discretization error u = 0.0106464 > discretization error p = 1.85907 > discretization error [u,p] = 1.8591 > > So here again the total of 225 is correct for p, but for u it > should be 60. Hope this helps you find the problem. > > Chris > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Few-places-left-for-Offshore-and-Ship-hydrodynamics-courses.htm > > ________________________________________ > From: Klaij, Christiaan > Sent: Tuesday, January 17, 2017 8:45 AM > To: Barry Smith > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Well, that's it, all the rest was hard coded. Here's the relevant part of the code: > > CALL PCSetType(pc_system,PCFIELDSPLIT,ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetType(pc_system,PC_COMPOSITE_SCHUR,ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetIS(pc_system,"0",isgs(1),ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetIS(pc_system,"1",isgs(2),ierr); CHKERRQ(ierr) > CALL PCFieldSplitSetSchurFactType(pc_system,PC_FIELDSPLIT_SCHUR_FACT_FULL,ierr);CHKERRQ(ierr) > CALL PCFieldSplitSetSchurPre(pc_system,PC_FIELDSPLIT_SCHUR_PRE_SELFP,PETSC_NULL_OBJECT,ierr);CHKERRQ(ierr) > > CALL KSPSetTolerances(ksp_system,tol,PETSC_DEFAULT_REAL,PETSC_DEFAULT_REAL,maxiter,ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_rtol","0.01",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_rtol","0.01",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_pc_side","right",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_pc_side","right",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_type","gmres",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_ksp_type","preonly",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_upper_pc_type","jacobi",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_ksp_type","preonly",ierr); CHKERRQ(ierr) > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_inner_pc_type","jacobi",ierr); CHKERRQ(ierr) > > ________________________________________ > From: Barry Smith > Sent: Monday, January 16, 2017 9:28 PM > To: Klaij, Christiaan > Cc: petsc-users at mcs.anl.gov > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > Please send all the command line options you use. > > >> On Jan 16, 2017, at 1:47 AM, Klaij, Christiaan wrote: >> >> Barry, >> >> Sure, here's the output with: >> >> -sys_ksp_view -sys_ksp_converged_reason -sys_fieldsplit_0_ksp_converged_reason -sys_fieldsplit_1_ksp_converged_reason >> >> (In my previous email, I rearranged 0 & 1 for easy summing.) >> >> Chris >> >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >> Linear sys_ solve converged due to CONVERGED_RTOL iterations 6 >> KSP Object:(sys_) 1 MPI processes >> type: fgmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=300, initial guess is zero >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. >> right preconditioning >> using UNPRECONDITIONED norm type for convergence test >> PC Object:(sys_) 1 MPI processes >> type: fieldsplit >> FieldSplit with Schur preconditioner, factorization FULL >> Preconditioner for the Schur complement formed from Sp, an assembled approximation to S, which uses (lumped, if requested) A00's diagonal's inverse >> Split info: >> Split number 0 Defined by IS >> Split number 1 Defined by IS >> KSP solver for A00 block >> KSP Object: (sys_fieldsplit_0_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. >> right preconditioning >> using UNPRECONDITIONED norm type for convergence test >> PC Object: (sys_fieldsplit_0_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 1., needed 1. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> package used to perform factorization: petsc >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix = precond matrix: >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP solver for upper A00 in upper triangular factor >> KSP Object: (sys_fieldsplit_1_upper_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (sys_fieldsplit_1_upper_) 1 MPI processes >> type: jacobi >> linear system matrix = precond matrix: >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP solver for S = A11 - A10 inv(A00) A01 >> KSP Object: (sys_fieldsplit_1_) 1 MPI processes >> type: gmres >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt Orthogonalization with no iterative refinement >> GMRES: happy breakdown tolerance 1e-30 >> maximum iterations=10000, initial guess is zero >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. >> right preconditioning >> using UNPRECONDITIONED norm type for convergence test >> PC Object: (sys_fieldsplit_1_) 1 MPI processes >> type: ilu >> ILU: out-of-place factorization >> 0 levels of fill >> tolerance for zero pivot 2.22045e-14 >> matrix ordering: natural >> factor fill ratio given 1., needed 1. >> Factored matrix follows: >> Mat Object: 1 MPI processes >> type: seqaij >> rows=3200, cols=3200 >> package used to perform factorization: petsc >> total: nonzeros=40404, allocated nonzeros=40404 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Mat Object: (sys_fieldsplit_1_) 1 MPI processes >> type: schurcomplement >> rows=3200, cols=3200 >> Schur complement A11 - A10 inv(A00) A01 >> A11 >> Mat Object: (sys_fieldsplit_1_) 1 MPI processes >> type: seqaij >> rows=3200, cols=3200 >> total: nonzeros=40404, allocated nonzeros=40404 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> A10 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=3200, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> KSP of A00 >> KSP Object: (sys_fieldsplit_1_inner_) 1 MPI processes >> type: preonly >> maximum iterations=10000, initial guess is zero >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. >> left preconditioning >> using NONE norm type for convergence test >> PC Object: (sys_fieldsplit_1_inner_) 1 MPI processes >> type: jacobi >> linear system matrix = precond matrix: >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes >> type: seqaij >> rows=9600, cols=9600 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> A01 >> Mat Object: 1 MPI processes >> type: seqaij >> rows=9600, cols=3200 >> total: nonzeros=47280, allocated nonzeros=47280 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> Mat Object: 1 MPI processes >> type: seqaij >> rows=3200, cols=3200 >> total: nonzeros=40404, allocated nonzeros=40404 >> total number of mallocs used during MatSetValues calls =0 >> not using I-node routines >> linear system matrix followed by preconditioner matrix: >> Mat Object: 1 MPI processes >> type: nest >> rows=12800, cols=12800 >> Matrix object: >> type=nest, rows=2, cols=2 >> MatNest structure: >> (0,0) : prefix="mom_", type=seqaij, rows=9600, cols=9600 >> (0,1) : prefix="grad_", type=seqaij, rows=9600, cols=3200 >> (1,0) : prefix="div_", type=seqaij, rows=3200, cols=9600 >> (1,1) : prefix="stab_", type=seqaij, rows=3200, cols=3200 >> Mat Object: 1 MPI processes >> type: nest >> rows=12800, cols=12800 >> Matrix object: >> type=nest, rows=2, cols=2 >> MatNest structure: >> (0,0) : prefix="sys_fieldsplit_0_", type=seqaij, rows=9600, cols=9600 >> (0,1) : type=seqaij, rows=9600, cols=3200 >> (1,0) : type=seqaij, rows=3200, cols=9600 >> (1,1) : prefix="sys_fieldsplit_1_", type=seqaij, rows=3200, cols=3200 >> nusediter_vv 37 >> nusediter_pp 37 >> >> >> >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl >> >> MARIN news: http://www.marin.nl/web/News/News-items/The-Ocean-Cleanup-testing-continues.htm >> >> ________________________________________ >> From: Barry Smith >> Sent: Friday, January 13, 2017 7:51 PM >> To: Klaij, Christiaan >> Cc: petsc-users at mcs.anl.gov >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >> >> Yes, I would have expected this to work. Could you send the output from -ksp_view in this case? >> >> >>> On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan wrote: >>> >>> Barry, >>> >>> It's been a while but I'm finally using this function in >>> 3.7.4. Is it supposed to work with fieldsplit? Here's why. >>> >>> I'm solving a Navier-Stokes system with fieldsplit (pc has one >>> velocity solve and one pressure solve) and trying to retrieve the >>> totals like this: >>> >>> CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) >>> CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); CHKERRQ(ierr) >>> CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) >>> CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) >>> print *, 'nusediter_vv', nusediter_vv >>> print *, 'nusediter_pp', nusediter_pp >>> >>> Running the code shows this surprise: >>> >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 1 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 7 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 8 >>> >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 22 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 6 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 3 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations 2 >>> >>> nusediter_vv 37 >>> nusediter_pp 37 >>> >>> So the value of nusediter_pp is indeed 37, but for nusediter_vv >>> it should be 66. Any idea what went wrong? >>> >>> Chris >>> >>> >>> >>> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development >>> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl >>> >>> MARIN news: http://www.marin.nl/web/News/News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm >>> >>> ________________________________________ >>> From: Barry Smith >>> Sent: Saturday, April 11, 2015 12:27 AM >>> To: Klaij, Christiaan >>> Cc: petsc-users at mcs.anl.gov >>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >>> >>> Chris, >>> >>> I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master and next. After tests it will go into master >>> >>> Barry >>> >>>> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan wrote: >>>> >>>> Barry, >>>> >>>> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 >>>> ksp and then KSPGetIterationNumber, but what does this number >>>> mean? >>>> >>>> It appears to be the number of iterations of the last time that >>>> the subsystem was solved, right? If so, this corresponds to the >>>> last iteration of the coupled system, how about all the previous >>>> iterations? >>>> >>>> Chris >>>> ________________________________________ >>>> From: Barry Smith >>>> Sent: Friday, April 10, 2015 2:48 PM >>>> To: Klaij, Christiaan >>>> Cc: petsc-users at mcs.anl.gov >>>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 >>>> >>>> Chris, >>>> >>>> It appears you should call PCFieldSplitGetSubKSP() and then get the information you want out of the individual KSPs. If this doesn't work please let us know. >>>> >>>> Barry >>>> >>>>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan wrote: >>>>> >>>>> A question when using PCFieldSplit: for each linear iteration of >>>>> the system, how many iterations for fielsplit 0 and 1? >>>>> >>>>> One way to find out is to run with -ksp_monitor, >>>>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This >>>>> gives the complete convergence history. >>>>> >>>>> Another way, suggested by Matt, is to use -ksp_monitor, >>>>> -fieldsplit_0_ksp_converged_reason and >>>>> -fieldsplit_1_ksp_converged_reason. This gives only the totals >>>>> for fieldsplit 0 and 1 (but without saying for which one). >>>>> >>>>> Both ways require to somehow process the output, which is a bit >>>>> inconvenient. Could KSPGetResidualHistory perhaps return (some) >>>>> information on the subsystems' convergence for processing inside >>>>> the code? >>>>> >>>>> Chris >>>>> >>>>> >>>>> dr. ir. Christiaan Klaij >>>>> CFD Researcher >>>>> Research & Development >>>>> E mailto:C.Klaij at marin.nl >>>>> T +31 317 49 33 44 >>>>> >>>>> >>>>> MARIN >>>>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands >>>>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl >>>>> >>>> >>> >> > From knepley at gmail.com Wed Jan 18 09:13:52 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 18 Jan 2017 09:13:52 -0600 Subject: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 In-Reply-To: <1484736132149.87779@marin.nl> References: <1428666513941.72745@marin.nl> <75D79823-7AE0-47A7-BE9E-15AB81C3581E@mcs.anl.gov> <1428671243078.94@marin.nl> <9EA3A2C1-5372-44A8-B0B7-4ADAF1D89819@mcs.anl.gov> <1484300795996.50804@marin.nl> <1484552853408.13052@marin.nl> <04C3073A-84AB-419F-A143-ACF10D97B2DE@mcs.anl.gov> <1484639126495.83463@marin.nl> <1484728820045.70097@marin.nl> <1110db5d-ddec-d295-2eba-b15182033b69@imperial.ac.uk> <1484736132149.87779@marin.nl> Message-ID: On Wed, Jan 18, 2017 at 4:42 AM, Klaij, Christiaan wrote: > Thanks Lawrence, that nicely explains the unexpected behaviour! > > I guess in general there ought to be getters for the four > ksp(A00)'s that occur in the full factorization. Yes, we will fix it. I think that the default retrieval should get the 00 block, not the inner as well. Matt > > Chris > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | http://www.marin.nl > > MARIN news: http://www.marin.nl/web/News/News-items/Verification-and- > validation-exercises-for-flow-around-KVLCC2-tanker.htm > > ________________________________________ > From: Lawrence Mitchell > Sent: Wednesday, January 18, 2017 10:59 AM > To: petsc-users at mcs.anl.gov > Cc: bsmith at mcs.anl.gov; Klaij, Christiaan > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 and 1 > > On 18/01/17 08:40, Klaij, Christiaan wrote: > > Barry, > > > > I've managed to replicate the problem with 3.7.4 > > snes/examples/tutorials/ex70.c. Basically I've added > > KSPGetTotalIterations to main (file is attached): > > PCFieldSplitGetSubKSP returns, in the Schur case: > > MatSchurComplementGet(pc->schur, &ksp); > > in subksp[0] > > and > > pc->schur in subksp[1] > > In your case, subksp[0] is the (preonly) approximation to A^{-1} *inside* > > S = D - C A_inner^{-1} B > > And subksp[1] is the approximation to S^{-1}. > > Since each application of S to a vector (required in S^{-1}) requires > one application of A^{-1}, because you use 225 iterations in total to > invert S, you also use 225 applications of the KSP on A_inner. > > There doesn't appear to be a way to get the KSP used for A^{-1} if > you've asked for different approximations to A^{-1} in the 0,0 block > and inside S. > > Cheers, > > Lawrence > > > $ diff -u ex70.c.bak ex70.c > > --- ex70.c.bak2017-01-18 09:25:46.286174830 +0100 > > +++ ex70.c2017-01-18 09:03:40.904483434 +0100 > > @@ -669,6 +669,10 @@ > > KSP ksp; > > PetscErrorCode ierr; > > > > + KSP *subksp; > > + PC pc; > > + PetscInt numsplit = 1, nusediter_vv, nusediter_pp; > > + > > ierr = PetscInitialize(&argc, &argv, NULL, help);CHKERRQ(ierr); > > s.nx = 4; > > s.ny = 6; > > @@ -690,6 +694,13 @@ > > ierr = StokesSetupPC(&s, ksp);CHKERRQ(ierr); > > ierr = KSPSolve(ksp, s.b, s.x);CHKERRQ(ierr); > > > > + ierr = KSPGetPC(ksp, &pc);CHKERRQ(ierr); > > + ierr = PCFieldSplitGetSubKSP(pc,&numsplit,&subksp); CHKERRQ(ierr); > > + ierr = KSPGetTotalIterations(subksp[0],&nusediter_vv); CHKERRQ(ierr); > > + ierr = KSPGetTotalIterations(subksp[1],&nusediter_pp); CHKERRQ(ierr); > > + ierr = PetscPrintf(PETSC_COMM_WORLD," total u solves = %i\n", > nusediter_vv); CHKERRQ(ierr); > > + ierr = PetscPrintf(PETSC_COMM_WORLD," total p solves = %i\n", > nusediter_pp); CHKERRQ(ierr); > > + > > /* don't trust, verify! */ > > ierr = StokesCalcResidual(&s);CHKERRQ(ierr); > > ierr = StokesCalcError(&s);CHKERRQ(ierr); > > > > Now run as follows: > > > > $ mpirun -n 2 ./ex70 -ksp_type fgmres -pc_type fieldsplit > -pc_fieldsplit_type schur -pc_fieldsplit_schur_fact_type lower > -fieldsplit_0_ksp_type gmres -fieldsplit_0_pc_type bjacobi > -fieldsplit_1_pc_type jacobi -fieldsplit_1_inner_ksp_type preonly > -fieldsplit_1_inner_pc_type jacobi -fieldsplit_1_upper_ksp_type preonly > -fieldsplit_1_upper_pc_type jacobi -fieldsplit_0_ksp_converged_reason > -fieldsplit_1_ksp_converged_reason > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 14 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 14 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 16 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 16 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 17 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 18 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 20 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 21 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 23 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 22 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 22 > > Linear fieldsplit_0_ solve converged due to CONVERGED_RTOL iterations 5 > > Linear fieldsplit_1_ solve converged due to CONVERGED_RTOL iterations > 22 > > total u solves = 225 > > total p solves = 225 > > residual u = 9.67257e-06 > > residual p = 5.42082e-07 > > residual [u,p] = 9.68775e-06 > > discretization error u = 0.0106464 > > discretization error p = 1.85907 > > discretization error [u,p] = 1.8591 > > > > So here again the total of 225 is correct for p, but for u it > > should be 60. Hope this helps you find the problem. > > > > Chris > > > > > > > > dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > > MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | > http://www.marin.nl > > > > MARIN news: http://www.marin.nl/web/News/News-items/Few-places-left- > for-Offshore-and-Ship-hydrodynamics-courses.htm > > > > ________________________________________ > > From: Klaij, Christiaan > > Sent: Tuesday, January 17, 2017 8:45 AM > > To: Barry Smith > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 > and 1 > > > > Well, that's it, all the rest was hard coded. Here's the relevant part > of the code: > > > > CALL PCSetType(pc_system,PCFIELDSPLIT,ierr); CHKERRQ(ierr) > > CALL PCFieldSplitSetType(pc_system,PC_COMPOSITE_SCHUR,ierr); > CHKERRQ(ierr) > > CALL PCFieldSplitSetIS(pc_system,"0",isgs(1),ierr); CHKERRQ(ierr) > > CALL PCFieldSplitSetIS(pc_system,"1",isgs(2),ierr); CHKERRQ(ierr) > > CALL PCFieldSplitSetSchurFactType(pc_system,PC_FIELDSPLIT_SCHUR_ > FACT_FULL,ierr);CHKERRQ(ierr) > > CALL PCFieldSplitSetSchurPre(pc_system,PC_FIELDSPLIT_SCHUR_ > PRE_SELFP,PETSC_NULL_OBJECT,ierr);CHKERRQ(ierr) > > > > CALL KSPSetTolerances(ksp_system,tol,PETSC_DEFAULT_REAL,PETSC_DEFAULT_REAL,maxiter,ierr); > CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_rtol","0.01",ierr); > CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_rtol","0.01",ierr); > CHKERRQ(ierr) > > > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_pc_side","right",ierr); > CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_1_ksp_pc_side","right",ierr); > CHKERRQ(ierr) > > > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_0_ksp_type","gmres",ierr); > CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_ > 1_upper_ksp_type","preonly",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_ > 1_upper_pc_type","jacobi",ierr); CHKERRQ(ierr) > > > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_ > 1_inner_ksp_type","preonly",ierr); CHKERRQ(ierr) > > CALL PetscOptionsSetValue(PETSC_NULL_OBJECT,"-sys_fieldsplit_ > 1_inner_pc_type","jacobi",ierr); CHKERRQ(ierr) > > > > ________________________________________ > > From: Barry Smith > > Sent: Monday, January 16, 2017 9:28 PM > > To: Klaij, Christiaan > > Cc: petsc-users at mcs.anl.gov > > Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 > and 1 > > > > Please send all the command line options you use. > > > > > >> On Jan 16, 2017, at 1:47 AM, Klaij, Christiaan > wrote: > >> > >> Barry, > >> > >> Sure, here's the output with: > >> > >> -sys_ksp_view -sys_ksp_converged_reason -sys_fieldsplit_0_ksp_converged_reason > -sys_fieldsplit_1_ksp_converged_reason > >> > >> (In my previous email, I rearranged 0 & 1 for easy summing.) > >> > >> Chris > >> > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 1 > >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 22 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 1 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 2 > >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 6 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 2 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 3 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 2 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 2 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 2 > >> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >> Linear sys_ solve converged due to CONVERGED_RTOL iterations 6 > >> KSP Object:(sys_) 1 MPI processes > >> type: fgmres > >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > >> GMRES: happy breakdown tolerance 1e-30 > >> maximum iterations=300, initial guess is zero > >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. > >> right preconditioning > >> using UNPRECONDITIONED norm type for convergence test > >> PC Object:(sys_) 1 MPI processes > >> type: fieldsplit > >> FieldSplit with Schur preconditioner, factorization FULL > >> Preconditioner for the Schur complement formed from Sp, an assembled > approximation to S, which uses (lumped, if requested) A00's diagonal's > inverse > >> Split info: > >> Split number 0 Defined by IS > >> Split number 1 Defined by IS > >> KSP solver for A00 block > >> KSP Object: (sys_fieldsplit_0_) 1 MPI processes > >> type: gmres > >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > >> GMRES: happy breakdown tolerance 1e-30 > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. > >> right preconditioning > >> using UNPRECONDITIONED norm type for convergence test > >> PC Object: (sys_fieldsplit_0_) 1 MPI processes > >> type: ilu > >> ILU: out-of-place factorization > >> 0 levels of fill > >> tolerance for zero pivot 2.22045e-14 > >> matrix ordering: natural > >> factor fill ratio given 1., needed 1. > >> Factored matrix follows: > >> Mat Object: 1 MPI processes > >> type: seqaij > >> rows=9600, cols=9600 > >> package used to perform factorization: petsc > >> total: nonzeros=47280, allocated nonzeros=47280 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> linear system matrix = precond matrix: > >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes > >> type: seqaij > >> rows=9600, cols=9600 > >> total: nonzeros=47280, allocated nonzeros=47280 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> KSP solver for upper A00 in upper triangular factor > >> KSP Object: (sys_fieldsplit_1_upper_) 1 MPI processes > >> type: preonly > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > >> left preconditioning > >> using NONE norm type for convergence test > >> PC Object: (sys_fieldsplit_1_upper_) 1 MPI processes > >> type: jacobi > >> linear system matrix = precond matrix: > >> Mat Object: (sys_fieldsplit_0_) 1 MPI processes > >> type: seqaij > >> rows=9600, cols=9600 > >> total: nonzeros=47280, allocated nonzeros=47280 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> KSP solver for S = A11 - A10 inv(A00) A01 > >> KSP Object: (sys_fieldsplit_1_) 1 MPI processes > >> type: gmres > >> GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > >> GMRES: happy breakdown tolerance 1e-30 > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=0.01, absolute=1e-50, divergence=10000. > >> right preconditioning > >> using UNPRECONDITIONED norm type for convergence test > >> PC Object: (sys_fieldsplit_1_) 1 MPI processes > >> type: ilu > >> ILU: out-of-place factorization > >> 0 levels of fill > >> tolerance for zero pivot 2.22045e-14 > >> matrix ordering: natural > >> factor fill ratio given 1., needed 1. > >> Factored matrix follows: > >> Mat Object: 1 MPI processes > >> type: seqaij > >> rows=3200, cols=3200 > >> package used to perform factorization: petsc > >> total: nonzeros=40404, allocated nonzeros=40404 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> linear system matrix followed by preconditioner matrix: > >> Mat Object: (sys_fieldsplit_1_) 1 MPI processes > >> type: schurcomplement > >> rows=3200, cols=3200 > >> Schur complement A11 - A10 inv(A00) A01 > >> A11 > >> Mat Object: (sys_fieldsplit_1_) > 1 MPI processes > >> type: seqaij > >> rows=3200, cols=3200 > >> total: nonzeros=40404, allocated nonzeros=40404 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> A10 > >> Mat Object: 1 MPI processes > >> type: seqaij > >> rows=3200, cols=9600 > >> total: nonzeros=47280, allocated nonzeros=47280 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> KSP of A00 > >> KSP Object: (sys_fieldsplit_1_inner_) > 1 MPI processes > >> type: preonly > >> maximum iterations=10000, initial guess is zero > >> tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > >> left preconditioning > >> using NONE norm type for convergence test > >> PC Object: (sys_fieldsplit_1_inner_) > 1 MPI processes > >> type: jacobi > >> linear system matrix = precond matrix: > >> Mat Object: (sys_fieldsplit_0_) > 1 MPI processes > >> type: seqaij > >> rows=9600, cols=9600 > >> total: nonzeros=47280, allocated nonzeros=47280 > >> total number of mallocs used during MatSetValues calls > =0 > >> not using I-node routines > >> A01 > >> Mat Object: 1 MPI processes > >> type: seqaij > >> rows=9600, cols=3200 > >> total: nonzeros=47280, allocated nonzeros=47280 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> Mat Object: 1 MPI processes > >> type: seqaij > >> rows=3200, cols=3200 > >> total: nonzeros=40404, allocated nonzeros=40404 > >> total number of mallocs used during MatSetValues calls =0 > >> not using I-node routines > >> linear system matrix followed by preconditioner matrix: > >> Mat Object: 1 MPI processes > >> type: nest > >> rows=12800, cols=12800 > >> Matrix object: > >> type=nest, rows=2, cols=2 > >> MatNest structure: > >> (0,0) : prefix="mom_", type=seqaij, rows=9600, cols=9600 > >> (0,1) : prefix="grad_", type=seqaij, rows=9600, cols=3200 > >> (1,0) : prefix="div_", type=seqaij, rows=3200, cols=9600 > >> (1,1) : prefix="stab_", type=seqaij, rows=3200, cols=3200 > >> Mat Object: 1 MPI processes > >> type: nest > >> rows=12800, cols=12800 > >> Matrix object: > >> type=nest, rows=2, cols=2 > >> MatNest structure: > >> (0,0) : prefix="sys_fieldsplit_0_", type=seqaij, rows=9600, > cols=9600 > >> (0,1) : type=seqaij, rows=9600, cols=3200 > >> (1,0) : type=seqaij, rows=3200, cols=9600 > >> (1,1) : prefix="sys_fieldsplit_1_", type=seqaij, rows=3200, > cols=3200 > >> nusediter_vv 37 > >> nusediter_pp 37 > >> > >> > >> > >> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > >> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | > http://www.marin.nl > >> > >> MARIN news: http://www.marin.nl/web/News/News-items/The-Ocean-Cleanup- > testing-continues.htm > >> > >> ________________________________________ > >> From: Barry Smith > >> Sent: Friday, January 13, 2017 7:51 PM > >> To: Klaij, Christiaan > >> Cc: petsc-users at mcs.anl.gov > >> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 > and 1 > >> > >> Yes, I would have expected this to work. Could you send the output > from -ksp_view in this case? > >> > >> > >>> On Jan 13, 2017, at 3:46 AM, Klaij, Christiaan > wrote: > >>> > >>> Barry, > >>> > >>> It's been a while but I'm finally using this function in > >>> 3.7.4. Is it supposed to work with fieldsplit? Here's why. > >>> > >>> I'm solving a Navier-Stokes system with fieldsplit (pc has one > >>> velocity solve and one pressure solve) and trying to retrieve the > >>> totals like this: > >>> > >>> CALL KSPSolve(ksp_system,rr_system,xx_system,ierr); CHKERRQ(ierr) > >>> CALL PCFieldSplitGetSubKSP(pc_system,numsplit,subksp,ierr); > CHKERRQ(ierr) > >>> CALL KSPGetTotalIterations(subksp(1),nusediter_vv,ierr); CHKERRQ(ierr) > >>> CALL KSPGetTotalIterations(subksp(2),nusediter_pp,ierr); CHKERRQ(ierr) > >>> print *, 'nusediter_vv', nusediter_vv > >>> print *, 'nusediter_pp', nusediter_pp > >>> > >>> Running the code shows this surprise: > >>> > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 1 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 1 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 2 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 2 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 7 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >>> Linear sys_fieldsplit_0_ solve converged due to CONVERGED_RTOL > iterations 8 > >>> > >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 22 > >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 6 > >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 3 > >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 2 > >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 2 > >>> Linear sys_fieldsplit_1_ solve converged due to CONVERGED_RTOL > iterations 2 > >>> > >>> nusediter_vv 37 > >>> nusediter_pp 37 > >>> > >>> So the value of nusediter_pp is indeed 37, but for nusediter_vv > >>> it should be 66. Any idea what went wrong? > >>> > >>> Chris > >>> > >>> > >>> > >>> dr. ir. Christiaan Klaij | CFD Researcher | Research & Development > >>> MARIN | T +31 317 49 33 44 | mailto:C.Klaij at marin.nl | > http://www.marin.nl > >>> > >>> MARIN news: http://www.marin.nl/web/News/ > News-items/MARIN-wishes-you-a-challenging-inspiring-2017.htm > >>> > >>> ________________________________________ > >>> From: Barry Smith > >>> Sent: Saturday, April 11, 2015 12:27 AM > >>> To: Klaij, Christiaan > >>> Cc: petsc-users at mcs.anl.gov > >>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 > and 1 > >>> > >>> Chris, > >>> > >>> I have added KSPGetTotalIterations() to the branch barry/add-ksp-total-iterations/master > and next. After tests it will go into master > >>> > >>> Barry > >>> > >>>> On Apr 10, 2015, at 8:07 AM, Klaij, Christiaan > wrote: > >>>> > >>>> Barry, > >>>> > >>>> Sure, I can call PCFieldSplitGetSubKSP() to get the fieldsplit_0 > >>>> ksp and then KSPGetIterationNumber, but what does this number > >>>> mean? > >>>> > >>>> It appears to be the number of iterations of the last time that > >>>> the subsystem was solved, right? If so, this corresponds to the > >>>> last iteration of the coupled system, how about all the previous > >>>> iterations? > >>>> > >>>> Chris > >>>> ________________________________________ > >>>> From: Barry Smith > >>>> Sent: Friday, April 10, 2015 2:48 PM > >>>> To: Klaij, Christiaan > >>>> Cc: petsc-users at mcs.anl.gov > >>>> Subject: Re: [petsc-users] monitoring the convergence of fieldsplit 0 > and 1 > >>>> > >>>> Chris, > >>>> > >>>> It appears you should call PCFieldSplitGetSubKSP() and then get the > information you want out of the individual KSPs. If this doesn't work > please let us know. > >>>> > >>>> Barry > >>>> > >>>>> On Apr 10, 2015, at 6:48 AM, Klaij, Christiaan > wrote: > >>>>> > >>>>> A question when using PCFieldSplit: for each linear iteration of > >>>>> the system, how many iterations for fielsplit 0 and 1? > >>>>> > >>>>> One way to find out is to run with -ksp_monitor, > >>>>> -fieldsplit_0_ksp_monitor and -fieldsplit_0_ksp_monitor. This > >>>>> gives the complete convergence history. > >>>>> > >>>>> Another way, suggested by Matt, is to use -ksp_monitor, > >>>>> -fieldsplit_0_ksp_converged_reason and > >>>>> -fieldsplit_1_ksp_converged_reason. This gives only the totals > >>>>> for fieldsplit 0 and 1 (but without saying for which one). > >>>>> > >>>>> Both ways require to somehow process the output, which is a bit > >>>>> inconvenient. Could KSPGetResidualHistory perhaps return (some) > >>>>> information on the subsystems' convergence for processing inside > >>>>> the code? > >>>>> > >>>>> Chris > >>>>> > >>>>> > >>>>> dr. ir. Christiaan Klaij > >>>>> CFD Researcher > >>>>> Research & Development > >>>>> E mailto:C.Klaij at marin.nl > >>>>> T +31 317 49 33 44 > >>>>> > >>>>> > >>>>> MARIN > >>>>> 2, Haagsteeg, P.O. Box 28, 6700 AA Wageningen, The Netherlands > >>>>> T +31 317 49 39 11, F +31 317 49 32 45, I www.marin.nl > >>>>> > >>>> > >>> > >> > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From chih-hao.chen2 at mail.mcgill.ca Wed Jan 18 14:22:11 2017 From: chih-hao.chen2 at mail.mcgill.ca (Chih-Hao Chen) Date: Wed, 18 Jan 2017 20:22:11 +0000 Subject: [petsc-users] About GMRES Solver in PETSc Message-ID: Hello, Sorry for this bother. I"ve checked GMRES solver in your libary supports parallel computing in your website: http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html. Summary of Sparse Linear Solvers Available from PETSc www.mcs.anl.gov Summary of Sparse Linear Solvers Available from PETSc Requests and contributions welcome As we all know the key algorithm inside GMRES solver is Gram-Schmidt process, so I guess if users specify their matrix A and right-hand-side vector b in a linear equation (Ax=b), the Gram-Schmidt process will be executed in parallel, including matrix-vector multiplications and orthogonalization between the basis vectors. Am I correct? Thanks very much. Best, Chih-Hao -------------- next part -------------- An HTML attachment was scrubbed... URL: From hzhang at mcs.anl.gov Wed Jan 18 14:24:41 2017 From: hzhang at mcs.anl.gov (Hong) Date: Wed, 18 Jan 2017 14:24:41 -0600 Subject: [petsc-users] About GMRES Solver in PETSc In-Reply-To: References: Message-ID: Yes. Hong On Wed, Jan 18, 2017 at 2:22 PM, Chih-Hao Chen < chih-hao.chen2 at mail.mcgill.ca> wrote: > Hello, > > > Sorry for this bother. > > I"ve checked GMRES solver in your libary supports parallel computing in > your website: > > http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html. > > > Summary of Sparse Linear Solvers Available from PETSc > > www.mcs.anl.gov > Summary of Sparse Linear Solvers Available from PETSc Requests and > contributions welcome > As we all know the key algorithm inside GMRES solver is Gram-Schmidt > process, > > so I guess if users specify their matrix A and right-hand-side vector b in > a linear equation (Ax=b), > > the Gram-Schmidt process will be executed in parallel, > > including matrix-vector multiplications and orthogonalization between the > basis vectors. > > Am I correct? > > Thanks very much. > > > Best, > > Chih-Hao > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From chih-hao.chen2 at mail.mcgill.ca Wed Jan 18 14:32:30 2017 From: chih-hao.chen2 at mail.mcgill.ca (Chih-Hao Chen) Date: Wed, 18 Jan 2017 20:32:30 +0000 Subject: [petsc-users] About GMRES Solver in PETSc In-Reply-To: References: , Message-ID: Hello, Thanks for the quick reply. So if I specify matrix A and RHS vector b in parallel forms, ie. having them distributed across several cores, when I using the functions like MatMult, VecAYPX and etc., all the operations would be executed in parallel too? Thanks very much. Best, Chih-Hao ________________________________ From: Hong Sent: 18 January 2017 15:24:41 To: Chih-Hao Chen Cc: petsc-users at mcs.anl.gov Subject: Re: [petsc-users] About GMRES Solver in PETSc Yes. Hong On Wed, Jan 18, 2017 at 2:22 PM, Chih-Hao Chen > wrote: Hello, Sorry for this bother. I"ve checked GMRES solver in your libary supports parallel computing in your website: http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html. Summary of Sparse Linear Solvers Available from PETSc www.mcs.anl.gov Summary of Sparse Linear Solvers Available from PETSc Requests and contributions welcome As we all know the key algorithm inside GMRES solver is Gram-Schmidt process, so I guess if users specify their matrix A and right-hand-side vector b in a linear equation (Ax=b), the Gram-Schmidt process will be executed in parallel, including matrix-vector multiplications and orthogonalization between the basis vectors. Am I correct? Thanks very much. Best, Chih-Hao -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Wed Jan 18 14:41:08 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 18 Jan 2017 14:41:08 -0600 Subject: [petsc-users] About GMRES Solver in PETSc In-Reply-To: References: Message-ID: On Wed, Jan 18, 2017 at 2:32 PM, Chih-Hao Chen < chih-hao.chen2 at mail.mcgill.ca> wrote: > Hello, > > > Thanks for the quick reply. > > So if I specify matrix A and RHS vector b in parallel forms, > > ie. having them distributed across several cores, > > when I using the functions like MatMult, VecAYPX and etc., > > all the operations would be executed in parallel too? > > Yes Matt > Thanks very much. > > > > Best, > > Chih-Hao > ------------------------------ > *From:* Hong > *Sent:* 18 January 2017 15:24:41 > *To:* Chih-Hao Chen > *Cc:* petsc-users at mcs.anl.gov > *Subject:* Re: [petsc-users] About GMRES Solver in PETSc > > Yes. > Hong > > On Wed, Jan 18, 2017 at 2:22 PM, Chih-Hao Chen < > chih-hao.chen2 at mail.mcgill.ca> wrote: > >> Hello, >> >> >> Sorry for this bother. >> >> I"ve checked GMRES solver in your libary supports parallel computing in >> your website: >> >> http://www.mcs.anl.gov/petsc/documentation/linearsolvertable.html. >> >> >> Summary of Sparse Linear Solvers Available from PETSc >> >> www.mcs.anl.gov >> Summary of Sparse Linear Solvers Available from PETSc Requests and >> contributions welcome >> As we all know the key algorithm inside GMRES solver is Gram-Schmidt >> process, >> >> so I guess if users specify their matrix A and right-hand-side vector b >> in a linear equation (Ax=b), >> >> the Gram-Schmidt process will be executed in parallel, >> >> including matrix-vector multiplications and orthogonalization between the >> basis vectors. >> >> Am I correct? >> >> Thanks very much. >> >> >> Best, >> >> Chih-Hao >> >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From juan.petscmaillist at yahoo.com Wed Jan 18 17:33:17 2017 From: juan.petscmaillist at yahoo.com (juan.petscmaillist at yahoo.com) Date: Wed, 18 Jan 2017 23:33:17 +0000 (UTC) Subject: [petsc-users] Questions about Gmsh and petsc (absolute beginner level) References: <883157705.16856.1484782397397.ref@mail.yahoo.com> Message-ID: <883157705.16856.1484782397397@mail.yahoo.com> Dear All, I am absolutely new to the mesh generation, the finite element method and petsc. But I wish to master this platform.? I am trying to use gmsh to generate mesh file of a sphere and use the interface, DMPlexCreateGmshFromFile, to create mesh in petsc. The mesh file (sphere.msh, see to the end of the email), created by gmsh, contains 58 nodes and 220 elements of two types, a) type 2, triangle of 3 vertices and b) type 4, tetrahedral with 4 vertices. I have absolute no idea of my setup and could not find examples of this problem to compare with. So could anyone tell me whether the following code is correct or not? This code runs properly on petsc 3.7.5 with -@$(MPIEXEC) -n $(N_CPU) ./a.out -snes_monitor_short, where N_CPU can be any number. Q1: Corresponding to the given mesh file, are these variables correct? #define MAX_ELEM????? 220? /* Maximum number of elements */ #define MAX_VERT????? 58? /* Maximum number of vertices */ #define MAX_VERT_ELEM?? 4? /* Vertices per element?????? */ Q2: The data structure AppCtx is copied from src/snes/examples/tutorials/ex10d/ex10.c. Is this structure suitable for this problem? Q3: Before and after calling DMPlexCreateGmshFromFile, did I miss any important steps? Q4: Also could anyone let me know what the proper way to check the mesh setup by petsc is. I just started playing around with is package. I found it is really amazing powerful, but also quite hard to get the accurate understanding. I appreciate if someone can give me some further suggestions. Thank you very much in advance. Juan //------------------------------------------------ #include #include #include #include #include ? DM???????????? dm;?????????????????? /* problem definition */ PetscMPIInt??? ranks, sizes, ierr; PetscViewer??? viewer; PetscBool????? interpolate; #define MAX_ELEM????? 220? /* Maximum number of elements */ #define MAX_VERT????? 58? /* Maximum number of vertices */ #define MAX_VERT_ELEM?? 4? /* Vertices per element?????? */ typedef struct { ? PetscInt?? Nvglobal,Nvlocal;????????????? /* global and local number of vertices */ ? PetscInt?? Neglobal,Nelocal;????????????? /* global and local number of vertices */ ? PetscInt?? AdjM[MAX_VERT][50];??????????? /* adjacency list of a vertex */ ? PetscInt?? itot[MAX_VERT];??????????????? /* total number of neighbors for a vertex */ ? PetscInt?? icv[MAX_ELEM][MAX_VERT_ELEM];? /* vertices belonging to an element */ ? PetscInt?? v2p[MAX_VERT];???????????????? /* processor number for a vertex */ ? PetscInt?? *locInd,*gloInd;?????????????? /* local and global orderings for a node */ ? Vec??????? localX,localF;???????????????? /* local solution (u) and f(u) vectors */ ? PetscReal? non_lin_param;???????????????? /* nonlinear parameter for the PDE */ ? PetscReal? lin_param;???????????????????? /* linear parameter for the PDE */ ? VecScatter scatter;?????????????????????? /* scatter context for the local and ?????????????????????????????????????????????? distributed vectors */ } AppCtx; #undef __FUNCT__ #define __FUNCT__ "main" int main(int argc,char **argv) { ? static char help[] = "Define Vector"; ? static char filename[] = "./data/sphere.msh"; ? PetscInitialize(&argc,&argv,(char*)0,help); ? ierr? = MPI_Comm_rank(PETSC_COMM_WORLD,&ranks);CHKERRQ(ierr); ? ierr? = MPI_Comm_size(PETSC_COMM_WORLD,&sizes);CHKERRQ(ierr); ? ierr=DMPlexCreateGmshFromFile(PETSC_COMM_WORLD, filename, interpolate, &dm);? ? PetscFinalize(); ? return 0; }; -------------- Mesh file ---------- $MeshFormat 2.2 0 8 $EndMeshFormat $Nodes 58 1 0 1 0 2 -1 0 0 3 0 0 1 4 -0.4999999999988097 0.8660254037851258 0 5 -0.8660254037838017 0.5000000000011033 0 6 0 0.4999999999988097 0.8660254037851258 7 0 0.8660254037838017 0.5000000000011033 8 -0.4999999999988097 0 0.8660254037851258 9 -0.8660254037838017 0 0.5000000000011033 10 -0.6339745962178445 0.4428909829273233 0.633974596172505 11 -0.4822994153869866 0.7506004041478795 0.4516484331982243 12 -0.3493897428766589 0.3546301510880328 0.8672740417606545 13 -0.8378218522221577 0.4012601404060589 0.3701956829304511 14 0 -1 0 15 -0.8660254037851258 -0.4999999999988097 0 16 -0.5000000000011033 -0.8660254037838017 0 17 0 -0.8660254037851258 0.4999999999988097 18 0 -0.5000000000011033 0.8660254037838017 19 -0.4428909828654145 -0.6339745962336311 0.6339745961999673 20 -0.7198886675198282 -0.4906852283660827 0.4909056050197959 21 -0.3594838238878847 -0.383951782616808 0.8505012692455668 22 -0.358692803375615 -0.8508569143149735 0.3839036131230655 23 1 0 0 24 0.8660254037851258 0.4999999999988097 0 25 0.5000000000011033 0.8660254037838017 0 26 0.8660254037851258 0 0.4999999999988097 27 0.5000000000011033 0 0.8660254037838017 28 0.5108386809150948 0.5108386789440059 0.6914388520872029 29 0.8481866372316714 0.374552685492926 0.3745526854952001 30 0.4487154446450166 0.8265947249299238 0.3396992941633065 31 0 0 -1 32 0 0.8660254037851258 -0.4999999999988097 33 0 0.5000000000011033 -0.8660254037838017 34 -0.8660254037851258 0 -0.4999999999988097 35 -0.5000000000011033 0 -0.8660254037838017 36 -0.6339745962178445 0.633974596172505 -0.4428909829273234 37 -0.5097746217363811 0.4676776035922022 -0.7220855171873929 38 -0.3545204312296736 0.868050272320611 -0.3475686817951805 39 -0.8398086783581998 0.3742446117737876 -0.3932713494682021 40 0.4999999999988097 0 -0.8660254037851258 41 0.8660254037838017 0 -0.5000000000011033 42 0.5314152505108283 0.5314152505101392 -0.6596936130127075 43 0.3745526855885761 0.8481866371475475 -0.3745526855900513 44 0.8166239656466456 0.4593780018327371 -0.3494240263114541 45 0 -0.4999999999988097 -0.8660254037851258 46 0 -0.8660254037838017 -0.5000000000011033 47 -0.693625367376466 -0.4647476149317986 -0.5503576147806688 48 -0.3745526897803909 -0.3745526897775885 -0.8481866334472902 49 -0.4411747191846568 -0.7694084839919415 -0.4619258078020342 50 0.4999999999988097 -0.8660254037851258 0 51 0.8660254037838017 -0.5000000000011033 0 52 0.4835428142572408 -0.7296387419233323 0.4835428140968744 53 0.8051246928456837 -0.4546444607248049 0.38088402867854 54 0.4212799396792105 -0.4355246816538718 0.7955133336998031 55 0.4835428133498502 -0.7296387419932698 -0.483542814898733 56 0.3808840285417322 -0.4546444603984627 -0.8051246930946855 57 0.7955133341574981 -0.4355246806022939 -0.4212799399020684 58 -1.110223024625157e-15 1.110223024625157e-15 -6.106226635438361e-16 $EndNodes $Elements 220 1 2 2 1 14 1 4 11 2 2 2 1 14 1 11 7 3 2 2 1 14 2 13 5 4 2 2 1 14 2 9 13 5 2 2 1 14 3 6 12 6 2 2 1 14 3 12 8 7 2 2 1 14 4 5 11 8 2 2 1 14 5 13 11 9 2 2 1 14 6 7 11 10 2 2 1 14 6 11 12 11 2 2 1 14 8 10 9 12 2 2 1 14 8 12 10 13 2 2 1 14 9 10 13 14 2 2 1 14 10 12 11 15 2 2 1 14 10 11 13 16 2 2 1 16 15 20 2 17 2 2 1 16 20 9 2 18 2 2 1 16 22 16 14 19 2 2 1 16 17 22 14 20 2 2 1 16 21 18 3 21 2 2 1 16 8 21 3 22 2 2 1 16 16 20 15 23 2 2 1 16 22 20 16 24 2 2 1 16 18 19 17 25 2 2 1 16 19 22 17 26 2 2 1 16 21 19 18 27 2 2 1 16 9 20 8 28 2 2 1 16 20 21 8 29 2 2 1 16 21 20 19 30 2 2 1 16 20 22 19 31 2 2 1 18 24 29 23 32 2 2 1 18 29 26 23 33 2 2 1 18 30 25 1 34 2 2 1 18 7 30 1 35 2 2 1 18 3 27 6 36 2 2 1 18 24 25 30 37 2 2 1 18 24 30 29 38 2 2 1 18 28 7 6 39 2 2 1 18 27 28 6 40 2 2 1 18 28 30 7 41 2 2 1 18 28 27 26 42 2 2 1 18 29 28 26 43 2 2 1 18 29 30 28 44 2 2 1 20 1 38 4 45 2 2 1 20 1 32 38 46 2 2 1 20 2 5 39 47 2 2 1 20 2 39 34 48 2 2 1 20 31 37 33 49 2 2 1 20 31 35 37 50 2 2 1 20 4 36 5 51 2 2 1 20 4 38 36 52 2 2 1 20 5 36 39 53 2 2 1 20 32 33 37 54 2 2 1 20 32 37 38 55 2 2 1 20 34 37 35 56 2 2 1 20 34 39 37 57 2 2 1 20 36 38 37 58 2 2 1 20 36 37 39 59 2 2 1 22 23 44 24 60 2 2 1 22 23 41 44 61 2 2 1 22 1 25 43 62 2 2 1 22 1 43 32 63 2 2 1 22 31 33 40 64 2 2 1 22 24 44 25 65 2 2 1 22 44 43 25 66 2 2 1 22 32 42 33 67 2 2 1 22 32 43 42 68 2 2 1 22 33 42 40 69 2 2 1 22 40 42 41 70 2 2 1 22 41 42 44 71 2 2 1 22 42 43 44 72 2 2 1 24 2 34 15 73 2 2 1 24 16 49 14 74 2 2 1 24 49 46 14 75 2 2 1 24 45 48 31 76 2 2 1 24 48 35 31 77 2 2 1 24 16 15 49 78 2 2 1 24 34 47 15 79 2 2 1 24 49 15 47 80 2 2 1 24 45 46 49 81 2 2 1 24 45 49 48 82 2 2 1 24 35 47 34 83 2 2 1 24 48 47 35 84 2 2 1 24 48 49 47 85 2 2 1 26 23 53 51 86 2 2 1 26 23 26 53 87 2 2 1 26 17 14 50 88 2 2 1 26 3 18 54 89 2 2 1 26 3 54 27 90 2 2 1 26 53 50 51 91 2 2 1 26 50 52 17 92 2 2 1 26 52 50 53 93 2 2 1 26 54 18 17 94 2 2 1 26 52 54 17 95 2 2 1 26 26 27 54 96 2 2 1 26 53 26 54 97 2 2 1 26 52 53 54 98 2 2 1 28 51 57 23 99 2 2 1 28 57 41 23 100 2 2 1 28 14 46 50 101 2 2 1 28 56 45 31 102 2 2 1 28 40 56 31 103 2 2 1 28 51 50 57 104 2 2 1 28 46 55 50 105 2 2 1 28 57 50 55 106 2 2 1 28 56 46 45 107 2 2 1 28 55 46 56 108 2 2 1 28 41 57 40 109 2 2 1 28 40 57 56 110 2 2 1 28 56 57 55 111 4 2 2 30 21 18 19 58 112 4 2 2 30 13 10 9 58 113 4 2 2 30 36 38 4 58 114 4 2 2 30 36 4 5 58 115 4 2 2 30 10 8 9 58 116 4 2 2 30 21 3 18 58 117 4 2 2 30 19 18 17 58 118 4 2 2 30 47 48 35 58 119 4 2 2 30 36 5 39 58 120 4 2 2 30 23 51 53 58 121 4 2 2 30 10 12 8 58 122 4 2 2 30 22 19 17 58 123 4 2 2 30 38 1 4 58 124 4 2 2 30 12 10 11 58 125 4 2 2 30 43 44 25 58 126 4 2 2 30 23 57 51 58 127 4 2 2 30 18 3 54 58 128 4 2 2 30 11 10 13 58 129 4 2 2 30 52 53 50 58 130 4 2 2 30 56 46 55 58 131 4 2 2 30 13 9 2 58 132 4 2 2 30 36 37 38 58 133 4 2 2 30 20 19 22 58 134 4 2 2 30 29 30 24 58 135 4 2 2 30 19 20 21 58 136 4 2 2 30 52 54 53 58 137 4 2 2 30 57 56 55 58 138 4 2 2 30 5 4 11 58 139 4 2 2 30 53 51 50 58 140 4 2 2 30 49 48 47 58 141 4 2 2 30 8 3 21 58 142 4 2 2 30 43 42 44 58 143 4 2 2 30 24 25 44 58 144 4 2 2 30 57 55 50 58 145 4 2 2 30 52 17 54 58 146 4 2 2 30 52 50 17 58 147 4 2 2 30 46 50 55 58 148 4 2 2 30 30 25 24 58 149 4 2 2 30 48 49 45 58 150 4 2 2 30 36 39 37 58 151 4 2 2 30 28 30 29 58 152 4 2 2 30 11 4 1 58 153 4 2 2 30 57 50 51 58 154 4 2 2 30 18 54 17 58 155 4 2 2 30 29 24 23 58 156 4 2 2 30 34 47 35 58 157 4 2 2 30 24 44 23 58 158 4 2 2 30 22 14 16 58 159 4 2 2 30 20 9 8 58 160 4 2 2 30 32 42 43 58 161 4 2 2 30 44 41 23 58 162 4 2 2 30 44 42 41 58 163 4 2 2 30 5 11 13 58 164 4 2 2 30 12 3 8 58 165 4 2 2 30 48 31 35 58 166 4 2 2 30 28 7 30 58 167 4 2 2 30 28 29 26 58 168 4 2 2 30 16 20 22 58 169 4 2 2 30 20 8 21 58 170 4 2 2 30 43 25 1 58 171 4 2 2 30 2 9 20 58 172 4 2 2 30 29 23 26 58 173 4 2 2 30 22 17 14 58 174 4 2 2 30 47 15 49 58 175 4 2 2 30 53 26 23 58 176 4 2 2 30 47 34 15 58 177 4 2 2 30 12 6 3 58 178 4 2 2 30 41 57 23 58 179 4 2 2 30 37 32 38 58 180 4 2 2 30 30 1 25 58 181 4 2 2 30 39 5 2 58 182 4 2 2 30 12 11 6 58 183 4 2 2 30 42 40 41 58 184 4 2 2 30 13 2 5 58 185 4 2 2 30 48 45 31 58 186 4 2 2 30 53 54 26 58 187 4 2 2 30 6 7 28 58 188 4 2 2 30 38 32 1 58 189 4 2 2 30 30 7 1 58 190 4 2 2 30 16 14 49 58 191 4 2 2 30 56 45 46 58 192 4 2 2 30 57 40 56 58 193 4 2 2 30 43 1 32 58 194 4 2 2 30 33 40 42 58 195 4 2 2 30 27 3 6 58 196 4 2 2 30 41 40 57 58 197 4 2 2 30 33 42 32 58 198 4 2 2 30 37 39 34 58 199 4 2 2 30 49 15 16 58 200 4 2 2 30 28 27 6 58 201 4 2 2 30 31 37 35 58 202 4 2 2 30 1 7 11 58 203 4 2 2 30 28 26 27 58 204 4 2 2 30 54 3 27 58 205 4 2 2 30 49 46 45 58 206 4 2 2 30 11 7 6 58 207 4 2 2 30 34 35 37 58 208 4 2 2 30 31 45 56 58 209 4 2 2 30 39 2 34 58 210 4 2 2 30 49 14 46 58 211 4 2 2 30 50 14 17 58 212 4 2 2 30 46 14 50 58 213 4 2 2 30 54 27 26 58 214 4 2 2 30 40 31 56 58 215 4 2 2 30 16 15 20 58 216 4 2 2 30 33 32 37 58 217 4 2 2 30 33 31 40 58 218 4 2 2 30 34 2 15 58 219 4 2 2 30 33 37 31 58 220 4 2 2 30 2 20 15 58 $EndElements -------------- next part -------------- An HTML attachment was scrubbed... URL: From juan.petscmaillist at yahoo.com Wed Jan 18 17:33:17 2017 From: juan.petscmaillist at yahoo.com (juan.petscmaillist at yahoo.com) Date: Wed, 18 Jan 2017 23:33:17 +0000 (UTC) Subject: [petsc-users] Questions about Gmsh and petsc (absolute beginner level) References: <883157705.16856.1484782397397.ref@mail.yahoo.com> Message-ID: <883157705.16856.1484782397397@mail.yahoo.com> Dear All, I am absolutely new to the mesh generation, the finite element method and petsc. But I wish to master this platform.? I am trying to use gmsh to generate mesh file of a sphere and use the interface, DMPlexCreateGmshFromFile, to create mesh in petsc. The mesh file (sphere.msh, see to the end of the email), created by gmsh, contains 58 nodes and 220 elements of two types, a) type 2, triangle of 3 vertices and b) type 4, tetrahedral with 4 vertices. I have absolute no idea of my setup and could not find examples of this problem to compare with. So could anyone tell me whether the following code is correct or not? This code runs properly on petsc 3.7.5 with -@$(MPIEXEC) -n $(N_CPU) ./a.out -snes_monitor_short, where N_CPU can be any number. Q1: Corresponding to the given mesh file, are these variables correct? #define MAX_ELEM????? 220? /* Maximum number of elements */ #define MAX_VERT????? 58? /* Maximum number of vertices */ #define MAX_VERT_ELEM?? 4? /* Vertices per element?????? */ Q2: The data structure AppCtx is copied from src/snes/examples/tutorials/ex10d/ex10.c. Is this structure suitable for this problem? Q3: Before and after calling DMPlexCreateGmshFromFile, did I miss any important steps? Q4: Also could anyone let me know what the proper way to check the mesh setup by petsc is. I just started playing around with is package. I found it is really amazing powerful, but also quite hard to get the accurate understanding. I appreciate if someone can give me some further suggestions. Thank you very much in advance. Juan //------------------------------------------------ #include #include #include #include #include ? DM???????????? dm;?????????????????? /* problem definition */ PetscMPIInt??? ranks, sizes, ierr; PetscViewer??? viewer; PetscBool????? interpolate; #define MAX_ELEM????? 220? /* Maximum number of elements */ #define MAX_VERT????? 58? /* Maximum number of vertices */ #define MAX_VERT_ELEM?? 4? /* Vertices per element?????? */ typedef struct { ? PetscInt?? Nvglobal,Nvlocal;????????????? /* global and local number of vertices */ ? PetscInt?? Neglobal,Nelocal;????????????? /* global and local number of vertices */ ? PetscInt?? AdjM[MAX_VERT][50];??????????? /* adjacency list of a vertex */ ? PetscInt?? itot[MAX_VERT];??????????????? /* total number of neighbors for a vertex */ ? PetscInt?? icv[MAX_ELEM][MAX_VERT_ELEM];? /* vertices belonging to an element */ ? PetscInt?? v2p[MAX_VERT];???????????????? /* processor number for a vertex */ ? PetscInt?? *locInd,*gloInd;?????????????? /* local and global orderings for a node */ ? Vec??????? localX,localF;???????????????? /* local solution (u) and f(u) vectors */ ? PetscReal? non_lin_param;???????????????? /* nonlinear parameter for the PDE */ ? PetscReal? lin_param;???????????????????? /* linear parameter for the PDE */ ? VecScatter scatter;?????????????????????? /* scatter context for the local and ?????????????????????????????????????????????? distributed vectors */ } AppCtx; #undef __FUNCT__ #define __FUNCT__ "main" int main(int argc,char **argv) { ? static char help[] = "Define Vector"; ? static char filename[] = "./data/sphere.msh"; ? PetscInitialize(&argc,&argv,(char*)0,help); ? ierr? = MPI_Comm_rank(PETSC_COMM_WORLD,&ranks);CHKERRQ(ierr); ? ierr? = MPI_Comm_size(PETSC_COMM_WORLD,&sizes);CHKERRQ(ierr); ? ierr=DMPlexCreateGmshFromFile(PETSC_COMM_WORLD, filename, interpolate, &dm);? ? PetscFinalize(); ? return 0; }; -------------- Mesh file ---------- $MeshFormat 2.2 0 8 $EndMeshFormat $Nodes 58 1 0 1 0 2 -1 0 0 3 0 0 1 4 -0.4999999999988097 0.8660254037851258 0 5 -0.8660254037838017 0.5000000000011033 0 6 0 0.4999999999988097 0.8660254037851258 7 0 0.8660254037838017 0.5000000000011033 8 -0.4999999999988097 0 0.8660254037851258 9 -0.8660254037838017 0 0.5000000000011033 10 -0.6339745962178445 0.4428909829273233 0.633974596172505 11 -0.4822994153869866 0.7506004041478795 0.4516484331982243 12 -0.3493897428766589 0.3546301510880328 0.8672740417606545 13 -0.8378218522221577 0.4012601404060589 0.3701956829304511 14 0 -1 0 15 -0.8660254037851258 -0.4999999999988097 0 16 -0.5000000000011033 -0.8660254037838017 0 17 0 -0.8660254037851258 0.4999999999988097 18 0 -0.5000000000011033 0.8660254037838017 19 -0.4428909828654145 -0.6339745962336311 0.6339745961999673 20 -0.7198886675198282 -0.4906852283660827 0.4909056050197959 21 -0.3594838238878847 -0.383951782616808 0.8505012692455668 22 -0.358692803375615 -0.8508569143149735 0.3839036131230655 23 1 0 0 24 0.8660254037851258 0.4999999999988097 0 25 0.5000000000011033 0.8660254037838017 0 26 0.8660254037851258 0 0.4999999999988097 27 0.5000000000011033 0 0.8660254037838017 28 0.5108386809150948 0.5108386789440059 0.6914388520872029 29 0.8481866372316714 0.374552685492926 0.3745526854952001 30 0.4487154446450166 0.8265947249299238 0.3396992941633065 31 0 0 -1 32 0 0.8660254037851258 -0.4999999999988097 33 0 0.5000000000011033 -0.8660254037838017 34 -0.8660254037851258 0 -0.4999999999988097 35 -0.5000000000011033 0 -0.8660254037838017 36 -0.6339745962178445 0.633974596172505 -0.4428909829273234 37 -0.5097746217363811 0.4676776035922022 -0.7220855171873929 38 -0.3545204312296736 0.868050272320611 -0.3475686817951805 39 -0.8398086783581998 0.3742446117737876 -0.3932713494682021 40 0.4999999999988097 0 -0.8660254037851258 41 0.8660254037838017 0 -0.5000000000011033 42 0.5314152505108283 0.5314152505101392 -0.6596936130127075 43 0.3745526855885761 0.8481866371475475 -0.3745526855900513 44 0.8166239656466456 0.4593780018327371 -0.3494240263114541 45 0 -0.4999999999988097 -0.8660254037851258 46 0 -0.8660254037838017 -0.5000000000011033 47 -0.693625367376466 -0.4647476149317986 -0.5503576147806688 48 -0.3745526897803909 -0.3745526897775885 -0.8481866334472902 49 -0.4411747191846568 -0.7694084839919415 -0.4619258078020342 50 0.4999999999988097 -0.8660254037851258 0 51 0.8660254037838017 -0.5000000000011033 0 52 0.4835428142572408 -0.7296387419233323 0.4835428140968744 53 0.8051246928456837 -0.4546444607248049 0.38088402867854 54 0.4212799396792105 -0.4355246816538718 0.7955133336998031 55 0.4835428133498502 -0.7296387419932698 -0.483542814898733 56 0.3808840285417322 -0.4546444603984627 -0.8051246930946855 57 0.7955133341574981 -0.4355246806022939 -0.4212799399020684 58 -1.110223024625157e-15 1.110223024625157e-15 -6.106226635438361e-16 $EndNodes $Elements 220 1 2 2 1 14 1 4 11 2 2 2 1 14 1 11 7 3 2 2 1 14 2 13 5 4 2 2 1 14 2 9 13 5 2 2 1 14 3 6 12 6 2 2 1 14 3 12 8 7 2 2 1 14 4 5 11 8 2 2 1 14 5 13 11 9 2 2 1 14 6 7 11 10 2 2 1 14 6 11 12 11 2 2 1 14 8 10 9 12 2 2 1 14 8 12 10 13 2 2 1 14 9 10 13 14 2 2 1 14 10 12 11 15 2 2 1 14 10 11 13 16 2 2 1 16 15 20 2 17 2 2 1 16 20 9 2 18 2 2 1 16 22 16 14 19 2 2 1 16 17 22 14 20 2 2 1 16 21 18 3 21 2 2 1 16 8 21 3 22 2 2 1 16 16 20 15 23 2 2 1 16 22 20 16 24 2 2 1 16 18 19 17 25 2 2 1 16 19 22 17 26 2 2 1 16 21 19 18 27 2 2 1 16 9 20 8 28 2 2 1 16 20 21 8 29 2 2 1 16 21 20 19 30 2 2 1 16 20 22 19 31 2 2 1 18 24 29 23 32 2 2 1 18 29 26 23 33 2 2 1 18 30 25 1 34 2 2 1 18 7 30 1 35 2 2 1 18 3 27 6 36 2 2 1 18 24 25 30 37 2 2 1 18 24 30 29 38 2 2 1 18 28 7 6 39 2 2 1 18 27 28 6 40 2 2 1 18 28 30 7 41 2 2 1 18 28 27 26 42 2 2 1 18 29 28 26 43 2 2 1 18 29 30 28 44 2 2 1 20 1 38 4 45 2 2 1 20 1 32 38 46 2 2 1 20 2 5 39 47 2 2 1 20 2 39 34 48 2 2 1 20 31 37 33 49 2 2 1 20 31 35 37 50 2 2 1 20 4 36 5 51 2 2 1 20 4 38 36 52 2 2 1 20 5 36 39 53 2 2 1 20 32 33 37 54 2 2 1 20 32 37 38 55 2 2 1 20 34 37 35 56 2 2 1 20 34 39 37 57 2 2 1 20 36 38 37 58 2 2 1 20 36 37 39 59 2 2 1 22 23 44 24 60 2 2 1 22 23 41 44 61 2 2 1 22 1 25 43 62 2 2 1 22 1 43 32 63 2 2 1 22 31 33 40 64 2 2 1 22 24 44 25 65 2 2 1 22 44 43 25 66 2 2 1 22 32 42 33 67 2 2 1 22 32 43 42 68 2 2 1 22 33 42 40 69 2 2 1 22 40 42 41 70 2 2 1 22 41 42 44 71 2 2 1 22 42 43 44 72 2 2 1 24 2 34 15 73 2 2 1 24 16 49 14 74 2 2 1 24 49 46 14 75 2 2 1 24 45 48 31 76 2 2 1 24 48 35 31 77 2 2 1 24 16 15 49 78 2 2 1 24 34 47 15 79 2 2 1 24 49 15 47 80 2 2 1 24 45 46 49 81 2 2 1 24 45 49 48 82 2 2 1 24 35 47 34 83 2 2 1 24 48 47 35 84 2 2 1 24 48 49 47 85 2 2 1 26 23 53 51 86 2 2 1 26 23 26 53 87 2 2 1 26 17 14 50 88 2 2 1 26 3 18 54 89 2 2 1 26 3 54 27 90 2 2 1 26 53 50 51 91 2 2 1 26 50 52 17 92 2 2 1 26 52 50 53 93 2 2 1 26 54 18 17 94 2 2 1 26 52 54 17 95 2 2 1 26 26 27 54 96 2 2 1 26 53 26 54 97 2 2 1 26 52 53 54 98 2 2 1 28 51 57 23 99 2 2 1 28 57 41 23 100 2 2 1 28 14 46 50 101 2 2 1 28 56 45 31 102 2 2 1 28 40 56 31 103 2 2 1 28 51 50 57 104 2 2 1 28 46 55 50 105 2 2 1 28 57 50 55 106 2 2 1 28 56 46 45 107 2 2 1 28 55 46 56 108 2 2 1 28 41 57 40 109 2 2 1 28 40 57 56 110 2 2 1 28 56 57 55 111 4 2 2 30 21 18 19 58 112 4 2 2 30 13 10 9 58 113 4 2 2 30 36 38 4 58 114 4 2 2 30 36 4 5 58 115 4 2 2 30 10 8 9 58 116 4 2 2 30 21 3 18 58 117 4 2 2 30 19 18 17 58 118 4 2 2 30 47 48 35 58 119 4 2 2 30 36 5 39 58 120 4 2 2 30 23 51 53 58 121 4 2 2 30 10 12 8 58 122 4 2 2 30 22 19 17 58 123 4 2 2 30 38 1 4 58 124 4 2 2 30 12 10 11 58 125 4 2 2 30 43 44 25 58 126 4 2 2 30 23 57 51 58 127 4 2 2 30 18 3 54 58 128 4 2 2 30 11 10 13 58 129 4 2 2 30 52 53 50 58 130 4 2 2 30 56 46 55 58 131 4 2 2 30 13 9 2 58 132 4 2 2 30 36 37 38 58 133 4 2 2 30 20 19 22 58 134 4 2 2 30 29 30 24 58 135 4 2 2 30 19 20 21 58 136 4 2 2 30 52 54 53 58 137 4 2 2 30 57 56 55 58 138 4 2 2 30 5 4 11 58 139 4 2 2 30 53 51 50 58 140 4 2 2 30 49 48 47 58 141 4 2 2 30 8 3 21 58 142 4 2 2 30 43 42 44 58 143 4 2 2 30 24 25 44 58 144 4 2 2 30 57 55 50 58 145 4 2 2 30 52 17 54 58 146 4 2 2 30 52 50 17 58 147 4 2 2 30 46 50 55 58 148 4 2 2 30 30 25 24 58 149 4 2 2 30 48 49 45 58 150 4 2 2 30 36 39 37 58 151 4 2 2 30 28 30 29 58 152 4 2 2 30 11 4 1 58 153 4 2 2 30 57 50 51 58 154 4 2 2 30 18 54 17 58 155 4 2 2 30 29 24 23 58 156 4 2 2 30 34 47 35 58 157 4 2 2 30 24 44 23 58 158 4 2 2 30 22 14 16 58 159 4 2 2 30 20 9 8 58 160 4 2 2 30 32 42 43 58 161 4 2 2 30 44 41 23 58 162 4 2 2 30 44 42 41 58 163 4 2 2 30 5 11 13 58 164 4 2 2 30 12 3 8 58 165 4 2 2 30 48 31 35 58 166 4 2 2 30 28 7 30 58 167 4 2 2 30 28 29 26 58 168 4 2 2 30 16 20 22 58 169 4 2 2 30 20 8 21 58 170 4 2 2 30 43 25 1 58 171 4 2 2 30 2 9 20 58 172 4 2 2 30 29 23 26 58 173 4 2 2 30 22 17 14 58 174 4 2 2 30 47 15 49 58 175 4 2 2 30 53 26 23 58 176 4 2 2 30 47 34 15 58 177 4 2 2 30 12 6 3 58 178 4 2 2 30 41 57 23 58 179 4 2 2 30 37 32 38 58 180 4 2 2 30 30 1 25 58 181 4 2 2 30 39 5 2 58 182 4 2 2 30 12 11 6 58 183 4 2 2 30 42 40 41 58 184 4 2 2 30 13 2 5 58 185 4 2 2 30 48 45 31 58 186 4 2 2 30 53 54 26 58 187 4 2 2 30 6 7 28 58 188 4 2 2 30 38 32 1 58 189 4 2 2 30 30 7 1 58 190 4 2 2 30 16 14 49 58 191 4 2 2 30 56 45 46 58 192 4 2 2 30 57 40 56 58 193 4 2 2 30 43 1 32 58 194 4 2 2 30 33 40 42 58 195 4 2 2 30 27 3 6 58 196 4 2 2 30 41 40 57 58 197 4 2 2 30 33 42 32 58 198 4 2 2 30 37 39 34 58 199 4 2 2 30 49 15 16 58 200 4 2 2 30 28 27 6 58 201 4 2 2 30 31 37 35 58 202 4 2 2 30 1 7 11 58 203 4 2 2 30 28 26 27 58 204 4 2 2 30 54 3 27 58 205 4 2 2 30 49 46 45 58 206 4 2 2 30 11 7 6 58 207 4 2 2 30 34 35 37 58 208 4 2 2 30 31 45 56 58 209 4 2 2 30 39 2 34 58 210 4 2 2 30 49 14 46 58 211 4 2 2 30 50 14 17 58 212 4 2 2 30 46 14 50 58 213 4 2 2 30 54 27 26 58 214 4 2 2 30 40 31 56 58 215 4 2 2 30 16 15 20 58 216 4 2 2 30 33 32 37 58 217 4 2 2 30 33 31 40 58 218 4 2 2 30 34 2 15 58 219 4 2 2 30 33 37 31 58 220 4 2 2 30 2 20 15 58 $EndElements -------------- next part -------------- An HTML attachment was scrubbed... URL: From cyrill.von.planta at usi.ch Thu Jan 19 07:14:55 2017 From: cyrill.von.planta at usi.ch (Cyrill Vonplanta) Date: Thu, 19 Jan 2017 13:14:55 +0000 Subject: [petsc-users] MatMatMult causes crash Message-ID: <0766E217-C116-4F0C-B984-4DEEAAE92181@usi.ch> Dear PETSc Users, I have a problem with a solver running on a cray machine that crashes at the command ?MatMatMult? (see error message below). When i run the same solver on my machine in serial or parallel it runs through, also when I look at it with -malloc_debug there doesn?t seem to be any issues. Does someone have a clue what the cause of this failure could be? Best Cyrill -- The line that causes the crash is this: ierr = MatMatMult(_O, _interpolations[0], MAT_INITIAL_MATRIX, PETSC_DEFAULT, &mmg->interpolations[mg_levels-2]); CHKERRQ(ierr); The error message: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Out of memory. This could be due to allocating [0]PETSC ERROR: too large an object or bleeding by not properly [0]PETSC ERROR: destroying unneeded objects. [0]PETSC ERROR: Memory allocated 0 Memory used by process 61852 [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. [0]PETSC ERROR: Memory requested 18446744068029169664 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell named nid01137 by studi Thu Jan 19 14:03:27 2017 [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --known-memcmp-ok=1 --known-mpi-c-double-complex=1 --known-mpi-long-double=1 --known-mpi-shared-libraries=0 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lparmetis" --with-metis=1 --with-metis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" [0]PETSC ERROR: #1 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: #2 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: #3 MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable() line 198 in src/mat/impls/aij/mpi/mpimatmatmult.c [0]PETSC ERROR: #4 MatMatMult_MPIAIJ_MPIAIJ() line 34 in src/mat/impls/aij/mpi/mpimatmatmult.c [0]PETSC ERROR: MMG Setup 30.868420 ms. #5 MatMatMult() line 9517 in src/mat/interface/matrix.c [0]PETSC ERROR: #6 MMGSetup() line 85 in /users/studi/src/moose-passo/src/passo/monotone_mg.C [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Arguments are incompatible [0]PETSC ERROR: Incompatible vector local lengths 666 != 10922 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell named nid01137 by studi Thu Jan 19 14:03:27 2017 [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --known-memcmp-ok=1 --known-mpi-c-double-complex=1 --known-mpi-long-double=1 --known-mpi-shared-libraries=0 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lparmetis" --with-metis=1 --with-metis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" [0]PETSC ERROR: #7 VecCopy() line 1639 in src/vec/vec/interface/vector.c Level 1, Presmoothing step 0 ... srun: error: nid01137: task 0: Trace/breakpoint trap srun: Terminating job step 349949.1 slurmstepd: error: *** STEP 349949.1 ON nid01137 CANCELLED AT 2017-01-19T14:03:32 *** srun: Job step aborted: Waiting up to 32 seconds for job step to finish. srun: error: nid01137: task 1: Killed From bsmith at mcs.anl.gov Thu Jan 19 10:03:30 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 19 Jan 2017 10:03:30 -0600 Subject: [petsc-users] MatMatMult causes crash In-Reply-To: <0766E217-C116-4F0C-B984-4DEEAAE92181@usi.ch> References: <0766E217-C116-4F0C-B984-4DEEAAE92181@usi.ch> Message-ID: Absurd memory requests "Memory requested 18446744068029169664" usually means that 32 bit integers are not large enough for the problem. Try configuring on the cray with --with-64-bit-indices Barry > On Jan 19, 2017, at 7:14 AM, Cyrill Vonplanta wrote: > > Dear PETSc Users, > > > I have a problem with a solver running on a cray machine that crashes at the command ?MatMatMult? (see error message below). When i run the same solver on my machine in serial or parallel it runs through, also when I look at it with -malloc_debug there doesn?t seem to be any issues. > > Does someone have a clue what the cause of this failure could be? > > Best Cyrill > -- > > The line that causes the crash is this: > > ierr = MatMatMult(_O, _interpolations[0], MAT_INITIAL_MATRIX, PETSC_DEFAULT, &mmg->interpolations[mg_levels-2]); CHKERRQ(ierr); > > The error message: > > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Out of memory. This could be due to allocating > [0]PETSC ERROR: too large an object or bleeding by not properly > [0]PETSC ERROR: destroying unneeded objects. > [0]PETSC ERROR: Memory allocated 0 Memory used by process 61852 > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0]PETSC ERROR: Memory requested 18446744068029169664 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell named nid01137 by studi Thu Jan 19 14:03:27 2017 > [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --known-memcmp-ok=1 --known-mpi-c-double-complex=1 --known-mpi-long-double=1 --known-mpi-shared-libraries=0 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lparmetis" --with-metis=1 --with-metis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" > [0]PETSC ERROR: #1 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in src/mat/impls/aij/mpi/mpiaij.c > [0]PETSC ERROR: #2 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in src/mat/impls/aij/mpi/mpiaij.c > [0]PETSC ERROR: #3 MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable() line 198 in src/mat/impls/aij/mpi/mpimatmatmult.c > [0]PETSC ERROR: #4 MatMatMult_MPIAIJ_MPIAIJ() line 34 in src/mat/impls/aij/mpi/mpimatmatmult.c > [0]PETSC ERROR: MMG Setup 30.868420 ms. > #5 MatMatMult() line 9517 in src/mat/interface/matrix.c > [0]PETSC ERROR: #6 MMGSetup() line 85 in /users/studi/src/moose-passo/src/passo/monotone_mg.C > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Arguments are incompatible > [0]PETSC ERROR: Incompatible vector local lengths 666 != 10922 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell named nid01137 by studi Thu Jan 19 14:03:27 2017 > [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --known-memcmp-ok=1 --known-mpi-c-double-complex=1 --known-mpi-long-double=1 --known-mpi-shared-libraries=0 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lparmetis" --with-metis=1 --with-metis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" > [0]PETSC ERROR: #7 VecCopy() line 1639 in src/vec/vec/interface/vector.c > Level 1, Presmoothing step 0 ... srun: error: nid01137: task 0: Trace/breakpoint trap > srun: Terminating job step 349949.1 > slurmstepd: error: *** STEP 349949.1 ON nid01137 CANCELLED AT 2017-01-19T14:03:32 *** > srun: Job step aborted: Waiting up to 32 seconds for job step to finish. > srun: error: nid01137: task 1: Killed > > From cyrill.von.planta at usi.ch Thu Jan 19 10:54:34 2017 From: cyrill.von.planta at usi.ch (Cyrill Vonplanta) Date: Thu, 19 Jan 2017 16:54:34 +0000 Subject: [petsc-users] MatMatMult causes crash In-Reply-To: References: <0766E217-C116-4F0C-B984-4DEEAAE92181@usi.ch> Message-ID: Thanks for the answer. I believe that bit shortage is not the problem as the problem size is still very small (I printed out the matrix sizes of the operands in MatMatMult on the cray and my machine below). In addition by commenting in and out of code I found that the matrix _O (codes an orthogonal 3D transformation and contains only 3x3 blocks on the diagonal) causes this. This seems strange to me as the matrix is set up and well behaved. When I write it out to matlab, _O is of full rank and the eigenvalues are nice. Is there a way to further diagnose this matrix in PETSc or maybe do I have to allocate something else than ?PETSC_DEFAULT? in MatMatMult(...)? Cyrill -- On the cray machine _O: (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: 1107, local row size: 666, local column size: 666, blocksize: 1 (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: 1107, local row size: 441, local column size: 441, blocksize: 1 _interpolations[0]: (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: 195, local row size: 666, local column size: 132, blocksize: 1 (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: 195, local row size: 441, local column size: 63, blocksize: 1 On my Desktop: _O: (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: 1107, local row size: 645, local column size: 645, blocksize: 1 (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: 1107, local row size: 462, local column size: 462, blocksize: 1 _interpolations[0]: (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: 195, local row size: 645, local column size: 126, blocksize: 1 (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: 195, local row size: 462, local column size: 69, blocksize: 1 ******* Cyrill von Planta Institute of Computational Science University of Lugano ** Switzerland Via Giuseppe Buffi 13 ** 6900 Lugano Tel.: +41 (0)58 666 49 73 ** Fax.: +41 (0)58 666 45 36 http://ics.usi.ch/ ** cyrill.von.planta at usi.ch On 19 Jan 2017, at 17:03, Barry Smith > wrote: Absurd memory requests "Memory requested 18446744068029169664" usually means that 32 bit integers are not large enough for the problem. Try configuring on the cray with --with-64-bit-indices Barry On Jan 19, 2017, at 7:14 AM, Cyrill Vonplanta > wrote: Dear PETSc Users, I have a problem with a solver running on a cray machine that crashes at the command ?MatMatMult? (see error message below). When i run the same solver on my machine in serial or parallel it runs through, also when I look at it with -malloc_debug there doesn?t seem to be any issues. Does someone have a clue what the cause of this failure could be? Best Cyrill -- The line that causes the crash is this: ierr = MatMatMult(_O, _interpolations[0], MAT_INITIAL_MATRIX, PETSC_DEFAULT, &mmg->interpolations[mg_levels-2]); CHKERRQ(ierr); The error message: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Out of memory. This could be due to allocating [0]PETSC ERROR: too large an object or bleeding by not properly [0]PETSC ERROR: destroying unneeded objects. [0]PETSC ERROR: Memory allocated 0 Memory used by process 61852 [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. [0]PETSC ERROR: Memory requested 18446744068029169664 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell named nid01137 by studi Thu Jan 19 14:03:27 2017 [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --known-memcmp-ok=1 --known-mpi-c-double-complex=1 --known-mpi-long-double=1 --known-mpi-shared-libraries=0 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lparmetis" --with-metis=1 --with-metis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" [0]PETSC ERROR: #1 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: #2 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in src/mat/impls/aij/mpi/mpiaij.c [0]PETSC ERROR: #3 MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable() line 198 in src/mat/impls/aij/mpi/mpimatmatmult.c [0]PETSC ERROR: #4 MatMatMult_MPIAIJ_MPIAIJ() line 34 in src/mat/impls/aij/mpi/mpimatmatmult.c [0]PETSC ERROR: MMG Setup 30.868420 ms. #5 MatMatMult() line 9517 in src/mat/interface/matrix.c [0]PETSC ERROR: #6 MMGSetup() line 85 in /users/studi/src/moose-passo/src/passo/monotone_mg.C [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Arguments are incompatible [0]PETSC ERROR: Incompatible vector local lengths 666 != 10922 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell named nid01137 by studi Thu Jan 19 14:03:27 2017 [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 --known-memcmp-ok=1 --known-mpi-c-double-complex=1 --known-mpi-long-double=1 --known-mpi-shared-libraries=0 --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lparmetis" --with-metis=1 --with-metis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" [0]PETSC ERROR: #7 VecCopy() line 1639 in src/vec/vec/interface/vector.c Level 1, Presmoothing step 0 ... srun: error: nid01137: task 0: Trace/breakpoint trap srun: Terminating job step 349949.1 slurmstepd: error: *** STEP 349949.1 ON nid01137 CANCELLED AT 2017-01-19T14:03:32 *** srun: Job step aborted: Waiting up to 32 seconds for job step to finish. srun: error: nid01137: task 1: Killed From knepley at gmail.com Thu Jan 19 10:59:34 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 19 Jan 2017 10:59:34 -0600 Subject: [petsc-users] MatMatMult causes crash In-Reply-To: References: <0766E217-C116-4F0C-B984-4DEEAAE92181@usi.ch> Message-ID: On Thu, Jan 19, 2017 at 10:54 AM, Cyrill Vonplanta wrote: > Thanks for the answer. I believe that bit shortage is not the problem as > the problem size is still very small (I printed out the matrix sizes of the > operands in MatMatMult on the cray and my machine below). > Then the next step is to run under valgrind, or give us something that reproduces this error. It does not occur in our tests yet. Thanks, Matt > In addition by commenting in and out of code I found that the matrix _O > (codes an orthogonal 3D transformation and contains only 3x3 blocks on the > diagonal) causes this. This seems strange to me as the matrix is set up and > well behaved. When I write it out to matlab, _O is of full rank and the > eigenvalues are nice. Is there a way to further diagnose this matrix in > PETSc or maybe do I have to allocate something else than ?PETSC_DEFAULT? in > MatMatMult(...)? > > Cyrill > -- > On the cray machine > > _O: > (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: > 1107, local row size: 666, local column size: 666, blocksize: 1 > (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: > 1107, local row size: 441, local column size: 441, blocksize: 1 > > _interpolations[0]: > (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: > 195, local row size: 666, local column size: 132, blocksize: 1 > (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: > 195, local row size: 441, local column size: 63, blocksize: 1 > > > > On my Desktop: > > _O: > (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: > 1107, local row size: 645, local column size: 645, blocksize: 1 > (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: > 1107, local row size: 462, local column size: 462, blocksize: 1 > > _interpolations[0]: > (Matrix) Type: mpiaij, rank 0| Global row size: 1107, global column size: > 195, local row size: 645, local column size: 126, blocksize: 1 > (Matrix) Type: mpiaij, rank 1| Global row size: 1107, global column size: > 195, local row size: 462, local column size: 69, blocksize: 1 > > > > > > > ******* > Cyrill von Planta > > Institute of Computational Science > University of Lugano ** Switzerland > Via Giuseppe Buffi 13 ** 6900 Lugano > Tel.: +41 (0)58 666 49 73 ** Fax.: +41 (0)58 666 45 36 > http://ics.usi.ch/ ** cyrill.von.planta at usi.ch cyrill.von.planta at usi.ch> > > On 19 Jan 2017, at 17:03, Barry Smith ith at mcs.anl.gov>> wrote: > > > Absurd memory requests "Memory requested 18446744068029169664" usually > means that 32 bit integers are not large enough for the problem. Try > configuring on the cray with --with-64-bit-indices > > Barry > > > On Jan 19, 2017, at 7:14 AM, Cyrill Vonplanta mailto:cyrill.von.planta at usi.ch>> wrote: > > Dear PETSc Users, > > > I have a problem with a solver running on a cray machine that crashes at > the command ?MatMatMult? (see error message below). When i run the same > solver on my machine in serial or parallel it runs through, also when I > look at it with -malloc_debug there doesn?t seem to be any issues. > > Does someone have a clue what the cause of this failure could be? > > Best Cyrill > -- > > The line that causes the crash is this: > > ierr = MatMatMult(_O, _interpolations[0], MAT_INITIAL_MATRIX, > PETSC_DEFAULT, &mmg->interpolations[mg_levels-2]); CHKERRQ(ierr); > > The error message: > > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Out of memory. This could be due to allocating > [0]PETSC ERROR: too large an object or bleeding by not properly > [0]PETSC ERROR: destroying unneeded objects. > [0]PETSC ERROR: Memory allocated 0 Memory used by process 61852 > [0]PETSC ERROR: Try running with -malloc_dump or -malloc_log for info. > [0]PETSC ERROR: Memory requested 18446744068029169664 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell > named nid01137 by studi Thu Jan 19 14:03:27 2017 > [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 > --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 > --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 > --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 > --known-memcmp-ok=1 --known-mpi-c-double-complex=1 > --known-mpi-long-double=1 --known-mpi-shared-libraries=0 > --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 > --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 > --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 > --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 > --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC > --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 > --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 > --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real > --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 > --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 > --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/ > opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 > --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/ > opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/ > opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" > --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lparmetis" --with-metis=1 --with-metis-include=/opt/ > cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/ > tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 > --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 > --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include > --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib > -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/ > cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/ > tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps > -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 > --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include > --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib > -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 > -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 > -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math > -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell > --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 > --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" > --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas > -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" > [0]PETSC ERROR: #1 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in > src/mat/impls/aij/mpi/mpiaij.c > [0]PETSC ERROR: #2 MatGetBrowsOfAoCols_MPIAIJ() line 4815 in > src/mat/impls/aij/mpi/mpiaij.c > [0]PETSC ERROR: #3 MatMatMultSymbolic_MPIAIJ_MPIAIJ_nonscalable() line > 198 in src/mat/impls/aij/mpi/mpimatmatmult.c > [0]PETSC ERROR: #4 MatMatMult_MPIAIJ_MPIAIJ() line 34 in > src/mat/impls/aij/mpi/mpimatmatmult.c > [0]PETSC ERROR: MMG Setup 30.868420 ms. > #5 MatMatMult() line 9517 in src/mat/interface/matrix.c > [0]PETSC ERROR: #6 MMGSetup() line 85 in /users/studi/src/moose-passo/ > src/passo/monotone_mg.C > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Arguments are incompatible > [0]PETSC ERROR: Incompatible vector local lengths 666 != 10922 > [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting. > [0]PETSC ERROR: Petsc Release Version 3.7.2, Jun, 05, 2016 > [0]PETSC ERROR: /scratch/snx3000/studi/./moose-passo-opt on a haswell > named nid01137 by studi Thu Jan 19 14:03:27 2017 > [0]PETSC ERROR: Configure options --known-has-attribute-aligned=1 > --known-mpi-int64_t=0 --known-bits-per-byte=8 --known-sdot-returns-double=0 > --known-snrm2-returns-double=0 --known-level1-dcache-assoc=0 > --known-level1-dcache-linesize=32 --known-level1-dcache-size=32768 > --known-memcmp-ok=1 --known-mpi-c-double-complex=1 > --known-mpi-long-double=1 --known-mpi-shared-libraries=0 > --known-sizeof-MPI_Comm=4 --known-sizeof-MPI_Fint=4 --known-sizeof-char=1 > --known-sizeof-double=8 --known-sizeof-float=4 --known-sizeof-int=4 > --known-sizeof-long-long=8 --known-sizeof-long=8 --known-sizeof-short=2 > --known-sizeof-size_t=8 --known-sizeof-void-p=8 --with-ar=ar --with-batch=1 > --with-cc=cc --with-clib-autodetect=0 --with-cxx=CC > --with-cxxlib-autodetect=0 --with-debugging=0 --with-dependencies=0 > --with-fc=ftn --with-fortran-datatypes=0 --with-fortran-interfaces=0 > --with-fortranlib-autodetect=0 --with-ranlib=ranlib --with-scalar-type=real > --with-shared-ld=ar --with-etags=0 --with-dependencies=0 --with-x=0 > --with-ssl=0 --with-shared-libraries=0 --with-dependencies=0 > --with-mpi-lib="[]" --with-mpi-include="[]" --with-blas-lapack-lib="-L/ > opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib -lsci_gnu_mp" --with-superlu=1 > --with-superlu-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-superlu-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lsuperlu" --with-superlu_dist=1 --with-superlu_dist-include=/ > opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-superlu_dist-lib="-L/ > opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lsuperlu_dist" > --with-parmetis=1 --with-parmetis-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-parmetis-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lparmetis" --with-metis=1 --with-metis-include=/opt/ > cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-metis-lib="-L/opt/cray/ > tpsl/16.07.1/GNU/5.1/haswell/lib -lmetis" --with-ptscotch=1 > --with-ptscotch-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-ptscotch-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lptscotch -lscotch -lptscotcherr -lscotcherr" --with-scalapack=1 > --with-scalapack-include=/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/include > --with-scalapack-lib="-L/opt/cray/libsci/13.2.0/GNU/5.1/x86_64/lib > -lsci_gnu_mpi_mp -lsci_gnu_mp" --with-mumps=1 --with-mumps-include=/opt/ > cray/tpsl/16.07.1/GNU/5.1/haswell/include --with-mumps-lib="-L/opt/cray/ > tpsl/16.07.1/GNU/5.1/haswell/lib -lcmumps -ldmumps -lesmumps -lsmumps > -lzmumps -lmumps_common -lptesmumps -lpord" --with-hdf5=1 > --with-hdf5-include=/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/include > --with-hdf5-lib="-L/opt/cray/hdf5-parallel/1.8.16/GNU/5.1/lib > -lhdf5_parallel -lz -ldl" --CFLAGS="-march=haswell -fopenmp -O3 > -ffast-math -fPIC" --CPPFLAGS= --CXXFLAGS="-march=haswell -fopenmp -O3 > -ffast-math -fPIC" --FFLAGS="-march=haswell -fopenmp -O3 -ffast-math > -fPIC" --LIBS= --CXX_LINKER_FLAGS= --PETSC_ARCH=haswell > --prefix=/opt/cray/pe/petsc/3.7.2.1/real/GNU/5.1/haswell --with-hypre=1 > --with-hypre-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-hypre-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib -lHYPRE" > --with-sundials=1 --with-sundials-include=/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/include > --with-sundials-lib="-L/opt/cray/tpsl/16.07.1/GNU/5.1/haswell/lib > -lsundials_cvode -lsundials_cvodes -lsundials_ida -lsundials_idas > -lsundials_kinsol -lsundials_nvecparallel -lsundials_nvecserial" > [0]PETSC ERROR: #7 VecCopy() line 1639 in src/vec/vec/interface/vector.c > Level 1, Presmoothing step 0 ... srun: error: nid01137: task 0: > Trace/breakpoint trap > srun: Terminating job step 349949.1 > slurmstepd: error: *** STEP 349949.1 ON nid01137 CANCELLED AT > 2017-01-19T14:03:32 *** > srun: Job step aborted: Waiting up to 32 seconds for job step to finish. > srun: error: nid01137: task 1: Killed > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mvalera at mail.sdsu.edu Thu Jan 19 16:01:37 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Thu, 19 Jan 2017 14:01:37 -0800 Subject: [petsc-users] DMDA objects while distributing 3d arrays Message-ID: Hello all, I'm currently pushing forward on the parallelization of my model, next step would be to parallelize all the grids (pressure, temperature, velocities, and such), and they are stored as 3d arrays in fortran. I'm following ex11f90.f and is a good start, i have a couple questions from it: 1. in the example a dummy vector g is made and the array values are loaded into it, the dimensions of this vector are variable? the same dummy vector is used for 1d,2d,3d so i guess it is. i was planning to use matrix objects for 3d arrays but i guess a vector of this kind would be better suited? 2. I notice also that a stride is used from the corners of the DMDA, im looking for a way to operate over the global indices of the array instead, can this be done? any good example to follow on this? this would save us lots of effort if we can just extend the actual operations from global indices into the DMDA objects. 3. next, im concerned about the degrees of freedom, how can i know how many dof my model has? we are following an arakawa c-type grid. Same for the type of stencil which i guess is star type in my case, we use a 9 point stencil. that is it for now, thanks for your time, Manuel Valera -------------- next part -------------- An HTML attachment was scrubbed... URL: From fangbowa at buffalo.edu Thu Jan 19 16:10:49 2017 From: fangbowa at buffalo.edu (Fangbo Wang) Date: Thu, 19 Jan 2017 17:10:49 -0500 Subject: [petsc-users] Need advice to do blockwise matrix-vector multiplication Message-ID: Hi, *Background:* I am using stochastic finite element to solve a solid mechanics problem with random material properties. At the end of the day, I get a linear system of equations Ax=b to solve. The matrix A is very large with size of 1.3million by 1.3 million, and to save this matrix needs more than 100 G memory. Fortunately, matrix A has some nice features that it is a block matrix, most of the blocks inside the matrix are similar, each block is 10,000 by 10,000. Hence, I only need to save some blocks (in my case 45). Most of the computation in my iterative solver is matrix-vec multiplication, that's why I want to do it using block matrices. ? *Current:* I tried to parallelize all my 45 block matrices in all the processors, and all the corresponding 45 block vectors in all the processors. However, the computation seems to be very slow, and no scalability at all. I am thinking of using small groups of processors to separate the computation, like using intra-communicators and inter-communicators. Maybe this will help to reduce the communication. Any one have some experiences on this? Is there any Petsc function to do these jobs? I am open to any suggestions. Thank you very much! Fangbo Wang -- Fangbo Wang, PhD student Stochastic Geomechanics Research Group Department of Civil, Structural and Environmental Engineering University at Buffalo Email: *fangbowa at buffalo.edu * -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: FIG-2-Color-online-A-symmetric-block-Toeplitz-matrix-Each-block-is-also-a-symmetric.png Type: image/png Size: 26270 bytes Desc: not available URL: From bsmith at mcs.anl.gov Thu Jan 19 17:19:26 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 19 Jan 2017 17:19:26 -0600 Subject: [petsc-users] Need advice to do blockwise matrix-vector multiplication In-Reply-To: References: Message-ID: <3CB0FC87-F0B1-4699-B96F-074E31486FD1@mcs.anl.gov> > On Jan 19, 2017, at 4:10 PM, Fangbo Wang wrote: > > Hi, > > Background: > > I am using stochastic finite element to solve a solid mechanics problem with random material properties. At the end of the day, I get a linear system of equations Ax=b to solve. > > The matrix A is very large with size of 1.3million by 1.3 million, and to save this matrix needs more than 100 G memory. Fortunately, matrix A has some nice features that it is a block matrix, most of the blocks inside the matrix are similar, each block is 10,000 by 10,000. > > > Hence, I only need to save some blocks (in my case 45). Most of the computation in my iterative solver is matrix-vec multiplication, that's why I want to do it using block matrices. > > > > ? > > Current: > I tried to parallelize all my 45 block matrices in all the processors, and all the corresponding 45 block vectors in all the processors. However, the computation seems to be very slow, and no scalability at all. > I am thinking of using small groups of processors to separate the computation, like using intra-communicators and inter-communicators. Maybe this will help to reduce the communication. No, just make things excessively complex. > > Any one have some experiences on this? Is there any Petsc function to do these jobs? I am open to any suggestions. Based on your picture it looks like if the matrix was explicitly formed it would be dense? Or are your 45 "small matrices" sparse? Are there any "empty" block matrices in your diagram or are they all one of the 45 small ones? There are two ways to order your unknowns; one with all unknowns for one "block" then all unknowns for the next block ... or interlacing the unknowns between blocks. Depending on the structure of the problem one or the other way can be significently better. The MatNest construct may be the way to go; it will behave like forming the full matrix but for each block in the matrix you would just have a pointer to the correct small matrix so you don't store the individual matrices more than once. Also if you get no speed up you need to verify that it is not due to the hardware or badly configured software so run the streams benchmark and make sure you have a good MPI binding http://www.mcs.anl.gov/petsc/documentation/faq.html#computers Barry > > Thank you very much! > > > > Fangbo Wang > > -- > Fangbo Wang, PhD student > Stochastic Geomechanics Research Group > Department of Civil, Structural and Environmental Engineering > University at Buffalo > Email: fangbowa at buffalo.edu From mvalera at mail.sdsu.edu Thu Jan 19 18:56:30 2017 From: mvalera at mail.sdsu.edu (Manuel Valera) Date: Thu, 19 Jan 2017 16:56:30 -0800 Subject: [petsc-users] DMDA objects while distributing 3d arrays In-Reply-To: References: Message-ID: I've read some more and from the ex13f90aux from the dm examples, it seems is very similar what im looking for, it says: ! ! The following 4 subroutines handle the mapping of coordinates. I'll explain ! this in detail: ! PETSc gives you local arrays which are indexed using the global indices. ! This is probably handy in some cases, but when you are re-writing an ! existing serial code and want to use DMDAs, you have tons of loops going ! from 1 to imax etc. that you don't want to change. ! These subroutines re-map the arrays so that all the local arrays go from ! 1 to the (local) imax. ! Could someone explain a little bit more about these functions? petsc_to_local(), local_to_petsc(), and specially why are used transform_petsc_us() and transform_us_petsc() ? Thanks, Manuel On Thu, Jan 19, 2017 at 2:01 PM, Manuel Valera wrote: > Hello all, > > I'm currently pushing forward on the parallelization of my model, next > step would be to parallelize all the grids (pressure, temperature, > velocities, and such), and they are stored as 3d arrays in fortran. > > I'm following ex11f90.f and is a good start, i have a couple questions > from it: > > 1. in the example a dummy vector g is made and the array values are > loaded into it, the dimensions of this vector are variable? the same dummy > vector is used for 1d,2d,3d so i guess it is. i was planning to use matrix > objects for 3d arrays but i guess a vector of this kind would be better > suited? > 2. I notice also that a stride is used from the corners of the DMDA, > im looking for a way to operate over the global indices of the array > instead, can this be done? any good example to follow on this? this would > save us lots of effort if we can just extend the actual operations from > global indices into the DMDA objects. > 3. next, im concerned about the degrees of freedom, how can i know how > many dof my model has? we are following an arakawa c-type grid. Same for > the type of stencil which i guess is star type in my case, we use a 9 point > stencil. > > > that is it for now, thanks for your time, > > Manuel Valera > -------------- next part -------------- An HTML attachment was scrubbed... URL: From elbueler at alaska.edu Thu Jan 19 19:13:22 2017 From: elbueler at alaska.edu (Ed Bueler) Date: Thu, 19 Jan 2017 16:13:22 -0900 Subject: [petsc-users] meaning of constants in classical iterations Message-ID: Dear PETSc -- In my humble opinion, the answers to the following rather basic questions are missing from the petsc users manual. At least in part, I am not certain what the answers are. Question 1: What formula does the preconditioned Richardson iteration (-ksp_type richardson -pc_type X) satisfy and where does the scale (-ksp_richardson_scale alpha) go? Answer?: In the left-preconditioned form I would guess it is (*) x_{k+1} = x_k + alpha M_L^{-1} (b - A x_k), where M_L is the matrix implied by -pc_type X. If so this makes option combination -ksp_type richardson -pc_type jacobi [-ksp_richardson_scale 1] into the classical Jacobi iteration x_{k+1} = x_k + D^{-1} (b - A x_k) = D^{-1} (b - (L+U)x_k) where A = D+L+U and D is the diagonal of A. I am pretty sure this answer is right, but equation (*) should probably be somewhere in the manual! Question 2. What formula is the (non-symmetric) SOR satisfying, and where do the constants -pc_sor_omega and -pc_sor_diagonal_shift delta go? Answer?: My understanding is first that the classical Jacobi, Gauss-Seidel, and SOR iterations are all regarded by PETSc developers as left-preconditioned Richardson iterations, so names "jacobi" and "sor" describe PC objects and these classical iterations are all KSP type "richardson". That is, they are all instances of (*). (RANT: There is no clear statement of this excellent philosophy in the users manual, and it would help a lot of students of PETSc! While some references share it, reading standard literature, including relevant wiki pages, is not healthful on this topic. Because the literature does not necessarily parallel PETSc thinking, it requires constant mental translation. This is bad when it comes to setting parameters at runtime!) Supposing the above philosophy, and that A = D+L+U with the usual meanings, then I guess SOR is (**) x_{k+1} = x_k + alpha (omega^{-1} (D+delta) + L)^{-1} (b - A x_k), In case alpha=1, omega=1, delta=0 then (**) becomes Gauss-Seidel: -ksp_type richardson -pc_type sor [-ksp_richardson_scale 1 -pc_sor_omega 1 -pc_sor_diagonal_shift 0] Especially because it involves three user-controlled constants, equation (**) could usefully be stated somewhere in the manual ... supposing it is correct. Question 3. What is the probability that a PETSc developer will address the above two questions by telling me that classical iterations are lame? Answer: Not the point! These iterations are used blockwise (e.g. ASM) and as smoothers in many proper PETSc applications. Clarity is helpful! Thanks for the great tool, as usual! Ed -- Ed Bueler Dept of Math and Stat and Geophysical Institute University of Alaska Fairbanks Fairbanks, AK 99775-6660 301C Chapman and 410D Elvey 907 474-7693 <(907)%20474-7693> and 907 474-7199 <(907)%20474-7199> (fax 907 474-5394 <(907)%20474-5394>) -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 19 21:54:57 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 19 Jan 2017 21:54:57 -0600 Subject: [petsc-users] DMDA objects while distributing 3d arrays In-Reply-To: References: Message-ID: On Thu, Jan 19, 2017 at 6:56 PM, Manuel Valera wrote: > I've read some more and from the ex13f90aux from the dm examples, it seems > is very similar what im looking for, it says: > > ! > ! The following 4 subroutines handle the mapping of coordinates. I'll > explain > ! this in detail: > ! PETSc gives you local arrays which are indexed using the global > indices. > ! This is probably handy in some cases, but when you are re-writing an > ! existing serial code and want to use DMDAs, you have tons of loops > going > ! from 1 to imax etc. that you don't want to change. > ! These subroutines re-map the arrays so that all the local arrays go > from > ! 1 to the (local) imax. > ! > > Could someone explain a little bit more about these functions? > petsc_to_local(), local_to_petsc(), and specially why are used > transform_petsc_us() and transform_us_petsc() ? > This is one way to do things, which I do not necessarily agree with. The larger point is that a scalable strategy is one where you only compute over patches rather than the whole grid. This is usually trivial since the global bounds just become local bounds, and you are done. With DMDA, we are always using global indices so no problem for translating anything with global indices. However, in parallel you should note that you can only refer to values on your owned patch of the grid. I hope this answers the question. If not, can you try and explain more about what is not clear? Thanks, Matt > Thanks, > > Manuel > > On Thu, Jan 19, 2017 at 2:01 PM, Manuel Valera > wrote: > >> Hello all, >> >> I'm currently pushing forward on the parallelization of my model, next >> step would be to parallelize all the grids (pressure, temperature, >> velocities, and such), and they are stored as 3d arrays in fortran. >> >> I'm following ex11f90.f and is a good start, i have a couple questions >> from it: >> >> 1. in the example a dummy vector g is made and the array values are >> loaded into it, the dimensions of this vector are variable? the same dummy >> vector is used for 1d,2d,3d so i guess it is. i was planning to use matrix >> objects for 3d arrays but i guess a vector of this kind would be better >> suited? >> 2. I notice also that a stride is used from the corners of the DMDA, >> im looking for a way to operate over the global indices of the array >> instead, can this be done? any good example to follow on this? this would >> save us lots of effort if we can just extend the actual operations from >> global indices into the DMDA objects. >> 3. next, im concerned about the degrees of freedom, how can i know >> how many dof my model has? we are following an arakawa c-type grid. Same >> for the type of stencil which i guess is star type in my case, we use a 9 >> point stencil. >> >> >> that is it for now, thanks for your time, >> >> Manuel Valera >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From pooja.singh16 at hotmail.com Fri Jan 20 03:08:21 2017 From: pooja.singh16 at hotmail.com (Pooja Singh) Date: Fri, 20 Jan 2017 09:08:21 +0000 Subject: [petsc-users] Translation and Interpretation Solution !! Message-ID: HI, petsc-users at mcs.anl.gov Hope you're doing well, We are a Translation, Interpretation and Localization Services Provider for all type of documents in languages like Russian, Polish, Chinese, German, Swedish, Spanish, Ukrainian, Italian, Arabic, Japanese, Thai, Dutch, French, Vietnamese, Swedish, Korean and regional Languages. We deal in: Medical Translation, Marketing Material Translation, Academic Translation, Book Translation, Financial Translation, Technical Translation, Legal Translation, E-learning course translation, Website and Software Localization. Major clientele: HP, Samsung Engineering, Fluor, ABB Ltd, TOYO, Sulzer, Emerson, TATA, NIIT, Petrofac, BHEL, Siemens Ltd, Larsen & Toubro, Posco, Schneider Electric. If you have such requirements then please share with me in detail. I will be happy to connect with you. Kind Regards, Priya Malik Team-Tr & In Note-If this is not relevant you then revert with "No" in subject line. -------------- next part -------------- An HTML attachment was scrubbed... URL: From patrick.sanan at gmail.com Fri Jan 20 03:56:34 2017 From: patrick.sanan at gmail.com (Patrick Sanan) Date: Fri, 20 Jan 2017 10:56:34 +0100 Subject: [petsc-users] meaning of constants in classical iterations In-Reply-To: References: Message-ID: On Fri, Jan 20, 2017 at 2:13 AM, Ed Bueler wrote: > Dear PETSc -- > > In my humble opinion, the answers to the following rather basic questions > are missing from the petsc users manual. At least in part, I am not certain > what the answers are. > > > Question 1: What formula does the preconditioned Richardson iteration > (-ksp_type richardson -pc_type X) satisfy and where does the scale > (-ksp_richardson_scale alpha) go? > > Answer?: In the left-preconditioned form I would guess it is > > (*) x_{k+1} = x_k + alpha M_L^{-1} (b - A x_k), > > where M_L is the matrix implied by -pc_type X. If so this makes option > combination > > -ksp_type richardson -pc_type jacobi [-ksp_richardson_scale 1] > > into the classical Jacobi iteration > > x_{k+1} = x_k + D^{-1} (b - A x_k) = D^{-1} (b - (L+U)x_k) > > where A = D+L+U and D is the diagonal of A. > > I am pretty sure this answer is right, but equation (*) should probably be > somewhere in the manual! It is at least on the KSPRICHARDSON man page (which I have added to my list of things to eventually clean up), albeit without explicit mention of the left preconditioning. > > > Question 2. What formula is the (non-symmetric) SOR satisfying, and where > do the constants -pc_sor_omega and -pc_sor_diagonal_shift delta go? > > Answer?: My understanding is first that the classical Jacobi, Gauss-Seidel, > and SOR iterations are all regarded by PETSc developers as > left-preconditioned Richardson iterations, so names "jacobi" and "sor" > describe PC objects and these classical iterations are all KSP type > "richardson". That is, they are all instances of (*). > > (RANT: There is no clear statement of this excellent philosophy in the > users manual, and it would help a lot of students of PETSc! While some > references share it, reading standard literature, including relevant wiki > pages, is not healthful on this topic. Because the literature does not > necessarily parallel PETSc thinking, it requires constant mental > translation. This is bad when it comes to setting parameters at runtime!) This would be very good for one of the proposed introductory tutorials, about KSP/PC, helping people translate algorithms into PETSc options. I've added some notes on this to my todo list as well. > > Supposing the above philosophy, and that A = D+L+U with the usual meanings, > then I guess SOR is > > (**) x_{k+1} = x_k + alpha (omega^{-1} (D+delta) + L)^{-1} (b - A x_k), > > In case alpha=1, omega=1, delta=0 then (**) becomes Gauss-Seidel: > > -ksp_type richardson -pc_type sor [-ksp_richardson_scale 1 -pc_sor_omega 1 > -pc_sor_diagonal_shift 0] > > Especially because it involves three user-controlled constants, equation > (**) could usefully be stated somewhere in the manual ... supposing it is > correct. Would you have any objection to this sort of thing being on the man pages (and/or in tutorials), instead of in the manual? Especially with the current state of affairs, with the man pages google-able and the manual not, the man pages might be a better place for these specifics - hopefully it's clear to readers of the manual that they can click links to be taken to man pages (if they have web access), which can allow for finer details without interrupting the flow of the manual too much or making it even longer. > > > Question 3. What is the probability that a PETSc developer will address the > above two questions by telling me that classical iterations are lame? > > Answer: Not the point! These iterations are used blockwise (e.g. ASM) and > as smoothers in many proper PETSc applications. Clarity is helpful! > > > Thanks for the great tool, as usual! > > Ed > > > -- > Ed Bueler > Dept of Math and Stat and Geophysical Institute > University of Alaska Fairbanks > Fairbanks, AK 99775-6660 > 301C Chapman and 410D Elvey > 907 474-7693 and 907 474-7199 (fax 907 474-5394) From elbueler at alaska.edu Fri Jan 20 09:49:37 2017 From: elbueler at alaska.edu (Ed Bueler) Date: Fri, 20 Jan 2017 06:49:37 -0900 Subject: [petsc-users] meaning of constants in classical iterations In-Reply-To: References: Message-ID: Patrick -- > Would you have any objection to this sort of thing being on the man > pages (and/or in tutorials), instead of in the manual? Especially with > the current state of affairs, with the man pages google-able and the > manual not, the man pages might be a better place for these specifics ... Yes, better the relevant man pages than nowhere. But the main idea of how classical iterations are factored into KSP and PC is pretty pervasive as a way of understanding PETSc smoothers and block methods, and so I think (*) should also be in the PDF. ... - hopefully it's clear to readers of the manual that they can click > links to be taken to man pages (if they have web access), which can > allow for finer details without interrupting the flow of the manual > too much or making it even longer. Good point. Ed On Fri, Jan 20, 2017 at 12:56 AM, Patrick Sanan wrote: > On Fri, Jan 20, 2017 at 2:13 AM, Ed Bueler wrote: > > Dear PETSc -- > > > > In my humble opinion, the answers to the following rather basic questions > > are missing from the petsc users manual. At least in part, I am not > certain > > what the answers are. > > > > > > Question 1: What formula does the preconditioned Richardson iteration > > (-ksp_type richardson -pc_type X) satisfy and where does the scale > > (-ksp_richardson_scale alpha) go? > > > > Answer?: In the left-preconditioned form I would guess it is > > > > (*) x_{k+1} = x_k + alpha M_L^{-1} (b - A x_k), > > > > where M_L is the matrix implied by -pc_type X. If so this makes option > > combination > > > > -ksp_type richardson -pc_type jacobi [-ksp_richardson_scale 1] > > > > into the classical Jacobi iteration > > > > x_{k+1} = x_k + D^{-1} (b - A x_k) = D^{-1} (b - (L+U)x_k) > > > > where A = D+L+U and D is the diagonal of A. > > > > I am pretty sure this answer is right, but equation (*) should probably > be > > somewhere in the manual! > It is at least on the KSPRICHARDSON man page (which I have added to my > list of things to eventually clean up), albeit without explicit > mention of the left preconditioning. > > > > > > Question 2. What formula is the (non-symmetric) SOR satisfying, and > where > > do the constants -pc_sor_omega and -pc_sor_diagonal_shift delta go? > > > > Answer?: My understanding is first that the classical Jacobi, > Gauss-Seidel, > > and SOR iterations are all regarded by PETSc developers as > > left-preconditioned Richardson iterations, so names "jacobi" and "sor" > > describe PC objects and these classical iterations are all KSP type > > "richardson". That is, they are all instances of (*). > > > > (RANT: There is no clear statement of this excellent philosophy in the > > users manual, and it would help a lot of students of PETSc! While some > > references share it, reading standard literature, including relevant wiki > > pages, is not healthful on this topic. Because the literature does not > > necessarily parallel PETSc thinking, it requires constant mental > > translation. This is bad when it comes to setting parameters at > runtime!) > This would be very good for one of the proposed introductory > tutorials, about KSP/PC, helping people translate algorithms into > PETSc options. I've added some notes on this to my todo list as well. > > > > Supposing the above philosophy, and that A = D+L+U with the usual > meanings, > > then I guess SOR is > > > > (**) x_{k+1} = x_k + alpha (omega^{-1} (D+delta) + L)^{-1} (b - A > x_k), > > > > In case alpha=1, omega=1, delta=0 then (**) becomes Gauss-Seidel: > > > > -ksp_type richardson -pc_type sor [-ksp_richardson_scale 1 -pc_sor_omega > 1 > > -pc_sor_diagonal_shift 0] > > > > Especially because it involves three user-controlled constants, equation > > (**) could usefully be stated somewhere in the manual ... supposing it is > > correct. > > Would you have any objection to this sort of thing being on the man > pages (and/or in tutorials), instead of in the manual? Especially with > the current state of affairs, with the man pages google-able and the > manual not, the man pages might be a better place for these specifics > - hopefully it's clear to readers of the manual that they can click > links to be taken to man pages (if they have web access), which can > allow for finer details without interrupting the flow of the manual > too much or making it even longer. > > > > > > Question 3. What is the probability that a PETSc developer will address > the > > above two questions by telling me that classical iterations are lame? > > > > Answer: Not the point! These iterations are used blockwise (e.g. ASM) > and > > as smoothers in many proper PETSc applications. Clarity is helpful! > > > > > > Thanks for the great tool, as usual! > > > > Ed > > > > > > -- > > Ed Bueler > > Dept of Math and Stat and Geophysical Institute > > University of Alaska Fairbanks > > Fairbanks, AK 99775-6660 > > 301C Chapman and 410D Elvey > > 907 474-7693 and 907 474-7199 (fax 907 474-5394) > -- Ed Bueler Dept of Math and Stat and Geophysical Institute University of Alaska Fairbanks Fairbanks, AK 99775-6660 301C Chapman and 410D Elvey 907 474-7693 and 907 474-7199 (fax 907 474-5394) -------------- next part -------------- An HTML attachment was scrubbed... URL: From fangbowa at buffalo.edu Fri Jan 20 13:18:12 2017 From: fangbowa at buffalo.edu (Fangbo Wang) Date: Fri, 20 Jan 2017 14:18:12 -0500 Subject: [petsc-users] Need advice to do blockwise matrix-vector multiplication In-Reply-To: <3CB0FC87-F0B1-4699-B96F-074E31486FD1@mcs.anl.gov> References: <3CB0FC87-F0B1-4699-B96F-074E31486FD1@mcs.anl.gov> Message-ID: Thank you very much for your reply, Barry. The block matrix size is 1.35 million by 1.35 million, and the matrix can be divided into 45 blocks along each direction, so there are 45X45=2025 blocks in the matrix (size of each block is 30,000 by 30,000) . Fortunately there are only around 700 non-zeros blocks which are also sparse. Correspondingly, the unknown vector and right-hand-side vector (both with size 1.35million by 1) can also be divided into 45 block vectors. Most of the blocks are similar, assume I have a block *A*, there are dozens of blocks similar to* A* by a scalar multiplication (*2A*, *3A* ...). Finally, I only need to save 45 different blocks and a few scalars for this matrix to save memory usage(the reason why 45 different blocks comes from physical concept). Is this clear to explain my problem? Thank you very much! Best regards, Fangbo ? On Thu, Jan 19, 2017 at 6:19 PM, Barry Smith wrote: > > > On Jan 19, 2017, at 4:10 PM, Fangbo Wang wrote: > > > > Hi, > > > > Background: > > > > I am using stochastic finite element to solve a solid mechanics problem > with random material properties. At the end of the day, I get a linear > system of equations Ax=b to solve. > > > > The matrix A is very large with size of 1.3million by 1.3 million, and > to save this matrix needs more than 100 G memory. Fortunately, matrix A has > some nice features that it is a block matrix, most of the blocks inside the > matrix are similar, each block is 10,000 by 10,000. > > > > > > Hence, I only need to save some blocks (in my case 45). Most of the > computation in my iterative solver is matrix-vec multiplication, that's > why I want to do it using block matrices. > > matrix-Each-block-is-also-a-symmetric.png> > > > > > > ? > > > > Current: > > I tried to parallelize all my 45 block matrices in all the processors, > and all the corresponding 45 block vectors in all the processors. However, > the computation seems to be very slow, and no scalability at all. > > I am thinking of using small groups of processors to separate the > computation, like using intra-communicators and inter-communicators. Maybe > this will help to reduce the communication. > > No, just make things excessively complex. > > > > > Any one have some experiences on this? Is there any Petsc function to do > these jobs? I am open to any suggestions. > > Based on your picture it looks like if the matrix was explicitly formed > it would be dense? Or are your 45 "small matrices" sparse? Are there any > "empty" block matrices in your diagram or are they all one of the 45 small > ones? > > There are two ways to order your unknowns; one with all unknowns for > one "block" then all unknowns for the next block ... or interlacing the > unknowns between blocks. Depending on the structure of the problem one or > the other way can be significently better. > > The MatNest construct may be the way to go; it will behave like forming > the full matrix but for each block in the matrix you would just have a > pointer to the correct small matrix so you don't store the individual > matrices more than once. > > Also if you get no speed up you need to verify that it is not due to > the hardware or badly configured software so run the streams benchmark and > make sure you have a good MPI binding http://www.mcs.anl.gov/petsc/ > documentation/faq.html#computers > > > > > Barry > > > > > Thank you very much! > > > > > > > > Fangbo Wang > > > > -- > > Fangbo Wang, PhD student > > Stochastic Geomechanics Research Group > > Department of Civil, Structural and Environmental Engineering > > University at Buffalo > > Email: fangbowa at buffalo.edu > > -- Fangbo Wang, PhD student Stochastic Geomechanics Research Group Department of Civil, Structural and Environmental Engineering University at Buffalo Email: *fangbowa at buffalo.edu * -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: IMG_2888.JPG Type: image/jpeg Size: 990967 bytes Desc: not available URL: From bsmith at mcs.anl.gov Fri Jan 20 13:58:18 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 20 Jan 2017 13:58:18 -0600 Subject: [petsc-users] Need advice to do blockwise matrix-vector multiplication In-Reply-To: References: <3CB0FC87-F0B1-4699-B96F-074E31486FD1@mcs.anl.gov> Message-ID: Fangbo, Thanks, your explanation is clear Jed, It doesn't look like MatCreateNest() is a appropriate since MATNEST assumes each matrix is parallel? What should he do in this case. Barry > On Jan 20, 2017, at 1:18 PM, Fangbo Wang wrote: > > Thank you very much for your reply, Barry. > > The block matrix size is 1.35 million by 1.35 million, and the matrix can be divided into 45 blocks along each direction, so there are 45X45=2025 blocks in the matrix (size of each block is 30,000 by 30,000) . Fortunately there are only around 700 non-zeros blocks which are also sparse. Correspondingly, the unknown vector and right-hand-side vector (both with size 1.35million by 1) can also be divided into 45 block vectors. > > Most of the blocks are similar, assume I have a block A, there are dozens of blocks similar to A by a scalar multiplication (2A, 3A ...). Finally, I only need to save 45 different blocks and a few scalars for this matrix to save memory usage(the reason why 45 different blocks comes from physical concept). > > > > Is this clear to explain my problem? > > Thank you very much! > > Best regards, > > Fangbo > ? > > > > > > > On Thu, Jan 19, 2017 at 6:19 PM, Barry Smith wrote: > > > On Jan 19, 2017, at 4:10 PM, Fangbo Wang wrote: > > > > Hi, > > > > Background: > > > > I am using stochastic finite element to solve a solid mechanics problem with random material properties. At the end of the day, I get a linear system of equations Ax=b to solve. > > > > The matrix A is very large with size of 1.3million by 1.3 million, and to save this matrix needs more than 100 G memory. Fortunately, matrix A has some nice features that it is a block matrix, most of the blocks inside the matrix are similar, each block is 10,000 by 10,000. > > > > > > Hence, I only need to save some blocks (in my case 45). Most of the computation in my iterative solver is matrix-vec multiplication, that's why I want to do it using block matrices. > > > > > > > > ? > > > > Current: > > I tried to parallelize all my 45 block matrices in all the processors, and all the corresponding 45 block vectors in all the processors. However, the computation seems to be very slow, and no scalability at all. > > I am thinking of using small groups of processors to separate the computation, like using intra-communicators and inter-communicators. Maybe this will help to reduce the communication. > > No, just make things excessively complex. > > > > > Any one have some experiences on this? Is there any Petsc function to do these jobs? I am open to any suggestions. > > Based on your picture it looks like if the matrix was explicitly formed it would be dense? Or are your 45 "small matrices" sparse? Are there any "empty" block matrices in your diagram or are they all one of the 45 small ones? > > There are two ways to order your unknowns; one with all unknowns for one "block" then all unknowns for the next block ... or interlacing the unknowns between blocks. Depending on the structure of the problem one or the other way can be significently better. > > The MatNest construct may be the way to go; it will behave like forming the full matrix but for each block in the matrix you would just have a pointer to the correct small matrix so you don't store the individual matrices more than once. > > Also if you get no speed up you need to verify that it is not due to the hardware or badly configured software so run the streams benchmark and make sure you have a good MPI binding http://www.mcs.anl.gov/petsc/documentation/faq.html#computers > > > > > Barry > > > > > Thank you very much! > > > > > > > > Fangbo Wang > > > > -- > > Fangbo Wang, PhD student > > Stochastic Geomechanics Research Group > > Department of Civil, Structural and Environmental Engineering > > University at Buffalo > > Email: fangbowa at buffalo.edu > > > > > -- > Fangbo Wang, PhD student > Stochastic Geomechanics Research Group > Department of Civil, Structural and Environmental Engineering > University at Buffalo > Email: fangbowa at buffalo.edu From jed at jedbrown.org Fri Jan 20 14:18:57 2017 From: jed at jedbrown.org (Jed Brown) Date: Fri, 20 Jan 2017 14:18:57 -0600 Subject: [petsc-users] Need advice to do blockwise matrix-vector multiplication In-Reply-To: References: <3CB0FC87-F0B1-4699-B96F-074E31486FD1@mcs.anl.gov> Message-ID: <87mvelcwni.fsf@jedbrown.org> Barry Smith writes: > Fangbo, > > Thanks, your explanation is clear > > Jed, > > It doesn't look like MatCreateNest() is a appropriate since MATNEST assumes each matrix is parallel? Well, there's nothing stopping you from making a parallel matrix with no entries on most (all but one or two) processes. But doing this manually is likely to be a major headache as the number of processes is scaled. The description here makes it sound like this particular structure might be nicely represented as a tensor contraction, for which there are libraries (e.g., https://github.com/solomonik/ctf) that could be put within a MatShell to use with PETSc. Also, the size here isn't that big since the blocks are themselves sparse. Depending on what preconditioner is being used, the naive sparse matrix representation might be required anyway and optimization of the MatMult to reduce its memory (/bandwidth) footprint might be premature. > What should he do in this case. > > Barry > >> On Jan 20, 2017, at 1:18 PM, Fangbo Wang wrote: >> >> Thank you very much for your reply, Barry. >> >> The block matrix size is 1.35 million by 1.35 million, and the matrix can be divided into 45 blocks along each direction, so there are 45X45=2025 blocks in the matrix (size of each block is 30,000 by 30,000) . Fortunately there are only around 700 non-zeros blocks which are also sparse. Correspondingly, the unknown vector and right-hand-side vector (both with size 1.35million by 1) can also be divided into 45 block vectors. >> >> Most of the blocks are similar, assume I have a block A, there are dozens of blocks similar to A by a scalar multiplication (2A, 3A ...). Finally, I only need to save 45 different blocks and a few scalars for this matrix to save memory usage(the reason why 45 different blocks comes from physical concept). >> >> >> >> Is this clear to explain my problem? >> >> Thank you very much! >> >> Best regards, >> >> Fangbo >> ? >> >> >> >> >> >> >> On Thu, Jan 19, 2017 at 6:19 PM, Barry Smith wrote: >> >> > On Jan 19, 2017, at 4:10 PM, Fangbo Wang wrote: >> > >> > Hi, >> > >> > Background: >> > >> > I am using stochastic finite element to solve a solid mechanics problem with random material properties. At the end of the day, I get a linear system of equations Ax=b to solve. >> > >> > The matrix A is very large with size of 1.3million by 1.3 million, and to save this matrix needs more than 100 G memory. Fortunately, matrix A has some nice features that it is a block matrix, most of the blocks inside the matrix are similar, each block is 10,000 by 10,000. >> > >> > >> > Hence, I only need to save some blocks (in my case 45). Most of the computation in my iterative solver is matrix-vec multiplication, that's why I want to do it using block matrices. >> > >> > >> > >> > ? >> > >> > Current: >> > I tried to parallelize all my 45 block matrices in all the processors, and all the corresponding 45 block vectors in all the processors. However, the computation seems to be very slow, and no scalability at all. >> > I am thinking of using small groups of processors to separate the computation, like using intra-communicators and inter-communicators. Maybe this will help to reduce the communication. >> >> No, just make things excessively complex. >> >> > >> > Any one have some experiences on this? Is there any Petsc function to do these jobs? I am open to any suggestions. >> >> Based on your picture it looks like if the matrix was explicitly formed it would be dense? Or are your 45 "small matrices" sparse? Are there any "empty" block matrices in your diagram or are they all one of the 45 small ones? >> >> There are two ways to order your unknowns; one with all unknowns for one "block" then all unknowns for the next block ... or interlacing the unknowns between blocks. Depending on the structure of the problem one or the other way can be significently better. >> >> The MatNest construct may be the way to go; it will behave like forming the full matrix but for each block in the matrix you would just have a pointer to the correct small matrix so you don't store the individual matrices more than once. >> >> Also if you get no speed up you need to verify that it is not due to the hardware or badly configured software so run the streams benchmark and make sure you have a good MPI binding http://www.mcs.anl.gov/petsc/documentation/faq.html#computers >> >> >> >> >> Barry >> >> > >> > Thank you very much! >> > >> > >> > >> > Fangbo Wang >> > >> > -- >> > Fangbo Wang, PhD student >> > Stochastic Geomechanics Research Group >> > Department of Civil, Structural and Environmental Engineering >> > University at Buffalo >> > Email: fangbowa at buffalo.edu >> >> >> >> >> -- >> Fangbo Wang, PhD student >> Stochastic Geomechanics Research Group >> Department of Civil, Structural and Environmental Engineering >> University at Buffalo >> Email: fangbowa at buffalo.edu -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From bsmith at mcs.anl.gov Fri Jan 20 14:51:15 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 20 Jan 2017 14:51:15 -0600 Subject: [petsc-users] Need advice to do blockwise matrix-vector multiplication In-Reply-To: <87mvelcwni.fsf@jedbrown.org> References: <3CB0FC87-F0B1-4699-B96F-074E31486FD1@mcs.anl.gov> <87mvelcwni.fsf@jedbrown.org> Message-ID: <06AF84AE-EB49-48FE-83DF-4D55633349CF@mcs.anl.gov> > On Jan 20, 2017, at 2:18 PM, Jed Brown wrote: > > Barry Smith writes: > >> Fangbo, >> >> Thanks, your explanation is clear >> >> Jed, >> >> It doesn't look like MatCreateNest() is a appropriate since MATNEST assumes each matrix is parallel? > > Well, there's nothing stopping you from making a parallel matrix with no > entries on most (all but one or two) processes. But doing this manually > is likely to be a major headache as the number of processes is scaled. > The description here makes it sound like this particular structure might > be nicely represented as a tensor contraction, for which there are > libraries (e.g., https://github.com/solomonik/ctf) Should we have a matrix class based on this software; rather than custom MatShell for each person? If we did this what are the best tensor contraction libraries to use? What are the abstractions? Especially when it is sparse matrices in sparse matrices? Barry > that could be put > within a MatShell to use with PETSc. Also, the size here isn't that big > since the blocks are themselves sparse. Depending on what > preconditioner is being used, the naive sparse matrix representation > might be required anyway and optimization of the MatMult to reduce its > memory (/bandwidth) footprint might be premature. > >> What should he do in this case. >> >> Barry >> >>> On Jan 20, 2017, at 1:18 PM, Fangbo Wang wrote: >>> >>> Thank you very much for your reply, Barry. >>> >>> The block matrix size is 1.35 million by 1.35 million, and the matrix can be divided into 45 blocks along each direction, so there are 45X45=2025 blocks in the matrix (size of each block is 30,000 by 30,000) . Fortunately there are only around 700 non-zeros blocks which are also sparse. Correspondingly, the unknown vector and right-hand-side vector (both with size 1.35million by 1) can also be divided into 45 block vectors. >>> >>> Most of the blocks are similar, assume I have a block A, there are dozens of blocks similar to A by a scalar multiplication (2A, 3A ...). Finally, I only need to save 45 different blocks and a few scalars for this matrix to save memory usage(the reason why 45 different blocks comes from physical concept). >>> >>> >>> >>> Is this clear to explain my problem? >>> >>> Thank you very much! >>> >>> Best regards, >>> >>> Fangbo >>> ? >>> >>> >>> >>> >>> >>> >>> On Thu, Jan 19, 2017 at 6:19 PM, Barry Smith wrote: >>> >>>> On Jan 19, 2017, at 4:10 PM, Fangbo Wang wrote: >>>> >>>> Hi, >>>> >>>> Background: >>>> >>>> I am using stochastic finite element to solve a solid mechanics problem with random material properties. At the end of the day, I get a linear system of equations Ax=b to solve. >>>> >>>> The matrix A is very large with size of 1.3million by 1.3 million, and to save this matrix needs more than 100 G memory. Fortunately, matrix A has some nice features that it is a block matrix, most of the blocks inside the matrix are similar, each block is 10,000 by 10,000. >>>> >>>> >>>> Hence, I only need to save some blocks (in my case 45). Most of the computation in my iterative solver is matrix-vec multiplication, that's why I want to do it using block matrices. >>>> >>>> >>>> >>>> ? >>>> >>>> Current: >>>> I tried to parallelize all my 45 block matrices in all the processors, and all the corresponding 45 block vectors in all the processors. However, the computation seems to be very slow, and no scalability at all. >>>> I am thinking of using small groups of processors to separate the computation, like using intra-communicators and inter-communicators. Maybe this will help to reduce the communication. >>> >>> No, just make things excessively complex. >>> >>>> >>>> Any one have some experiences on this? Is there any Petsc function to do these jobs? I am open to any suggestions. >>> >>> Based on your picture it looks like if the matrix was explicitly formed it would be dense? Or are your 45 "small matrices" sparse? Are there any "empty" block matrices in your diagram or are they all one of the 45 small ones? >>> >>> There are two ways to order your unknowns; one with all unknowns for one "block" then all unknowns for the next block ... or interlacing the unknowns between blocks. Depending on the structure of the problem one or the other way can be significently better. >>> >>> The MatNest construct may be the way to go; it will behave like forming the full matrix but for each block in the matrix you would just have a pointer to the correct small matrix so you don't store the individual matrices more than once. >>> >>> Also if you get no speed up you need to verify that it is not due to the hardware or badly configured software so run the streams benchmark and make sure you have a good MPI binding http://www.mcs.anl.gov/petsc/documentation/faq.html#computers >>> >>> >>> >>> >>> Barry >>> >>>> >>>> Thank you very much! >>>> >>>> >>>> >>>> Fangbo Wang >>>> >>>> -- >>>> Fangbo Wang, PhD student >>>> Stochastic Geomechanics Research Group >>>> Department of Civil, Structural and Environmental Engineering >>>> University at Buffalo >>>> Email: fangbowa at buffalo.edu >>> >>> >>> >>> >>> -- >>> Fangbo Wang, PhD student >>> Stochastic Geomechanics Research Group >>> Department of Civil, Structural and Environmental Engineering >>> University at Buffalo >>> Email: fangbowa at buffalo.edu From jed at jedbrown.org Fri Jan 20 15:07:16 2017 From: jed at jedbrown.org (Jed Brown) Date: Fri, 20 Jan 2017 15:07:16 -0600 Subject: [petsc-users] Need advice to do blockwise matrix-vector multiplication In-Reply-To: <06AF84AE-EB49-48FE-83DF-4D55633349CF@mcs.anl.gov> References: <3CB0FC87-F0B1-4699-B96F-074E31486FD1@mcs.anl.gov> <87mvelcwni.fsf@jedbrown.org> <06AF84AE-EB49-48FE-83DF-4D55633349CF@mcs.anl.gov> Message-ID: <87d1fhcuez.fsf@jedbrown.org> Barry Smith writes: > Should we have a matrix class based on this software; rather than custom MatShell for each person? If we did this what are the best tensor contraction libraries to use? What are the abstractions? Especially when it is sparse matrices in sparse matrices? CTF is the best I know for what it does. But I think it might be a bit like MatFFTW in that a supported Mat impl would be a maintenance burden of marginal value. (I would argue that the most common use of FFT in scientific computing is in the form of a convolution, for which the abstraction leaks and needs to be composed anyway.) I could be wrong, and if someone is using CTF and sees a good way to make a useful MatCTF and has the energy to do so, I think it would be welcome. -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From fdkong.jd at gmail.com Sat Jan 21 22:38:05 2017 From: fdkong.jd at gmail.com (Fande Kong) Date: Sat, 21 Jan 2017 21:38:05 -0700 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra Message-ID: Hi All, I upgraded the OS system to macOS Sierra, and observed that PETSc can not read the exodus file any more. The same code runs fine on macOS Capitan. I also tested the function DMPlexCreateExodusFromFile() against different versions of the GCC compiler such as GCC-5.4 and GCC-6, and neither of them work. I guess this issue is related to the external package *exodus*, and PETSc might not pick up the right enveriment variables for the *exodus.* This issue can be reproduced using the following simple code: *static char help[] = " create mesh from exodus.\n\n";* *#include * *#include * *#undef __FUNCT__* *#define __FUNCT__ "main"* *int main(int argc,char **argv)* *{* * char fineMeshFileName[2048];* * DM dm;* * MPI_Comm comm;* * PetscBool flg;* * PetscErrorCode ierr;* * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* * comm = PETSC_COMM_WORLD;* * ierr = PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* * if(!flg){* * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file \n");* * }* * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, PETSC_FALSE, &dm);CHKERRQ(ierr);* * ierr = DMDestroy(&dm);CHKERRQ(ierr);* * ierr = PetscFinalize();CHKERRQ(ierr);* *}* *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile -file Tri3.exo * *[0]PETSC ERROR: ------------------------------------------------------------------------* *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range* *[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger* *[0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind * *[0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors* *[0]PETSC ERROR: likely location of problem given in stack below* *[0]PETSC ERROR: --------------------- Stack Frames ------------------------------------* *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available,* *[0]PETSC ERROR: INSTEAD the line number of the start of the function* *[0]PETSC ERROR: is given.* *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* *[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------* *[0]PETSC ERROR: Signal received* *[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting.* *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 21:04:22 2017* *[0]PETSC ERROR: Configure options --with-clanguage=cxx --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 --download-parmetis=1 --download-metis=1 --download-netcdf=1 --download-exodusii=1 --download-hdf5=1 --with-debugging=yes --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* *:* *system msg for write_line failure : Bad file descriptor* The log files of make and configuration are also attached. If you have any idea on this issue, please let me know! Fande Kong, -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log.zip Type: application/zip Size: 11767 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log.zip Type: application/zip Size: 1189476 bytes Desc: not available URL: From knepley at gmail.com Sat Jan 21 23:47:18 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sat, 21 Jan 2017 23:47:18 -0600 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong wrote: > Hi All, > > I upgraded the OS system to macOS Sierra, and observed that PETSc can not > read the exodus file any more. The same code runs fine on macOS Capitan. I > also tested the function DMPlexCreateExodusFromFile() against different > versions of the GCC compiler such as GCC-5.4 and GCC-6, and neither of them > work. I guess this issue is related to the external package *exodus*, and > PETSc might not pick up the right enveriment variables for the *exodus.* > > This issue can be reproduced using the following simple code: > 1) This is just a standard check. Have you reconfigured so that you know ExodusII was built with the same compilers and system libraries? 2) If so, can you get a stack trace with gdb or lldb? Matt > *static char help[] = " create mesh from exodus.\n\n";* > > *#include * > *#include * > > *#undef __FUNCT__* > *#define __FUNCT__ "main"* > *int main(int argc,char **argv)* > *{* > * char fineMeshFileName[2048];* > * DM dm;* > * MPI_Comm comm;* > * PetscBool flg;* > > * PetscErrorCode ierr;* > > * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* > * comm = PETSC_COMM_WORLD;* > * ierr = > PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* > * if(!flg){* > * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file \n");* > * }* > * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, PETSC_FALSE, > &dm);CHKERRQ(ierr);* > * ierr = DMDestroy(&dm);CHKERRQ(ierr);* > * ierr = PetscFinalize();CHKERRQ(ierr);* > *}* > > > *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile -file > Tri3.exo * > *[0]PETSC ERROR: > ------------------------------------------------------------------------* > *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range* > *[0]PETSC ERROR: Try option -start_in_debugger or > -on_error_attach_debugger* > *[0]PETSC ERROR: or see > http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind > * > *[0]PETSC ERROR: or try http://valgrind.org on > GNU/linux and Apple Mac OS X to find memory corruption errors* > *[0]PETSC ERROR: likely location of problem given in stack below* > *[0]PETSC ERROR: --------------------- Stack Frames > ------------------------------------* > *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not > available,* > *[0]PETSC ERROR: INSTEAD the line number of the start of the > function* > *[0]PETSC ERROR: is given.* > *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 > /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* > *[0]PETSC ERROR: --------------------- Error Message > --------------------------------------------------------------* > *[0]PETSC ERROR: Signal received* > *[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html > for trouble shooting.* > *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * > *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a arch-darwin-cxx-debug > named LiviadeMacBook-Pro.local by livia Sat Jan 21 21:04:22 2017* > *[0]PETSC ERROR: Configure options --with-clanguage=cxx > --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 > --download-parmetis=1 --download-metis=1 --download-netcdf=1 > --download-exodusii=1 --download-hdf5=1 --with-debugging=yes > --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 > --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* > *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* > *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* > *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* > *:* > *system msg for write_line failure : Bad file descriptor* > > > The log files of make and configuration are also attached. If you have > any idea on this issue, please let me know! > > Fande Kong, > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fdkong.jd at gmail.com Sun Jan 22 12:41:39 2017 From: fdkong.jd at gmail.com (Fande Kong) Date: Sun, 22 Jan 2017 11:41:39 -0700 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley wrote: > On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong wrote: > >> Hi All, >> >> I upgraded the OS system to macOS Sierra, and observed that PETSc can not >> read the exodus file any more. The same code runs fine on macOS Capitan. I >> also tested the function DMPlexCreateExodusFromFile() against different >> versions of the GCC compiler such as GCC-5.4 and GCC-6, and neither of them >> work. I guess this issue is related to the external package *exodus*, >> and PETSc might not pick up the right enveriment variables for the >> *exodus.* >> >> This issue can be reproduced using the following simple code: >> > > 1) This is just a standard check. Have you reconfigured so that you know > ExodusII was built with the same compilers and system libraries? > > 2) If so, can you get a stack trace with gdb or lldb? > 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill + 10 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + 90 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 3 libpetsc.3.7.dylib 0x00000001100eb9ee PetscAbortErrorHandler + 506 (errstop.c:40) 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + 916 (err.c:379) 5 libpetsc.3.7.dylib 0x00000001100ed830 PetscSignalHandlerDefault + 1927 (signal.c:160) 6 libpetsc.3.7.dylib 0x00000001100ed088 PetscSignalHandler_Private(int) + 630 (signal.c:49) 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 8 ??? 0x000000011ea09370 initialPoolContent + 19008 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + 210 (dutf8proc.c:543) 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + 38 (dutf8proc.c:568) 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + 110 (attr.c:341) 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + 119 (attr.c:384) 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + 47 (attr.c:1138) 14 libnetcdf.7.dylib 0x0000000112286126 nc_get_att_float + 90 (dattget.c:192) 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + 171 (ex_open.c:259) 16 libpetsc.3.7.dylib 0x0000000110c36609 DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 (DMPlexCreateExodusFromFile.cpp:24) 18 libdyld.dylib 0x00007fffad78a255 start + 1 > > Matt > > >> *static char help[] = " create mesh from exodus.\n\n";* >> >> *#include * >> *#include * >> >> *#undef __FUNCT__* >> *#define __FUNCT__ "main"* >> *int main(int argc,char **argv)* >> *{* >> * char fineMeshFileName[2048];* >> * DM dm;* >> * MPI_Comm comm;* >> * PetscBool flg;* >> >> * PetscErrorCode ierr;* >> >> * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* >> * comm = PETSC_COMM_WORLD;* >> * ierr = >> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >> * if(!flg){* >> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file \n");* >> * }* >> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, PETSC_FALSE, >> &dm);CHKERRQ(ierr);* >> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >> * ierr = PetscFinalize();CHKERRQ(ierr);* >> *}* >> >> >> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile -file >> Tri3.exo * >> *[0]PETSC ERROR: >> ------------------------------------------------------------------------* >> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >> probably memory access out of range* >> *[0]PETSC ERROR: Try option -start_in_debugger or >> -on_error_attach_debugger* >> *[0]PETSC ERROR: or see >> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >> * >> *[0]PETSC ERROR: or try http://valgrind.org on >> GNU/linux and Apple Mac OS X to find memory corruption errors* >> *[0]PETSC ERROR: likely location of problem given in stack below* >> *[0]PETSC ERROR: --------------------- Stack Frames >> ------------------------------------* >> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >> available,* >> *[0]PETSC ERROR: INSTEAD the line number of the start of the >> function* >> *[0]PETSC ERROR: is given.* >> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >> *[0]PETSC ERROR: --------------------- Error Message >> --------------------------------------------------------------* >> *[0]PETSC ERROR: Signal received* >> *[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >> for trouble shooting.* >> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a arch-darwin-cxx-debug >> named LiviadeMacBook-Pro.local by livia Sat Jan 21 21:04:22 2017* >> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* >> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >> *:* >> *system msg for write_line failure : Bad file descriptor* >> >> >> The log files of make and configuration are also attached. If you have >> any idea on this issue, please let me know! >> >> Fande Kong, >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Jan 22 13:35:11 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 22 Jan 2017 13:35:11 -0600 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong wrote: > On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley > wrote: > >> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong wrote: >> >>> Hi All, >>> >>> I upgraded the OS system to macOS Sierra, and observed that PETSc can >>> not read the exodus file any more. The same code runs fine on macOS >>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>> neither of them work. I guess this issue is related to the external package >>> *exodus*, and PETSc might not pick up the right enveriment variables >>> for the *exodus.* >>> >>> This issue can be reproduced using the following simple code: >>> >> >> 1) This is just a standard check. Have you reconfigured so that you know >> ExodusII was built with the same compilers and system libraries? >> >> 2) If so, can you get a stack trace with gdb or lldb? >> > > 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill + 10 > 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + 90 > 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 > 3 libpetsc.3.7.dylib 0x00000001100eb9ee > PetscAbortErrorHandler + 506 (errstop.c:40) > 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + 916 > (err.c:379) > 5 libpetsc.3.7.dylib 0x00000001100ed830 > PetscSignalHandlerDefault + 1927 (signal.c:160) > 6 libpetsc.3.7.dylib 0x00000001100ed088 > PetscSignalHandler_Private(int) + 630 (signal.c:49) > 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 > 8 ??? 0x000000011ea09370 initialPoolContent + > 19008 > 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + 210 > (dutf8proc.c:543) > 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + 38 > (dutf8proc.c:568) > 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + 110 > (attr.c:341) > 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + 119 > (attr.c:384) > 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + 47 > (attr.c:1138) > 14 libnetcdf.7.dylib 0x0000000112286126 nc_get_att_float + > 90 (dattget.c:192) > 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + 171 > (ex_open.c:259) > 16 libpetsc.3.7.dylib 0x0000000110c36609 > DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) > 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 > (DMPlexCreateExodusFromFile.cpp:24) > 18 libdyld.dylib 0x00007fffad78a255 start + 1 > This is a NetCDF error on ex_open_int(). My guess is that your NetCDF build is old and when it calls the system DLL you bomb. Can you do a completely new build, meaning either reclone PETSc somewhere else, or delete the whole $PETSC_DIR/$PETSC_ARCH/externalpackage directory and reconfigure/build? Thanks, Matt > > >> Matt >> >> >>> *static char help[] = " create mesh from exodus.\n\n";* >>> >>> *#include * >>> *#include * >>> >>> *#undef __FUNCT__* >>> *#define __FUNCT__ "main"* >>> *int main(int argc,char **argv)* >>> *{* >>> * char fineMeshFileName[2048];* >>> * DM dm;* >>> * MPI_Comm comm;* >>> * PetscBool flg;* >>> >>> * PetscErrorCode ierr;* >>> >>> * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* >>> * comm = PETSC_COMM_WORLD;* >>> * ierr = >>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>> * if(!flg){* >>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file \n");* >>> * }* >>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>> *}* >>> >>> >>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile -file >>> Tri3.exo * >>> *[0]PETSC ERROR: >>> ------------------------------------------------------------------------* >>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>> probably memory access out of range* >>> *[0]PETSC ERROR: Try option -start_in_debugger or >>> -on_error_attach_debugger* >>> *[0]PETSC ERROR: or see >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>> * >>> *[0]PETSC ERROR: or try http://valgrind.org on >>> GNU/linux and Apple Mac OS X to find memory corruption errors* >>> *[0]PETSC ERROR: likely location of problem given in stack below* >>> *[0]PETSC ERROR: --------------------- Stack Frames >>> ------------------------------------* >>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>> available,* >>> *[0]PETSC ERROR: INSTEAD the line number of the start of the >>> function* >>> *[0]PETSC ERROR: is given.* >>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>> *[0]PETSC ERROR: --------------------- Error Message >>> --------------------------------------------------------------* >>> *[0]PETSC ERROR: Signal received* >>> *[0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >>> for trouble shooting.* >>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a arch-darwin-cxx-debug >>> named LiviadeMacBook-Pro.local by livia Sat Jan 21 21:04:22 2017* >>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* >>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>> *:* >>> *system msg for write_line failure : Bad file descriptor* >>> >>> >>> The log files of make and configuration are also attached. If you have >>> any idea on this issue, please let me know! >>> >>> Fande Kong, >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fdkong.jd at gmail.com Sun Jan 22 17:28:34 2017 From: fdkong.jd at gmail.com (Fande Kong) Date: Sun, 22 Jan 2017 16:28:34 -0700 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley wrote: > On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong wrote: > >> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley >> wrote: >> >>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>> wrote: >>> >>>> Hi All, >>>> >>>> I upgraded the OS system to macOS Sierra, and observed that PETSc can >>>> not read the exodus file any more. The same code runs fine on macOS >>>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>>> neither of them work. I guess this issue is related to the external package >>>> *exodus*, and PETSc might not pick up the right enveriment variables >>>> for the *exodus.* >>>> >>>> This issue can be reproduced using the following simple code: >>>> >>> >>> 1) This is just a standard check. Have you reconfigured so that you know >>> ExodusII was built with the same compilers and system libraries? >>> >>> 2) If so, can you get a stack trace with gdb or lldb? >>> >> >> 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill + 10 >> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + 90 >> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >> PetscAbortErrorHandler + 506 (errstop.c:40) >> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + 916 >> (err.c:379) >> 5 libpetsc.3.7.dylib 0x00000001100ed830 >> PetscSignalHandlerDefault + 1927 (signal.c:160) >> 6 libpetsc.3.7.dylib 0x00000001100ed088 >> PetscSignalHandler_Private(int) + 630 (signal.c:49) >> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 >> 8 ??? 0x000000011ea09370 initialPoolContent >> + 19008 >> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + 210 >> (dutf8proc.c:543) >> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + 38 >> (dutf8proc.c:568) >> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + 110 >> (attr.c:341) >> 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + 119 >> (attr.c:384) >> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + 47 >> (attr.c:1138) >> 14 libnetcdf.7.dylib 0x0000000112286126 nc_get_att_float + >> 90 (dattget.c:192) >> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + 171 >> (ex_open.c:259) >> 16 libpetsc.3.7.dylib 0x0000000110c36609 >> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >> (DMPlexCreateExodusFromFile.cpp:24) >> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >> > > This is a NetCDF error on ex_open_int(). My guess is that your NetCDF > build is old and when it calls the system DLL > you bomb. Can you do a completely new build, meaning either reclone PETSc > somewhere else, or delete the whole > $PETSC_DIR/$PETSC_ARCH/externalpackage directory and reconfigure/build? > > Thanks, > > Matt > > Hi Matt, Thanks for reply. I recloned PETSc (the old petsc folder is deleted completely) and reconfigure. And still has the same issue. I also checked if the binary is complied against any other netcdf. The binary is actually complied against the right netcdf installed through PETSc. *LiviadeMacBook-Pro:partition livia$ otool -L DMPlexCreateExodusFromFile* *DMPlexCreateExodusFromFile:* * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib (compatibility version 3.7.0, current version 3.7.5)* * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib (compatibility version 5.0.0, current version 5.1.3)* * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib (compatibility version 0.0.0, current version 0.0.0)* * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib (compatibility version 0.0.0, current version 0.0.0)* * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib (compatibility version 10.0.0, current version 10.0.0)* * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib (compatibility version 9.0.0, current version 9.1.0)* * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib (compatibility version 9.0.0, current version 9.1.0)* * /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current version 10.0.0)* * /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib (compatibility version 14.0.0, current version 14.0.0)* * /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility version 4.0.0, current version 4.0.0)* * /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility version 1.0.0, current version 1.0.0)* * /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib (compatibility version 14.0.0, current version 14.0.0)* * /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility version 7.0.0, current version 7.21.0)* * /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)* * /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.0.0)* * /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib (compatibility version 14.0.0, current version 14.0.0)* * /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1238.0.0)* * /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0)* > >> > >>> Matt >>> >>> >>>> *static char help[] = " create mesh from exodus.\n\n";* >>>> >>>> *#include * >>>> *#include * >>>> >>>> *#undef __FUNCT__* >>>> *#define __FUNCT__ "main"* >>>> *int main(int argc,char **argv)* >>>> *{* >>>> * char fineMeshFileName[2048];* >>>> * DM dm;* >>>> * MPI_Comm comm;* >>>> * PetscBool flg;* >>>> >>>> * PetscErrorCode ierr;* >>>> >>>> * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* >>>> * comm = PETSC_COMM_WORLD;* >>>> * ierr = >>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>> * if(!flg){* >>>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file >>>> \n");* >>>> * }* >>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>> *}* >>>> >>>> >>>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile -file >>>> Tri3.exo * >>>> *[0]PETSC ERROR: >>>> ------------------------------------------------------------------------* >>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>> probably memory access out of range* >>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>> -on_error_attach_debugger* >>>> *[0]PETSC ERROR: or see >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>> * >>>> *[0]PETSC ERROR: or try http://valgrind.org on >>>> GNU/linux and Apple Mac OS X to find memory corruption errors* >>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>> ------------------------------------* >>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>> available,* >>>> *[0]PETSC ERROR: INSTEAD the line number of the start of the >>>> function* >>>> *[0]PETSC ERROR: is given.* >>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>> *[0]PETSC ERROR: --------------------- Error Message >>>> --------------------------------------------------------------* >>>> *[0]PETSC ERROR: Signal received* >>>> *[0]PETSC ERROR: See >>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>> for trouble shooting.* >>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>> 21:04:22 2017* >>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* >>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>> *:* >>>> *system msg for write_line failure : Bad file descriptor* >>>> >>>> >>>> The log files of make and configuration are also attached. If you have >>>> any idea on this issue, please let me know! >>>> >>>> Fande Kong, >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: configure.log.zip Type: application/zip Size: 1179555 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log.zip Type: application/zip Size: 11766 bytes Desc: not available URL: From knepley at gmail.com Sun Jan 22 19:50:26 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 22 Jan 2017 19:50:26 -0600 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong wrote: > On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley > wrote: > >> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong wrote: >> >>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley >>> wrote: >>> >>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>>> wrote: >>>> >>>>> Hi All, >>>>> >>>>> I upgraded the OS system to macOS Sierra, and observed that PETSc can >>>>> not read the exodus file any more. The same code runs fine on macOS >>>>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>>>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>>>> neither of them work. I guess this issue is related to the external package >>>>> *exodus*, and PETSc might not pick up the right enveriment variables >>>>> for the *exodus.* >>>>> >>>>> This issue can be reproduced using the following simple code: >>>>> >>>> >>>> 1) This is just a standard check. Have you reconfigured so that you >>>> know ExodusII was built with the same compilers and system libraries? >>>> >>>> 2) If so, can you get a stack trace with gdb or lldb? >>>> >>> >>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill + >>> 10 >>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + 90 >>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>> PetscAbortErrorHandler + 506 (errstop.c:40) >>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + 916 >>> (err.c:379) >>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 >>> 8 ??? 0x000000011ea09370 initialPoolContent >>> + 19008 >>> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + 210 >>> (dutf8proc.c:543) >>> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + 38 >>> (dutf8proc.c:568) >>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + 110 >>> (attr.c:341) >>> 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + >>> 119 (attr.c:384) >>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + 47 >>> (attr.c:1138) >>> 14 libnetcdf.7.dylib 0x0000000112286126 nc_get_att_float + >>> 90 (dattget.c:192) >>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + 171 >>> (ex_open.c:259) >>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>> (DMPlexCreateExodusFromFile.cpp:24) >>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>> >> >> This is a NetCDF error on ex_open_int(). My guess is that your NetCDF >> build is old and when it calls the system DLL >> you bomb. Can you do a completely new build, meaning either reclone PETSc >> somewhere else, or delete the whole >> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and reconfigure/build? >> >> Thanks, >> >> Matt >> >> > > Hi Matt, > > Thanks for reply. I recloned PETSc (the old petsc folder is deleted > completely) and reconfigure. And still has the same issue. I also checked > if the binary is complied against any other netcdf. The binary is actually > complied against the right netcdf installed through PETSc. > You can see that this crash happens on the call to int CPU_word_size = 0, IO_word_size = 0, exoid = -1; float version; exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, &version); which means the fault is not in PETSc, but rather in ExodusII for your machine. We could definitely confirm this if you made a 5 line program that only called this, but I don't see why it should be different. I am not sure what to do, since I am not in control of anything about ExodusII. Can you report this to their dev team? It is strange since Blaise has not reported this, and I know he uses it all the time. Thanks, Matt > *LiviadeMacBook-Pro:partition livia$ otool -L DMPlexCreateExodusFromFile* > *DMPlexCreateExodusFromFile:* > * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib > (compatibility version 3.7.0, current version 3.7.5)* > * > /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib > (compatibility version 5.0.0, current version 5.1.3)* > * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib > (compatibility version 0.0.0, current version 0.0.0)* > * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib > (compatibility version 0.0.0, current version 0.0.0)* > * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib > (compatibility version 10.0.0, current version 10.0.0)* > * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib > (compatibility version 9.0.0, current version 9.1.0)* > * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib > (compatibility version 9.0.0, current version 9.1.0)* > * /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current > version 10.0.0)* > * /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib > (compatibility version 14.0.0, current version 14.0.0)* > * /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility > version 4.0.0, current version 4.0.0)* > * /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility > version 1.0.0, current version 1.0.0)* > * /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib > (compatibility version 14.0.0, current version 14.0.0)* > * /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility version > 7.0.0, current version 7.21.0)* > * /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility version > 1.0.0, current version 1.0.0)* > * /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib (compatibility > version 14.0.0, current version 14.0.0)* > * /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib (compatibility > version 14.0.0, current version 14.0.0)* > * /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version > 1238.0.0)* > * /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, > current version 1.0.0)* > > > > > >> >>> >> >>>> Matt >>>> >>>> >>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>> >>>>> *#include * >>>>> *#include * >>>>> >>>>> *#undef __FUNCT__* >>>>> *#define __FUNCT__ "main"* >>>>> *int main(int argc,char **argv)* >>>>> *{* >>>>> * char fineMeshFileName[2048];* >>>>> * DM dm;* >>>>> * MPI_Comm comm;* >>>>> * PetscBool flg;* >>>>> >>>>> * PetscErrorCode ierr;* >>>>> >>>>> * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* >>>>> * comm = PETSC_COMM_WORLD;* >>>>> * ierr = >>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>> * if(!flg){* >>>>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file >>>>> \n");* >>>>> * }* >>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>> *}* >>>>> >>>>> >>>>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile >>>>> -file Tri3.exo * >>>>> *[0]PETSC ERROR: >>>>> ------------------------------------------------------------------------* >>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, >>>>> probably memory access out of range* >>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>> -on_error_attach_debugger* >>>>> *[0]PETSC ERROR: or see >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>> * >>>>> *[0]PETSC ERROR: or try http://valgrind.org on >>>>> GNU/linux and Apple Mac OS X to find memory corruption errors* >>>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>> ------------------------------------* >>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>> available,* >>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of the >>>>> function* >>>>> *[0]PETSC ERROR: is given.* >>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>> --------------------------------------------------------------* >>>>> *[0]PETSC ERROR: Signal received* >>>>> *[0]PETSC ERROR: See >>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>> for trouble shooting.* >>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>> 21:04:22 2017* >>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* >>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>> *:* >>>>> *system msg for write_line failure : Bad file descriptor* >>>>> >>>>> >>>>> The log files of make and configuration are also attached. If you >>>>> have any idea on this issue, please let me know! >>>>> >>>>> Fande Kong, >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Sun Jan 22 20:15:05 2017 From: cpraveen at gmail.com (Praveen C) Date: Mon, 23 Jan 2017 07:45:05 +0530 Subject: [petsc-users] Type of Vec in rhs function of TS Message-ID: Dear all If we do TSSetSolution(ts, x); TSSolve(ts, x); Is the type of Vec passed to rhs function same as "x" ? If "x" is ghosted vector, is the vector passed to rhs function also having ghost values ? Thanks praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Sun Jan 22 20:21:41 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Sun, 22 Jan 2017 20:21:41 -0600 Subject: [petsc-users] Type of Vec in rhs function of TS In-Reply-To: References: Message-ID: You would never pass in x a "ghosted" i.e. local vector. You would always pass in a "global" vector for x and the vector passed to the rhs function would be global. There is no way to call solvers with "ghosted" i.e. local vectors. It is the responsibility of the rhs function to communicate into a ghosted vector in order to do the function evaluation. Barry > On Jan 22, 2017, at 8:15 PM, Praveen C wrote: > > Dear all > > If we do > > TSSetSolution(ts, x); > > TSSolve(ts, x); > > Is the type of Vec passed to rhs function same as "x" ? > > If "x" is ghosted vector, is the vector passed to rhs function also having ghost values ? > > Thanks > praveen From fdkong.jd at gmail.com Sun Jan 22 20:40:15 2017 From: fdkong.jd at gmail.com (Fande Kong) Date: Sun, 22 Jan 2017 19:40:15 -0700 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: Thanks, Matt. It is a weird bug. Do we have an alternative solution to this? I was wondering whether it is possible to read the ".exo" files without using the ExodusII. For example, can we read the ".exo" files using the netcdf only? Fande Kong, On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley wrote: > On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong wrote: > >> On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley >> wrote: >> >>> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong >>> wrote: >>> >>>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>>>> wrote: >>>>> >>>>>> Hi All, >>>>>> >>>>>> I upgraded the OS system to macOS Sierra, and observed that PETSc can >>>>>> not read the exodus file any more. The same code runs fine on macOS >>>>>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>>>>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>>>>> neither of them work. I guess this issue is related to the external package >>>>>> *exodus*, and PETSc might not pick up the right enveriment variables >>>>>> for the *exodus.* >>>>>> >>>>>> This issue can be reproduced using the following simple code: >>>>>> >>>>> >>>>> 1) This is just a standard check. Have you reconfigured so that you >>>>> know ExodusII was built with the same compilers and system libraries? >>>>> >>>>> 2) If so, can you get a stack trace with gdb or lldb? >>>>> >>>> >>>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill + >>>> 10 >>>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + 90 >>>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>>> PetscAbortErrorHandler + 506 (errstop.c:40) >>>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + 916 >>>> (err.c:379) >>>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 >>>> 8 ??? 0x000000011ea09370 >>>> initialPoolContent + 19008 >>>> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + >>>> 210 (dutf8proc.c:543) >>>> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + 38 >>>> (dutf8proc.c:568) >>>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + 110 >>>> (attr.c:341) >>>> 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + >>>> 119 (attr.c:384) >>>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + 47 >>>> (attr.c:1138) >>>> 14 libnetcdf.7.dylib 0x0000000112286126 nc_get_att_float >>>> + 90 (dattget.c:192) >>>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + >>>> 171 (ex_open.c:259) >>>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>>> (DMPlexCreateExodusFromFile.cpp:24) >>>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>>> >>> >>> This is a NetCDF error on ex_open_int(). My guess is that your NetCDF >>> build is old and when it calls the system DLL >>> you bomb. Can you do a completely new build, meaning either reclone >>> PETSc somewhere else, or delete the whole >>> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and reconfigure/build? >>> >>> Thanks, >>> >>> Matt >>> >>> >> >> Hi Matt, >> >> Thanks for reply. I recloned PETSc (the old petsc folder is deleted >> completely) and reconfigure. And still has the same issue. I also checked >> if the binary is complied against any other netcdf. The binary is actually >> complied against the right netcdf installed through PETSc. >> > > You can see that this crash happens on the call to > > int CPU_word_size = 0, IO_word_size = 0, exoid = -1; > float version; > > exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, > &version); > > which means the fault is not in PETSc, but rather in ExodusII for your > machine. We could definitely > confirm this if you made a 5 line program that only called this, but I > don't see why it should be different. > I am not sure what to do, since I am not in control of anything about > ExodusII. Can you report this to > their dev team? It is strange since Blaise has not reported this, and I > know he uses it all the time. > > Thanks, > > Matt > > >> *LiviadeMacBook-Pro:partition livia$ otool -L DMPlexCreateExodusFromFile* >> *DMPlexCreateExodusFromFile:* >> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib >> (compatibility version 3.7.0, current version 3.7.5)* >> * >> /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib >> (compatibility version 5.0.0, current version 5.1.3)* >> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib >> (compatibility version 0.0.0, current version 0.0.0)* >> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib >> (compatibility version 0.0.0, current version 0.0.0)* >> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib >> (compatibility version 10.0.0, current version 10.0.0)* >> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib >> (compatibility version 9.0.0, current version 9.1.0)* >> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib >> (compatibility version 9.0.0, current version 9.1.0)* >> * /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current >> version 10.0.0)* >> * /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib >> (compatibility version 14.0.0, current version 14.0.0)* >> * /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility >> version 4.0.0, current version 4.0.0)* >> * /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility >> version 1.0.0, current version 1.0.0)* >> * /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib >> (compatibility version 14.0.0, current version 14.0.0)* >> * /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility version >> 7.0.0, current version 7.21.0)* >> * /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility version >> 1.0.0, current version 1.0.0)* >> * /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib (compatibility >> version 14.0.0, current version 14.0.0)* >> * /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib (compatibility >> version 14.0.0, current version 14.0.0)* >> * /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current >> version 1238.0.0)* >> * /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, >> current version 1.0.0)* >> >> >> >> >> >>> >>>> >>> >>>>> Matt >>>>> >>>>> >>>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>>> >>>>>> *#include * >>>>>> *#include * >>>>>> >>>>>> *#undef __FUNCT__* >>>>>> *#define __FUNCT__ "main"* >>>>>> *int main(int argc,char **argv)* >>>>>> *{* >>>>>> * char fineMeshFileName[2048];* >>>>>> * DM dm;* >>>>>> * MPI_Comm comm;* >>>>>> * PetscBool flg;* >>>>>> >>>>>> * PetscErrorCode ierr;* >>>>>> >>>>>> * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* >>>>>> * comm = PETSC_COMM_WORLD;* >>>>>> * ierr = >>>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>>> * if(!flg){* >>>>>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file >>>>>> \n");* >>>>>> * }* >>>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>>> *}* >>>>>> >>>>>> >>>>>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile >>>>>> -file Tri3.exo * >>>>>> *[0]PETSC ERROR: >>>>>> ------------------------------------------------------------------------* >>>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>> Violation, probably memory access out of range* >>>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>>> -on_error_attach_debugger* >>>>>> *[0]PETSC ERROR: or see >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>>> * >>>>>> *[0]PETSC ERROR: or try http://valgrind.org on >>>>>> GNU/linux and Apple Mac OS X to find memory corruption errors* >>>>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>>> ------------------------------------* >>>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>> available,* >>>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>> function* >>>>>> *[0]PETSC ERROR: is given.* >>>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>>> --------------------------------------------------------------* >>>>>> *[0]PETSC ERROR: Signal received* >>>>>> *[0]PETSC ERROR: See >>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>>> for trouble shooting.* >>>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>>> 21:04:22 2017* >>>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* >>>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>>> *:* >>>>>> *system msg for write_line failure : Bad file descriptor* >>>>>> >>>>>> >>>>>> The log files of make and configuration are also attached. If you >>>>>> have any idea on this issue, please let me know! >>>>>> >>>>>> Fande Kong, >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Jan 22 21:03:44 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 22 Jan 2017 21:03:44 -0600 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: On Sun, Jan 22, 2017 at 8:40 PM, Fande Kong wrote: > Thanks, Matt. > > It is a weird bug. > > Do we have an alternative solution to this? I was wondering whether it is > possible to read the ".exo" files without using the ExodusII. For example, > can we read the ".exo" files using the netcdf only? > Well, ExodusII is only a think layer on NetCDF, just like other wrappers are thin layers on HDF5. It is really NetCDF that is failing. Can you switch compilers and see if that helps? Matt > Fande Kong, > > > > On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley > wrote: > >> On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong wrote: >> >>> On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley >>> wrote: >>> >>>> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong >>>> wrote: >>>> >>>>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>>>>> wrote: >>>>>> >>>>>>> Hi All, >>>>>>> >>>>>>> I upgraded the OS system to macOS Sierra, and observed that PETSc >>>>>>> can not read the exodus file any more. The same code runs fine on macOS >>>>>>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>>>>>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>>>>>> neither of them work. I guess this issue is related to the external package >>>>>>> *exodus*, and PETSc might not pick up the right enveriment >>>>>>> variables for the *exodus.* >>>>>>> >>>>>>> This issue can be reproduced using the following simple code: >>>>>>> >>>>>> >>>>>> 1) This is just a standard check. Have you reconfigured so that you >>>>>> know ExodusII was built with the same compilers and system libraries? >>>>>> >>>>>> 2) If so, can you get a stack trace with gdb or lldb? >>>>>> >>>>> >>>>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill >>>>> + 10 >>>>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + 90 >>>>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>>>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>>>> PetscAbortErrorHandler + 506 (errstop.c:40) >>>>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + >>>>> 916 (err.c:379) >>>>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>>>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>>>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>>>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>>>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 >>>>> 8 ??? 0x000000011ea09370 >>>>> initialPoolContent + 19008 >>>>> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + >>>>> 210 (dutf8proc.c:543) >>>>> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + >>>>> 38 (dutf8proc.c:568) >>>>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + >>>>> 110 (attr.c:341) >>>>> 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + >>>>> 119 (attr.c:384) >>>>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + 47 >>>>> (attr.c:1138) >>>>> 14 libnetcdf.7.dylib 0x0000000112286126 nc_get_att_float >>>>> + 90 (dattget.c:192) >>>>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + >>>>> 171 (ex_open.c:259) >>>>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>>>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>>>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>>>> (DMPlexCreateExodusFromFile.cpp:24) >>>>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>>>> >>>> >>>> This is a NetCDF error on ex_open_int(). My guess is that your NetCDF >>>> build is old and when it calls the system DLL >>>> you bomb. Can you do a completely new build, meaning either reclone >>>> PETSc somewhere else, or delete the whole >>>> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and reconfigure/build? >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>> >>> Hi Matt, >>> >>> Thanks for reply. I recloned PETSc (the old petsc folder is deleted >>> completely) and reconfigure. And still has the same issue. I also checked >>> if the binary is complied against any other netcdf. The binary is actually >>> complied against the right netcdf installed through PETSc. >>> >> >> You can see that this crash happens on the call to >> >> int CPU_word_size = 0, IO_word_size = 0, exoid = -1; >> float version; >> >> exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, >> &version); >> >> which means the fault is not in PETSc, but rather in ExodusII for your >> machine. We could definitely >> confirm this if you made a 5 line program that only called this, but I >> don't see why it should be different. >> I am not sure what to do, since I am not in control of anything about >> ExodusII. Can you report this to >> their dev team? It is strange since Blaise has not reported this, and I >> know he uses it all the time. >> >> Thanks, >> >> Matt >> >> >>> *LiviadeMacBook-Pro:partition livia$ otool -L >>> DMPlexCreateExodusFromFile* >>> *DMPlexCreateExodusFromFile:* >>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib >>> (compatibility version 3.7.0, current version 3.7.5)* >>> * >>> /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib >>> (compatibility version 5.0.0, current version 5.1.3)* >>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib >>> (compatibility version 0.0.0, current version 0.0.0)* >>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib >>> (compatibility version 0.0.0, current version 0.0.0)* >>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib >>> (compatibility version 10.0.0, current version 10.0.0)* >>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib >>> (compatibility version 9.0.0, current version 9.1.0)* >>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib >>> (compatibility version 9.0.0, current version 9.1.0)* >>> * /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current >>> version 10.0.0)* >>> * /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib >>> (compatibility version 14.0.0, current version 14.0.0)* >>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility >>> version 4.0.0, current version 4.0.0)* >>> * /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility >>> version 1.0.0, current version 1.0.0)* >>> * /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib >>> (compatibility version 14.0.0, current version 14.0.0)* >>> * /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility >>> version 7.0.0, current version 7.21.0)* >>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility version >>> 1.0.0, current version 1.0.0)* >>> * /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib (compatibility >>> version 14.0.0, current version 14.0.0)* >>> * /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib >>> (compatibility version 14.0.0, current version 14.0.0)* >>> * /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current >>> version 1238.0.0)* >>> * /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, >>> current version 1.0.0)* >>> >>> >>> >>> >>> >>>> >>>>> >>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>>>> >>>>>>> *#include * >>>>>>> *#include * >>>>>>> >>>>>>> *#undef __FUNCT__* >>>>>>> *#define __FUNCT__ "main"* >>>>>>> *int main(int argc,char **argv)* >>>>>>> *{* >>>>>>> * char fineMeshFileName[2048];* >>>>>>> * DM dm;* >>>>>>> * MPI_Comm comm;* >>>>>>> * PetscBool flg;* >>>>>>> >>>>>>> * PetscErrorCode ierr;* >>>>>>> >>>>>>> * ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr);* >>>>>>> * comm = PETSC_COMM_WORLD;* >>>>>>> * ierr = >>>>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>>>> * if(!flg){* >>>>>>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file >>>>>>> \n");* >>>>>>> * }* >>>>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>>>> *}* >>>>>>> >>>>>>> >>>>>>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile >>>>>>> -file Tri3.exo * >>>>>>> *[0]PETSC ERROR: >>>>>>> ------------------------------------------------------------------------* >>>>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>> Violation, probably memory access out of range* >>>>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>>>> -on_error_attach_debugger* >>>>>>> *[0]PETSC ERROR: or see >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>>>> * >>>>>>> *[0]PETSC ERROR: or try http://valgrind.org on >>>>>>> GNU/linux and Apple Mac OS X to find memory corruption errors* >>>>>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>>>> ------------------------------------* >>>>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>> available,* >>>>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>> function* >>>>>>> *[0]PETSC ERROR: is given.* >>>>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>>>> --------------------------------------------------------------* >>>>>>> *[0]PETSC ERROR: Signal received* >>>>>>> *[0]PETSC ERROR: See >>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>>>> for trouble shooting.* >>>>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>>>> 21:04:22 2017* >>>>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown file* >>>>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>>>> *:* >>>>>>> *system msg for write_line failure : Bad file descriptor* >>>>>>> >>>>>>> >>>>>>> The log files of make and configuration are also attached. If you >>>>>>> have any idea on this issue, please let me know! >>>>>>> >>>>>>> Fande Kong, >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Sun Jan 22 23:16:53 2017 From: cpraveen at gmail.com (Praveen C) Date: Mon, 23 Jan 2017 10:46:53 +0530 Subject: [petsc-users] Application context in fortran Message-ID: Hello With snes/ts, we use an "application context to contain data needed by the application-provided call-back routines, FormJacobian() and FormFunction()". This can be a struct in the C examples. What can I use in case of fortran ? Can I use a module to pass the data needed by the call-back routines ? Thanks praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From fdkong.jd at gmail.com Sun Jan 22 23:18:32 2017 From: fdkong.jd at gmail.com (Fande Kong) Date: Sun, 22 Jan 2017 22:18:32 -0700 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: Thanks, Matt, Clang does not have this issue. The code runs fine with clang. Fande, On Sun, Jan 22, 2017 at 8:03 PM, Matthew Knepley wrote: > On Sun, Jan 22, 2017 at 8:40 PM, Fande Kong wrote: > >> Thanks, Matt. >> >> It is a weird bug. >> >> Do we have an alternative solution to this? I was wondering whether it is >> possible to read the ".exo" files without using the ExodusII. For example, >> can we read the ".exo" files using the netcdf only? >> > > Well, ExodusII is only a think layer on NetCDF, just like other wrappers > are thin layers on HDF5. It is > really NetCDF that is failing. Can you switch compilers and see if that > helps? > > Matt > > >> Fande Kong, >> >> >> >> On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley >> wrote: >> >>> On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong wrote: >>> >>>> On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong >>>>> wrote: >>>>> >>>>>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>>>>>> wrote: >>>>>>> >>>>>>>> Hi All, >>>>>>>> >>>>>>>> I upgraded the OS system to macOS Sierra, and observed that PETSc >>>>>>>> can not read the exodus file any more. The same code runs fine on macOS >>>>>>>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>>>>>>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>>>>>>> neither of them work. I guess this issue is related to the external package >>>>>>>> *exodus*, and PETSc might not pick up the right enveriment >>>>>>>> variables for the *exodus.* >>>>>>>> >>>>>>>> This issue can be reproduced using the following simple code: >>>>>>>> >>>>>>> >>>>>>> 1) This is just a standard check. Have you reconfigured so that you >>>>>>> know ExodusII was built with the same compilers and system libraries? >>>>>>> >>>>>>> 2) If so, can you get a stack trace with gdb or lldb? >>>>>>> >>>>>> >>>>>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill >>>>>> + 10 >>>>>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + >>>>>> 90 >>>>>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>>>>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>>>>> PetscAbortErrorHandler + 506 (errstop.c:40) >>>>>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + >>>>>> 916 (err.c:379) >>>>>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>>>>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>>>>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>>>>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>>>>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 >>>>>> 8 ??? 0x000000011ea09370 >>>>>> initialPoolContent + 19008 >>>>>> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + >>>>>> 210 (dutf8proc.c:543) >>>>>> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + >>>>>> 38 (dutf8proc.c:568) >>>>>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + >>>>>> 110 (attr.c:341) >>>>>> 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + >>>>>> 119 (attr.c:384) >>>>>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + >>>>>> 47 (attr.c:1138) >>>>>> 14 libnetcdf.7.dylib 0x0000000112286126 >>>>>> nc_get_att_float + 90 (dattget.c:192) >>>>>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + >>>>>> 171 (ex_open.c:259) >>>>>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>>>>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>>>>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>>>>> (DMPlexCreateExodusFromFile.cpp:24) >>>>>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>>>>> >>>>> >>>>> This is a NetCDF error on ex_open_int(). My guess is that your NetCDF >>>>> build is old and when it calls the system DLL >>>>> you bomb. Can you do a completely new build, meaning either reclone >>>>> PETSc somewhere else, or delete the whole >>>>> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and >>>>> reconfigure/build? >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>> >>>> Hi Matt, >>>> >>>> Thanks for reply. I recloned PETSc (the old petsc folder is deleted >>>> completely) and reconfigure. And still has the same issue. I also checked >>>> if the binary is complied against any other netcdf. The binary is actually >>>> complied against the right netcdf installed through PETSc. >>>> >>> >>> You can see that this crash happens on the call to >>> >>> int CPU_word_size = 0, IO_word_size = 0, exoid = -1; >>> float version; >>> >>> exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, >>> &version); >>> >>> which means the fault is not in PETSc, but rather in ExodusII for your >>> machine. We could definitely >>> confirm this if you made a 5 line program that only called this, but I >>> don't see why it should be different. >>> I am not sure what to do, since I am not in control of anything about >>> ExodusII. Can you report this to >>> their dev team? It is strange since Blaise has not reported this, and I >>> know he uses it all the time. >>> >>> Thanks, >>> >>> Matt >>> >>> >>>> *LiviadeMacBook-Pro:partition livia$ otool -L >>>> DMPlexCreateExodusFromFile* >>>> *DMPlexCreateExodusFromFile:* >>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib >>>> (compatibility version 3.7.0, current version 3.7.5)* >>>> * >>>> /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib >>>> (compatibility version 5.0.0, current version 5.1.3)* >>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib >>>> (compatibility version 0.0.0, current version 0.0.0)* >>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib >>>> (compatibility version 0.0.0, current version 0.0.0)* >>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib >>>> (compatibility version 10.0.0, current version 10.0.0)* >>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib >>>> (compatibility version 9.0.0, current version 9.1.0)* >>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib >>>> (compatibility version 9.0.0, current version 9.1.0)* >>>> * /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current >>>> version 10.0.0)* >>>> * /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib >>>> (compatibility version 14.0.0, current version 14.0.0)* >>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility >>>> version 4.0.0, current version 4.0.0)* >>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility >>>> version 1.0.0, current version 1.0.0)* >>>> * /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib >>>> (compatibility version 14.0.0, current version 14.0.0)* >>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility >>>> version 7.0.0, current version 7.21.0)* >>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility >>>> version 1.0.0, current version 1.0.0)* >>>> * /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib >>>> (compatibility version 14.0.0, current version 14.0.0)* >>>> * /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib >>>> (compatibility version 14.0.0, current version 14.0.0)* >>>> * /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current >>>> version 1238.0.0)* >>>> * /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, >>>> current version 1.0.0)* >>>> >>>> >>>> >>>> >>>> >>>>> >>>>>> >>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>>>>> >>>>>>>> *#include * >>>>>>>> *#include * >>>>>>>> >>>>>>>> *#undef __FUNCT__* >>>>>>>> *#define __FUNCT__ "main"* >>>>>>>> *int main(int argc,char **argv)* >>>>>>>> *{* >>>>>>>> * char fineMeshFileName[2048];* >>>>>>>> * DM dm;* >>>>>>>> * MPI_Comm comm;* >>>>>>>> * PetscBool flg;* >>>>>>>> >>>>>>>> * PetscErrorCode ierr;* >>>>>>>> >>>>>>>> * ierr = PetscInitialize(&argc,&argv,(char >>>>>>>> *)0,help);CHKERRQ(ierr);* >>>>>>>> * comm = PETSC_COMM_WORLD;* >>>>>>>> * ierr = >>>>>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>>>>> * if(!flg){* >>>>>>>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file >>>>>>>> \n");* >>>>>>>> * }* >>>>>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>>>>> *}* >>>>>>>> >>>>>>>> >>>>>>>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile >>>>>>>> -file Tri3.exo * >>>>>>>> *[0]PETSC ERROR: >>>>>>>> ------------------------------------------------------------------------* >>>>>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>> Violation, probably memory access out of range* >>>>>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>> -on_error_attach_debugger* >>>>>>>> *[0]PETSC ERROR: or see >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>>>>> * >>>>>>>> *[0]PETSC ERROR: or try http://valgrind.org >>>>>>>> on GNU/linux and Apple Mac OS X to find memory corruption errors* >>>>>>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>>>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>>>>> ------------------------------------* >>>>>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>> available,* >>>>>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>> function* >>>>>>>> *[0]PETSC ERROR: is given.* >>>>>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>>>>> --------------------------------------------------------------* >>>>>>>> *[0]PETSC ERROR: Signal received* >>>>>>>> *[0]PETSC ERROR: See >>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>>>>> for trouble shooting.* >>>>>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>>>>> 21:04:22 2017* >>>>>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>> file* >>>>>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>>>>> *:* >>>>>>>> *system msg for write_line failure : Bad file descriptor* >>>>>>>> >>>>>>>> >>>>>>>> The log files of make and configuration are also attached. If you >>>>>>>> have any idea on this issue, please let me know! >>>>>>>> >>>>>>>> Fande Kong, >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Sun Jan 22 23:23:08 2017 From: cpraveen at gmail.com (Praveen C) Date: Mon, 23 Jan 2017 10:53:08 +0530 Subject: [petsc-users] Application context in fortran In-Reply-To: References: Message-ID: Can I pass an object of fortran "type" as the application context ? Thanks praveen On Mon, Jan 23, 2017 at 10:46 AM, Praveen C wrote: > Hello > > With snes/ts, we use an "application context to contain data needed by > the application-provided call-back routines, FormJacobian() > and FormFunction()". This can be a struct in the C examples. What can I use > in case of fortran ? Can I use a module to pass the data needed by the > call-back routines ? > > Thanks > praveen > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Sun Jan 22 23:36:53 2017 From: knepley at gmail.com (Matthew Knepley) Date: Sun, 22 Jan 2017 23:36:53 -0600 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: On Sun, Jan 22, 2017 at 11:18 PM, Fande Kong wrote: > Thanks, Matt, > > Clang does not have this issue. The code runs fine with clang. > Okay, it sounds like a gcc bug on Mac 10.6, or at least in the version you have. Matt > Fande, > > On Sun, Jan 22, 2017 at 8:03 PM, Matthew Knepley > wrote: > >> On Sun, Jan 22, 2017 at 8:40 PM, Fande Kong wrote: >> >>> Thanks, Matt. >>> >>> It is a weird bug. >>> >>> Do we have an alternative solution to this? I was wondering whether it >>> is possible to read the ".exo" files without using the ExodusII. For >>> example, can we read the ".exo" files using the netcdf only? >>> >> >> Well, ExodusII is only a think layer on NetCDF, just like other wrappers >> are thin layers on HDF5. It is >> really NetCDF that is failing. Can you switch compilers and see if that >> helps? >> >> Matt >> >> >>> Fande Kong, >>> >>> >>> >>> On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley >>> wrote: >>> >>>> On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong >>>> wrote: >>>> >>>>> On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong >>>>>> wrote: >>>>>> >>>>>>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley >>>>>> > wrote: >>>>>>> >>>>>>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>>>>>>> wrote: >>>>>>>> >>>>>>>>> Hi All, >>>>>>>>> >>>>>>>>> I upgraded the OS system to macOS Sierra, and observed that PETSc >>>>>>>>> can not read the exodus file any more. The same code runs fine on macOS >>>>>>>>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>>>>>>>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>>>>>>>> neither of them work. I guess this issue is related to the external package >>>>>>>>> *exodus*, and PETSc might not pick up the right enveriment >>>>>>>>> variables for the *exodus.* >>>>>>>>> >>>>>>>>> This issue can be reproduced using the following simple code: >>>>>>>>> >>>>>>>> >>>>>>>> 1) This is just a standard check. Have you reconfigured so that you >>>>>>>> know ExodusII was built with the same compilers and system libraries? >>>>>>>> >>>>>>>> 2) If so, can you get a stack trace with gdb or lldb? >>>>>>>> >>>>>>> >>>>>>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda >>>>>>> __pthread_kill + 10 >>>>>>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + >>>>>>> 90 >>>>>>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>>>>>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>>>>>> PetscAbortErrorHandler + 506 (errstop.c:40) >>>>>>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + >>>>>>> 916 (err.c:379) >>>>>>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>>>>>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>>>>>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>>>>>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>>>>>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 >>>>>>> 8 ??? 0x000000011ea09370 >>>>>>> initialPoolContent + 19008 >>>>>>> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + >>>>>>> 210 (dutf8proc.c:543) >>>>>>> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + >>>>>>> 38 (dutf8proc.c:568) >>>>>>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + >>>>>>> 110 (attr.c:341) >>>>>>> 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr >>>>>>> + 119 (attr.c:384) >>>>>>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + >>>>>>> 47 (attr.c:1138) >>>>>>> 14 libnetcdf.7.dylib 0x0000000112286126 >>>>>>> nc_get_att_float + 90 (dattget.c:192) >>>>>>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + >>>>>>> 171 (ex_open.c:259) >>>>>>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>>>>>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>>>>>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>>>>>> (DMPlexCreateExodusFromFile.cpp:24) >>>>>>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>>>>>> >>>>>> >>>>>> This is a NetCDF error on ex_open_int(). My guess is that your NetCDF >>>>>> build is old and when it calls the system DLL >>>>>> you bomb. Can you do a completely new build, meaning either reclone >>>>>> PETSc somewhere else, or delete the whole >>>>>> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and >>>>>> reconfigure/build? >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>> >>>>> Hi Matt, >>>>> >>>>> Thanks for reply. I recloned PETSc (the old petsc folder is deleted >>>>> completely) and reconfigure. And still has the same issue. I also checked >>>>> if the binary is complied against any other netcdf. The binary is actually >>>>> complied against the right netcdf installed through PETSc. >>>>> >>>> >>>> You can see that this crash happens on the call to >>>> >>>> int CPU_word_size = 0, IO_word_size = 0, exoid = -1; >>>> float version; >>>> >>>> exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, >>>> &version); >>>> >>>> which means the fault is not in PETSc, but rather in ExodusII for your >>>> machine. We could definitely >>>> confirm this if you made a 5 line program that only called this, but I >>>> don't see why it should be different. >>>> I am not sure what to do, since I am not in control of anything about >>>> ExodusII. Can you report this to >>>> their dev team? It is strange since Blaise has not reported this, and I >>>> know he uses it all the time. >>>> >>>> Thanks, >>>> >>>> Matt >>>> >>>> >>>>> *LiviadeMacBook-Pro:partition livia$ otool -L >>>>> DMPlexCreateExodusFromFile* >>>>> *DMPlexCreateExodusFromFile:* >>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib >>>>> (compatibility version 3.7.0, current version 3.7.5)* >>>>> * >>>>> /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib >>>>> (compatibility version 5.0.0, current version 5.1.3)* >>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib >>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib >>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib >>>>> (compatibility version 10.0.0, current version 10.0.0)* >>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib >>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib >>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>> * /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current >>>>> version 10.0.0)* >>>>> * /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib >>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility >>>>> version 4.0.0, current version 4.0.0)* >>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility >>>>> version 1.0.0, current version 1.0.0)* >>>>> * /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib >>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility >>>>> version 7.0.0, current version 7.21.0)* >>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility >>>>> version 1.0.0, current version 1.0.0)* >>>>> * /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib >>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>> * /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib >>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>> * /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current >>>>> version 1238.0.0)* >>>>> * /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, >>>>> current version 1.0.0)* >>>>> >>>>> >>>>> >>>>> >>>>> >>>>>> >>>>>>> >>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>>>>>> >>>>>>>>> *#include * >>>>>>>>> *#include * >>>>>>>>> >>>>>>>>> *#undef __FUNCT__* >>>>>>>>> *#define __FUNCT__ "main"* >>>>>>>>> *int main(int argc,char **argv)* >>>>>>>>> *{* >>>>>>>>> * char fineMeshFileName[2048];* >>>>>>>>> * DM dm;* >>>>>>>>> * MPI_Comm comm;* >>>>>>>>> * PetscBool flg;* >>>>>>>>> >>>>>>>>> * PetscErrorCode ierr;* >>>>>>>>> >>>>>>>>> * ierr = PetscInitialize(&argc,&argv,(char >>>>>>>>> *)0,help);CHKERRQ(ierr);* >>>>>>>>> * comm = PETSC_COMM_WORLD;* >>>>>>>>> * ierr = >>>>>>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>>>>>> * if(!flg){* >>>>>>>>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file >>>>>>>>> \n");* >>>>>>>>> * }* >>>>>>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>>>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>>>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>>>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>>>>>> *}* >>>>>>>>> >>>>>>>>> >>>>>>>>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile >>>>>>>>> -file Tri3.exo * >>>>>>>>> *[0]PETSC ERROR: >>>>>>>>> ------------------------------------------------------------------------* >>>>>>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>> Violation, probably memory access out of range* >>>>>>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>> -on_error_attach_debugger* >>>>>>>>> *[0]PETSC ERROR: or see >>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>>>>>> * >>>>>>>>> *[0]PETSC ERROR: or try http://valgrind.org >>>>>>>>> on GNU/linux and Apple Mac OS X to find memory corruption errors* >>>>>>>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>>>>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>> ------------------------------------* >>>>>>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are not >>>>>>>>> available,* >>>>>>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of the >>>>>>>>> function* >>>>>>>>> *[0]PETSC ERROR: is given.* >>>>>>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>>>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>>>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>>>>>> --------------------------------------------------------------* >>>>>>>>> *[0]PETSC ERROR: Signal received* >>>>>>>>> *[0]PETSC ERROR: See >>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>>>>>> for trouble shooting.* >>>>>>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>>>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>>>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>>>>>> 21:04:22 2017* >>>>>>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>>>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>>>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>>>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>>>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>>>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>>>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>> file* >>>>>>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>>>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>>>>>> *:* >>>>>>>>> *system msg for write_line failure : Bad file descriptor* >>>>>>>>> >>>>>>>>> >>>>>>>>> The log files of make and configuration are also attached. If you >>>>>>>>> have any idea on this issue, please let me know! >>>>>>>>> >>>>>>>>> Fande Kong, >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Mon Jan 23 06:38:12 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Mon, 23 Jan 2017 13:38:12 +0100 Subject: [petsc-users] Building petsc4py / mpi4py Message-ID: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> Hello, I try to build petsc from the maint branch together with petsc4py and mpi4py python2 configure --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 make works without errors, so does make test. % echo $PYTHONPATH /home/florian/software/petsc/arch-linux2-c-debug/lib % ls $PYTHONPATH libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py mpi4py-1.3.1-py3.6.egg-info petsc petsc4py petsc4py-3.7.0-py3.6.egg-info pkgconfig but: % python2 RBF_Load.py Traceback (most recent call last): File "RBF_Load.py", line 10, in petsc4py.init(sys.argv) File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", line 42, in init PETSc = petsc4py.lib.ImportPETSc(arch) File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 29, in ImportPETSc return Import('petsc4py', 'PETSc', path, arch) File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 63, in Import fo, fn, stuff = imp.find_module(name, pathlist) ImportError: No module named PETSc Anyone having an idea what could be the problem here? I have also attached the configure.log Best, Florian -------------- next part -------------- Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC ================================================================================ ================================================================================ Starting Configure Run at Mon Jan 23 12:51:05 2017 Configure Options: --configModules=PETSc.Configure --optionsModule=config.compilerOptions --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 Working directory: /home/florian/software/petsc Machine platform: ('Linux', 'asaru', '4.8.13-1-ARCH', '#1 SMP PREEMPT Fri Dec 9 07:24:34 CET 2016', 'x86_64', '') Python version: 2.7.13 (default, Dec 21 2016, 07:16:46) [GCC 6.2.1 20160830] ================================================================================ Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC ================================================================================ TEST configureExternalPackagesDir from config.framework(/home/florian/software/petsc/config/BuildSystem/config/framework.py:834) TESTING: configureExternalPackagesDir from config.framework(config/BuildSystem/config/framework.py:834) ================================================================================ TEST configureDebuggers from config.utilities.debuggers(/home/florian/software/petsc/config/BuildSystem/config/utilities/debuggers.py:22) TESTING: configureDebuggers from config.utilities.debuggers(config/BuildSystem/config/utilities/debuggers.py:22) Find a default debugger and determine its arguments Checking for program /home/florian/software/bin/gdb...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/gdb...not found Checking for program /usr/local/sbin/gdb...not found Checking for program /usr/local/bin/gdb...not found Checking for program /usr/bin/gdb...found Defined make macro "GDB" to "/usr/bin/gdb" Checking for program /home/florian/software/bin/dbx...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/dbx...not found Checking for program /usr/local/sbin/dbx...not found Checking for program /usr/local/bin/dbx...not found Checking for program /usr/bin/dbx...not found Checking for program /usr/lib/jvm/default/bin/dbx...not found Checking for program /opt/paraview/bin/dbx...not found Checking for program /usr/bin/site_perl/dbx...not found Checking for program /usr/bin/vendor_perl/dbx...not found Checking for program /usr/bin/core_perl/dbx...not found Checking for program /home/florian/dbx...not found Checking for program /home/florian/software/bin/xdb...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/xdb...not found Checking for program /usr/local/sbin/xdb...not found Checking for program /usr/local/bin/xdb...not found Checking for program /usr/bin/xdb...not found Checking for program /usr/lib/jvm/default/bin/xdb...not found Checking for program /opt/paraview/bin/xdb...not found Checking for program /usr/bin/site_perl/xdb...not found Checking for program /usr/bin/vendor_perl/xdb...not found Checking for program /usr/bin/core_perl/xdb...not found Checking for program /home/florian/xdb...not found Checking for program /home/florian/software/bin/dsymutil...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/dsymutil...not found Checking for program /usr/local/sbin/dsymutil...not found Checking for program /usr/local/bin/dsymutil...not found Checking for program /usr/bin/dsymutil...not found Checking for program /usr/lib/jvm/default/bin/dsymutil...not found Checking for program /opt/paraview/bin/dsymutil...not found Checking for program /usr/bin/site_perl/dsymutil...not found Checking for program /usr/bin/vendor_perl/dsymutil...not found Checking for program /usr/bin/core_perl/dsymutil...not found Checking for program /home/florian/dsymutil...not found Defined make macro "DSYMUTIL" to "true" Defined "USE_GDB_DEBUGGER" to "1" ================================================================================ TEST configureGit from config.sourceControl(/home/florian/software/petsc/config/BuildSystem/config/sourceControl.py:24) TESTING: configureGit from config.sourceControl(config/BuildSystem/config/sourceControl.py:24) Find the Git executable Checking for program /home/florian/software/bin/git...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/git...not found Checking for program /usr/local/sbin/git...not found Checking for program /usr/local/bin/git...not found Checking for program /usr/bin/git...found Defined make macro "GIT" to "git" Executing: git --version stdout: git version 2.11.0 ================================================================================ TEST configureMercurial from config.sourceControl(/home/florian/software/petsc/config/BuildSystem/config/sourceControl.py:35) TESTING: configureMercurial from config.sourceControl(config/BuildSystem/config/sourceControl.py:35) Find the Mercurial executable Checking for program /home/florian/software/bin/hg...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/hg...not found Checking for program /usr/local/sbin/hg...not found Checking for program /usr/local/bin/hg...not found Checking for program /usr/bin/hg...found Defined make macro "HG" to "hg" Executing: hg version -q stdout: Mercurial Distributed SCM (version 4.0.1) ================================================================================ TEST configureCLanguage from PETSc.options.languages(/home/florian/software/petsc/config/PETSc/options/languages.py:27) TESTING: configureCLanguage from PETSc.options.languages(config/PETSc/options/languages.py:27) Choose whether to compile the PETSc library using a C or C++ compiler C language is C Defined "CLANGUAGE_C" to "1" ================================================================================ TEST configureDirectories from PETSc.options.petscdir(/home/florian/software/petsc/config/PETSc/options/petscdir.py:23) TESTING: configureDirectories from PETSc.options.petscdir(config/PETSc/options/petscdir.py:23) Checks PETSC_DIR and sets if not set Version Information: #define PETSC_VERSION_RELEASE 1 #define PETSC_VERSION_MAJOR 3 #define PETSC_VERSION_MINOR 7 #define PETSC_VERSION_SUBMINOR 5 #define PETSC_VERSION_PATCH 0 #define PETSC_VERSION_DATE "unknown" #define PETSC_VERSION_GIT "unknown" #define PETSC_VERSION_DATE_GIT "unknown" #define PETSC_VERSION_(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_LT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_LE(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GT(MAJOR,MINOR,SUBMINOR) \ #define PETSC_VERSION_GE(MAJOR,MINOR,SUBMINOR) \ Defined make macro "DIR" to "/home/florian/software/petsc" ================================================================================ TEST getDatafilespath from PETSc.options.dataFilesPath(/home/florian/software/petsc/config/PETSc/options/dataFilesPath.py:29) TESTING: getDatafilespath from PETSc.options.dataFilesPath(config/PETSc/options/dataFilesPath.py:29) Checks what DATAFILESPATH should be ================================================================================ TEST configureInstallationMethod from PETSc.options.petscclone(/home/florian/software/petsc/config/PETSc/options/petscclone.py:20) TESTING: configureInstallationMethod from PETSc.options.petscclone(config/PETSc/options/petscclone.py:20) bin/maint exists. This appears to be a repository clone .git directory exists Executing: cd /home/florian/software/petsc && git describe stdout: v3.7.5-10-ga4629e9613 Executing: cd /home/florian/software/petsc && git log -1 --pretty=format:%H stdout: a4629e9613d49a420aa9124c29752e2ac7decb6f Executing: cd /home/florian/software/petsc && git log -1 --pretty=format:%ci stdout: 2017-01-19 08:56:29 -0600 Executing: cd /home/florian/software/petsc && git branch stdout: * maint Defined "VERSION_GIT" to ""v3.7.5-10-ga4629e9613"" Defined "VERSION_DATE_GIT" to ""2017-01-19 08:56:29 -0600"" Defined "VERSION_BRANCH_GIT" to ""maint"" ================================================================================ TEST configureArchitecture from PETSc.options.arch(/home/florian/software/petsc/config/PETSc/options/arch.py:25) TESTING: configureArchitecture from PETSc.options.arch(config/PETSc/options/arch.py:25) Checks PETSC_ARCH and sets if not set Defined "ARCH" to ""arch-linux2-c-debug"" ================================================================================ TEST setInstallDir from PETSc.options.installDir(/home/florian/software/petsc/config/PETSc/options/installDir.py:35) TESTING: setInstallDir from PETSc.options.installDir(config/PETSc/options/installDir.py:35) setup installDir to either prefix or if that is not set to PETSC_DIR/PETSC_ARCH ================================================================================ TEST saveReconfigure from PETSc.options.installDir(/home/florian/software/petsc/config/PETSc/options/installDir.py:74) TESTING: saveReconfigure from PETSc.options.installDir(config/PETSc/options/installDir.py:74) ================================================================================ TEST cleanInstallDir from PETSc.options.installDir(/home/florian/software/petsc/config/PETSc/options/installDir.py:67) TESTING: cleanInstallDir from PETSc.options.installDir(config/PETSc/options/installDir.py:67) ================================================================================ TEST configureInstallDir from PETSc.options.installDir(/home/florian/software/petsc/config/PETSc/options/installDir.py:51) TESTING: configureInstallDir from PETSc.options.installDir(config/PETSc/options/installDir.py:51) Makes installDir subdirectories if it does not exist for both prefix install location and PETSc work install location Changed persistence directory to /home/florian/software/petsc/arch-linux2-c-debug/lib/petsc/conf ================================================================================ TEST restoreReconfigure from PETSc.options.installDir(/home/florian/software/petsc/config/PETSc/options/installDir.py:87) TESTING: restoreReconfigure from PETSc.options.installDir(config/PETSc/options/installDir.py:87) ================================================================================ TEST setExternalPackagesDir from PETSc.options.externalpackagesdir(/home/florian/software/petsc/config/PETSc/options/externalpackagesdir.py:15) TESTING: setExternalPackagesDir from PETSc.options.externalpackagesdir(config/PETSc/options/externalpackagesdir.py:15) ================================================================================ TEST cleanExternalpackagesDir from PETSc.options.externalpackagesdir(/home/florian/software/petsc/config/PETSc/options/externalpackagesdir.py:22) TESTING: cleanExternalpackagesDir from PETSc.options.externalpackagesdir(config/PETSc/options/externalpackagesdir.py:22) ================================================================================ TEST printEnvVariables from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1589) TESTING: printEnvVariables from config.setCompilers(config/BuildSystem/config/setCompilers.py:1589) **** printenv **** USER=florian LESS_TERMCAP_md= LESS_TERMCAP_mb= DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/1000/bus XDG_CURRENT_DESKTOP=i3 XDG_SESSION_TYPE=x11 CPLUS_INCLUDE_PATH=/usr/include/eigen3:/home/florian/precice/src:/usr/include/eigen3:/home/florian/precice/src: CPATH=/home/florian/software/petsc/include:/home/florian/software/petsc/arch-linux2-c-debug/include:/home/florian/software/petsc/include:/home/florian/software/petsc/arch-linux2-c-debug/include: LOGNAME=florian QT_LOGGING_RULES=*.debug=false PATH=/home/florian/software/bin:/home/florian/.gem/ruby/2.4.0/bin:/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/lib/jvm/default/bin:/opt/paraview/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl XDG_VTNR=1 HOME=/home/florian LD_LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib:/home/florian/software/lib:/home/florian/software/petsc/arch-linux2-c-debug/lib:/home/florian/software/lib::/home/florian/precice/build/last:/home/florian/precice/build/last XDG_RUNTIME_DIR=/run/user/1000 XDG_SESSION_DESKTOP=i3 PROFILEHOME= PRECICE_ROOT=/home/florian/precice SHELL=/bin/zsh XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session0 XAUTHORITY=/home/florian/.Xauthority LANGUAGE= SHLVL=2 CUPS_USER=lindnefn QT_QPA_PLATFORMTHEME=qt5ct KONSOLE_PROFILE_NAME=Shell HG=/usr/bin/hg KONSOLE_DBUS_WINDOW=/Windows/1 WINDOWID=79691782 PETSC_DIR=/home/florian/software/petsc EDITOR=emacsclient -t --alternate-editor=emacs KONSOLE_DBUS_SESSION=/Sessions/1 XDG_SESSION_CLASS=user LANG=en_US.UTF-8 LESS_TERMCAP_me= PETSC_ARCH=arch-linux2-c-debug PYTHONPATH=/home/florian/software/petsc/arch-linux2-c-debug/lib:/home/florian/software/petsc/arch-linux2-c-debug/lib: KONSOLE_DBUS_SERVICE=:1.195 TERM=xterm-256color LESS_TERMCAP_ue= XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0 COLORFGBG=15;0 LESSOPEN=|/usr/bin/lesspipe.sh %s XDG_SESSION_ID=c1 SHELL_SESSION_ID=52bfc94f68f1499ab743ba79c21e36c3 _=/usr/bin/python2 LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib: DESKTOP_SESSION=/usr/share/xsessions/i3 DISPLAY=:0 MOZ_PLUGIN_PATH=/usr/lib/mozilla/plugins OLDPWD=/home/florian/software LESS_TERMCAP_se= GTK_MODULES=canberra-gtk-module TERMINAL=konsole PWD=/home/florian/software/petsc LESS_TERMCAP_us= COLORTERM=yes MAIL=/var/spool/mail/florian LESS_TERMCAP_so= LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36: PAGER=less PYTHONDOCS=/usr/share/doc/python/html/library XDG_SEAT=seat0 ================================================================================ TEST resetEnvCompilers from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1596) TESTING: resetEnvCompilers from config.setCompilers(config/BuildSystem/config/setCompilers.py:1596) ================================================================================ TEST checkEnvCompilers from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1626) TESTING: checkEnvCompilers from config.setCompilers(config/BuildSystem/config/setCompilers.py:1626) ================================================================================ TEST checkMPICompilerOverride from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1561) TESTING: checkMPICompilerOverride from config.setCompilers(config/BuildSystem/config/setCompilers.py:1561) Check if --with-mpi-dir is used along with CC CXX or FC compiler options. This usually prevents mpi compilers from being used - so issue a warning ================================================================================ TEST requireMpiLdPath from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1580) TESTING: requireMpiLdPath from config.setCompilers(config/BuildSystem/config/setCompilers.py:1580) OpenMPI wrappers require LD_LIBRARY_PATH set ================================================================================ TEST checkVendor from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:417) TESTING: checkVendor from config.setCompilers(config/BuildSystem/config/setCompilers.py:417) Determine the compiler vendor Compiler vendor is "" ================================================================================ TEST checkInitialFlags from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:427) TESTING: checkInitialFlags from config.setCompilers(config/BuildSystem/config/setCompilers.py:427) Initialize the compiler and linker flags Pushing language C Initialized CFLAGS to Initialized CFLAGS to Initialized LDFLAGS to Popping language C Pushing language CUDA Initialized CUDAFLAGS to Initialized CUDAFLAGS to Initialized LDFLAGS to Popping language CUDA Pushing language Cxx Initialized CXXFLAGS to Initialized CXX_CXXFLAGS to Initialized LDFLAGS to Popping language Cxx Pushing language FC Initialized FFLAGS to Initialized FFLAGS to Initialized LDFLAGS to Popping language FC Initialized CPPFLAGS to Initialized CUDAPPFLAGS to Initialized CXXCPPFLAGS to Initialized CC_LINKER_FLAGS to [] Initialized CXX_LINKER_FLAGS to [] Initialized FC_LINKER_FLAGS to [] Initialized CUDAC_LINKER_FLAGS to [] Initialized sharedLibraryFlags to [] Initialized dynamicLibraryFlags to [] ================================================================================ TEST checkCCompiler from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:553) TESTING: checkCCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:553) Locate a functional C compiler Executing: mpicc --help stdout: Usage: gcc [options] file... Options: -pass-exit-codes Exit with highest error code from a phase. --help Display this information. --target-help Display target specific command line options. --help={common|optimizers|params|target|warnings|[^]{joined|separate|undocumented}}[,...]. Display specific types of command line options. (Use '-v --help' to display command line options of sub-processes). --version Display compiler version information. -dumpspecs Display all of the built in spec strings. -dumpversion Display the version of the compiler. -dumpmachine Display the compiler's target processor. -print-search-dirs Display the directories in the compiler's search path. -print-libgcc-file-name Display the name of the compiler's companion library. -print-file-name= Display the full path to library . -print-prog-name= Display the full path to compiler component . -print-multiarch Display the target's normalized GNU triplet, used as a component in the library path. -print-multi-directory Display the root directory for versions of libgcc. -print-multi-lib Display the mapping between command line options and multiple library search directories. -print-multi-os-directory Display the relative path to OS libraries. -print-sysroot Display the target libraries directory. -print-sysroot-headers-suffix Display the sysroot suffix used to find headers. -Wa, Pass comma-separated on to the assembler. -Wp, Pass comma-separated on to the preprocessor. -Wl, Pass comma-separated on to the linker. -Xassembler Pass on to the assembler. -Xpreprocessor Pass on to the preprocessor. -Xlinker Pass on to the linker. -save-temps Do not delete intermediate files. -save-temps= Do not delete intermediate files. -no-canonical-prefixes Do not canonicalize paths when building relative prefixes to other gcc components. -pipe Use pipes rather than intermediate files. -time Time the execution of each subprocess. -specs= Override built-in specs with the contents of . -std= Assume that the input sources are for . --sysroot= Use as the root directory for headers and libraries. -B Add to the compiler's search paths. -v Display the programs invoked by the compiler. -### Like -v but options quoted and commands not executed. -E Preprocess only; do not compile, assemble or link. -S Compile only; do not assemble or link. -c Compile and assemble, but do not link. -o Place the output into . -pie Create a position independent executable. -shared Create a shared library. -x Specify the language of the following input files. Permissible languages include: c c++ assembler none 'none' means revert to the default behavior of guessing the language based on the file's extension. Options starting with -g, -f, -m, -O, -W, or --param are automatically passed on to the various sub-processes invoked by gcc. In order to pass other options on to these processes the -W options must be used. For bug reporting instructions, please see: . Checking for program /home/florian/software/bin/mpicc...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpicc...not found Checking for program /usr/local/sbin/mpicc...not found Checking for program /usr/local/bin/mpicc...not found Checking for program /usr/bin/mpicc...found Defined make macro "CC" to "mpicc" Pushing language C All intermediate test results are stored in /tmp/petsc-KvGRNM All intermediate test results are stored in /tmp/petsc-KvGRNM/config.setCompilers Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language C ================================================================================ TEST checkCPreprocessor from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:586) TESTING: checkCPreprocessor from config.setCompilers(config/BuildSystem/config/setCompilers.py:586) Locate a functional C preprocessor Checking for program /home/florian/software/bin/mpicc...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpicc...not found Checking for program /usr/local/sbin/mpicc...not found Checking for program /usr/local/bin/mpicc...not found Checking for program /usr/bin/mpicc...found Defined make macro "CPP" to "mpicc -E" Pushing language C Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.setCompilers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.setCompilers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.c" 2 # 1 "/usr/include/stdlib.h" 1 3 4 # 24 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 25 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 33 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitflags.h" 1 3 4 # 42 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitstatus.h" 1 3 4 # 43 "/usr/include/stdlib.h" 2 3 4 # 56 "/usr/include/stdlib.h" 3 4 typedef struct { int quot; int rem; } div_t; typedef struct { long int quot; long int rem; } ldiv_t; __extension__ typedef struct { long long int quot; long long int rem; } lldiv_t; # 100 "/usr/include/stdlib.h" 3 4 extern size_t __ctype_get_mb_cur_max (void) __attribute__ ((__nothrow__ , __leaf__)) ; extern double atof (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern int atoi (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern long int atol (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; __extension__ extern long long int atoll (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern double strtod (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern float strtof (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long double strtold (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int strtol (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern unsigned long int strtoul (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtouq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoll (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtoull (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 266 "/usr/include/stdlib.h" 3 4 extern char *l64a (long int __n) __attribute__ ((__nothrow__ , __leaf__)) ; extern long int a64l (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 276 "/usr/include/stdlib.h" 2 3 4 extern long int random (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srandom (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern char *initstate (unsigned int __seed, char *__statebuf, size_t __statelen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *setstate (char *__statebuf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct random_data { int32_t *fptr; int32_t *rptr; int32_t *state; int rand_type; int rand_deg; int rand_sep; int32_t *end_ptr; }; extern int random_r (struct random_data *__restrict __buf, int32_t *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srandom_r (unsigned int __seed, struct random_data *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int initstate_r (unsigned int __seed, char *__restrict __statebuf, size_t __statelen, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern int setstate_r (char *__restrict __statebuf, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int rand (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srand (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern int rand_r (unsigned int *__seed) __attribute__ ((__nothrow__ , __leaf__)); extern double drand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern double erand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int lrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int nrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int mrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int jrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void srand48 (long int __seedval) __attribute__ ((__nothrow__ , __leaf__)); extern unsigned short int *seed48 (unsigned short int __seed16v[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void lcong48 (unsigned short int __param[7]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct drand48_data { unsigned short int __x[3]; unsigned short int __old_x[3]; unsigned short int __c; unsigned short int __init; __extension__ unsigned long long int __a; }; extern int drand48_r (struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int erand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int nrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int mrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int jrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srand48_r (long int __seedval, struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int seed48_r (unsigned short int __seed16v[3], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lcong48_r (unsigned short int __param[7], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *malloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *calloc (size_t __nmemb, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *realloc (void *__ptr, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__warn_unused_result__)); extern void free (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void cfree (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/include/alloca.h" 1 3 4 # 24 "/usr/include/alloca.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 25 "/usr/include/alloca.h" 2 3 4 extern void *alloca (size_t __size) __attribute__ ((__nothrow__ , __leaf__)); # 454 "/usr/include/stdlib.h" 2 3 4 extern void *valloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int posix_memalign (void **__memptr, size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern void *aligned_alloc (size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__alloc_size__ (2))) ; extern void abort (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern int atexit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int at_quick_exit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int on_exit (void (*__func) (int __status, void *__arg), void *__arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void quick_exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void _Exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern char *getenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 539 "/usr/include/stdlib.h" 3 4 extern int putenv (char *__string) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int setenv (const char *__name, const char *__value, int __replace) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int unsetenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int clearenv (void) __attribute__ ((__nothrow__ , __leaf__)); # 567 "/usr/include/stdlib.h" 3 4 extern char *mktemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 580 "/usr/include/stdlib.h" 3 4 extern int mkstemp (char *__template) __attribute__ ((__nonnull__ (1))) ; # 602 "/usr/include/stdlib.h" 3 4 extern int mkstemps (char *__template, int __suffixlen) __attribute__ ((__nonnull__ (1))) ; # 623 "/usr/include/stdlib.h" 3 4 extern char *mkdtemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 672 "/usr/include/stdlib.h" 3 4 extern int system (const char *__command) ; # 694 "/usr/include/stdlib.h" 3 4 extern char *realpath (const char *__restrict __name, char *__restrict __resolved) __attribute__ ((__nothrow__ , __leaf__)) ; typedef int (*__compar_fn_t) (const void *, const void *); # 712 "/usr/include/stdlib.h" 3 4 extern void *bsearch (const void *__key, const void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 2, 5))) ; extern void qsort (void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 4))); # 735 "/usr/include/stdlib.h" 3 4 extern int abs (int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern long int labs (long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern long long int llabs (long long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern div_t div (int __numer, int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern ldiv_t ldiv (long int __numer, long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern lldiv_t lldiv (long long int __numer, long long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; # 772 "/usr/include/stdlib.h" 3 4 extern char *ecvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *fcvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *gcvt (double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern char *qecvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qfcvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qgcvt (long double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern int ecvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int fcvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qecvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qfcvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int mblen (const char *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int mbtowc (wchar_t *__restrict __pwc, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int wctomb (char *__s, wchar_t __wchar) __attribute__ ((__nothrow__ , __leaf__)); extern size_t mbstowcs (wchar_t *__restrict __pwcs, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern size_t wcstombs (char *__restrict __s, const wchar_t *__restrict __pwcs, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int rpmatch (const char *__response) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 859 "/usr/include/stdlib.h" 3 4 extern int getsubopt (char **__restrict __optionp, char *const *__restrict __tokens, char **__restrict __valuep) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))) ; # 911 "/usr/include/stdlib.h" 3 4 extern int getloadavg (double __loadavg[], int __nelem) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 921 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/bits/stdlib-float.h" 1 3 4 # 922 "/usr/include/stdlib.h" 2 3 4 # 934 "/usr/include/stdlib.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Popping language C ================================================================================ TEST checkCUDACompiler from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:627) TESTING: checkCUDACompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:627) Locate a functional CUDA compiler ================================================================================ TEST checkCUDAPreprocessor from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:667) TESTING: checkCUDAPreprocessor from config.setCompilers(config/BuildSystem/config/setCompilers.py:667) Locate a functional CUDA preprocessor ================================================================================ TEST checkCxxCompiler from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:779) TESTING: checkCxxCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:779) Locate a functional Cxx compiler Executing: mpicxx --help stdout: Usage: g++ [options] file... Options: -pass-exit-codes Exit with highest error code from a phase. --help Display this information. --target-help Display target specific command line options. --help={common|optimizers|params|target|warnings|[^]{joined|separate|undocumented}}[,...]. Display specific types of command line options. (Use '-v --help' to display command line options of sub-processes). --version Display compiler version information. -dumpspecs Display all of the built in spec strings. -dumpversion Display the version of the compiler. -dumpmachine Display the compiler's target processor. -print-search-dirs Display the directories in the compiler's search path. -print-libgcc-file-name Display the name of the compiler's companion library. -print-file-name= Display the full path to library . -print-prog-name= Display the full path to compiler component . -print-multiarch Display the target's normalized GNU triplet, used as a component in the library path. -print-multi-directory Display the root directory for versions of libgcc. -print-multi-lib Display the mapping between command line options and multiple library search directories. -print-multi-os-directory Display the relative path to OS libraries. -print-sysroot Display the target libraries directory. -print-sysroot-headers-suffix Display the sysroot suffix used to find headers. -Wa, Pass comma-separated on to the assembler. -Wp, Pass comma-separated on to the preprocessor. -Wl, Pass comma-separated on to the linker. -Xassembler Pass on to the assembler. -Xpreprocessor Pass on to the preprocessor. -Xlinker Pass on to the linker. -save-temps Do not delete intermediate files. -save-temps= Do not delete intermediate files. -no-canonical-prefixes Do not canonicalize paths when building relative prefixes to other gcc components. -pipe Use pipes rather than intermediate files. -time Time the execution of each subprocess. -specs= Override built-in specs with the contents of . -std= Assume that the input sources are for . --sysroot= Use as the root directory for headers and libraries. -B Add to the compiler's search paths. -v Display the programs invoked by the compiler. -### Like -v but options quoted and commands not executed. -E Preprocess only; do not compile, assemble or link. -S Compile only; do not assemble or link. -c Compile and assemble, but do not link. -o Place the output into . -pie Create a position independent executable. -shared Create a shared library. -x Specify the language of the following input files. Permissible languages include: c c++ assembler none 'none' means revert to the default behavior of guessing the language based on the file's extension. Options starting with -g, -f, -m, -O, -W, or --param are automatically passed on to the various sub-processes invoked by g++. In order to pass other options on to these processes the -W options must be used. For bug reporting instructions, please see: . Checking for program /home/florian/software/bin/mpicxx...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpicxx...not found Checking for program /usr/local/sbin/mpicxx...not found Checking for program /usr/local/bin/mpicxx...not found Checking for program /usr/bin/mpicxx...found Defined make macro "CXX" to "mpicxx" Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language Cxx ================================================================================ TEST checkCxxPreprocessor from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:817) TESTING: checkCxxPreprocessor from config.setCompilers(config/BuildSystem/config/setCompilers.py:817) Locate a functional Cxx preprocessor Checking for program /home/florian/software/bin/mpicxx...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpicxx...not found Checking for program /usr/local/sbin/mpicxx...not found Checking for program /usr/local/bin/mpicxx...not found Checking for program /usr/bin/mpicxx...found Defined make macro "CXXCPP" to "mpicxx -E" Pushing language Cxx Executing: mpicxx -E -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc stdout: # 1 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.cc" # 1 "" # 1 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 1 "" 2 # 1 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.cc" # 1 "/tmp/petsc-KvGRNM/config.setCompilers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.cc" 2 # 1 "/tmp/petsc-KvGRNM/config.setCompilers/conffix.h" 1 extern "C" { } # 3 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.cc" 2 # 1 "/usr/include/c++/6.3.1/cstdlib" 1 3 # 39 "/usr/include/c++/6.3.1/cstdlib" 3 # 40 "/usr/include/c++/6.3.1/cstdlib" 3 # 1 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/c++config.h" 1 3 # 199 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/c++config.h" 3 # 199 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/c++config.h" 3 namespace std { typedef long unsigned int size_t; typedef long int ptrdiff_t; typedef decltype(nullptr) nullptr_t; } # 221 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/c++config.h" 3 namespace std { inline namespace __cxx11 __attribute__((__abi_tag__ ("cxx11"))) { } } namespace __gnu_cxx { inline namespace __cxx11 __attribute__((__abi_tag__ ("cxx11"))) { } } # 507 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/c++config.h" 3 # 1 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/os_defines.h" 1 3 # 39 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/os_defines.h" 3 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 40 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/os_defines.h" 2 3 # 508 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/c++config.h" 2 3 # 1 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/cpu_defines.h" 1 3 # 511 "/usr/include/c++/6.3.1/x86_64-pc-linux-gnu/bits/c++config.h" 2 3 # 42 "/usr/include/c++/6.3.1/cstdlib" 2 3 # 75 "/usr/include/c++/6.3.1/cstdlib" 3 # 1 "/usr/include/stdlib.h" 1 3 4 # 32 "/usr/include/stdlib.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 33 "/usr/include/stdlib.h" 2 3 4 extern "C" { # 1 "/usr/include/bits/waitflags.h" 1 3 4 # 42 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitstatus.h" 1 3 4 # 43 "/usr/include/stdlib.h" 2 3 4 # 56 "/usr/include/stdlib.h" 3 4 typedef struct { int quot; int rem; } div_t; typedef struct { long int quot; long int rem; } ldiv_t; __extension__ typedef struct { long long int quot; long long int rem; } lldiv_t; # 100 "/usr/include/stdlib.h" 3 4 extern size_t __ctype_get_mb_cur_max (void) throw () ; extern double atof (const char *__nptr) throw () __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern int atoi (const char *__nptr) throw () __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern long int atol (const char *__nptr) throw () __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; __extension__ extern long long int atoll (const char *__nptr) throw () __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern double strtod (const char *__restrict __nptr, char **__restrict __endptr) throw () __attribute__ ((__nonnull__ (1))); extern float strtof (const char *__restrict __nptr, char **__restrict __endptr) throw () __attribute__ ((__nonnull__ (1))); extern long double strtold (const char *__restrict __nptr, char **__restrict __endptr) throw () __attribute__ ((__nonnull__ (1))); extern long int strtol (const char *__restrict __nptr, char **__restrict __endptr, int __base) throw () __attribute__ ((__nonnull__ (1))); extern unsigned long int strtoul (const char *__restrict __nptr, char **__restrict __endptr, int __base) throw () __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoq (const char *__restrict __nptr, char **__restrict __endptr, int __base) throw () __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtouq (const char *__restrict __nptr, char **__restrict __endptr, int __base) throw () __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoll (const char *__restrict __nptr, char **__restrict __endptr, int __base) throw () __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtoull (const char *__restrict __nptr, char **__restrict __endptr, int __base) throw () __attribute__ ((__nonnull__ (1))); # 196 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 197 "/usr/include/stdlib.h" 2 3 4 extern long int strtol_l (const char *__restrict __nptr, char **__restrict __endptr, int __base, __locale_t __loc) throw () __attribute__ ((__nonnull__ (1, 4))); extern unsigned long int strtoul_l (const char *__restrict __nptr, char **__restrict __endptr, int __base, __locale_t __loc) throw () __attribute__ ((__nonnull__ (1, 4))); __extension__ extern long long int strtoll_l (const char *__restrict __nptr, char **__restrict __endptr, int __base, __locale_t __loc) throw () __attribute__ ((__nonnull__ (1, 4))); __extension__ extern unsigned long long int strtoull_l (const char *__restrict __nptr, char **__restrict __endptr, int __base, __locale_t __loc) throw () __attribute__ ((__nonnull__ (1, 4))); extern double strtod_l (const char *__restrict __nptr, char **__restrict __endptr, __locale_t __loc) throw () __attribute__ ((__nonnull__ (1, 3))); extern float strtof_l (const char *__restrict __nptr, char **__restrict __endptr, __locale_t __loc) throw () __attribute__ ((__nonnull__ (1, 3))); extern long double strtold_l (const char *__restrict __nptr, char **__restrict __endptr, __locale_t __loc) throw () __attribute__ ((__nonnull__ (1, 3))); # 266 "/usr/include/stdlib.h" 3 4 extern char *l64a (long int __n) throw () ; extern long int a64l (const char *__s) throw () __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 extern "C" { # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; typedef __ino64_t ino64_t; typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; typedef __off64_t off64_t; typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 typedef __useconds_t useconds_t; typedef __suseconds_t suseconds_t; # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 # 56 "/usr/include/sys/select.h" 3 4 typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 extern "C" { # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 } # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 extern "C" { __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) throw () __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) throw () __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) throw () __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 } # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 262 "/usr/include/sys/types.h" 3 4 typedef __blkcnt64_t blkcnt64_t; typedef __fsblkcnt64_t fsblkcnt64_t; typedef __fsfilcnt64_t fsfilcnt64_t; # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 } # 276 "/usr/include/stdlib.h" 2 3 4 extern long int random (void) throw (); extern void srandom (unsigned int __seed) throw (); extern char *initstate (unsigned int __seed, char *__statebuf, size_t __statelen) throw () __attribute__ ((__nonnull__ (2))); extern char *setstate (char *__statebuf) throw () __attribute__ ((__nonnull__ (1))); struct random_data { int32_t *fptr; int32_t *rptr; int32_t *state; int rand_type; int rand_deg; int rand_sep; int32_t *end_ptr; }; extern int random_r (struct random_data *__restrict __buf, int32_t *__restrict __result) throw () __attribute__ ((__nonnull__ (1, 2))); extern int srandom_r (unsigned int __seed, struct random_data *__buf) throw () __attribute__ ((__nonnull__ (2))); extern int initstate_r (unsigned int __seed, char *__restrict __statebuf, size_t __statelen, struct random_data *__restrict __buf) throw () __attribute__ ((__nonnull__ (2, 4))); extern int setstate_r (char *__restrict __statebuf, struct random_data *__restrict __buf) throw () __attribute__ ((__nonnull__ (1, 2))); extern int rand (void) throw (); extern void srand (unsigned int __seed) throw (); extern int rand_r (unsigned int *__seed) throw (); extern double drand48 (void) throw (); extern double erand48 (unsigned short int __xsubi[3]) throw () __attribute__ ((__nonnull__ (1))); extern long int lrand48 (void) throw (); extern long int nrand48 (unsigned short int __xsubi[3]) throw () __attribute__ ((__nonnull__ (1))); extern long int mrand48 (void) throw (); extern long int jrand48 (unsigned short int __xsubi[3]) throw () __attribute__ ((__nonnull__ (1))); extern void srand48 (long int __seedval) throw (); extern unsigned short int *seed48 (unsigned short int __seed16v[3]) throw () __attribute__ ((__nonnull__ (1))); extern void lcong48 (unsigned short int __param[7]) throw () __attribute__ ((__nonnull__ (1))); struct drand48_data { unsigned short int __x[3]; unsigned short int __old_x[3]; unsigned short int __c; unsigned short int __init; __extension__ unsigned long long int __a; }; extern int drand48_r (struct drand48_data *__restrict __buffer, double *__restrict __result) throw () __attribute__ ((__nonnull__ (1, 2))); extern int erand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, double *__restrict __result) throw () __attribute__ ((__nonnull__ (1, 2))); extern int lrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) throw () __attribute__ ((__nonnull__ (1, 2))); extern int nrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) throw () __attribute__ ((__nonnull__ (1, 2))); extern int mrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) throw () __attribute__ ((__nonnull__ (1, 2))); extern int jrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) throw () __attribute__ ((__nonnull__ (1, 2))); extern int srand48_r (long int __seedval, struct drand48_data *__buffer) throw () __attribute__ ((__nonnull__ (2))); extern int seed48_r (unsigned short int __seed16v[3], struct drand48_data *__buffer) throw () __attribute__ ((__nonnull__ (1, 2))); extern int lcong48_r (unsigned short int __param[7], struct drand48_data *__buffer) throw () __attribute__ ((__nonnull__ (1, 2))); extern void *malloc (size_t __size) throw () __attribute__ ((__malloc__)) ; extern void *calloc (size_t __nmemb, size_t __size) throw () __attribute__ ((__malloc__)) ; extern void *realloc (void *__ptr, size_t __size) throw () __attribute__ ((__warn_unused_result__)); extern void free (void *__ptr) throw (); extern void cfree (void *__ptr) throw (); # 1 "/usr/include/alloca.h" 1 3 4 # 24 "/usr/include/alloca.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 25 "/usr/include/alloca.h" 2 3 4 extern "C" { extern void *alloca (size_t __size) throw (); } # 454 "/usr/include/stdlib.h" 2 3 4 extern void *valloc (size_t __size) throw () __attribute__ ((__malloc__)) ; extern int posix_memalign (void **__memptr, size_t __alignment, size_t __size) throw () __attribute__ ((__nonnull__ (1))) ; extern void *aligned_alloc (size_t __alignment, size_t __size) throw () __attribute__ ((__malloc__)) __attribute__ ((__alloc_size__ (2))) ; extern void abort (void) throw () __attribute__ ((__noreturn__)); extern int atexit (void (*__func) (void)) throw () __attribute__ ((__nonnull__ (1))); extern "C++" int at_quick_exit (void (*__func) (void)) throw () __asm ("at_quick_exit") __attribute__ ((__nonnull__ (1))); extern int on_exit (void (*__func) (int __status, void *__arg), void *__arg) throw () __attribute__ ((__nonnull__ (1))); extern void exit (int __status) throw () __attribute__ ((__noreturn__)); extern void quick_exit (int __status) throw () __attribute__ ((__noreturn__)); extern void _Exit (int __status) throw () __attribute__ ((__noreturn__)); extern char *getenv (const char *__name) throw () __attribute__ ((__nonnull__ (1))) ; extern char *secure_getenv (const char *__name) throw () __attribute__ ((__nonnull__ (1))) ; extern int putenv (char *__string) throw () __attribute__ ((__nonnull__ (1))); extern int setenv (const char *__name, const char *__value, int __replace) throw () __attribute__ ((__nonnull__ (2))); extern int unsetenv (const char *__name) throw () __attribute__ ((__nonnull__ (1))); extern int clearenv (void) throw (); # 567 "/usr/include/stdlib.h" 3 4 extern char *mktemp (char *__template) throw () __attribute__ ((__nonnull__ (1))); # 580 "/usr/include/stdlib.h" 3 4 extern int mkstemp (char *__template) __attribute__ ((__nonnull__ (1))) ; # 590 "/usr/include/stdlib.h" 3 4 extern int mkstemp64 (char *__template) __attribute__ ((__nonnull__ (1))) ; # 602 "/usr/include/stdlib.h" 3 4 extern int mkstemps (char *__template, int __suffixlen) __attribute__ ((__nonnull__ (1))) ; # 612 "/usr/include/stdlib.h" 3 4 extern int mkstemps64 (char *__template, int __suffixlen) __attribute__ ((__nonnull__ (1))) ; # 623 "/usr/include/stdlib.h" 3 4 extern char *mkdtemp (char *__template) throw () __attribute__ ((__nonnull__ (1))) ; # 634 "/usr/include/stdlib.h" 3 4 extern int mkostemp (char *__template, int __flags) __attribute__ ((__nonnull__ (1))) ; # 644 "/usr/include/stdlib.h" 3 4 extern int mkostemp64 (char *__template, int __flags) __attribute__ ((__nonnull__ (1))) ; # 654 "/usr/include/stdlib.h" 3 4 extern int mkostemps (char *__template, int __suffixlen, int __flags) __attribute__ ((__nonnull__ (1))) ; # 666 "/usr/include/stdlib.h" 3 4 extern int mkostemps64 (char *__template, int __suffixlen, int __flags) __attribute__ ((__nonnull__ (1))) ; extern int system (const char *__command) ; extern char *canonicalize_file_name (const char *__name) throw () __attribute__ ((__nonnull__ (1))) ; # 694 "/usr/include/stdlib.h" 3 4 extern char *realpath (const char *__restrict __name, char *__restrict __resolved) throw () ; typedef int (*__compar_fn_t) (const void *, const void *); typedef __compar_fn_t comparison_fn_t; typedef int (*__compar_d_fn_t) (const void *, const void *, void *); extern void *bsearch (const void *__key, const void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 2, 5))) ; extern void qsort (void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 4))); extern void qsort_r (void *__base, size_t __nmemb, size_t __size, __compar_d_fn_t __compar, void *__arg) __attribute__ ((__nonnull__ (1, 4))); extern int abs (int __x) throw () __attribute__ ((__const__)) ; extern long int labs (long int __x) throw () __attribute__ ((__const__)) ; __extension__ extern long long int llabs (long long int __x) throw () __attribute__ ((__const__)) ; extern div_t div (int __numer, int __denom) throw () __attribute__ ((__const__)) ; extern ldiv_t ldiv (long int __numer, long int __denom) throw () __attribute__ ((__const__)) ; __extension__ extern lldiv_t lldiv (long long int __numer, long long int __denom) throw () __attribute__ ((__const__)) ; # 772 "/usr/include/stdlib.h" 3 4 extern char *ecvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) throw () __attribute__ ((__nonnull__ (3, 4))) ; extern char *fcvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) throw () __attribute__ ((__nonnull__ (3, 4))) ; extern char *gcvt (double __value, int __ndigit, char *__buf) throw () __attribute__ ((__nonnull__ (3))) ; extern char *qecvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) throw () __attribute__ ((__nonnull__ (3, 4))) ; extern char *qfcvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) throw () __attribute__ ((__nonnull__ (3, 4))) ; extern char *qgcvt (long double __value, int __ndigit, char *__buf) throw () __attribute__ ((__nonnull__ (3))) ; extern int ecvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) throw () __attribute__ ((__nonnull__ (3, 4, 5))); extern int fcvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) throw () __attribute__ ((__nonnull__ (3, 4, 5))); extern int qecvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) throw () __attribute__ ((__nonnull__ (3, 4, 5))); extern int qfcvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) throw () __attribute__ ((__nonnull__ (3, 4, 5))); extern int mblen (const char *__s, size_t __n) throw (); extern int mbtowc (wchar_t *__restrict __pwc, const char *__restrict __s, size_t __n) throw (); extern int wctomb (char *__s, wchar_t __wchar) throw (); extern size_t mbstowcs (wchar_t *__restrict __pwcs, const char *__restrict __s, size_t __n) throw (); extern size_t wcstombs (char *__restrict __s, const wchar_t *__restrict __pwcs, size_t __n) throw (); extern int rpmatch (const char *__response) throw () __attribute__ ((__nonnull__ (1))) ; # 859 "/usr/include/stdlib.h" 3 4 extern int getsubopt (char **__restrict __optionp, char *const *__restrict __tokens, char **__restrict __valuep) throw () __attribute__ ((__nonnull__ (1, 2, 3))) ; extern void setkey (const char *__key) throw () __attribute__ ((__nonnull__ (1))); extern int posix_openpt (int __oflag) ; extern int grantpt (int __fd) throw (); extern int unlockpt (int __fd) throw (); extern char *ptsname (int __fd) throw () ; extern int ptsname_r (int __fd, char *__buf, size_t __buflen) throw () __attribute__ ((__nonnull__ (2))); extern int getpt (void); extern int getloadavg (double __loadavg[], int __nelem) throw () __attribute__ ((__nonnull__ (1))); # 921 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/bits/stdlib-float.h" 1 3 4 # 922 "/usr/include/stdlib.h" 2 3 4 # 934 "/usr/include/stdlib.h" 3 4 } # 76 "/usr/include/c++/6.3.1/cstdlib" 2 3 # 118 "/usr/include/c++/6.3.1/cstdlib" 3 extern "C++" { namespace std __attribute__ ((__visibility__ ("default"))) { using ::div_t; using ::ldiv_t; using ::abort; using ::abs; using ::atexit; using ::at_quick_exit; using ::atof; using ::atoi; using ::atol; using ::bsearch; using ::calloc; using ::div; using ::exit; using ::free; using ::getenv; using ::labs; using ::ldiv; using ::malloc; using ::mblen; using ::mbstowcs; using ::mbtowc; using ::qsort; using ::quick_exit; using ::rand; using ::realloc; using ::srand; using ::strtod; using ::strtol; using ::strtoul; using ::system; using ::wcstombs; using ::wctomb; inline long abs(long __i) { return __builtin_labs(__i); } inline ldiv_t div(long __i, long __j) { return ldiv(__i, __j); } inline long long abs(long long __x) { return __builtin_llabs (__x); } inline __int128 abs(__int128 __x) { return __x >= 0 ? __x : -__x; } # 201 "/usr/include/c++/6.3.1/cstdlib" 3 } # 215 "/usr/include/c++/6.3.1/cstdlib" 3 namespace __gnu_cxx __attribute__ ((__visibility__ ("default"))) { using ::lldiv_t; using ::_Exit; using ::llabs; inline lldiv_t div(long long __n, long long __d) { lldiv_t __q; __q.quot = __n / __d; __q.rem = __n % __d; return __q; } using ::lldiv; # 247 "/usr/include/c++/6.3.1/cstdlib" 3 using ::atoll; using ::strtoll; using ::strtoull; using ::strtof; using ::strtold; } namespace std { using ::__gnu_cxx::lldiv_t; using ::__gnu_cxx::_Exit; using ::__gnu_cxx::llabs; using ::__gnu_cxx::div; using ::__gnu_cxx::lldiv; using ::__gnu_cxx::atoll; using ::__gnu_cxx::strtof; using ::__gnu_cxx::strtoll; using ::__gnu_cxx::strtoull; using ::__gnu_cxx::strtold; } } # 3 "/tmp/petsc-KvGRNM/config.setCompilers/conftest.cc" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Popping language Cxx ================================================================================ TEST checkFortranCompiler from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:934) TESTING: checkFortranCompiler from config.setCompilers(config/BuildSystem/config/setCompilers.py:934) Locate a functional Fortran compiler Executing: mpif90 --help stdout: Usage: gfortran [options] file... Options: -pass-exit-codes Exit with highest error code from a phase. --help Display this information. --target-help Display target specific command line options. --help={common|optimizers|params|target|warnings|[^]{joined|separate|undocumented}}[,...]. Display specific types of command line options. (Use '-v --help' to display command line options of sub-processes). --version Display compiler version information. -dumpspecs Display all of the built in spec strings. -dumpversion Display the version of the compiler. -dumpmachine Display the compiler's target processor. -print-search-dirs Display the directories in the compiler's search path. -print-libgcc-file-name Display the name of the compiler's companion library. -print-file-name= Display the full path to library . -print-prog-name= Display the full path to compiler component . -print-multiarch Display the target's normalized GNU triplet, used as a component in the library path. -print-multi-directory Display the root directory for versions of libgcc. -print-multi-lib Display the mapping between command line options and multiple library search directories. -print-multi-os-directory Display the relative path to OS libraries. -print-sysroot Display the target libraries directory. -print-sysroot-headers-suffix Display the sysroot suffix used to find headers. -Wa, Pass comma-separated on to the assembler. -Wp, Pass comma-separated on to the preprocessor. -Wl, Pass comma-separated on to the linker. -Xassembler Pass on to the assembler. -Xpreprocessor Pass on to the preprocessor. -Xlinker Pass on to the linker. -save-temps Do not delete intermediate files. -save-temps= Do not delete intermediate files. -no-canonical-prefixes Do not canonicalize paths when building relative prefixes to other gcc components. -pipe Use pipes rather than intermediate files. -time Time the execution of each subprocess. -specs= Override built-in specs with the contents of . -std= Assume that the input sources are for . --sysroot= Use as the root directory for headers and libraries. -B Add to the compiler's search paths. -v Display the programs invoked by the compiler. -### Like -v but options quoted and commands not executed. -E Preprocess only; do not compile, assemble or link. -S Compile only; do not assemble or link. -c Compile and assemble, but do not link. -o Place the output into . -pie Create a position independent executable. -shared Create a shared library. -x Specify the language of the following input files. Permissible languages include: c c++ assembler none 'none' means revert to the default behavior of guessing the language based on the file's extension. Options starting with -g, -f, -m, -O, -W, or --param are automatically passed on to the various sub-processes invoked by gfortran. In order to pass other options on to these processes the -W options must be used. For bug reporting instructions, please see: . Checking for program /home/florian/software/bin/mpif90...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpif90...not found Checking for program /usr/local/sbin/mpif90...not found Checking for program /usr/local/bin/mpif90...not found Checking for program /usr/bin/mpif90...found Defined make macro "FC" to "mpif90" Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language FC ================================================================================ TEST checkFortranComments from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:955) TESTING: checkFortranComments from config.setCompilers(config/BuildSystem/config/setCompilers.py:955) Make sure fortran comment "!" works Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: ! comment program main end Fortran comments can use ! in column 1 Popping language FC ================================================================================ TEST checkLargeFileIO from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1072) TESTING: checkLargeFileIO from config.setCompilers(config/BuildSystem/config/setCompilers.py:1072) ================================================================================ TEST checkArchiver from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1171) TESTING: checkArchiver from config.setCompilers(config/BuildSystem/config/setCompilers.py:1171) Check that the archiver exists and can make a library usable by the compiler Pushing language C Executing: ar -V stdout: GNU ar (GNU Binutils) 2.27 Copyright (C) 2016 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Executing: ar -V stdout: GNU ar (GNU Binutils) 2.27 Copyright (C) 2016 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /home/florian/software/bin/ar...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/ar...not found Checking for program /usr/local/sbin/ar...not found Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /home/florian/software/bin/ranlib...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/ranlib...not found Checking for program /usr/local/sbin/ranlib...not found Checking for program /usr/local/bin/ranlib...not found Checking for program /usr/bin/ranlib...found Defined make macro "RANLIB" to "/usr/bin/ranlib -c" Executing: /usr/bin/ar cr /tmp/petsc-KvGRNM/config.setCompilers/libconf1.a /tmp/petsc-KvGRNM/config.setCompilers/conf1.o Executing: /usr/bin/ranlib -c /tmp/petsc-KvGRNM/config.setCompilers/libconf1.a Possible ERROR while running ranlib: stderr: /usr/bin/ranlib: invalid option -- 'c' Ranlib is not functional with your archiver. Try --with-ranlib=true if ranlib is unnecessary. Executing: ar -V stdout: GNU ar (GNU Binutils) 2.27 Copyright (C) 2016 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Executing: ar -V stdout: GNU ar (GNU Binutils) 2.27 Copyright (C) 2016 Free Software Foundation, Inc. This program is free software; you may redistribute it under the terms of the GNU General Public License version 3 or (at your option) any later version. This program has absolutely no warranty. Defined make macro "FAST_AR_FLAGS" to "Scq" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(int a) { return a+1; } Checking for program /home/florian/software/bin/ar...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/ar...not found Checking for program /usr/local/sbin/ar...not found Checking for program /usr/local/bin/ar...not found Checking for program /usr/bin/ar...found Defined make macro "AR" to "/usr/bin/ar" Checking for program /home/florian/software/bin/ranlib...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/ranlib...not found Checking for program /usr/local/sbin/ranlib...not found Checking for program /usr/local/bin/ranlib...not found Checking for program /usr/bin/ranlib...found Defined make macro "RANLIB" to "/usr/bin/ranlib" Executing: /usr/bin/ar cr /tmp/petsc-KvGRNM/config.setCompilers/libconf1.a /tmp/petsc-KvGRNM/config.setCompilers/conf1.o Executing: /usr/bin/ranlib /tmp/petsc-KvGRNM/config.setCompilers/libconf1.a Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern int foo(int); int main() { int b = foo(1); if (b); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -L/tmp/petsc-KvGRNM/config.setCompilers -lconf1 Defined make macro "AR_FLAGS" to "cr" Defined make macro "AR_LIB_SUFFIX" to "a" Popping language C ================================================================================ TEST checkSharedLinker from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1280) TESTING: checkSharedLinker from config.setCompilers(config/BuildSystem/config/setCompilers.py:1280) Check that the linker can produce shared libraries Executing: uname -s stdout: Linux Checking shared linker mpicc using flags ['-shared'] Checking for program /home/florian/software/bin/mpicc...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpicc...not found Checking for program /usr/local/sbin/mpicc...not found Checking for program /usr/local/bin/mpicc...not found Checking for program /usr/bin/mpicc...found Defined make macro "LD_SHARED" to "mpicc" Trying C compiler flag Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -shared /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Valid C linker flag -shared Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {fprintf(stdout,"hello"); return 0;} Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: /tmp/petsc-KvGRNM/config.setCompilers/conftest.o: relocation R_X86_64_32 against `.rodata' can not be used when making a shared object; recompile with -fPIC /usr/bin/ld: final link failed: Nonrepresentable section on output collect2: error: ld returned 1 exit status Rejected C compiler flag because it was not compatible with shared linker mpicc using flags ['-shared'] Trying C compiler flag -fPIC Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -fPIC Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Valid C linker flag -shared Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {fprintf(stdout,"hello"); return 0;} Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int foo(void); int main() { int ret = foo(); if(ret);; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -L/tmp/petsc-KvGRNM/config.setCompilers -lconftest Using shared linker mpicc with flags ['-shared'] and library extension so Executing: uname -s stdout: Linux ================================================================================ TEST checkPIC from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1031) TESTING: checkPIC from config.setCompilers(config/BuildSystem/config/setCompilers.py:1031) Determine the PIC option for each compiler Pushing language C Trying C compiler flag Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include void foo(void){fprintf(stdout,"hello"); return;} void bar(void){foo();} Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Accepted C compiler flag Popping language C Pushing language Cxx Trying Cxx compiler flag Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include void foo(void){fprintf(stdout,"hello"); return;} void bar(void){foo();} Pushing language C Pushing language C Popping language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: /tmp/petsc-KvGRNM/config.setCompilers/conftest.o: relocation R_X86_64_32 against `.rodata' can not be used when making a shared object; recompile with -fPIC /usr/bin/ld: final link failed: Nonrepresentable section on output collect2: error: ld returned 1 exit status Rejected Cxx compiler flag because shared linker cannot handle it Trying Cxx compiler flag -fPIC Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -fPIC Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include void foo(void){fprintf(stdout,"hello"); return;} void bar(void){foo();} Pushing language C Pushing language C Popping language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Accepted Cxx compiler flag -fPIC Popping language Cxx Pushing language FC Trying FC compiler flag Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Added FC compiler flag Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: function foo(a) real:: a,x,bar common /xx/ x x=a foo = bar(x) end Pushing language C Pushing language C Popping language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: /tmp/petsc-KvGRNM/config.setCompilers/conftest.o: relocation R_X86_64_32 against undefined symbol `xx_' can not be used when making a shared object; recompile with -fPIC /usr/bin/ld: final link failed: Nonrepresentable section on output collect2: error: ld returned 1 exit status Rejected FC compiler flag because shared linker cannot handle it Trying FC compiler flag -fPIC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Added FC compiler flag -fPIC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: function foo(a) real:: a,x,bar common /xx/ x x=a foo = bar(x) end Pushing language C Pushing language C Popping language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Accepted FC compiler flag -fPIC Popping language FC ================================================================================ TEST checkSharedLinkerPaths from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1369) TESTING: checkSharedLinkerPaths from config.setCompilers(config/BuildSystem/config/setCompilers.py:1369) Determine the shared linker path options - IRIX: -rpath - Linux, OSF: -Wl,-rpath, - Solaris: -R - FreeBSD: -Wl,-R, Pushing language C Executing: uname -s stdout: Linux Executing: mpicc -V Trying C linker flag -Wl,-rpath, Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -Wl,-rpath,/home/florian/software/petsc -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Valid C linker flag -Wl,-rpath,/home/florian/software/petsc Popping language C Pushing language Cxx Executing: uname -s stdout: Linux Executing: mpicc -V Trying Cxx linker flag -Wl,-rpath, Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -Wl,-rpath,/home/florian/software/petsc /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Valid Cxx linker flag -Wl,-rpath,/home/florian/software/petsc Popping language Cxx Pushing language FC Executing: uname -s stdout: Linux Executing: mpicc -V Trying FC linker flag -Wl,-rpath, Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -Wl,-rpath,/home/florian/software/petsc -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Valid FC linker flag -Wl,-rpath,/home/florian/software/petsc Popping language FC ================================================================================ TEST checkLibC from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1404) TESTING: checkLibC from config.setCompilers(config/BuildSystem/config/setCompilers.py:1404) Test whether we need to explicitly include libc in shared linking - Mac OSX requires an explicit reference to libc for shared linking Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {void *chunk = malloc(31); free(chunk); return 0;} Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o Shared linking does not require an explicit libc reference ================================================================================ TEST checkDynamicLinker from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1453) TESTING: checkDynamicLinker from config.setCompilers(config/BuildSystem/config/setCompilers.py:1453) Check that the linker can dynamicaly load shared libraries Checking for header: dlfcn.h All intermediate test results are stored in /tmp/petsc-KvGRNM/config.headers Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/dlfcn.h" 1 3 4 # 22 "/usr/include/dlfcn.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 23 "/usr/include/dlfcn.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 25 "/usr/include/dlfcn.h" 2 3 4 # 1 "/usr/include/bits/dlfcn.h" 1 3 4 # 28 "/usr/include/dlfcn.h" 2 3 4 # 52 "/usr/include/dlfcn.h" 3 4 extern void *dlopen (const char *__file, int __mode) __attribute__ ((__nothrow__)); extern int dlclose (void *__handle) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern void *dlsym (void *__restrict __handle, const char *__restrict __name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 82 "/usr/include/dlfcn.h" 3 4 extern char *dlerror (void) __attribute__ ((__nothrow__ , __leaf__)); # 188 "/usr/include/dlfcn.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_DLFCN_H" to "1" Checking for functions [dlopen dlsym dlclose] in library ['dl'] [] Pushing language C All intermediate test results are stored in /tmp/petsc-KvGRNM/config.libraries Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -fPIC /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char dlopen(); static void _check_dlopen() { dlopen(); } char dlsym(); static void _check_dlsym() { dlsym(); } char dlclose(); static void _check_dlclose() { dlclose(); } int main() { _check_dlopen(); _check_dlsym(); _check_dlclose();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC /tmp/petsc-KvGRNM/config.libraries/conftest.o -ldl Defined "HAVE_LIBDL" to "1" Popping language C Adding ['dl'] to LIBS Executing: uname -s stdout: Linux Checking dynamic linker mpicc using flags ['-shared'] Checking for program /home/florian/software/bin/mpicc...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpicc...not found Checking for program /usr/local/sbin/mpicc...not found Checking for program /usr/local/bin/mpicc...not found Checking for program /usr/bin/mpicc...found Defined make macro "DYNAMICLINKER" to "mpicc" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -ldl Valid C linker flag -shared Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int foo(void) {printf("test");return 0;} Pushing language C Popping language C Pushing language CUDA Popping language CUDA Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.setCompilers/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:11:3: warning: implicit declaration of function 'printf' [-Wimplicit-function-declaration] printf("Could not load symbol\n"); ^~~~~~ /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:11:3: warning: incompatible implicit declaration of built-in function 'printf' /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:11:3: note: include '' or provide a declaration of 'printf' /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:15:3: warning: incompatible implicit declaration of built-in function 'printf' printf("Invalid return from foo()\n"); ^~~~~~ /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:15:3: note: include '' or provide a declaration of 'printf' /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:19:3: warning: incompatible implicit declaration of built-in function 'printf' printf("Could not close library\n"); ^~~~~~ /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:19:3: note: include '' or provide a declaration of 'printf' Source: #include "confdefs.h" #include "conffix.h" #include int main() { void *handle = dlopen("/tmp/petsc-KvGRNM/config.setCompilers/libconftest.so", 0); int (*foo)(void) = (int (*)(void)) dlsym(handle, "foo"); if (!foo) { printf("Could not load symbol\n"); return -1; } if ((*foo)()) { printf("Invalid return from foo()\n"); return -1; } if (dlclose(handle)) { printf("Could not close library\n"); return -1; } ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -ldl Using dynamic linker mpicc with flags ['-shared'] and library extension so ================================================================================ TEST output from config.setCompilers(/home/florian/software/petsc/config/BuildSystem/config/setCompilers.py:1505) TESTING: output from config.setCompilers(config/BuildSystem/config/setCompilers.py:1505) Output module data as defines and substitutions Substituting "CC" with "mpicc" Substituting "CFLAGS" with " -fPIC " Defined make macro "CC_LINKER_SLFLAG" to "-Wl,-rpath," Substituting "CPP" with "mpicc -E" Substituting "CPPFLAGS" with "" Substituting "CXX" with "mpicxx" Substituting "CXX_CXXFLAGS" with " -fPIC" Substituting "CXXFLAGS" with "" Substituting "CXX_LINKER_SLFLAG" with "-Wl,-rpath," Substituting "CXXCPP" with "mpicxx -E" Substituting "CXXCPPFLAGS" with "" Substituting "FC" with "mpif90" Substituting "FFLAGS" with " -fPIC" Defined make macro "FC_LINKER_SLFLAG" to "-Wl,-rpath," Substituting "LDFLAGS" with "" Substituting "LIBS" with "-ldl " Substituting "SHARED_LIBRARY_FLAG" with "-shared" ================================================================================ TEST configureIndexSize from PETSc.options.indexTypes(/home/florian/software/petsc/config/PETSc/options/indexTypes.py:31) TESTING: configureIndexSize from PETSc.options.indexTypes(config/PETSc/options/indexTypes.py:31) Defined make macro "PETSC_INDEX_SIZE" to "32" ================================================================================ TEST checkSharedDynamicPicOptions from PETSc.options.sharedLibraries(/home/florian/software/petsc/config/PETSc/options/sharedLibraries.py:37) TESTING: checkSharedDynamicPicOptions from PETSc.options.sharedLibraries(config/PETSc/options/sharedLibraries.py:37) ================================================================================ TEST configureSharedLibraries from PETSc.options.sharedLibraries(/home/florian/software/petsc/config/PETSc/options/sharedLibraries.py:53) TESTING: configureSharedLibraries from PETSc.options.sharedLibraries(config/PETSc/options/sharedLibraries.py:53) Checks whether shared libraries should be used, for which you must - Specify --with-shared-libraries - Have found a working shared linker Defines PETSC_USE_SHARED_LIBRARIES if they are used Defined make rule "shared_arch" with dependencies "shared_linux" and code [] Defined make macro "SONAME_FUNCTION" to "$(1).so.$(2)" Defined make macro "SL_LINKER_FUNCTION" to "-shared -Wl,-soname,$(call SONAME_FUNCTION,$(notdir $(1)),$(2))" Defined make macro "BUILDSHAREDLIB" to "yes" Defined "HAVE_SHARED_LIBRARIES" to "1" Defined "USE_SHARED_LIBRARIES" to "1" ================================================================================ TEST configureDynamicLibraries from PETSc.options.sharedLibraries(/home/florian/software/petsc/config/PETSc/options/sharedLibraries.py:96) TESTING: configureDynamicLibraries from PETSc.options.sharedLibraries(config/PETSc/options/sharedLibraries.py:96) Checks whether dynamic loading is available (with dlfcn.h and libdl) Defined "HAVE_DYNAMIC_LIBRARIES" to "1" ================================================================================ TEST configureSerializedFunctions from PETSc.options.sharedLibraries(/home/florian/software/petsc/config/PETSc/options/sharedLibraries.py:102) TESTING: configureSerializedFunctions from PETSc.options.sharedLibraries(config/PETSc/options/sharedLibraries.py:102) Defines PETSC_SERIALIZE_FUNCTIONS if they are used Requires shared libraries ================================================================================ TEST configureCompilerFlags from config.compilerFlags(/home/florian/software/petsc/config/BuildSystem/config/compilerFlags.py:72) TESTING: configureCompilerFlags from config.compilerFlags(config/BuildSystem/config/compilerFlags.py:72) Get the default compiler flags Defined make macro "MPICC_SHOW" to "gcc -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi" Trying C compiler flag -Wall Trying C compiler flag -Wwrite-strings Trying C compiler flag -Wno-strict-aliasing Trying C compiler flag -Wno-unknown-pragmas Trying C compiler flag -fvisibility=hidden Defined make macro "MPICC_SHOW" to "gcc -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi" Trying C compiler flag -g3 Defined make macro "MPICXX_SHOW" to "g++ -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_cxx -lmpi" Trying Cxx compiler flag -Wall Trying Cxx compiler flag -Wwrite-strings Trying Cxx compiler flag -Wno-strict-aliasing Trying Cxx compiler flag -Wno-unknown-pragmas Trying Cxx compiler flag -fvisibility=hidden Defined make macro "MPICXX_SHOW" to "g++ -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_cxx -lmpi" Trying Cxx compiler flag -g Defined make macro "MPIFC_SHOW" to "/usr/bin/gfortran -I/usr/include -pthread -I/usr/lib/openmpi -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi" Trying FC compiler flag -Wall Trying FC compiler flag -ffree-line-length-0 Trying FC compiler flag -Wno-unused-dummy-argument Defined make macro "MPIFC_SHOW" to "/usr/bin/gfortran -I/usr/include -pthread -I/usr/lib/openmpi -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi" Trying FC compiler flag -g Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -Wall Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -Wwrite-strings /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -Wwrite-strings Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -Wno-strict-aliasing Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -Wno-unknown-pragmas Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -fvisibility=hidden Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added C compiler flag -g3 Popping language C Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -Wall -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -Wall Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -Wall -Wwrite-strings -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -Wwrite-strings Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -Wall -Wwrite-strings -Wno-strict-aliasing -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -Wno-strict-aliasing Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -Wno-unknown-pragmas Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -fvisibility=hidden Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Added Cxx compiler flag -g Popping language Cxx Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Added FC compiler flag -Wall Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Added FC compiler flag -ffree-line-length-0 Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Added FC compiler flag -Wno-unused-dummy-argument Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Added FC compiler flag -g Popping language FC Executing: mpicc --version stdout: gcc (GCC) 6.3.1 20170109 Copyright (C) 2016 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. getCompilerVersion: mpicc gcc (GCC) 6.3.1 20170109 Executing: mpicc -show stdout: gcc -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi Executing: gcc --help stdout: Usage: gcc [options] file... Options: -pass-exit-codes Exit with highest error code from a phase. --help Display this information. --target-help Display target specific command line options. --help={common|optimizers|params|target|warnings|[^]{joined|separate|undocumented}}[,...]. Display specific types of command line options. (Use '-v --help' to display command line options of sub-processes). --version Display compiler version information. -dumpspecs Display all of the built in spec strings. -dumpversion Display the version of the compiler. -dumpmachine Display the compiler's target processor. -print-search-dirs Display the directories in the compiler's search path. -print-libgcc-file-name Display the name of the compiler's companion library. -print-file-name= Display the full path to library . -print-prog-name= Display the full path to compiler component . -print-multiarch Display the target's normalized GNU triplet, used as a component in the library path. -print-multi-directory Display the root directory for versions of libgcc. -print-multi-lib Display the mapping between command line options and multiple library search directories. -print-multi-os-directory Display the relative path to OS libraries. -print-sysroot Display the target libraries directory. -print-sysroot-headers-suffix Display the sysroot suffix used to find headers. -Wa, Pass comma-separated on to the assembler. -Wp, Pass comma-separated on to the preprocessor. -Wl, Pass comma-separated on to the linker. -Xassembler Pass on to the assembler. -Xpreprocessor Pass on to the preprocessor. -Xlinker Pass on to the linker. -save-temps Do not delete intermediate files. -save-temps= Do not delete intermediate files. -no-canonical-prefixes Do not canonicalize paths when building relative prefixes to other gcc components. -pipe Use pipes rather than intermediate files. -time Time the execution of each subprocess. -specs= Override built-in specs with the contents of . -std= Assume that the input sources are for . --sysroot= Use as the root directory for headers and libraries. -B Add to the compiler's search paths. -v Display the programs invoked by the compiler. -### Like -v but options quoted and commands not executed. -E Preprocess only; do not compile, assemble or link. -S Compile only; do not assemble or link. -c Compile and assemble, but do not link. -o Place the output into . -pie Create a position independent executable. -shared Create a shared library. -x Specify the language of the following input files. Permissible languages include: c c++ assembler none 'none' means revert to the default behavior of guessing the language based on the file's extension. Options starting with -g, -f, -m, -O, -W, or --param are automatically passed on to the various sub-processes invoked by gcc. In order to pass other options on to these processes the -W options must be used. For bug reporting instructions, please see: . Executing: gcc --help stdout: Usage: gcc [options] file... Options: -pass-exit-codes Exit with highest error code from a phase. --help Display this information. --target-help Display target specific command line options. --help={common|optimizers|params|target|warnings|[^]{joined|separate|undocumented}}[,...]. Display specific types of command line options. (Use '-v --help' to display command line options of sub-processes). --version Display compiler version information. -dumpspecs Display all of the built in spec strings. -dumpversion Display the version of the compiler. -dumpmachine Display the compiler's target processor. -print-search-dirs Display the directories in the compiler's search path. -print-libgcc-file-name Display the name of the compiler's companion library. -print-file-name= Display the full path to library . -print-prog-name= Display the full path to compiler component . -print-multiarch Display the target's normalized GNU triplet, used as a component in the library path. -print-multi-directory Display the root directory for versions of libgcc. -print-multi-lib Display the mapping between command line options and multiple library search directories. -print-multi-os-directory Display the relative path to OS libraries. -print-sysroot Display the target libraries directory. -print-sysroot-headers-suffix Display the sysroot suffix used to find headers. -Wa, Pass comma-separated on to the assembler. -Wp, Pass comma-separated on to the preprocessor. -Wl, Pass comma-separated on to the linker. -Xassembler Pass on to the assembler. -Xpreprocessor Pass on to the preprocessor. -Xlinker Pass on to the linker. -save-temps Do not delete intermediate files. -save-temps= Do not delete intermediate files. -no-canonical-prefixes Do not canonicalize paths when building relative prefixes to other gcc components. -pipe Use pipes rather than intermediate files. -time Time the execution of each subprocess. -specs= Override built-in specs with the contents of . -std= Assume that the input sources are for . --sysroot= Use as the root directory for headers and libraries. -B Add to the compiler's search paths. -v Display the programs invoked by the compiler. -### Like -v but options quoted and commands not executed. -E Preprocess only; do not compile, assemble or link. -S Compile only; do not assemble or link. -c Compile and assemble, but do not link. -o Place the output into . -pie Create a position independent executable. -shared Create a shared library. -x Specify the language of the following input files. Permissible languages include: c c++ assembler none 'none' means revert to the default behavior of guessing the language based on the file's extension. Options starting with -g, -f, -m, -O, -W, or --param are automatically passed on to the various sub-processes invoked by gcc. In order to pass other options on to these processes the -W options must be used. For bug reporting instructions, please see: . Executing: mpicc -show stdout: gcc -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi Executing: mpicxx --version stdout: g++ (GCC) 6.3.1 20170109 Copyright (C) 2016 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. getCompilerVersion: mpicxx g++ (GCC) 6.3.1 20170109 Executing: mpicxx -show stdout: g++ -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_cxx -lmpi Executing: g++ --help stdout: Usage: g++ [options] file... Options: -pass-exit-codes Exit with highest error code from a phase. --help Display this information. --target-help Display target specific command line options. --help={common|optimizers|params|target|warnings|[^]{joined|separate|undocumented}}[,...]. Display specific types of command line options. (Use '-v --help' to display command line options of sub-processes). --version Display compiler version information. -dumpspecs Display all of the built in spec strings. -dumpversion Display the version of the compiler. -dumpmachine Display the compiler's target processor. -print-search-dirs Display the directories in the compiler's search path. -print-libgcc-file-name Display the name of the compiler's companion library. -print-file-name= Display the full path to library . -print-prog-name= Display the full path to compiler component . -print-multiarch Display the target's normalized GNU triplet, used as a component in the library path. -print-multi-directory Display the root directory for versions of libgcc. -print-multi-lib Display the mapping between command line options and multiple library search directories. -print-multi-os-directory Display the relative path to OS libraries. -print-sysroot Display the target libraries directory. -print-sysroot-headers-suffix Display the sysroot suffix used to find headers. -Wa, Pass comma-separated on to the assembler. -Wp, Pass comma-separated on to the preprocessor. -Wl, Pass comma-separated on to the linker. -Xassembler Pass on to the assembler. -Xpreprocessor Pass on to the preprocessor. -Xlinker Pass on to the linker. -save-temps Do not delete intermediate files. -save-temps= Do not delete intermediate files. -no-canonical-prefixes Do not canonicalize paths when building relative prefixes to other gcc components. -pipe Use pipes rather than intermediate files. -time Time the execution of each subprocess. -specs= Override built-in specs with the contents of . -std= Assume that the input sources are for . --sysroot= Use as the root directory for headers and libraries. -B Add to the compiler's search paths. -v Display the programs invoked by the compiler. -### Like -v but options quoted and commands not executed. -E Preprocess only; do not compile, assemble or link. -S Compile only; do not assemble or link. -c Compile and assemble, but do not link. -o Place the output into . -pie Create a position independent executable. -shared Create a shared library. -x Specify the language of the following input files. Permissible languages include: c c++ assembler none 'none' means revert to the default behavior of guessing the language based on the file's extension. Options starting with -g, -f, -m, -O, -W, or --param are automatically passed on to the various sub-processes invoked by g++. In order to pass other options on to these processes the -W options must be used. For bug reporting instructions, please see: . Executing: mpicxx -show stdout: g++ -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_cxx -lmpi Executing: mpif90 --version stdout: GNU Fortran (GCC) 6.3.1 20170109 Copyright (C) 2016 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. getCompilerVersion: mpif90 GNU Fortran (GCC) 6.3.1 20170109 Executing: mpif90 -show stdout: /usr/bin/gfortran -I/usr/include -pthread -I/usr/lib/openmpi -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi Executing: /usr/bin/gfortran --help stdout: Usage: gfortran [options] file... Options: -pass-exit-codes Exit with highest error code from a phase. --help Display this information. --target-help Display target specific command line options. --help={common|optimizers|params|target|warnings|[^]{joined|separate|undocumented}}[,...]. Display specific types of command line options. (Use '-v --help' to display command line options of sub-processes). --version Display compiler version information. -dumpspecs Display all of the built in spec strings. -dumpversion Display the version of the compiler. -dumpmachine Display the compiler's target processor. -print-search-dirs Display the directories in the compiler's search path. -print-libgcc-file-name Display the name of the compiler's companion library. -print-file-name= Display the full path to library . -print-prog-name= Display the full path to compiler component . -print-multiarch Display the target's normalized GNU triplet, used as a component in the library path. -print-multi-directory Display the root directory for versions of libgcc. -print-multi-lib Display the mapping between command line options and multiple library search directories. -print-multi-os-directory Display the relative path to OS libraries. -print-sysroot Display the target libraries directory. -print-sysroot-headers-suffix Display the sysroot suffix used to find headers. -Wa, Pass comma-separated on to the assembler. -Wp, Pass comma-separated on to the preprocessor. -Wl, Pass comma-separated on to the linker. -Xassembler Pass on to the assembler. -Xpreprocessor Pass on to the preprocessor. -Xlinker Pass on to the linker. -save-temps Do not delete intermediate files. -save-temps= Do not delete intermediate files. -no-canonical-prefixes Do not canonicalize paths when building relative prefixes to other gcc components. -pipe Use pipes rather than intermediate files. -time Time the execution of each subprocess. -specs= Override built-in specs with the contents of . -std= Assume that the input sources are for . --sysroot= Use as the root directory for headers and libraries. -B Add to the compiler's search paths. -v Display the programs invoked by the compiler. -### Like -v but options quoted and commands not executed. -E Preprocess only; do not compile, assemble or link. -S Compile only; do not assemble or link. -c Compile and assemble, but do not link. -o Place the output into . -pie Create a position independent executable. -shared Create a shared library. -x Specify the language of the following input files. Permissible languages include: c c++ assembler none 'none' means revert to the default behavior of guessing the language based on the file's extension. Options starting with -g, -f, -m, -O, -W, or --param are automatically passed on to the various sub-processes invoked by gfortran. In order to pass other options on to these processes the -W options must be used. For bug reporting instructions, please see: . Executing: /usr/bin/gfortran --version stdout: GNU Fortran (GCC) 6.3.1 20170109 Copyright (C) 2016 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Executing: /usr/bin/gfortran --version stdout: GNU Fortran (GCC) 6.3.1 20170109 Copyright (C) 2016 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Executing: /usr/bin/gfortran --version stdout: GNU Fortran (GCC) 6.3.1 20170109 Copyright (C) 2016 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Executing: mpif90 -show stdout: /usr/bin/gfortran -I/usr/include -pthread -I/usr/lib/openmpi -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi ================================================================================ TEST configureDebugging from PETSc.options.debugging(/home/florian/software/petsc/config/PETSc/options/debugging.py:25) TESTING: configureDebugging from PETSc.options.debugging(config/PETSc/options/debugging.py:25) Defined "USE_ERRORCHECKING" to "1" ================================================================================ TEST checkRestrict from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:137) TESTING: checkRestrict from config.compilers(config/BuildSystem/config/compilers.py:137) Check for the C/CXX restrict keyword Executing: mpicc -V Pushing language C All intermediate test results are stored in /tmp/petsc-KvGRNM/config.compilers Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.compilers/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.compilers/conftest.c:5:18: warning: unused variable 'x' [-Wunused-variable] float * restrict x;; ^ Source: #include "confdefs.h" #include "conffix.h" int main() { float * restrict x;; return 0; } compilers: Set C restrict keyword to restrict Defined "C_RESTRICT" to "restrict" Popping language C ================================================================================ TEST checkCFormatting from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:321) TESTING: checkCFormatting from config.compilers(config/BuildSystem/config/compilers.py:321) Activate format string checking if using the GNU compilers ================================================================================ TEST checkCStaticInline from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:108) TESTING: checkCStaticInline from config.compilers(config/BuildSystem/config/compilers.py:108) Check for C keyword: static inline Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" static inline int foo(int a) {return a;} int main() { foo(1);; return 0; } compilers: Set C StaticInline keyword to static inline Popping language C Defined "C_STATIC_INLINE" to "static inline" ================================================================================ TEST checkDynamicLoadFlag from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:332) TESTING: checkDynamicLoadFlag from config.compilers(config/BuildSystem/config/compilers.py:332) Checks that dlopen() takes RTLD_XXX, and defines PETSC_HAVE_RTLD_XXX if it does Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include char *libname; int main() { dlopen(libname, RTLD_LAZY); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.o -ldl Defined "HAVE_RTLD_LAZY" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include char *libname; int main() { dlopen(libname, RTLD_NOW); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.o -ldl Defined "HAVE_RTLD_NOW" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include char *libname; int main() { dlopen(libname, RTLD_LOCAL); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.o -ldl Defined "HAVE_RTLD_LOCAL" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include char *libname; int main() { dlopen(libname, RTLD_GLOBAL); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.o -ldl Defined "HAVE_RTLD_GLOBAL" to "1" ================================================================================ TEST checkCLibraries from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:168) TESTING: checkCLibraries from config.compilers(config/BuildSystem/config/compilers.py:168) Determines the libraries needed to link with C Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.compilers/conftest -v -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.o -ldl Possible ERROR while running linker: stderr: Using built-in specs. COLLECT_GCC=/usr/bin/gcc COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper Target: x86_64-pc-linux-gnu Configured with: /build/gcc/src/gcc/configure --prefix=/usr --libdir=/usr/lib --libexecdir=/usr/lib --mandir=/usr/share/man --infodir=/usr/share/info --with-bugurl=https://bugs.archlinux.org/ --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ --enable-shared --enable-threads=posix --enable-libmpx --with-system-zlib --with-isl --enable-__cxa_atexit --disable-libunwind-exceptions --enable-clocale=gnu --disable-libstdcxx-pch --disable-libssp --enable-gnu-unique-object --enable-linker-build-id --enable-lto --enable-plugin --enable-install-libiberty --with-linker-hash-style=gnu --enable-gnu-indirect-function --disable-multilib --disable-werror --enable-checking=release Thread model: posix gcc version 6.3.1 20170109 (GCC) COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ COLLECT_GCC_OPTIONS='-o' '/tmp/petsc-KvGRNM/config.compilers/conftest' '-v' '-fPIC' '-Wall' '-Wwrite-strings' '-Wno-strict-aliasing' '-Wno-unknown-pragmas' '-fvisibility=hidden' '-g3' '-pthread' '-L/usr/lib/openmpi' '-mtune=generic' '-march=x86-64' /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 -plugin /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper -plugin-opt=-fresolution=/tmp/cc2E5BUv.res -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lpthread -plugin-opt=-pass-through=-lc -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lgcc_s --build-id --eh-frame-hdr --hash-style=gnu -m elf_x86_64 -dynamic-linker /lib64/ld-linux-x86-64.so.2 -o /tmp/petsc-KvGRNM/config.compilers/conftest /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o -L/usr/lib/openmpi -L/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib -L./../lib -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib -L/lib/../lib -L/usr/lib/../lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -L. -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../.. /tmp/petsc-KvGRNM/config.compilers/conftest.o -ldl -rpath /usr/lib/openmpi --enable-new-dtags -lmpi -lgcc --as-needed -lgcc_s --no-as-needed -lpthread -lc -lgcc --as-needed -lgcc_s --no-as-needed /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o COLLECT_GCC_OPTIONS='-o' '/tmp/petsc-KvGRNM/config.compilers/conftest' '-v' '-fPIC' '-Wall' '-Wwrite-strings' '-Wno-strict-aliasing' '-Wno-unknown-pragmas' '-fvisibility=hidden' '-g3' '-pthread' '-L/usr/lib/openmpi' '-mtune=generic' '-march=x86-64' Popping language C compilers: Checking arg Using compilers: Unknown arg Using compilers: Checking arg built-in compilers: Unknown arg built-in compilers: Checking arg specs. compilers: Unknown arg specs. compilers: Checking arg COLLECT_GCC=/usr/bin/gcc compilers: Unknown arg COLLECT_GCC=/usr/bin/gcc compilers: Checking arg COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Unknown arg COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Checking arg Target: compilers: Unknown arg Target: compilers: Checking arg x86_64-pc-linux-gnu compilers: Unknown arg x86_64-pc-linux-gnu compilers: Checking arg Configured compilers: Unknown arg Configured compilers: Checking arg with: compilers: Unknown arg with: compilers: Checking arg /build/gcc/src/gcc/configure compilers: Unknown arg /build/gcc/src/gcc/configure compilers: Checking arg --prefix=/usr compilers: Unknown arg --prefix=/usr compilers: Checking arg --libdir=/usr/lib compilers: Unknown arg --libdir=/usr/lib compilers: Checking arg --libexecdir=/usr/lib compilers: Unknown arg --libexecdir=/usr/lib compilers: Checking arg --mandir=/usr/share/man compilers: Unknown arg --mandir=/usr/share/man compilers: Checking arg --infodir=/usr/share/info compilers: Unknown arg --infodir=/usr/share/info compilers: Checking arg --with-bugurl=https://bugs.archlinux.org/ compilers: Unknown arg --with-bugurl=https://bugs.archlinux.org/ compilers: Checking arg --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ compilers: Unknown arg --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ compilers: Checking arg --enable-shared compilers: Unknown arg --enable-shared compilers: Checking arg --enable-threads=posix compilers: Unknown arg --enable-threads=posix compilers: Checking arg --enable-libmpx compilers: Unknown arg --enable-libmpx compilers: Checking arg --with-system-zlib compilers: Unknown arg --with-system-zlib compilers: Checking arg --with-isl compilers: Unknown arg --with-isl compilers: Checking arg --enable-__cxa_atexit compilers: Unknown arg --enable-__cxa_atexit compilers: Checking arg --disable-libunwind-exceptions compilers: Unknown arg --disable-libunwind-exceptions compilers: Checking arg --enable-clocale=gnu compilers: Unknown arg --enable-clocale=gnu compilers: Checking arg --disable-libstdcxx-pch compilers: Unknown arg --disable-libstdcxx-pch compilers: Checking arg --disable-libssp compilers: Unknown arg --disable-libssp compilers: Checking arg --enable-gnu-unique-object compilers: Unknown arg --enable-gnu-unique-object compilers: Checking arg --enable-linker-build-id compilers: Unknown arg --enable-linker-build-id compilers: Checking arg --enable-lto compilers: Unknown arg --enable-lto compilers: Checking arg --enable-plugin compilers: Unknown arg --enable-plugin compilers: Checking arg --enable-install-libiberty compilers: Unknown arg --enable-install-libiberty compilers: Checking arg --with-linker-hash-style=gnu compilers: Unknown arg --with-linker-hash-style=gnu compilers: Checking arg --enable-gnu-indirect-function compilers: Unknown arg --enable-gnu-indirect-function compilers: Checking arg --disable-multilib compilers: Unknown arg --disable-multilib compilers: Checking arg --disable-werror compilers: Unknown arg --disable-werror compilers: Checking arg --enable-checking=release compilers: Unknown arg --enable-checking=release compilers: Checking arg Thread compilers: Unknown arg Thread compilers: Checking arg model: compilers: Unknown arg model: compilers: Checking arg posix compilers: Unknown arg posix compilers: Checking arg gcc compilers: Unknown arg gcc compilers: Checking arg version compilers: Unknown arg version compilers: Checking arg 6.3.1 compilers: Unknown arg 6.3.1 compilers: Checking arg 20170109 compilers: Unknown arg 20170109 compilers: Checking arg (GCC) compilers: Unknown arg (GCC) compilers: Checking arg COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ compilers: Unknown arg COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ compilers: Checking arg LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ compilers: Unknown arg LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ compilers: Checking arg COLLECT_GCC_OPTIONS= compilers: Unknown arg COLLECT_GCC_OPTIONS= compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 compilers: Checking arg -plugin compilers: Unknown arg -plugin compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so compilers: Checking arg -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Unknown arg -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Checking arg -plugin-opt=-fresolution=/tmp/cc2E5BUv.res compilers: Unknown arg -plugin-opt=-fresolution=/tmp/cc2E5BUv.res compilers: Checking arg -plugin-opt=-pass-through=-lgcc compilers: Unknown arg -plugin-opt=-pass-through=-lgcc compilers: Checking arg -plugin-opt=-pass-through=-lgcc_s compilers: Unknown arg -plugin-opt=-pass-through=-lgcc_s compilers: Checking arg -plugin-opt=-pass-through=-lpthread compilers: Unknown arg -plugin-opt=-pass-through=-lpthread compilers: Checking arg -plugin-opt=-pass-through=-lc compilers: Unknown arg -plugin-opt=-pass-through=-lc compilers: Checking arg -plugin-opt=-pass-through=-lgcc compilers: Unknown arg -plugin-opt=-pass-through=-lgcc compilers: Checking arg -plugin-opt=-pass-through=-lgcc_s compilers: Unknown arg -plugin-opt=-pass-through=-lgcc_s compilers: Checking arg --build-id compilers: Unknown arg --build-id compilers: Checking arg --eh-frame-hdr compilers: Unknown arg --eh-frame-hdr compilers: Checking arg --hash-style=gnu compilers: Unknown arg --hash-style=gnu compilers: Checking arg -m compilers: Unknown arg -m compilers: Checking arg elf_x86_64 compilers: Unknown arg elf_x86_64 compilers: Checking arg -dynamic-linker compilers: Unknown arg -dynamic-linker compilers: Checking arg /lib64/ld-linux-x86-64.so.2 compilers: Unknown arg /lib64/ld-linux-x86-64.so.2 compilers: Checking arg -o compilers: Unknown arg -o compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o compilers: Checking arg -L/usr/lib/openmpi compilers: Found library directory: -L/usr/lib/openmpi compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib compilers: Found library directory: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L./../lib compilers: Found library directory: -L/home/florian/software/lib compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Found library directory: -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib compilers: Checking arg -L/lib/../lib compilers: Checking arg -L/usr/lib/../lib compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Found library directory: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L. compilers: Found library directory: -L/home/florian/software/petsc compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../.. compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Checking arg -ldl compilers: Found library : -ldl compilers: Checking arg -rpath compilers: Found -rpath library: /usr/lib/openmpi compilers: Checking arg --enable-new-dtags compilers: Unknown arg --enable-new-dtags compilers: Checking arg -lmpi compilers: Found library : -lmpi compilers: Checking arg -lgcc compilers: Skipping system library: -lgcc compilers: Checking arg --as-needed compilers: Unknown arg --as-needed compilers: Checking arg -lgcc_s compilers: Found library : -lgcc_s compilers: Checking arg --no-as-needed compilers: Unknown arg --no-as-needed compilers: Checking arg -lpthread compilers: Found library : -lpthread compilers: Checking arg -lc compilers: Skipping system library: -lc compilers: Checking arg -lgcc compilers: Skipping system library: -lgcc compilers: Checking arg --as-needed compilers: Unknown arg --as-needed compilers: Checking arg -lgcc_s compilers: Checking arg --no-as-needed compilers: Unknown arg --no-as-needed compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o compilers: Checking arg COLLECT_GCC_OPTIONS= compilers: Unknown arg COLLECT_GCC_OPTIONS= compilers: Libraries needed to link C code with another linker: ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-ldl', '-Wl,-rpath,/usr/lib/openmpi', '-lmpi', '-lgcc_s', '-lpthread'] compilers: Check that C libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language FC ================================================================================ TEST checkDependencyGenerationFlag from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1363) TESTING: checkDependencyGenerationFlag from config.compilers(config/BuildSystem/config/compilers.py:1363) Check if -MMD works for dependency generation, and add it if it does Trying C compiler flag -MMD -MP Defined make macro "C_DEPFLAGS" to "-MMD -MP" Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 -MMD -MP /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Popping language C Trying Cxx compiler flag -MMD -MP Defined make macro "CXX_DEPFLAGS" to "-MMD -MP" Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC -MMD -MP /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Popping language Cxx Trying FC compiler flag -MMD -MP Defined make macro "FC_DEPFLAGS" to "-MMD -MP" Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -MMD -MP /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Popping language FC ================================================================================ TEST checkC99Flag from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1409) TESTING: checkC99Flag from config.compilers(config/BuildSystem/config/compilers.py:1409) Check for -std=c99 or equivalent flag Accepted C99 compile flag: Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.setCompilers/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.setCompilers/conftest.c:7:11: warning: variable 'x' set but not used [-Wunused-but-set-variable] float x[2],y; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { float x[2],y; y = FLT_ROUNDS; // c++ comment int j = 2; for (int i=0; i<2; i++){ x[i] = i*j*y; } ; return 0; } Popping language C ================================================================================ TEST checkRestrict from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:137) TESTING: checkRestrict from config.compilers(config/BuildSystem/config/compilers.py:137) Check for the C/CXX restrict keyword Executing: mpicc -V Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.compilers/conftest.cc: In function 'int main()': /tmp/petsc-KvGRNM/config.compilers/conftest.cc:5:18: error: expected initializer before 'x' float * restrict x;; ^ Source: #include "confdefs.h" #include "conffix.h" int main() { float * restrict x;; return 0; } Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.compilers/conftest.cc: In function 'int main()': /tmp/petsc-KvGRNM/config.compilers/conftest.cc:5:23: warning: unused variable 'x' [-Wunused-variable] float * __restrict__ x;; ^ Source: #include "confdefs.h" #include "conffix.h" int main() { float * __restrict__ x;; return 0; } compilers: Set Cxx restrict keyword to __restrict__ Defined "CXX_RESTRICT" to " __restrict__" Popping language Cxx ================================================================================ TEST checkCxxNamespace from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:372) TESTING: checkCxxNamespace from config.compilers(config/BuildSystem/config/compilers.py:372) Checks that C++ compiler supports namespaces, and if it does defines HAVE_CXX_NAMESPACE Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" namespace petsc {int dummy;} int main() { ; return 0; } Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" template struct a {}; namespace trouble{ template struct a : public ::a {}; } trouble::a uugh; int main() { ; return 0; } Popping language Cxx compilers: C++ has namespaces Defined "HAVE_CXX_NAMESPACE" to "1" ================================================================================ TEST checkCxxOptionalExtensions from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:345) TESTING: checkCxxOptionalExtensions from config.compilers(config/BuildSystem/config/compilers.py:345) Check whether the C++ compiler (IBM xlC, OSF5) need special flag for .c files which contain C++ Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { class somename { int i; };; return 0; } Added Cxx compiler flag Popping language Cxx ================================================================================ TEST checkCxxStaticInline from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:122) TESTING: checkCxxStaticInline from config.compilers(config/BuildSystem/config/compilers.py:122) Check for C++ keyword: static inline Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" static inline int foo(int a) {return a;} int main() { foo(1);; return 0; } compilers: Set Cxx StaticInline keyword to static inline Popping language Cxx Defined "CXX_STATIC_INLINE" to "static inline" ================================================================================ TEST checkCxxLibraries from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:430) TESTING: checkCxxLibraries from config.compilers(config/BuildSystem/config/compilers.py:430) Determines the libraries needed to link with C++ Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.compilers/conftest -v -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.compilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: stderr: Using built-in specs. COLLECT_GCC=/usr/bin/g++ COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper Target: x86_64-pc-linux-gnu Configured with: /build/gcc/src/gcc/configure --prefix=/usr --libdir=/usr/lib --libexecdir=/usr/lib --mandir=/usr/share/man --infodir=/usr/share/info --with-bugurl=https://bugs.archlinux.org/ --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ --enable-shared --enable-threads=posix --enable-libmpx --with-system-zlib --with-isl --enable-__cxa_atexit --disable-libunwind-exceptions --enable-clocale=gnu --disable-libstdcxx-pch --disable-libssp --enable-gnu-unique-object --enable-linker-build-id --enable-lto --enable-plugin --enable-install-libiberty --with-linker-hash-style=gnu --enable-gnu-indirect-function --disable-multilib --disable-werror --enable-checking=release Thread model: posix gcc version 6.3.1 20170109 (GCC) COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ COLLECT_GCC_OPTIONS='-o' '/tmp/petsc-KvGRNM/config.compilers/conftest' '-v' '-Wall' '-Wwrite-strings' '-Wno-strict-aliasing' '-Wno-unknown-pragmas' '-fvisibility=hidden' '-g' '-L/usr/lib/openmpi' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/lib' '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/petsc' '-pthread' '-L/usr/lib/openmpi' '-shared-libgcc' '-mtune=generic' '-march=x86-64' /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 -plugin /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper -plugin-opt=-fresolution=/tmp/cc3AHR5X.res -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lpthread -plugin-opt=-pass-through=-lc -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lgcc --build-id --eh-frame-hdr --hash-style=gnu -m elf_x86_64 -dynamic-linker /lib64/ld-linux-x86-64.so.2 -o /tmp/petsc-KvGRNM/config.compilers/conftest /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o -L/usr/lib/openmpi -L/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/lib -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc -L/usr/lib/openmpi -L/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib -L./../lib -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib -L/lib/../lib -L/usr/lib/../lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -L. -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../.. /tmp/petsc-KvGRNM/config.compilers/conftest.o -rpath /usr/lib/openmpi -rpath /home/florian/software/petsc/arch-linux2-c-debug/lib -rpath /home/florian/software/lib -rpath /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -rpath /home/florian/software/petsc/arch-linux2-c-debug/lib -rpath /home/florian/software/petsc -ldl -rpath /usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -rpath /usr/lib/openmpi --enable-new-dtags -lmpi_cxx -lmpi -lstdc++ -lm -lgcc_s -lgcc -lpthread -lc -lgcc_s -lgcc /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o COLLECT_GCC_OPTIONS='-o' '/tmp/petsc-KvGRNM/config.compilers/conftest' '-v' '-Wall' '-Wwrite-strings' '-Wno-strict-aliasing' '-Wno-unknown-pragmas' '-fvisibility=hidden' '-g' '-L/usr/lib/openmpi' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/lib' '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/petsc' '-pthread' '-L/usr/lib/openmpi' '-shared-libgcc' '-mtune=generic' '-march=x86-64' Popping language Cxx compilers: Checking arg Using compilers: Unknown arg Using compilers: Checking arg built-in compilers: Unknown arg built-in compilers: Checking arg specs. compilers: Unknown arg specs. compilers: Checking arg COLLECT_GCC=/usr/bin/g++ compilers: Unknown arg COLLECT_GCC=/usr/bin/g++ compilers: Checking arg COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Unknown arg COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Checking arg Target: compilers: Unknown arg Target: compilers: Checking arg x86_64-pc-linux-gnu compilers: Unknown arg x86_64-pc-linux-gnu compilers: Checking arg Configured compilers: Unknown arg Configured compilers: Checking arg with: compilers: Unknown arg with: compilers: Checking arg /build/gcc/src/gcc/configure compilers: Unknown arg /build/gcc/src/gcc/configure compilers: Checking arg --prefix=/usr compilers: Unknown arg --prefix=/usr compilers: Checking arg --libdir=/usr/lib compilers: Unknown arg --libdir=/usr/lib compilers: Checking arg --libexecdir=/usr/lib compilers: Unknown arg --libexecdir=/usr/lib compilers: Checking arg --mandir=/usr/share/man compilers: Unknown arg --mandir=/usr/share/man compilers: Checking arg --infodir=/usr/share/info compilers: Unknown arg --infodir=/usr/share/info compilers: Checking arg --with-bugurl=https://bugs.archlinux.org/ compilers: Unknown arg --with-bugurl=https://bugs.archlinux.org/ compilers: Checking arg --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ compilers: Unknown arg --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ compilers: Checking arg --enable-shared compilers: Unknown arg --enable-shared compilers: Checking arg --enable-threads=posix compilers: Unknown arg --enable-threads=posix compilers: Checking arg --enable-libmpx compilers: Unknown arg --enable-libmpx compilers: Checking arg --with-system-zlib compilers: Unknown arg --with-system-zlib compilers: Checking arg --with-isl compilers: Unknown arg --with-isl compilers: Checking arg --enable-__cxa_atexit compilers: Unknown arg --enable-__cxa_atexit compilers: Checking arg --disable-libunwind-exceptions compilers: Unknown arg --disable-libunwind-exceptions compilers: Checking arg --enable-clocale=gnu compilers: Unknown arg --enable-clocale=gnu compilers: Checking arg --disable-libstdcxx-pch compilers: Unknown arg --disable-libstdcxx-pch compilers: Checking arg --disable-libssp compilers: Unknown arg --disable-libssp compilers: Checking arg --enable-gnu-unique-object compilers: Unknown arg --enable-gnu-unique-object compilers: Checking arg --enable-linker-build-id compilers: Unknown arg --enable-linker-build-id compilers: Checking arg --enable-lto compilers: Unknown arg --enable-lto compilers: Checking arg --enable-plugin compilers: Unknown arg --enable-plugin compilers: Checking arg --enable-install-libiberty compilers: Unknown arg --enable-install-libiberty compilers: Checking arg --with-linker-hash-style=gnu compilers: Unknown arg --with-linker-hash-style=gnu compilers: Checking arg --enable-gnu-indirect-function compilers: Unknown arg --enable-gnu-indirect-function compilers: Checking arg --disable-multilib compilers: Unknown arg --disable-multilib compilers: Checking arg --disable-werror compilers: Unknown arg --disable-werror compilers: Checking arg --enable-checking=release compilers: Unknown arg --enable-checking=release compilers: Checking arg Thread compilers: Unknown arg Thread compilers: Checking arg model: compilers: Unknown arg model: compilers: Checking arg posix compilers: Unknown arg posix compilers: Checking arg gcc compilers: Unknown arg gcc compilers: Checking arg version compilers: Unknown arg version compilers: Checking arg 6.3.1 compilers: Unknown arg 6.3.1 compilers: Checking arg 20170109 compilers: Unknown arg 20170109 compilers: Checking arg (GCC) compilers: Unknown arg (GCC) compilers: Checking arg COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ compilers: Unknown arg COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ compilers: Checking arg LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ compilers: Unknown arg LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ compilers: Checking arg COLLECT_GCC_OPTIONS= compilers: Unknown arg COLLECT_GCC_OPTIONS= compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 compilers: Checking arg -plugin compilers: Unknown arg -plugin compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so compilers: Checking arg -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Unknown arg -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Checking arg -plugin-opt=-fresolution=/tmp/cc3AHR5X.res compilers: Unknown arg -plugin-opt=-fresolution=/tmp/cc3AHR5X.res compilers: Checking arg -plugin-opt=-pass-through=-lgcc_s compilers: Unknown arg -plugin-opt=-pass-through=-lgcc_s compilers: Checking arg -plugin-opt=-pass-through=-lgcc compilers: Unknown arg -plugin-opt=-pass-through=-lgcc compilers: Checking arg -plugin-opt=-pass-through=-lpthread compilers: Unknown arg -plugin-opt=-pass-through=-lpthread compilers: Checking arg -plugin-opt=-pass-through=-lc compilers: Unknown arg -plugin-opt=-pass-through=-lc compilers: Checking arg -plugin-opt=-pass-through=-lgcc_s compilers: Unknown arg -plugin-opt=-pass-through=-lgcc_s compilers: Checking arg -plugin-opt=-pass-through=-lgcc compilers: Unknown arg -plugin-opt=-pass-through=-lgcc compilers: Checking arg --build-id compilers: Unknown arg --build-id compilers: Checking arg --eh-frame-hdr compilers: Unknown arg --eh-frame-hdr compilers: Checking arg --hash-style=gnu compilers: Unknown arg --hash-style=gnu compilers: Checking arg -m compilers: Unknown arg -m compilers: Checking arg elf_x86_64 compilers: Unknown arg elf_x86_64 compilers: Checking arg -dynamic-linker compilers: Unknown arg -dynamic-linker compilers: Checking arg /lib64/ld-linux-x86-64.so.2 compilers: Unknown arg /lib64/ld-linux-x86-64.so.2 compilers: Checking arg -o compilers: Unknown arg -o compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o compilers: Checking arg -L/usr/lib/openmpi compilers: Found library directory: -L/usr/lib/openmpi compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Found library directory: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L/home/florian/software/lib compilers: Found library directory: -L/home/florian/software/lib compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Found library directory: -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L/home/florian/software/petsc compilers: Found library directory: -L/home/florian/software/petsc compilers: Checking arg -L/usr/lib/openmpi compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib compilers: Checking arg -L./../lib compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib compilers: Checking arg -L/lib/../lib compilers: Checking arg -L/usr/lib/../lib compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L. compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../.. compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Checking arg -rpath compilers: Found -rpath library: /usr/lib/openmpi compilers: Checking arg -rpath compilers: Found -rpath library: /home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -rpath compilers: Found -rpath library: /home/florian/software/lib compilers: Checking arg -rpath compilers: Found -rpath library: /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -rpath compilers: Already in rpathflags, skipping:-rpath compilers: Checking arg -rpath compilers: Found -rpath library: /home/florian/software/petsc compilers: Checking arg -ldl compilers: Found library: -ldl Library already in C list so skipping in C++ compilers: Checking arg -rpath compilers: Already in rpathflags, skipping:-rpath compilers: Checking arg -lmpi compilers: Found library: -lmpi Library already in C list so skipping in C++ compilers: Checking arg -lgcc_s compilers: Found library: -lgcc_s Library already in C list so skipping in C++ compilers: Checking arg -lpthread compilers: Found library: -lpthread Library already in C list so skipping in C++ compilers: Checking arg -ldl compilers: Checking arg -rpath compilers: Already in rpathflags, skipping:-rpath compilers: Checking arg --enable-new-dtags compilers: Unknown arg --enable-new-dtags compilers: Checking arg -lmpi_cxx compilers: Found library: -lmpi_cxx compilers: Checking arg -lmpi compilers: Checking arg -lstdc++ compilers: Found library: -lstdc++ compilers: Checking arg -lm compilers: Checking arg -lgcc_s compilers: Checking arg -lgcc compilers: Skipping system library: -lgcc compilers: Checking arg -lpthread compilers: Checking arg -lc compilers: Skipping system library: -lc compilers: Checking arg -lgcc_s compilers: Checking arg -lgcc compilers: Skipping system library: -lgcc compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o compilers: Checking arg COLLECT_GCC_OPTIONS= compilers: Unknown arg COLLECT_GCC_OPTIONS= compilers: Libraries needed to link Cxx code with another linker: ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lmpi_cxx', '-lstdc++'] compilers: Check that Cxx libraries can be used from C Pushing language C Popping language C Pushing language C Popping language C Pushing language C Popping language C Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language C compilers: Check that Cxx libraries can be used from Fortran Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Popping language FC Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main end Executing: mpif90 -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language FC ================================================================================ TEST checkCxx11 from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:387) TESTING: checkCxx11 from config.compilers(config/BuildSystem/config/compilers.py:387) Determine the option needed to support the C++11 dialect We auto-detect C++11 if the compiler supports it without options, otherwise we require with-cxx-dialect=C++11 to try adding flags to support it. Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc: In function 'int main()': /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc:13:24: warning: unused variable 'x' [-Wunused-variable] const double x = dist(mt); ^ Source: #include "confdefs.h" #include "conffix.h" #include template constexpr T Cubed( T x ) { return x*x*x; } int main() { std::random_device rd; std::mt19937 mt(rd()); std::normal_distribution dist(0,1); const double x = dist(mt); ; return 0; } Popping language Cxx ================================================================================ TEST checkFortranTypeSizes from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:593) TESTING: checkFortranTypeSizes from config.compilers(config/BuildSystem/config/compilers.py:593) Check whether real*8 is supported and suggest flags which will allow support Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.compilers/conftest.F:2:21: real*8 variable 1 Warning: Unused variable 'variable' declared at (1) [-Wunused-variable] Source: program main real*8 variable end Popping language FC ================================================================================ TEST checkFortranNameMangling from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:652) TESTING: checkFortranNameMangling from config.compilers(config/BuildSystem/config/compilers.py:652) Checks Fortran name mangling, and defines HAVE_FORTRAN_UNDERSCORE, HAVE_FORTRAN_NOUNDERSCORE, HAVE_FORTRAN_CAPS, or HAVE_FORTRAN_STDCALL Testing Fortran mangling type underscore with code void d1chk_(void){return;} Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void d1chk_(void){return;} Popping language C Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main call d1chk() end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o /tmp/petsc-KvGRNM/config.compilers/confc.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language FC compilers: Fortran name mangling is underscore Defined "HAVE_FORTRAN_UNDERSCORE" to "1" ================================================================================ TEST checkFortranNameManglingDouble from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:689) TESTING: checkFortranNameManglingDouble from config.compilers(config/BuildSystem/config/compilers.py:689) Checks if symbols containing an underscore append an extra underscore, and defines HAVE_FORTRAN_UNDERSCORE_UNDERSCORE if necessary Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void d1_chk__(void){return;} Popping language C Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main call d1_chk() end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o /tmp/petsc-KvGRNM/config.compilers/confc.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.compilers/conftest.o: In function `MAIN__': /tmp/petsc-KvGRNM/config.compilers/conftest.F:2: undefined reference to `d1_chk_' collect2: error: ld returned 1 exit status Popping language FC ================================================================================ TEST checkFortranPreprocessor from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:699) TESTING: checkFortranPreprocessor from config.compilers(config/BuildSystem/config/compilers.py:699) Determine if Fortran handles preprocessing properly compilers: Fortran uses CPP preprocessor Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main #define dummy dummy #ifndef dummy fooey #endif end Added FC compiler flag Popping language FC ================================================================================ TEST checkFortranDefineCompilerOption from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:723) TESTING: checkFortranDefineCompilerOption from config.compilers(config/BuildSystem/config/compilers.py:723) Check if -WF,-Dfoobar or -Dfoobar is the compiler option to define a macro Defined make macro "FC_DEFINE_FLAG" to "-D" compilers: Fortran uses -D for defining macro Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g -DTesting /tmp/petsc-KvGRNM/config.setCompilers/conftest.F Successful compile: Source: program main #define dummy dummy #ifndef Testing fooey #endif end Popping language FC ================================================================================ TEST checkFortranLibraries from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:743) TESTING: checkFortranLibraries from config.compilers(config/BuildSystem/config/compilers.py:743) Substitutes for FLIBS the libraries needed to link with Fortran This macro is intended to be used in those situations when it is necessary to mix, e.g. C++ and Fortran 77, source code into a single program or shared library. For example, if object files from a C++ and Fortran 77 compiler must be linked together, then the C++ compiler/linker must be used for linking (since special C++-ish things need to happen at link time like calling global constructors, instantiating templates, enabling exception support, etc.). However, the Fortran 77 intrinsic and run-time libraries must be linked in as well, but the C++ compiler/linker does not know how to add these Fortran 77 libraries. This code was translated from the autoconf macro which was packaged in its current form by Matthew D. Langston . However, nearly all of this macro came from the OCTAVE_FLIBS macro in octave-2.0.13/aclocal.m4, and full credit should go to John W. Eaton for writing this extremely useful macro. Pushing language FC Executing: mpif90 -V Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -v -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: stderr: Driving: /usr/bin/gfortran -o /tmp/petsc-KvGRNM/config.compilers/conftest -v -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -I/usr/include -pthread -I/usr/lib/openmpi -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -l gfortran -l m -shared-libgcc Using built-in specs. COLLECT_GCC=/usr/bin/gfortran COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper Target: x86_64-pc-linux-gnu Configured with: /build/gcc/src/gcc/configure --prefix=/usr --libdir=/usr/lib --libexecdir=/usr/lib --mandir=/usr/share/man --infodir=/usr/share/info --with-bugurl=https://bugs.archlinux.org/ --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ --enable-shared --enable-threads=posix --enable-libmpx --with-system-zlib --with-isl --enable-__cxa_atexit --disable-libunwind-exceptions --enable-clocale=gnu --disable-libstdcxx-pch --disable-libssp --enable-gnu-unique-object --enable-linker-build-id --enable-lto --enable-plugin --enable-install-libiberty --with-linker-hash-style=gnu --enable-gnu-indirect-function --disable-multilib --disable-werror --enable-checking=release Thread model: posix gcc version 6.3.1 20170109 (GCC) Reading specs from /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/libgfortran.spec rename spec lib to liborig COLLECT_GCC_OPTIONS='-o' '/tmp/petsc-KvGRNM/config.compilers/conftest' '-v' '-fPIC' '-Wall' '-ffree-line-length-0' '-Wno-unused-dummy-argument' '-g' '-L/usr/lib/openmpi' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/lib' '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/petsc' '-I' '/usr/include' '-pthread' '-I' '/usr/lib/openmpi' '-L/usr/lib/openmpi' '-shared-libgcc' '-mtune=generic' '-march=x86-64' COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ COLLECT_GCC_OPTIONS='-o' '/tmp/petsc-KvGRNM/config.compilers/conftest' '-v' '-fPIC' '-Wall' '-ffree-line-length-0' '-Wno-unused-dummy-argument' '-g' '-L/usr/lib/openmpi' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/lib' '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/petsc' '-I' '/usr/include' '-pthread' '-I' '/usr/lib/openmpi' '-L/usr/lib/openmpi' '-shared-libgcc' '-mtune=generic' '-march=x86-64' /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 -plugin /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper -plugin-opt=-fresolution=/tmp/ccsQQaXF.res -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lquadmath -plugin-opt=-pass-through=-lm -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lgcc -plugin-opt=-pass-through=-lpthread -plugin-opt=-pass-through=-lc -plugin-opt=-pass-through=-lgcc_s -plugin-opt=-pass-through=-lgcc --build-id --eh-frame-hdr --hash-style=gnu -m elf_x86_64 -dynamic-linker /lib64/ld-linux-x86-64.so.2 -o /tmp/petsc-KvGRNM/config.compilers/conftest /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o -L/usr/lib/openmpi -L/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/lib -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc -L/usr/lib/openmpi -L/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib -L./../lib -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib -L/lib/../lib -L/usr/lib/../lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -L. -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../.. /tmp/petsc-KvGRNM/config.compilers/conftest.o -rpath /usr/lib/openmpi -rpath /home/florian/software/petsc/arch-linux2-c-debug/lib -rpath /home/florian/software/lib -rpath /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -rpath /home/florian/software/petsc/arch-linux2-c-debug/lib -rpath /home/florian/software/petsc -ldl -rpath /usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -rpath /usr/lib/openmpi --enable-new-dtags -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi -lgfortran -lm -lgcc_s -lgcc -lquadmath -lm -lgcc_s -lgcc -lpthread -lc -lgcc_s -lgcc /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o COLLECT_GCC_OPTIONS='-o' '/tmp/petsc-KvGRNM/config.compilers/conftest' '-v' '-fPIC' '-Wall' '-ffree-line-length-0' '-Wno-unused-dummy-argument' '-g' '-L/usr/lib/openmpi' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/lib' '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1' '-L/home/florian/software/petsc/arch-linux2-c-debug/lib' '-L/home/florian/software/petsc' '-I' '/usr/include' '-pthread' '-I' '/usr/lib/openmpi' '-L/usr/lib/openmpi' '-shared-libgcc' '-mtune=generic' '-march=x86-64' Popping language FC compilers: Checking arg Driving: compilers: Unknown arg Driving: compilers: Checking arg /usr/bin/gfortran compilers: Unknown arg /usr/bin/gfortran compilers: Checking arg -o compilers: Unknown arg -o compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Checking arg -v compilers: Unknown arg -v compilers: Checking arg -fPIC compilers: Unknown arg -fPIC compilers: Checking arg -Wall compilers: Unknown arg -Wall compilers: Checking arg -ffree-line-length-0 compilers: Unknown arg -ffree-line-length-0 compilers: Checking arg -Wno-unused-dummy-argument compilers: Unknown arg -Wno-unused-dummy-argument compilers: Checking arg -g compilers: Unknown arg -g compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Checking arg -Wl,-rpath,/usr/lib/openmpi compilers: Unknown arg -Wl,-rpath,/usr/lib/openmpi compilers: Checking arg -L/usr/lib/openmpi compilers: Found library directory: -L/usr/lib/openmpi compilers: Checking arg -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Unknown arg -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Found library directory: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -Wl,-rpath,/home/florian/software/lib compilers: Unknown arg -Wl,-rpath,/home/florian/software/lib compilers: Checking arg -L/home/florian/software/lib compilers: Found library directory: -L/home/florian/software/lib compilers: Checking arg -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Unknown arg -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Found library directory: -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Unknown arg -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Already in lflags so skipping: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -Wl,-rpath,/home/florian/software/petsc compilers: Unknown arg -Wl,-rpath,/home/florian/software/petsc compilers: Checking arg -L/home/florian/software/petsc compilers: Found library directory: -L/home/florian/software/petsc compilers: Checking arg -ldl compilers: Found library: -ldl Library already in C list so skipping in Fortran compilers: Checking arg -Wl,-rpath,/usr/lib/openmpi compilers: Unknown arg -Wl,-rpath,/usr/lib/openmpi compilers: Checking arg -lmpi compilers: Found library: -lmpi Library already in C list so skipping in Fortran compilers: Checking arg -lgcc_s compilers: Found library: -lgcc_s Library already in C list so skipping in Fortran compilers: Checking arg -lpthread compilers: Found library: -lpthread Library already in C list so skipping in Fortran compilers: Checking arg -ldl compilers: Already in lflags: -ldl compilers: Checking arg -I/usr/include compilers: Found include directory: /usr/include compilers: Checking arg -pthread compilers: Unknown arg -pthread compilers: Checking arg -I/usr/lib/openmpi compilers: Found include directory: /usr/lib/openmpi compilers: Checking arg -Wl,-rpath compilers: Unknown arg -Wl,-rpath compilers: Checking arg -Wl,/usr/lib/openmpi compilers: Unknown arg -Wl,/usr/lib/openmpi compilers: Checking arg -Wl,--enable-new-dtags compilers: Unknown arg -Wl,--enable-new-dtags compilers: Checking arg -L/usr/lib/openmpi compilers: Already in lflags so skipping: -L/usr/lib/openmpi compilers: Checking arg -lmpi_usempif08 compilers: Found library: -lmpi_usempif08 compilers: Checking arg -lmpi_usempi_ignore_tkr compilers: Found library: -lmpi_usempi_ignore_tkr compilers: Checking arg -lmpi_mpifh compilers: Found library: -lmpi_mpifh compilers: Checking arg -lmpi compilers: Already in lflags: -lmpi compilers: Checking arg -l compilers: Found canonical library: -lgfortran compilers: Checking arg -l compilers: Found canonical library: -lm compilers: Checking arg -shared-libgcc compilers: Unknown arg -shared-libgcc compilers: Checking arg Using compilers: Unknown arg Using compilers: Checking arg built-in compilers: Unknown arg built-in compilers: Checking arg specs. compilers: Unknown arg specs. compilers: Checking arg COLLECT_GCC=/usr/bin/gfortran compilers: Unknown arg COLLECT_GCC=/usr/bin/gfortran compilers: Checking arg COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Unknown arg COLLECT_LTO_WRAPPER=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Checking arg Target: compilers: Unknown arg Target: compilers: Checking arg x86_64-pc-linux-gnu compilers: Unknown arg x86_64-pc-linux-gnu compilers: Checking arg Configured compilers: Unknown arg Configured compilers: Checking arg with: compilers: Unknown arg with: compilers: Checking arg /build/gcc/src/gcc/configure compilers: Unknown arg /build/gcc/src/gcc/configure compilers: Checking arg --prefix=/usr compilers: Unknown arg --prefix=/usr compilers: Checking arg --libdir=/usr/lib compilers: Unknown arg --libdir=/usr/lib compilers: Checking arg --libexecdir=/usr/lib compilers: Unknown arg --libexecdir=/usr/lib compilers: Checking arg --mandir=/usr/share/man compilers: Unknown arg --mandir=/usr/share/man compilers: Checking arg --infodir=/usr/share/info compilers: Unknown arg --infodir=/usr/share/info compilers: Checking arg --with-bugurl=https://bugs.archlinux.org/ compilers: Unknown arg --with-bugurl=https://bugs.archlinux.org/ compilers: Checking arg --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ compilers: Unknown arg --enable-languages=c,c++,ada,fortran,go,lto,objc,obj-c++ compilers: Checking arg --enable-shared compilers: Unknown arg --enable-shared compilers: Checking arg --enable-threads=posix compilers: Unknown arg --enable-threads=posix compilers: Checking arg --enable-libmpx compilers: Unknown arg --enable-libmpx compilers: Checking arg --with-system-zlib compilers: Unknown arg --with-system-zlib compilers: Checking arg --with-isl compilers: Unknown arg --with-isl compilers: Checking arg --enable-__cxa_atexit compilers: Unknown arg --enable-__cxa_atexit compilers: Checking arg --disable-libunwind-exceptions compilers: Unknown arg --disable-libunwind-exceptions compilers: Checking arg --enable-clocale=gnu compilers: Unknown arg --enable-clocale=gnu compilers: Checking arg --disable-libstdcxx-pch compilers: Unknown arg --disable-libstdcxx-pch compilers: Checking arg --disable-libssp compilers: Unknown arg --disable-libssp compilers: Checking arg --enable-gnu-unique-object compilers: Unknown arg --enable-gnu-unique-object compilers: Checking arg --enable-linker-build-id compilers: Unknown arg --enable-linker-build-id compilers: Checking arg --enable-lto compilers: Unknown arg --enable-lto compilers: Checking arg --enable-plugin compilers: Unknown arg --enable-plugin compilers: Checking arg --enable-install-libiberty compilers: Unknown arg --enable-install-libiberty compilers: Checking arg --with-linker-hash-style=gnu compilers: Unknown arg --with-linker-hash-style=gnu compilers: Checking arg --enable-gnu-indirect-function compilers: Unknown arg --enable-gnu-indirect-function compilers: Checking arg --disable-multilib compilers: Unknown arg --disable-multilib compilers: Checking arg --disable-werror compilers: Unknown arg --disable-werror compilers: Checking arg --enable-checking=release compilers: Unknown arg --enable-checking=release compilers: Checking arg Thread compilers: Unknown arg Thread compilers: Checking arg model: compilers: Unknown arg model: compilers: Checking arg posix compilers: Unknown arg posix compilers: Checking arg gcc compilers: Unknown arg gcc compilers: Checking arg version compilers: Unknown arg version compilers: Checking arg 6.3.1 compilers: Unknown arg 6.3.1 compilers: Checking arg 20170109 compilers: Unknown arg 20170109 compilers: Checking arg (GCC) compilers: Unknown arg (GCC) compilers: Checking arg Reading compilers: Unknown arg Reading compilers: Checking arg specs compilers: Unknown arg specs compilers: Checking arg from compilers: Unknown arg from compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/libgfortran.spec compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/libgfortran.spec compilers: Checking arg rename compilers: Unknown arg rename compilers: Checking arg spec compilers: Unknown arg spec compilers: Checking arg lib compilers: Unknown arg lib compilers: Checking arg to compilers: Unknown arg to compilers: Checking arg liborig compilers: Unknown arg liborig compilers: Checking arg COLLECT_GCC_OPTIONS= compilers: Unknown arg COLLECT_GCC_OPTIONS= compilers: Checking arg COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ compilers: Skipping arg COMPILER_PATH=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/ compilers: Checking arg LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ compilers: Skipping arg LIBRARY_PATH=/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib/:./../lib/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/:/lib/../lib/:/usr/lib/../lib/:/home/florian/software/petsc/arch-linux2-c-debug/lib/:./:/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../:/lib/:/usr/lib/ compilers: Checking arg COLLECT_GCC_OPTIONS= compilers: Unknown arg COLLECT_GCC_OPTIONS= compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/collect2 compilers: Checking arg -plugin compilers: Unknown arg -plugin compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/liblto_plugin.so compilers: Checking arg -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Unknown arg -plugin-opt=/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/lto-wrapper compilers: Checking arg -plugin-opt=-fresolution=/tmp/ccsQQaXF.res compilers: Unknown arg -plugin-opt=-fresolution=/tmp/ccsQQaXF.res compilers: Checking arg -plugin-opt=-pass-through=-lgcc_s compilers: Unknown arg -plugin-opt=-pass-through=-lgcc_s compilers: Checking arg -plugin-opt=-pass-through=-lgcc compilers: Unknown arg -plugin-opt=-pass-through=-lgcc compilers: Checking arg -plugin-opt=-pass-through=-lquadmath compilers: Unknown arg -plugin-opt=-pass-through=-lquadmath compilers: Checking arg -plugin-opt=-pass-through=-lm compilers: Unknown arg -plugin-opt=-pass-through=-lm compilers: Checking arg -plugin-opt=-pass-through=-lgcc_s compilers: Unknown arg -plugin-opt=-pass-through=-lgcc_s compilers: Checking arg -plugin-opt=-pass-through=-lgcc compilers: Unknown arg -plugin-opt=-pass-through=-lgcc compilers: Checking arg -plugin-opt=-pass-through=-lpthread compilers: Unknown arg -plugin-opt=-pass-through=-lpthread compilers: Checking arg -plugin-opt=-pass-through=-lc compilers: Unknown arg -plugin-opt=-pass-through=-lc compilers: Checking arg -plugin-opt=-pass-through=-lgcc_s compilers: Unknown arg -plugin-opt=-pass-through=-lgcc_s compilers: Checking arg -plugin-opt=-pass-through=-lgcc compilers: Unknown arg -plugin-opt=-pass-through=-lgcc compilers: Checking arg --build-id compilers: Unknown arg --build-id compilers: Checking arg --eh-frame-hdr compilers: Unknown arg --eh-frame-hdr compilers: Checking arg --hash-style=gnu compilers: Unknown arg --hash-style=gnu compilers: Checking arg -m compilers: Unknown arg -m compilers: Checking arg elf_x86_64 compilers: Unknown arg elf_x86_64 compilers: Checking arg -dynamic-linker compilers: Unknown arg -dynamic-linker compilers: Checking arg /lib64/ld-linux-x86-64.so.2 compilers: Unknown arg /lib64/ld-linux-x86-64.so.2 compilers: Checking arg -o compilers: Unknown arg -o compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crt1.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crti.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtbegin.o compilers: Checking arg -L/usr/lib/openmpi compilers: Already in lflags so skipping: -L/usr/lib/openmpi compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Already in lflags so skipping: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L/home/florian/software/lib compilers: Already in lflags so skipping: -L/home/florian/software/lib compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Already in lflags so skipping: -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Already in lflags so skipping: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L/home/florian/software/petsc compilers: Already in lflags so skipping: -L/home/florian/software/petsc compilers: Checking arg -L/usr/lib/openmpi compilers: Already in lflags so skipping: -L/usr/lib/openmpi compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib/../lib compilers: Already in lflags so skipping: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L./../lib compilers: Already in lflags so skipping: -L/home/florian/software/lib compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Already in lflags so skipping: -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib compilers: Checking arg -L/lib/../lib compilers: Checking arg -L/usr/lib/../lib compilers: Checking arg -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Already in lflags so skipping: -L/home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -L. compilers: Already in lflags so skipping: -L/home/florian/software/petsc compilers: Checking arg -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../.. compilers: Checking arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Unknown arg /tmp/petsc-KvGRNM/config.compilers/conftest.o compilers: Checking arg -rpath compilers: Found -rpath library: /usr/lib/openmpi compilers: Checking arg -rpath compilers: Found -rpath library: /home/florian/software/petsc/arch-linux2-c-debug/lib compilers: Checking arg -rpath compilers: Found -rpath library: /home/florian/software/lib compilers: Checking arg -rpath compilers: Found -rpath library: /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 compilers: Checking arg -rpath compilers: Already in rpathflags so skipping: -rpath compilers: Checking arg -rpath compilers: Found -rpath library: /home/florian/software/petsc compilers: Checking arg -ldl compilers: Already in lflags: -ldl compilers: Checking arg -rpath compilers: Already in rpathflags so skipping: -rpath compilers: Checking arg -lmpi compilers: Already in lflags: -lmpi compilers: Checking arg -lgcc_s compilers: Already in lflags: -lgcc_s compilers: Checking arg -lpthread compilers: Already in lflags: -lpthread compilers: Checking arg -ldl compilers: Already in lflags: -ldl compilers: Checking arg -rpath compilers: Already in rpathflags so skipping: -rpath compilers: Checking arg --enable-new-dtags compilers: Unknown arg --enable-new-dtags compilers: Checking arg -lmpi_usempif08 compilers: Already in lflags: -lmpi_usempif08 compilers: Checking arg -lmpi_usempi_ignore_tkr compilers: Already in lflags: -lmpi_usempi_ignore_tkr compilers: Checking arg -lmpi_mpifh compilers: Already in lflags: -lmpi_mpifh compilers: Checking arg -lmpi compilers: Already in lflags: -lmpi compilers: Checking arg -lgfortran compilers: Found library: -lgfortran compilers: Checking arg -lm compilers: Found library: -lm compilers: Checking arg -lgcc_s compilers: Already in lflags: -lgcc_s compilers: Checking arg -lgcc compilers: Found system library therefor skipping: -lgcc compilers: Checking arg -lquadmath compilers: Found library: -lquadmath compilers: Checking arg -lm compilers: Found library: -lm compilers: Checking arg -lgcc_s compilers: Already in lflags: -lgcc_s compilers: Checking arg -lgcc compilers: Found system library therefor skipping: -lgcc compilers: Checking arg -lpthread compilers: Already in lflags: -lpthread compilers: Checking arg -lc compilers: Found system library therefor skipping: -lc compilers: Checking arg -lgcc_s compilers: Already in lflags: -lgcc_s compilers: Checking arg -lgcc compilers: Found system library therefor skipping: -lgcc compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/crtend.o compilers: Checking arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o compilers: Unknown arg /usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/../../../../lib/crtn.o compilers: Checking arg COLLECT_GCC_OPTIONS= compilers: Unknown arg COLLECT_GCC_OPTIONS= compilers: Libraries needed to link Fortran code with the C linker: ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] compilers: Libraries needed to link Fortran main with the C linker: [] compilers: Check that Fortran libraries can be used from C Pushing language C Popping language C Pushing language C Popping language C Pushing language C Popping language C Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language C compilers: Check that Fortran libraries can be used from C++ Pushing language CXX Popping language CXX Pushing language CXX Popping language CXX Pushing language CXX Popping language CXX compilers: Fortran libraries can be used from C++ Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicxx -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicxx -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -lpetsc-ufod4vtr9mqHvKIQiVAm Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lpetsc-ufod4vtr9mqHvKIQiVAm collect2: error: ld returned 1 exit status Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.setCompilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { ; return 0; } Executing: mpicxx -o /tmp/petsc-KvGRNM/config.setCompilers/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.setCompilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Executing: /tmp/petsc-KvGRNM/config.setCompilers/conftest Popping language Cxx ================================================================================ TEST checkFortranLinkingCxx from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1097) TESTING: checkFortranLinkingCxx from config.compilers(config/BuildSystem/config/compilers.py:1097) Check that Fortran can be linked against C++ Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern "C" void d1chk_(void); void foo(void){d1chk_();} Popping language Cxx Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.compilers/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" extern "C" void d1chk_(void); void d1chk_(void){return;} Popping language Cxx Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main call d1chk() end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o /tmp/petsc-KvGRNM/config.compilers/cxxobj.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread /tmp/petsc-KvGRNM/config.compilers/confc.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language FC compilers: Fortran can link C++ functions ================================================================================ TEST checkFortran90 from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1132) TESTING: checkFortran90 from config.compilers(config/BuildSystem/config/compilers.py:1132) Determine whether the Fortran compiler handles F90 Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main INTEGER, PARAMETER :: int = SELECTED_INT_KIND(8) INTEGER (KIND=int) :: ierr ierr = 1 end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "USING_F90" to "1" Fortran compiler supports F90 Popping language FC ================================================================================ TEST checkFortran2003 from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1145) TESTING: checkFortran2003 from config.compilers(config/BuildSystem/config/compilers.py:1145) Determine whether the Fortran compiler handles F2003 Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main use,intrinsic :: iso_c_binding Type(C_Ptr),Dimension(:),Pointer :: CArray character(kind=c_char),pointer :: nullc => null() character(kind=c_char,len=5),dimension(:),pointer::list1 allocate(list1(5)) CArray = (/(c_loc(list1(i)),i=1,5),c_loc(nullc)/) end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "USING_F2003" to "1" Fortran compiler supports F2003 Popping language FC ================================================================================ TEST checkFortran90Array from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1165) TESTING: checkFortran90Array from config.compilers(config/BuildSystem/config/compilers.py:1165) Check for F90 array interfaces Executing: uname -s stdout: Linux Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.compilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include void f90arraytest_(void* a1, void* a2,void* a3, void* i) { printf("arrays [%p %p %p]\n",a1,a2,a3); fflush(stdout); return; } void f90ptrtest_(void* a1, void* a2,void* a3, void* i, void* p1 ,void* p2, void* p3) { printf("arrays [%p %p %p]\n",a1,a2,a3); if ((p1 == p3) && (p1 != p2)) { printf("pointers match! [%p %p] [%p]\n",p1,p3,p2); fflush(stdout); } else { printf("pointers do not match! [%p %p] [%p]\n",p1,p3,p2); fflush(stdout); exit(111); } return; } Popping language C Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main Interface Subroutine f90ptrtest(p1,p2,p3,i) integer, pointer :: p1(:,:) integer, pointer :: p2(:,:) integer, pointer :: p3(:,:) integer i End Subroutine End Interface integer, pointer :: ptr1(:,:),ptr2(:,:) integer, target :: array(6:8,9:21) integer in in = 25 ptr1 => array ptr2 => array call f90arraytest(ptr1,ptr2,ptr1,in) call f90ptrtest(ptr1,ptr2,ptr1,in) end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o /tmp/petsc-KvGRNM/config.compilers/fooobj.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.compilers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.compilers/conftest Executing: /tmp/petsc-KvGRNM/config.compilers/conftest stdout: arrays [0x7fff20a6eb50 0x7fff20a6eb50 0x7fff20a6eb50] arrays [0x7fff20a6eb00 0x7fff20a6eab0 0x7fff20a6eb00] pointers do not match! [0x7fecde205740 0x7fff20a6eb50] [0x7fecdfbe9d40] ERROR while running executable: Could not execute "/tmp/petsc-KvGRNM/config.compilers/conftest": arrays [0x7fff20a6eb50 0x7fff20a6eb50 0x7fff20a6eb50] arrays [0x7fff20a6eb00 0x7fff20a6eab0 0x7fff20a6eb00] pointers do not match! [0x7fecde205740 0x7fff20a6eb50] [0x7fecdfbe9d40] Popping language FC compilers: F90 uses a single argument for array pointers ================================================================================ TEST checkFortranModuleInclude from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1252) TESTING: checkFortranModuleInclude from config.compilers(config/BuildSystem/config/compilers.py:1252) Figures out what flag is used to specify the include path for Fortran modules Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.compilers/confdir -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: program main use configtest write(*,*) testint end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.compilers/conftest -I/tmp/petsc-KvGRNM/config.compilers/confdir -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.o /tmp/petsc-KvGRNM/config.compilers/configtest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl compilers: Fortran module include flag -I found Popping language FC ================================================================================ TEST checkFortranModuleOutput from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1318) TESTING: checkFortranModuleOutput from config.compilers(config/BuildSystem/config/compilers.py:1318) Figures out what flag is used to specify the include path for Fortran modules Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -module /tmp/petsc-KvGRNM/config.compilers/confdir -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Possible ERROR while running compiler: exit code 256 stderr: gfortran: error: unrecognized command line option '-module'; did you mean '-mhle'? Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -module compile failed Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -module:/tmp/petsc-KvGRNM/config.compilers/confdir -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Possible ERROR while running compiler: exit code 256 stderr: gfortran: error: unrecognized command line option '-module:/tmp/petsc-KvGRNM/config.compilers/confdir' Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -module: compile failed Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fmod=/tmp/petsc-KvGRNM/config.compilers/confdir -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Possible ERROR while running compiler: exit code 256 stderr: gfortran: error: unrecognized command line option '-fmod=/tmp/petsc-KvGRNM/config.compilers/confdir' Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -fmod= compile failed Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.compilers/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -J/tmp/petsc-KvGRNM/config.compilers/confdir -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.compilers/conftest.F Successful compile: Source: module configtest integer testint parameter (testint = 42) end module configtest compilers: Fortran module output flag -J found Popping language FC ================================================================================ TEST setupFrameworkCompilers from config.compilers(/home/florian/software/petsc/config/BuildSystem/config/compilers.py:1476) TESTING: setupFrameworkCompilers from config.compilers(config/BuildSystem/config/compilers.py:1476) ================================================================================ TEST configureClosure from config.utilities.closure(/home/florian/software/petsc/config/BuildSystem/config/utilities/closure.py:18) TESTING: configureClosure from config.utilities.closure(config/BuildSystem/config/utilities/closure.py:18) Determine if Apple ^close syntax is supported in C Pushing language C All intermediate test results are stored in /tmp/petsc-KvGRNM/config.utilities.closure Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.closure/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.closure/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.utilities.closure/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.utilities.closure/conftest.c:6:6: error: expected identifier or '(' before '^' token int (^closure)(int);; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { int (^closure)(int);; return 0; } Compile failed inside link ================================================================================ TEST configureFortranCPP from PETSc.options.fortranCPP(/home/florian/software/petsc/config/PETSc/options/fortranCPP.py:27) TESTING: configureFortranCPP from PETSc.options.fortranCPP(config/PETSc/options/fortranCPP.py:27) Handle case where Fortran cannot preprocess properly Defined make rule ".f.o .f90.o .f95.o" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} -o $@ $<'] Defined make rule ".f.a" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} $<', '-${AR} ${AR_FLAGS} ${LIBNAME} $*.o', '-${RM} $*.o'] Defined make rule ".F.o .F90.o .F95.o" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} ${FCPPFLAGS} -o $@ $<'] Defined make rule ".F.a" with dependencies "" and code ['${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} ${FCPPFLAGS} $<', '-${AR} ${AR_FLAGS} ${LIBNAME} $*.o', '-${RM} $*.o'] ================================================================================ TEST checkStdC from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:105) TESTING: checkStdC from config.headers(config/BuildSystem/config/headers.py:105) Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.headers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.headers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.headers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #include #include int main() { ; return 0; } Source: #include "confdefs.h" #include "conffix.h" #include Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/string.h" 1 3 4 # 25 "/usr/include/string.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/string.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 33 "/usr/include/string.h" 2 3 4 extern void *memcpy (void *__restrict __dest, const void *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memmove (void *__dest, const void *__src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memccpy (void *__restrict __dest, const void *__restrict __src, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memset (void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int memcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 92 "/usr/include/string.h" 3 4 extern void *memchr (const void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 123 "/usr/include/string.h" 3 4 extern char *strcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strcat (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncat (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcoll (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strxfrm (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 160 "/usr/include/string.h" 2 3 4 extern int strcoll_l (const char *__s1, const char *__s2, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern size_t strxfrm_l (char *__dest, const char *__src, size_t __n, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern char *strdup (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); extern char *strndup (const char *__string, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); # 206 "/usr/include/string.h" 3 4 # 231 "/usr/include/string.h" 3 4 extern char *strchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 258 "/usr/include/string.h" 3 4 extern char *strrchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 277 "/usr/include/string.h" 3 4 extern size_t strcspn (const char *__s, const char *__reject) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strspn (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 310 "/usr/include/string.h" 3 4 extern char *strpbrk (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 337 "/usr/include/string.h" 3 4 extern char *strstr (const char *__haystack, const char *__needle) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strtok (char *__restrict __s, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *__strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern char *strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); # 392 "/usr/include/string.h" 3 4 extern size_t strlen (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern size_t strnlen (const char *__string, size_t __maxlen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern char *strerror (int __errnum) __attribute__ ((__nothrow__ , __leaf__)); # 422 "/usr/include/string.h" 3 4 extern int strerror_r (int __errnum, char *__buf, size_t __buflen) __asm__ ("" "__xpg_strerror_r") __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 440 "/usr/include/string.h" 3 4 extern char *strerror_l (int __errnum, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)); extern void __bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void bcopy (const void *__src, void *__dest, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int bcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 484 "/usr/include/string.h" 3 4 extern char *index (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 512 "/usr/include/string.h" 3 4 extern char *rindex (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern int ffs (int __i) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 529 "/usr/include/string.h" 3 4 extern int strcasecmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncasecmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 552 "/usr/include/string.h" 3 4 extern char *strsep (char **__restrict __stringp, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strsignal (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern char *__stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *__stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); # 656 "/usr/include/string.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Source: #include "confdefs.h" #include "conffix.h" #include Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/stdlib.h" 1 3 4 # 24 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 25 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 33 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitflags.h" 1 3 4 # 42 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitstatus.h" 1 3 4 # 43 "/usr/include/stdlib.h" 2 3 4 # 56 "/usr/include/stdlib.h" 3 4 typedef struct { int quot; int rem; } div_t; typedef struct { long int quot; long int rem; } ldiv_t; __extension__ typedef struct { long long int quot; long long int rem; } lldiv_t; # 100 "/usr/include/stdlib.h" 3 4 extern size_t __ctype_get_mb_cur_max (void) __attribute__ ((__nothrow__ , __leaf__)) ; extern double atof (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern int atoi (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern long int atol (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; __extension__ extern long long int atoll (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern double strtod (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern float strtof (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long double strtold (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int strtol (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern unsigned long int strtoul (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtouq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoll (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtoull (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 266 "/usr/include/stdlib.h" 3 4 extern char *l64a (long int __n) __attribute__ ((__nothrow__ , __leaf__)) ; extern long int a64l (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 276 "/usr/include/stdlib.h" 2 3 4 extern long int random (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srandom (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern char *initstate (unsigned int __seed, char *__statebuf, size_t __statelen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *setstate (char *__statebuf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct random_data { int32_t *fptr; int32_t *rptr; int32_t *state; int rand_type; int rand_deg; int rand_sep; int32_t *end_ptr; }; extern int random_r (struct random_data *__restrict __buf, int32_t *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srandom_r (unsigned int __seed, struct random_data *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int initstate_r (unsigned int __seed, char *__restrict __statebuf, size_t __statelen, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern int setstate_r (char *__restrict __statebuf, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int rand (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srand (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern int rand_r (unsigned int *__seed) __attribute__ ((__nothrow__ , __leaf__)); extern double drand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern double erand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int lrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int nrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int mrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int jrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void srand48 (long int __seedval) __attribute__ ((__nothrow__ , __leaf__)); extern unsigned short int *seed48 (unsigned short int __seed16v[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void lcong48 (unsigned short int __param[7]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct drand48_data { unsigned short int __x[3]; unsigned short int __old_x[3]; unsigned short int __c; unsigned short int __init; __extension__ unsigned long long int __a; }; extern int drand48_r (struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int erand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int nrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int mrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int jrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srand48_r (long int __seedval, struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int seed48_r (unsigned short int __seed16v[3], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lcong48_r (unsigned short int __param[7], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *malloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *calloc (size_t __nmemb, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *realloc (void *__ptr, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__warn_unused_result__)); extern void free (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void cfree (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/include/alloca.h" 1 3 4 # 24 "/usr/include/alloca.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 25 "/usr/include/alloca.h" 2 3 4 extern void *alloca (size_t __size) __attribute__ ((__nothrow__ , __leaf__)); # 454 "/usr/include/stdlib.h" 2 3 4 extern void *valloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int posix_memalign (void **__memptr, size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern void *aligned_alloc (size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__alloc_size__ (2))) ; extern void abort (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern int atexit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int at_quick_exit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int on_exit (void (*__func) (int __status, void *__arg), void *__arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void quick_exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void _Exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern char *getenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 539 "/usr/include/stdlib.h" 3 4 extern int putenv (char *__string) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int setenv (const char *__name, const char *__value, int __replace) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int unsetenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int clearenv (void) __attribute__ ((__nothrow__ , __leaf__)); # 567 "/usr/include/stdlib.h" 3 4 extern char *mktemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 580 "/usr/include/stdlib.h" 3 4 extern int mkstemp (char *__template) __attribute__ ((__nonnull__ (1))) ; # 602 "/usr/include/stdlib.h" 3 4 extern int mkstemps (char *__template, int __suffixlen) __attribute__ ((__nonnull__ (1))) ; # 623 "/usr/include/stdlib.h" 3 4 extern char *mkdtemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 672 "/usr/include/stdlib.h" 3 4 extern int system (const char *__command) ; # 694 "/usr/include/stdlib.h" 3 4 extern char *realpath (const char *__restrict __name, char *__restrict __resolved) __attribute__ ((__nothrow__ , __leaf__)) ; typedef int (*__compar_fn_t) (const void *, const void *); # 712 "/usr/include/stdlib.h" 3 4 extern void *bsearch (const void *__key, const void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 2, 5))) ; extern void qsort (void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 4))); # 735 "/usr/include/stdlib.h" 3 4 extern int abs (int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern long int labs (long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern long long int llabs (long long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern div_t div (int __numer, int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern ldiv_t ldiv (long int __numer, long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern lldiv_t lldiv (long long int __numer, long long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; # 772 "/usr/include/stdlib.h" 3 4 extern char *ecvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *fcvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *gcvt (double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern char *qecvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qfcvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qgcvt (long double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern int ecvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int fcvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qecvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qfcvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int mblen (const char *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int mbtowc (wchar_t *__restrict __pwc, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int wctomb (char *__s, wchar_t __wchar) __attribute__ ((__nothrow__ , __leaf__)); extern size_t mbstowcs (wchar_t *__restrict __pwcs, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern size_t wcstombs (char *__restrict __s, const wchar_t *__restrict __pwcs, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int rpmatch (const char *__response) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 859 "/usr/include/stdlib.h" 3 4 extern int getsubopt (char **__restrict __optionp, char *const *__restrict __tokens, char **__restrict __valuep) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))) ; # 911 "/usr/include/stdlib.h" 3 4 extern int getloadavg (double __loadavg[], int __nelem) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 921 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/bits/stdlib-float.h" 1 3 4 # 922 "/usr/include/stdlib.h" 2 3 4 # 934 "/usr/include/stdlib.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.headers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.headers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.headers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #define ISLOWER(c) ('a' <= (c) && (c) <= 'z') #define TOUPPER(c) (ISLOWER(c) ? 'A' + ((c) - 'a') : (c)) #define XOR(e, f) (((e) && !(f)) || (!(e) && (f))) int main() { int i; for(i = 0; i < 256; i++) if (XOR(islower(i), ISLOWER(i)) || toupper(i) != TOUPPER(i)) exit(2); exit(0); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.headers/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.headers/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.headers/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.headers/conftest Executing: /tmp/petsc-KvGRNM/config.headers/conftest Defined "STDC_HEADERS" to "1" ================================================================================ TEST checkStat from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:138) TESTING: checkStat from config.headers(config/BuildSystem/config/headers.py:138) Checks whether stat file-mode macros are broken, and defines STAT_MACROS_BROKEN if they are Source: #include "confdefs.h" #include "conffix.h" #include #include #if defined(S_ISBLK) && defined(S_IFDIR) # if S_ISBLK (S_IFDIR) You lose. # endif #endif #if defined(S_ISBLK) && defined(S_IFCHR) # if S_ISBLK (S_IFCHR) You lose. # endif #endif #if defined(S_ISLNK) && defined(S_IFREG) # if S_ISLNK (S_IFREG) You lose. # endif #endif #if defined(S_ISSOCK) && defined(S_IFREG) # if S_ISSOCK (S_IFREG) You lose. # endif #endif Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/types.h" 1 3 4 # 25 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 5 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/stat.h" 1 3 4 # 36 "/usr/include/sys/stat.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 37 "/usr/include/sys/stat.h" 2 3 4 # 102 "/usr/include/sys/stat.h" 3 4 # 1 "/usr/include/bits/stat.h" 1 3 4 # 46 "/usr/include/bits/stat.h" 3 4 struct stat { __dev_t st_dev; __ino_t st_ino; __nlink_t st_nlink; __mode_t st_mode; __uid_t st_uid; __gid_t st_gid; int __pad0; __dev_t st_rdev; __off_t st_size; __blksize_t st_blksize; __blkcnt_t st_blocks; # 91 "/usr/include/bits/stat.h" 3 4 struct timespec st_atim; struct timespec st_mtim; struct timespec st_ctim; # 106 "/usr/include/bits/stat.h" 3 4 __syscall_slong_t __glibc_reserved[3]; # 115 "/usr/include/bits/stat.h" 3 4 }; # 105 "/usr/include/sys/stat.h" 2 3 4 # 208 "/usr/include/sys/stat.h" 3 4 extern int stat (const char *__restrict __file, struct stat *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int fstat (int __fd, struct stat *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 237 "/usr/include/sys/stat.h" 3 4 extern int fstatat (int __fd, const char *__restrict __file, struct stat *__restrict __buf, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); # 262 "/usr/include/sys/stat.h" 3 4 extern int lstat (const char *__restrict __file, struct stat *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); # 283 "/usr/include/sys/stat.h" 3 4 extern int chmod (const char *__file, __mode_t __mode) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int lchmod (const char *__file, __mode_t __mode) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int fchmod (int __fd, __mode_t __mode) __attribute__ ((__nothrow__ , __leaf__)); extern int fchmodat (int __fd, const char *__file, __mode_t __mode, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))) ; extern __mode_t umask (__mode_t __mask) __attribute__ ((__nothrow__ , __leaf__)); # 320 "/usr/include/sys/stat.h" 3 4 extern int mkdir (const char *__path, __mode_t __mode) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int mkdirat (int __fd, const char *__path, __mode_t __mode) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int mknod (const char *__path, __mode_t __mode, __dev_t __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int mknodat (int __fd, const char *__path, __mode_t __mode, __dev_t __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int mkfifo (const char *__path, __mode_t __mode) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int mkfifoat (int __fd, const char *__path, __mode_t __mode) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int utimensat (int __fd, const char *__path, const struct timespec __times[2], int __flags) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int futimens (int __fd, const struct timespec __times[2]) __attribute__ ((__nothrow__ , __leaf__)); # 398 "/usr/include/sys/stat.h" 3 4 extern int __fxstat (int __ver, int __fildes, struct stat *__stat_buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))); extern int __xstat (int __ver, const char *__filename, struct stat *__stat_buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern int __lxstat (int __ver, const char *__filename, struct stat *__stat_buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern int __fxstatat (int __ver, int __fildes, const char *__filename, struct stat *__stat_buf, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))); # 441 "/usr/include/sys/stat.h" 3 4 extern int __xmknod (int __ver, const char *__path, __mode_t __mode, __dev_t *__dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern int __xmknodat (int __ver, int __fd, const char *__path, __mode_t __mode, __dev_t *__dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 5))); # 533 "/usr/include/sys/stat.h" 3 4 # 6 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 ================================================================================ TEST checkSysWait from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:173) TESTING: checkSysWait from config.headers(config/BuildSystem/config/headers.py:173) Check for POSIX.1 compatible sys/wait.h, and defines HAVE_SYS_WAIT_H if found Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.headers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.headers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.headers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #ifndef WEXITSTATUS #define WEXITSTATUS(stat_val) ((unsigned)(stat_val) >> 8) #endif #ifndef WIFEXITED #define WIFEXITED(stat_val) (((stat_val) & 255) == 0) #endif int main() { int s; wait (&s); s = WIFEXITED (s) ? WEXITSTATUS (s) : 1; ; return 0; } Defined "HAVE_SYS_WAIT_H" to "1" ================================================================================ TEST checkTime from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:195) TESTING: checkTime from config.headers(config/BuildSystem/config/headers.py:195) Checks if you can safely include both and , and if so defines TIME_WITH_SYS_TIME Checking for header: time.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/time.h" 1 3 4 # 27 "/usr/include/time.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 28 "/usr/include/time.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 38 "/usr/include/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 42 "/usr/include/time.h" 2 3 4 # 55 "/usr/include/time.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 56 "/usr/include/time.h" 2 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; struct tm { int tm_sec; int tm_min; int tm_hour; int tm_mday; int tm_mon; int tm_year; int tm_wday; int tm_yday; int tm_isdst; long int tm_gmtoff; const char *tm_zone; }; struct itimerspec { struct timespec it_interval; struct timespec it_value; }; struct sigevent; typedef __pid_t pid_t; # 186 "/usr/include/time.h" 3 4 extern clock_t clock (void) __attribute__ ((__nothrow__ , __leaf__)); extern time_t time (time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern double difftime (time_t __time1, time_t __time0) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern time_t mktime (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern size_t strftime (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); # 221 "/usr/include/time.h" 3 4 # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 222 "/usr/include/time.h" 2 3 4 extern size_t strftime_l (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)); # 236 "/usr/include/time.h" 3 4 extern struct tm *gmtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *gmtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime (const struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime_r (const struct tm *__restrict __tp, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime_r (const time_t *__restrict __timer, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *__tzname[2]; extern int __daylight; extern long int __timezone; extern char *tzname[2]; extern void tzset (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daylight; extern long int timezone; extern int stime (const time_t *__when) __attribute__ ((__nothrow__ , __leaf__)); # 319 "/usr/include/time.h" 3 4 extern time_t timegm (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern time_t timelocal (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int dysize (int __year) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 334 "/usr/include/time.h" 3 4 extern int nanosleep (const struct timespec *__requested_time, struct timespec *__remaining); extern int clock_getres (clockid_t __clock_id, struct timespec *__res) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_gettime (clockid_t __clock_id, struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_settime (clockid_t __clock_id, const struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_nanosleep (clockid_t __clock_id, int __flags, const struct timespec *__req, struct timespec *__rem); extern int clock_getcpuclockid (pid_t __pid, clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_create (clockid_t __clock_id, struct sigevent *__restrict __evp, timer_t *__restrict __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_delete (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_settime (timer_t __timerid, int __flags, const struct itimerspec *__restrict __value, struct itimerspec *__restrict __ovalue) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_gettime (timer_t __timerid, struct itimerspec *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_getoverrun (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timespec_get (struct timespec *__ts, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 430 "/usr/include/time.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_TIME_H" to "1" Checking for header: sys/time.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/time.h" 1 3 4 # 21 "/usr/include/sys/time.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/sys/time.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 24 "/usr/include/sys/time.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 26 "/usr/include/sys/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 28 "/usr/include/sys/time.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 30 "/usr/include/sys/time.h" 2 3 4 # 55 "/usr/include/sys/time.h" 3 4 struct timezone { int tz_minuteswest; int tz_dsttime; }; typedef struct timezone *__restrict __timezone_ptr_t; # 71 "/usr/include/sys/time.h" 3 4 extern int gettimeofday (struct timeval *__restrict __tv, __timezone_ptr_t __tz) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int settimeofday (const struct timeval *__tv, const struct timezone *__tz) __attribute__ ((__nothrow__ , __leaf__)); extern int adjtime (const struct timeval *__delta, struct timeval *__olddelta) __attribute__ ((__nothrow__ , __leaf__)); enum __itimer_which { ITIMER_REAL = 0, ITIMER_VIRTUAL = 1, ITIMER_PROF = 2 }; struct itimerval { struct timeval it_interval; struct timeval it_value; }; typedef int __itimer_which_t; extern int getitimer (__itimer_which_t __which, struct itimerval *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int setitimer (__itimer_which_t __which, const struct itimerval *__restrict __new, struct itimerval *__restrict __old) __attribute__ ((__nothrow__ , __leaf__)); extern int utimes (const char *__file, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int lutimes (const char *__file, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int futimes (int __fd, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)); # 189 "/usr/include/sys/time.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_TIME_H" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.headers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.headers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.headers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #include int main() { struct tm *tp = 0; if (tp); ; return 0; } Defined "TIME_WITH_SYS_TIME" to "1" ================================================================================ TEST checkMath from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:203) TESTING: checkMath from config.headers(config/BuildSystem/config/headers.py:203) Checks for the math headers and defines Checking for header: math.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/math.h" 1 3 4 # 26 "/usr/include/math.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 27 "/usr/include/math.h" 2 3 4 # 1 "/usr/include/bits/math-vector.h" 1 3 4 # 25 "/usr/include/bits/math-vector.h" 3 4 # 1 "/usr/include/bits/libm-simd-decl-stubs.h" 1 3 4 # 26 "/usr/include/bits/math-vector.h" 2 3 4 # 32 "/usr/include/math.h" 2 3 4 # 1 "/usr/include/bits/huge_val.h" 1 3 4 # 36 "/usr/include/math.h" 2 3 4 # 1 "/usr/include/bits/huge_valf.h" 1 3 4 # 38 "/usr/include/math.h" 2 3 4 # 1 "/usr/include/bits/huge_vall.h" 1 3 4 # 39 "/usr/include/math.h" 2 3 4 # 1 "/usr/include/bits/inf.h" 1 3 4 # 42 "/usr/include/math.h" 2 3 4 # 1 "/usr/include/bits/nan.h" 1 3 4 # 45 "/usr/include/math.h" 2 3 4 # 1 "/usr/include/bits/mathdef.h" 1 3 4 # 28 "/usr/include/bits/mathdef.h" 3 4 # 28 "/usr/include/bits/mathdef.h" 3 4 typedef float float_t; typedef double double_t; # 49 "/usr/include/math.h" 2 3 4 # 83 "/usr/include/math.h" 3 4 # 1 "/usr/include/bits/mathcalls.h" 1 3 4 # 52 "/usr/include/bits/mathcalls.h" 3 4 extern double acos (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __acos (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double asin (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __asin (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double atan (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __atan (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double atan2 (double __y, double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __atan2 (double __y, double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double cos (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __cos (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double sin (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __sin (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double tan (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __tan (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double cosh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __cosh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double sinh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __sinh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double tanh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __tanh (double __x) __attribute__ ((__nothrow__ , __leaf__)); # 86 "/usr/include/bits/mathcalls.h" 3 4 extern double acosh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __acosh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double asinh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __asinh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double atanh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __atanh (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double exp (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __exp (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double frexp (double __x, int *__exponent) __attribute__ ((__nothrow__ , __leaf__)); extern double __frexp (double __x, int *__exponent) __attribute__ ((__nothrow__ , __leaf__)); extern double ldexp (double __x, int __exponent) __attribute__ ((__nothrow__ , __leaf__)); extern double __ldexp (double __x, int __exponent) __attribute__ ((__nothrow__ , __leaf__)); extern double log (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __log (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double log10 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __log10 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double modf (double __x, double *__iptr) __attribute__ ((__nothrow__ , __leaf__)); extern double __modf (double __x, double *__iptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 126 "/usr/include/bits/mathcalls.h" 3 4 extern double expm1 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __expm1 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double log1p (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __log1p (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double logb (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __logb (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double exp2 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __exp2 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double log2 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __log2 (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double pow (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double __pow (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double sqrt (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __sqrt (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double hypot (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double __hypot (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double cbrt (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __cbrt (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double ceil (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __ceil (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double fabs (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __fabs (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double floor (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __floor (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double fmod (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double __fmod (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern int __isinf (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __finite (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int isinf (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int finite (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double drem (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double __drem (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double significand (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __significand (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double copysign (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __copysign (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double nan (const char *__tagb) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __nan (const char *__tagb) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __isnan (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int isnan (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double j0 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __j0 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double j1 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __j1 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double jn (int, double) __attribute__ ((__nothrow__ , __leaf__)); extern double __jn (int, double) __attribute__ ((__nothrow__ , __leaf__)); extern double y0 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __y0 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double y1 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __y1 (double) __attribute__ ((__nothrow__ , __leaf__)); extern double yn (int, double) __attribute__ ((__nothrow__ , __leaf__)); extern double __yn (int, double) __attribute__ ((__nothrow__ , __leaf__)); extern double erf (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __erf (double) __attribute__ ((__nothrow__ , __leaf__)); extern double erfc (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __erfc (double) __attribute__ ((__nothrow__ , __leaf__)); extern double lgamma (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __lgamma (double) __attribute__ ((__nothrow__ , __leaf__)); extern double tgamma (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __tgamma (double) __attribute__ ((__nothrow__ , __leaf__)); extern double gamma (double) __attribute__ ((__nothrow__ , __leaf__)); extern double __gamma (double) __attribute__ ((__nothrow__ , __leaf__)); extern double lgamma_r (double, int *__signgamp) __attribute__ ((__nothrow__ , __leaf__)); extern double __lgamma_r (double, int *__signgamp) __attribute__ ((__nothrow__ , __leaf__)); extern double rint (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __rint (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double nextafter (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __nextafter (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double nexttoward (double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __nexttoward (double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 305 "/usr/include/bits/mathcalls.h" 3 4 extern double remainder (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double __remainder (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double scalbn (double __x, int __n) __attribute__ ((__nothrow__ , __leaf__)); extern double __scalbn (double __x, int __n) __attribute__ ((__nothrow__ , __leaf__)); extern int ilogb (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern int __ilogb (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double scalbln (double __x, long int __n) __attribute__ ((__nothrow__ , __leaf__)); extern double __scalbln (double __x, long int __n) __attribute__ ((__nothrow__ , __leaf__)); extern double nearbyint (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double __nearbyint (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double round (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __round (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double trunc (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __trunc (double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double remquo (double __x, double __y, int *__quo) __attribute__ ((__nothrow__ , __leaf__)); extern double __remquo (double __x, double __y, int *__quo) __attribute__ ((__nothrow__ , __leaf__)); extern long int lrint (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int __lrint (double __x) __attribute__ ((__nothrow__ , __leaf__)); __extension__ extern long long int llrint (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long long int __llrint (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int lround (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int __lround (double __x) __attribute__ ((__nothrow__ , __leaf__)); __extension__ extern long long int llround (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long long int __llround (double __x) __attribute__ ((__nothrow__ , __leaf__)); extern double fdim (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double __fdim (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)); extern double fmax (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __fmax (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double fmin (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double __fmin (double __x, double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __fpclassify (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __signbit (double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern double fma (double __x, double __y, double __z) __attribute__ ((__nothrow__ , __leaf__)); extern double __fma (double __x, double __y, double __z) __attribute__ ((__nothrow__ , __leaf__)); # 390 "/usr/include/bits/mathcalls.h" 3 4 extern double scalb (double __x, double __n) __attribute__ ((__nothrow__ , __leaf__)); extern double __scalb (double __x, double __n) __attribute__ ((__nothrow__ , __leaf__)); # 84 "/usr/include/math.h" 2 3 4 # 104 "/usr/include/math.h" 3 4 # 1 "/usr/include/bits/mathcalls.h" 1 3 4 # 52 "/usr/include/bits/mathcalls.h" 3 4 extern float acosf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __acosf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float asinf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __asinf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float atanf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __atanf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float atan2f (float __y, float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __atan2f (float __y, float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float cosf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __cosf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float sinf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __sinf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float tanf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __tanf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float coshf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __coshf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float sinhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __sinhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float tanhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __tanhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); # 86 "/usr/include/bits/mathcalls.h" 3 4 extern float acoshf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __acoshf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float asinhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __asinhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float atanhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __atanhf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float expf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __expf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float frexpf (float __x, int *__exponent) __attribute__ ((__nothrow__ , __leaf__)); extern float __frexpf (float __x, int *__exponent) __attribute__ ((__nothrow__ , __leaf__)); extern float ldexpf (float __x, int __exponent) __attribute__ ((__nothrow__ , __leaf__)); extern float __ldexpf (float __x, int __exponent) __attribute__ ((__nothrow__ , __leaf__)); extern float logf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __logf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float log10f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __log10f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float modff (float __x, float *__iptr) __attribute__ ((__nothrow__ , __leaf__)); extern float __modff (float __x, float *__iptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 126 "/usr/include/bits/mathcalls.h" 3 4 extern float expm1f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __expm1f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float log1pf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __log1pf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float logbf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __logbf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float exp2f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __exp2f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float log2f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __log2f (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float powf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float __powf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float sqrtf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __sqrtf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float hypotf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float __hypotf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float cbrtf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __cbrtf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float ceilf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __ceilf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float fabsf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __fabsf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float floorf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __floorf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float fmodf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float __fmodf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern int __isinff (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __finitef (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int isinff (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int finitef (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float dremf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float __dremf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float significandf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __significandf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float copysignf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __copysignf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float nanf (const char *__tagb) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __nanf (const char *__tagb) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __isnanf (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int isnanf (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float j0f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __j0f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float j1f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __j1f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float jnf (int, float) __attribute__ ((__nothrow__ , __leaf__)); extern float __jnf (int, float) __attribute__ ((__nothrow__ , __leaf__)); extern float y0f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __y0f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float y1f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __y1f (float) __attribute__ ((__nothrow__ , __leaf__)); extern float ynf (int, float) __attribute__ ((__nothrow__ , __leaf__)); extern float __ynf (int, float) __attribute__ ((__nothrow__ , __leaf__)); extern float erff (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __erff (float) __attribute__ ((__nothrow__ , __leaf__)); extern float erfcf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __erfcf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float lgammaf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __lgammaf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float tgammaf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __tgammaf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float gammaf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float __gammaf (float) __attribute__ ((__nothrow__ , __leaf__)); extern float lgammaf_r (float, int *__signgamp) __attribute__ ((__nothrow__ , __leaf__)); extern float __lgammaf_r (float, int *__signgamp) __attribute__ ((__nothrow__ , __leaf__)); extern float rintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __rintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float nextafterf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __nextafterf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float nexttowardf (float __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __nexttowardf (float __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 305 "/usr/include/bits/mathcalls.h" 3 4 extern float remainderf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float __remainderf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float scalbnf (float __x, int __n) __attribute__ ((__nothrow__ , __leaf__)); extern float __scalbnf (float __x, int __n) __attribute__ ((__nothrow__ , __leaf__)); extern int ilogbf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern int __ilogbf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float scalblnf (float __x, long int __n) __attribute__ ((__nothrow__ , __leaf__)); extern float __scalblnf (float __x, long int __n) __attribute__ ((__nothrow__ , __leaf__)); extern float nearbyintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float __nearbyintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float roundf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __roundf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float truncf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __truncf (float __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float remquof (float __x, float __y, int *__quo) __attribute__ ((__nothrow__ , __leaf__)); extern float __remquof (float __x, float __y, int *__quo) __attribute__ ((__nothrow__ , __leaf__)); extern long int lrintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int __lrintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); __extension__ extern long long int llrintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern long long int __llrintf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int lroundf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int __lroundf (float __x) __attribute__ ((__nothrow__ , __leaf__)); __extension__ extern long long int llroundf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern long long int __llroundf (float __x) __attribute__ ((__nothrow__ , __leaf__)); extern float fdimf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float __fdimf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)); extern float fmaxf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __fmaxf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float fminf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float __fminf (float __x, float __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __fpclassifyf (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __signbitf (float __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern float fmaf (float __x, float __y, float __z) __attribute__ ((__nothrow__ , __leaf__)); extern float __fmaf (float __x, float __y, float __z) __attribute__ ((__nothrow__ , __leaf__)); # 390 "/usr/include/bits/mathcalls.h" 3 4 extern float scalbf (float __x, float __n) __attribute__ ((__nothrow__ , __leaf__)); extern float __scalbf (float __x, float __n) __attribute__ ((__nothrow__ , __leaf__)); # 105 "/usr/include/math.h" 2 3 4 # 151 "/usr/include/math.h" 3 4 # 1 "/usr/include/bits/mathcalls.h" 1 3 4 # 52 "/usr/include/bits/mathcalls.h" 3 4 extern long double acosl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __acosl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double asinl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __asinl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double atanl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __atanl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double atan2l (long double __y, long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __atan2l (long double __y, long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double cosl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __cosl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double sinl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __sinl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double tanl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __tanl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double coshl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __coshl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double sinhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __sinhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double tanhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __tanhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); # 86 "/usr/include/bits/mathcalls.h" 3 4 extern long double acoshl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __acoshl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double asinhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __asinhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double atanhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __atanhl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double expl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __expl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double frexpl (long double __x, int *__exponent) __attribute__ ((__nothrow__ , __leaf__)); extern long double __frexpl (long double __x, int *__exponent) __attribute__ ((__nothrow__ , __leaf__)); extern long double ldexpl (long double __x, int __exponent) __attribute__ ((__nothrow__ , __leaf__)); extern long double __ldexpl (long double __x, int __exponent) __attribute__ ((__nothrow__ , __leaf__)); extern long double logl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __logl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double log10l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __log10l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double modfl (long double __x, long double *__iptr) __attribute__ ((__nothrow__ , __leaf__)); extern long double __modfl (long double __x, long double *__iptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 126 "/usr/include/bits/mathcalls.h" 3 4 extern long double expm1l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __expm1l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double log1pl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __log1pl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double logbl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __logbl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double exp2l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __exp2l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double log2l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __log2l (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double powl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double __powl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double sqrtl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __sqrtl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double hypotl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double __hypotl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double cbrtl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __cbrtl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double ceill (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __ceill (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double fabsl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __fabsl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double floorl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __floorl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double fmodl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double __fmodl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern int __isinfl (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __finitel (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int isinfl (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int finitel (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double dreml (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double __dreml (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double significandl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __significandl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double copysignl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __copysignl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double nanl (const char *__tagb) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __nanl (const char *__tagb) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __isnanl (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int isnanl (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double j0l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __j0l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double j1l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __j1l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double jnl (int, long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __jnl (int, long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double y0l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __y0l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double y1l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __y1l (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double ynl (int, long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __ynl (int, long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double erfl (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __erfl (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double erfcl (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __erfcl (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double lgammal (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __lgammal (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double tgammal (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __tgammal (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double gammal (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double __gammal (long double) __attribute__ ((__nothrow__ , __leaf__)); extern long double lgammal_r (long double, int *__signgamp) __attribute__ ((__nothrow__ , __leaf__)); extern long double __lgammal_r (long double, int *__signgamp) __attribute__ ((__nothrow__ , __leaf__)); extern long double rintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __rintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double nextafterl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __nextafterl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double nexttowardl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __nexttowardl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 305 "/usr/include/bits/mathcalls.h" 3 4 extern long double remainderl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double __remainderl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double scalbnl (long double __x, int __n) __attribute__ ((__nothrow__ , __leaf__)); extern long double __scalbnl (long double __x, int __n) __attribute__ ((__nothrow__ , __leaf__)); extern int ilogbl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern int __ilogbl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double scalblnl (long double __x, long int __n) __attribute__ ((__nothrow__ , __leaf__)); extern long double __scalblnl (long double __x, long int __n) __attribute__ ((__nothrow__ , __leaf__)); extern long double nearbyintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double __nearbyintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double roundl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __roundl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double truncl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __truncl (long double __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double remquol (long double __x, long double __y, int *__quo) __attribute__ ((__nothrow__ , __leaf__)); extern long double __remquol (long double __x, long double __y, int *__quo) __attribute__ ((__nothrow__ , __leaf__)); extern long int lrintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int __lrintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); __extension__ extern long long int llrintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long long int __llrintl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int lroundl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long int __lroundl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); __extension__ extern long long int llroundl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long long int __llroundl (long double __x) __attribute__ ((__nothrow__ , __leaf__)); extern long double fdiml (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double __fdiml (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)); extern long double fmaxl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __fmaxl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double fminl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double __fminl (long double __x, long double __y) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __fpclassifyl (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int __signbitl (long double __value) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern long double fmal (long double __x, long double __y, long double __z) __attribute__ ((__nothrow__ , __leaf__)); extern long double __fmal (long double __x, long double __y, long double __z) __attribute__ ((__nothrow__ , __leaf__)); # 390 "/usr/include/bits/mathcalls.h" 3 4 extern long double scalbl (long double __x, long double __n) __attribute__ ((__nothrow__ , __leaf__)); extern long double __scalbl (long double __x, long double __n) __attribute__ ((__nothrow__ , __leaf__)); # 152 "/usr/include/math.h" 2 3 4 # 168 "/usr/include/math.h" 3 4 extern int signgam; # 209 "/usr/include/math.h" 3 4 enum { FP_NAN = 0, FP_INFINITE = 1, FP_ZERO = 2, FP_SUBNORMAL = 3, FP_NORMAL = 4 }; # 347 "/usr/include/math.h" 3 4 typedef enum { _IEEE_ = -1, _SVID_, _XOPEN_, _POSIX_, _ISOC_ } _LIB_VERSION_TYPE; extern _LIB_VERSION_TYPE _LIB_VERSION; # 372 "/usr/include/math.h" 3 4 struct exception { int type; char *name; double arg1; double arg2; double retval; }; extern int matherr (struct exception *__exc); # 534 "/usr/include/math.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_MATH_H" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.headers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.headers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.headers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { double pi = M_PI; if (pi); ; return 0; } Found math #defines, like M_PI ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/socket.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/socket.h" 1 3 4 # 22 "/usr/include/sys/socket.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 23 "/usr/include/sys/socket.h" 2 3 4 # 1 "/usr/include/sys/uio.h" 1 3 4 # 23 "/usr/include/sys/uio.h" 3 4 # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 24 "/usr/include/sys/uio.h" 2 3 4 # 1 "/usr/include/bits/uio.h" 1 3 4 # 43 "/usr/include/bits/uio.h" 3 4 struct iovec { void *iov_base; size_t iov_len; }; # 29 "/usr/include/sys/uio.h" 2 3 4 # 39 "/usr/include/sys/uio.h" 3 4 extern ssize_t readv (int __fd, const struct iovec *__iovec, int __count) ; # 50 "/usr/include/sys/uio.h" 3 4 extern ssize_t writev (int __fd, const struct iovec *__iovec, int __count) ; # 65 "/usr/include/sys/uio.h" 3 4 extern ssize_t preadv (int __fd, const struct iovec *__iovec, int __count, __off_t __offset) ; # 77 "/usr/include/sys/uio.h" 3 4 extern ssize_t pwritev (int __fd, const struct iovec *__iovec, int __count, __off_t __offset) ; # 120 "/usr/include/sys/uio.h" 3 4 # 27 "/usr/include/sys/socket.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 29 "/usr/include/sys/socket.h" 2 3 4 # 38 "/usr/include/sys/socket.h" 3 4 # 1 "/usr/include/bits/socket.h" 1 3 4 # 27 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 28 "/usr/include/bits/socket.h" 2 3 4 typedef __socklen_t socklen_t; # 1 "/usr/include/bits/socket_type.h" 1 3 4 # 24 "/usr/include/bits/socket_type.h" 3 4 enum __socket_type { SOCK_STREAM = 1, SOCK_DGRAM = 2, SOCK_RAW = 3, SOCK_RDM = 4, SOCK_SEQPACKET = 5, SOCK_DCCP = 6, SOCK_PACKET = 10, SOCK_CLOEXEC = 02000000, SOCK_NONBLOCK = 00004000 }; # 39 "/usr/include/bits/socket.h" 2 3 4 # 167 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/include/bits/sockaddr.h" 1 3 4 # 28 "/usr/include/bits/sockaddr.h" 3 4 typedef unsigned short int sa_family_t; # 168 "/usr/include/bits/socket.h" 2 3 4 struct sockaddr { sa_family_t sa_family; char sa_data[14]; }; # 183 "/usr/include/bits/socket.h" 3 4 struct sockaddr_storage { sa_family_t ss_family; char __ss_padding[(128 - (sizeof (unsigned short int)) - sizeof (unsigned long int))]; unsigned long int __ss_align; }; enum { MSG_OOB = 0x01, MSG_PEEK = 0x02, MSG_DONTROUTE = 0x04, MSG_CTRUNC = 0x08, MSG_PROXY = 0x10, MSG_TRUNC = 0x20, MSG_DONTWAIT = 0x40, MSG_EOR = 0x80, MSG_WAITALL = 0x100, MSG_FIN = 0x200, MSG_SYN = 0x400, MSG_CONFIRM = 0x800, MSG_RST = 0x1000, MSG_ERRQUEUE = 0x2000, MSG_NOSIGNAL = 0x4000, MSG_MORE = 0x8000, MSG_WAITFORONE = 0x10000, MSG_BATCH = 0x40000, MSG_FASTOPEN = 0x20000000, MSG_CMSG_CLOEXEC = 0x40000000 }; struct msghdr { void *msg_name; socklen_t msg_namelen; struct iovec *msg_iov; size_t msg_iovlen; void *msg_control; size_t msg_controllen; int msg_flags; }; struct cmsghdr { size_t cmsg_len; int cmsg_level; int cmsg_type; __extension__ unsigned char __cmsg_data []; }; # 295 "/usr/include/bits/socket.h" 3 4 extern struct cmsghdr *__cmsg_nxthdr (struct msghdr *__mhdr, struct cmsghdr *__cmsg) __attribute__ ((__nothrow__ , __leaf__)); # 322 "/usr/include/bits/socket.h" 3 4 enum { SCM_RIGHTS = 0x01 }; # 368 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/include/asm/socket.h" 1 3 4 # 1 "/usr/include/asm-generic/socket.h" 1 3 4 # 1 "/usr/include/asm/sockios.h" 1 3 4 # 1 "/usr/include/asm-generic/sockios.h" 1 3 4 # 1 "/usr/include/asm/sockios.h" 2 3 4 # 5 "/usr/include/asm-generic/socket.h" 2 3 4 # 1 "/usr/include/asm/socket.h" 2 3 4 # 369 "/usr/include/bits/socket.h" 2 3 4 # 402 "/usr/include/bits/socket.h" 3 4 struct linger { int l_onoff; int l_linger; }; # 39 "/usr/include/sys/socket.h" 2 3 4 struct osockaddr { unsigned short int sa_family; unsigned char sa_data[14]; }; enum { SHUT_RD = 0, SHUT_WR, SHUT_RDWR }; # 113 "/usr/include/sys/socket.h" 3 4 extern int socket (int __domain, int __type, int __protocol) __attribute__ ((__nothrow__ , __leaf__)); extern int socketpair (int __domain, int __type, int __protocol, int __fds[2]) __attribute__ ((__nothrow__ , __leaf__)); extern int bind (int __fd, const struct sockaddr * __addr, socklen_t __len) __attribute__ ((__nothrow__ , __leaf__)); extern int getsockname (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __len) __attribute__ ((__nothrow__ , __leaf__)); # 137 "/usr/include/sys/socket.h" 3 4 extern int connect (int __fd, const struct sockaddr * __addr, socklen_t __len); extern int getpeername (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __len) __attribute__ ((__nothrow__ , __leaf__)); extern ssize_t send (int __fd, const void *__buf, size_t __n, int __flags); extern ssize_t recv (int __fd, void *__buf, size_t __n, int __flags); extern ssize_t sendto (int __fd, const void *__buf, size_t __n, int __flags, const struct sockaddr * __addr, socklen_t __addr_len); # 174 "/usr/include/sys/socket.h" 3 4 extern ssize_t recvfrom (int __fd, void *__restrict __buf, size_t __n, int __flags, struct sockaddr *__restrict __addr, socklen_t *__restrict __addr_len); extern ssize_t sendmsg (int __fd, const struct msghdr *__message, int __flags); # 202 "/usr/include/sys/socket.h" 3 4 extern ssize_t recvmsg (int __fd, struct msghdr *__message, int __flags); # 219 "/usr/include/sys/socket.h" 3 4 extern int getsockopt (int __fd, int __level, int __optname, void *__restrict __optval, socklen_t *__restrict __optlen) __attribute__ ((__nothrow__ , __leaf__)); extern int setsockopt (int __fd, int __level, int __optname, const void *__optval, socklen_t __optlen) __attribute__ ((__nothrow__ , __leaf__)); extern int listen (int __fd, int __n) __attribute__ ((__nothrow__ , __leaf__)); # 243 "/usr/include/sys/socket.h" 3 4 extern int accept (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __addr_len); # 261 "/usr/include/sys/socket.h" 3 4 extern int shutdown (int __fd, int __how) __attribute__ ((__nothrow__ , __leaf__)); extern int sockatmark (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int isfdtype (int __fd, int __fdtype) __attribute__ ((__nothrow__ , __leaf__)); # 283 "/usr/include/sys/socket.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_SOCKET_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/types.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/types.h" 1 3 4 # 25 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_TYPES_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: malloc.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/malloc.h" 1 3 4 # 22 "/usr/include/malloc.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 23 "/usr/include/malloc.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long int ptrdiff_t; # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 426 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef struct { long long __max_align_ll __attribute__((__aligned__(__alignof__(long long)))); long double __max_align_ld __attribute__((__aligned__(__alignof__(long double)))); } max_align_t; # 24 "/usr/include/malloc.h" 2 3 4 # 1 "/usr/include/stdio.h" 1 3 4 # 29 "/usr/include/stdio.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 34 "/usr/include/stdio.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 36 "/usr/include/stdio.h" 2 3 4 # 44 "/usr/include/stdio.h" 3 4 struct _IO_FILE; typedef struct _IO_FILE FILE; # 64 "/usr/include/stdio.h" 3 4 typedef struct _IO_FILE __FILE; # 74 "/usr/include/stdio.h" 3 4 # 1 "/usr/include/libio.h" 1 3 4 # 31 "/usr/include/libio.h" 3 4 # 1 "/usr/include/_G_config.h" 1 3 4 # 15 "/usr/include/_G_config.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 16 "/usr/include/_G_config.h" 2 3 4 # 1 "/usr/include/wchar.h" 1 3 4 # 82 "/usr/include/wchar.h" 3 4 typedef struct { int __count; union { unsigned int __wch; char __wchb[4]; } __value; } __mbstate_t; # 21 "/usr/include/_G_config.h" 2 3 4 typedef struct { __off_t __pos; __mbstate_t __state; } _G_fpos_t; typedef struct { __off64_t __pos; __mbstate_t __state; } _G_fpos64_t; # 32 "/usr/include/libio.h" 2 3 4 # 49 "/usr/include/libio.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 1 3 4 # 40 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 3 4 typedef __builtin_va_list __gnuc_va_list; # 50 "/usr/include/libio.h" 2 3 4 # 144 "/usr/include/libio.h" 3 4 struct _IO_jump_t; struct _IO_FILE; typedef void _IO_lock_t; struct _IO_marker { struct _IO_marker *_next; struct _IO_FILE *_sbuf; int _pos; # 173 "/usr/include/libio.h" 3 4 }; enum __codecvt_result { __codecvt_ok, __codecvt_partial, __codecvt_error, __codecvt_noconv }; # 241 "/usr/include/libio.h" 3 4 struct _IO_FILE { int _flags; char* _IO_read_ptr; char* _IO_read_end; char* _IO_read_base; char* _IO_write_base; char* _IO_write_ptr; char* _IO_write_end; char* _IO_buf_base; char* _IO_buf_end; char *_IO_save_base; char *_IO_backup_base; char *_IO_save_end; struct _IO_marker *_markers; struct _IO_FILE *_chain; int _fileno; int _flags2; __off_t _old_offset; unsigned short _cur_column; signed char _vtable_offset; char _shortbuf[1]; _IO_lock_t *_lock; # 289 "/usr/include/libio.h" 3 4 __off64_t _offset; void *__pad1; void *__pad2; void *__pad3; void *__pad4; size_t __pad5; int _mode; char _unused2[15 * sizeof (int) - 4 * sizeof (void *) - sizeof (size_t)]; }; typedef struct _IO_FILE _IO_FILE; struct _IO_FILE_plus; extern struct _IO_FILE_plus _IO_2_1_stdin_; extern struct _IO_FILE_plus _IO_2_1_stdout_; extern struct _IO_FILE_plus _IO_2_1_stderr_; # 333 "/usr/include/libio.h" 3 4 typedef __ssize_t __io_read_fn (void *__cookie, char *__buf, size_t __nbytes); typedef __ssize_t __io_write_fn (void *__cookie, const char *__buf, size_t __n); typedef int __io_seek_fn (void *__cookie, __off64_t *__pos, int __w); typedef int __io_close_fn (void *__cookie); # 385 "/usr/include/libio.h" 3 4 extern int __underflow (_IO_FILE *); extern int __uflow (_IO_FILE *); extern int __overflow (_IO_FILE *, int); # 429 "/usr/include/libio.h" 3 4 extern int _IO_getc (_IO_FILE *__fp); extern int _IO_putc (int __c, _IO_FILE *__fp); extern int _IO_feof (_IO_FILE *__fp) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_ferror (_IO_FILE *__fp) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_peekc_locked (_IO_FILE *__fp); extern void _IO_flockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); extern void _IO_funlockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_ftrylockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); # 459 "/usr/include/libio.h" 3 4 extern int _IO_vfscanf (_IO_FILE * __restrict, const char * __restrict, __gnuc_va_list, int *__restrict); extern int _IO_vfprintf (_IO_FILE *__restrict, const char *__restrict, __gnuc_va_list); extern __ssize_t _IO_padn (_IO_FILE *, int, __ssize_t); extern size_t _IO_sgetn (_IO_FILE *, void *, size_t); extern __off64_t _IO_seekoff (_IO_FILE *, __off64_t, int, int); extern __off64_t _IO_seekpos (_IO_FILE *, __off64_t, int); extern void _IO_free_backup_area (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); # 75 "/usr/include/stdio.h" 2 3 4 typedef __gnuc_va_list va_list; # 90 "/usr/include/stdio.h" 3 4 typedef __off_t off_t; # 104 "/usr/include/stdio.h" 3 4 typedef __ssize_t ssize_t; typedef _G_fpos_t fpos_t; # 166 "/usr/include/stdio.h" 3 4 # 1 "/usr/include/bits/stdio_lim.h" 1 3 4 # 167 "/usr/include/stdio.h" 2 3 4 extern struct _IO_FILE *stdin; extern struct _IO_FILE *stdout; extern struct _IO_FILE *stderr; extern int remove (const char *__filename) __attribute__ ((__nothrow__ , __leaf__)); extern int rename (const char *__old, const char *__new) __attribute__ ((__nothrow__ , __leaf__)); extern int renameat (int __oldfd, const char *__old, int __newfd, const char *__new) __attribute__ ((__nothrow__ , __leaf__)); extern FILE *tmpfile (void) ; # 211 "/usr/include/stdio.h" 3 4 extern char *tmpnam (char *__s) __attribute__ ((__nothrow__ , __leaf__)) ; extern char *tmpnam_r (char *__s) __attribute__ ((__nothrow__ , __leaf__)) ; # 229 "/usr/include/stdio.h" 3 4 extern char *tempnam (const char *__dir, const char *__pfx) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int fclose (FILE *__stream); extern int fflush (FILE *__stream); # 254 "/usr/include/stdio.h" 3 4 extern int fflush_unlocked (FILE *__stream); # 268 "/usr/include/stdio.h" 3 4 extern FILE *fopen (const char *__restrict __filename, const char *__restrict __modes) ; extern FILE *freopen (const char *__restrict __filename, const char *__restrict __modes, FILE *__restrict __stream) ; # 297 "/usr/include/stdio.h" 3 4 # 308 "/usr/include/stdio.h" 3 4 extern FILE *fdopen (int __fd, const char *__modes) __attribute__ ((__nothrow__ , __leaf__)) ; # 321 "/usr/include/stdio.h" 3 4 extern FILE *fmemopen (void *__s, size_t __len, const char *__modes) __attribute__ ((__nothrow__ , __leaf__)) ; extern FILE *open_memstream (char **__bufloc, size_t *__sizeloc) __attribute__ ((__nothrow__ , __leaf__)) ; extern void setbuf (FILE *__restrict __stream, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern int setvbuf (FILE *__restrict __stream, char *__restrict __buf, int __modes, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern void setbuffer (FILE *__restrict __stream, char *__restrict __buf, size_t __size) __attribute__ ((__nothrow__ , __leaf__)); extern void setlinebuf (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int fprintf (FILE *__restrict __stream, const char *__restrict __format, ...); extern int printf (const char *__restrict __format, ...); extern int sprintf (char *__restrict __s, const char *__restrict __format, ...) __attribute__ ((__nothrow__)); extern int vfprintf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg); extern int vprintf (const char *__restrict __format, __gnuc_va_list __arg); extern int vsprintf (char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__)); extern int snprintf (char *__restrict __s, size_t __maxlen, const char *__restrict __format, ...) __attribute__ ((__nothrow__)) __attribute__ ((__format__ (__printf__, 3, 4))); extern int vsnprintf (char *__restrict __s, size_t __maxlen, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__)) __attribute__ ((__format__ (__printf__, 3, 0))); # 414 "/usr/include/stdio.h" 3 4 extern int vdprintf (int __fd, const char *__restrict __fmt, __gnuc_va_list __arg) __attribute__ ((__format__ (__printf__, 2, 0))); extern int dprintf (int __fd, const char *__restrict __fmt, ...) __attribute__ ((__format__ (__printf__, 2, 3))); extern int fscanf (FILE *__restrict __stream, const char *__restrict __format, ...) ; extern int scanf (const char *__restrict __format, ...) ; extern int sscanf (const char *__restrict __s, const char *__restrict __format, ...) __attribute__ ((__nothrow__ , __leaf__)); # 445 "/usr/include/stdio.h" 3 4 extern int fscanf (FILE *__restrict __stream, const char *__restrict __format, ...) __asm__ ("" "__isoc99_fscanf") ; extern int scanf (const char *__restrict __format, ...) __asm__ ("" "__isoc99_scanf") ; extern int sscanf (const char *__restrict __s, const char *__restrict __format, ...) __asm__ ("" "__isoc99_sscanf") __attribute__ ((__nothrow__ , __leaf__)) ; # 465 "/usr/include/stdio.h" 3 4 extern int vfscanf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__format__ (__scanf__, 2, 0))) ; extern int vscanf (const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__format__ (__scanf__, 1, 0))) ; extern int vsscanf (const char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__format__ (__scanf__, 2, 0))); # 496 "/usr/include/stdio.h" 3 4 extern int vfscanf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vfscanf") __attribute__ ((__format__ (__scanf__, 2, 0))) ; extern int vscanf (const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vscanf") __attribute__ ((__format__ (__scanf__, 1, 0))) ; extern int vsscanf (const char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vsscanf") __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__format__ (__scanf__, 2, 0))); # 524 "/usr/include/stdio.h" 3 4 extern int fgetc (FILE *__stream); extern int getc (FILE *__stream); extern int getchar (void); # 552 "/usr/include/stdio.h" 3 4 extern int getc_unlocked (FILE *__stream); extern int getchar_unlocked (void); # 563 "/usr/include/stdio.h" 3 4 extern int fgetc_unlocked (FILE *__stream); extern int fputc (int __c, FILE *__stream); extern int putc (int __c, FILE *__stream); extern int putchar (int __c); # 596 "/usr/include/stdio.h" 3 4 extern int fputc_unlocked (int __c, FILE *__stream); extern int putc_unlocked (int __c, FILE *__stream); extern int putchar_unlocked (int __c); extern int getw (FILE *__stream); extern int putw (int __w, FILE *__stream); extern char *fgets (char *__restrict __s, int __n, FILE *__restrict __stream) ; # 642 "/usr/include/stdio.h" 3 4 # 667 "/usr/include/stdio.h" 3 4 extern __ssize_t __getdelim (char **__restrict __lineptr, size_t *__restrict __n, int __delimiter, FILE *__restrict __stream) ; extern __ssize_t getdelim (char **__restrict __lineptr, size_t *__restrict __n, int __delimiter, FILE *__restrict __stream) ; extern __ssize_t getline (char **__restrict __lineptr, size_t *__restrict __n, FILE *__restrict __stream) ; extern int fputs (const char *__restrict __s, FILE *__restrict __stream); extern int puts (const char *__s); extern int ungetc (int __c, FILE *__stream); extern size_t fread (void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream) ; extern size_t fwrite (const void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __s); # 739 "/usr/include/stdio.h" 3 4 extern size_t fread_unlocked (void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream) ; extern size_t fwrite_unlocked (const void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream); extern int fseek (FILE *__stream, long int __off, int __whence); extern long int ftell (FILE *__stream) ; extern void rewind (FILE *__stream); # 775 "/usr/include/stdio.h" 3 4 extern int fseeko (FILE *__stream, __off_t __off, int __whence); extern __off_t ftello (FILE *__stream) ; # 794 "/usr/include/stdio.h" 3 4 extern int fgetpos (FILE *__restrict __stream, fpos_t *__restrict __pos); extern int fsetpos (FILE *__stream, const fpos_t *__pos); # 817 "/usr/include/stdio.h" 3 4 # 826 "/usr/include/stdio.h" 3 4 extern void clearerr (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int feof (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int ferror (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void clearerr_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int feof_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int ferror_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void perror (const char *__s); # 1 "/usr/include/bits/sys_errlist.h" 1 3 4 # 26 "/usr/include/bits/sys_errlist.h" 3 4 extern int sys_nerr; extern const char *const sys_errlist[]; # 856 "/usr/include/stdio.h" 2 3 4 extern int fileno (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int fileno_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; # 874 "/usr/include/stdio.h" 3 4 extern FILE *popen (const char *__command, const char *__modes) ; extern int pclose (FILE *__stream); extern char *ctermid (char *__s) __attribute__ ((__nothrow__ , __leaf__)); # 914 "/usr/include/stdio.h" 3 4 extern void flockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int ftrylockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void funlockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); # 944 "/usr/include/stdio.h" 3 4 # 25 "/usr/include/malloc.h" 2 3 4 # 35 "/usr/include/malloc.h" 3 4 extern void *malloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *calloc (size_t __nmemb, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *realloc (void *__ptr, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__warn_unused_result__)); extern void free (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void cfree (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void *memalign (size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *valloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *pvalloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *(*__morecore) (ptrdiff_t __size); extern void *__default_morecore (ptrdiff_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)); struct mallinfo { int arena; int ordblks; int smblks; int hblks; int hblkhd; int usmblks; int fsmblks; int uordblks; int fordblks; int keepcost; }; extern struct mallinfo mallinfo (void) __attribute__ ((__nothrow__ , __leaf__)); # 121 "/usr/include/malloc.h" 3 4 extern int mallopt (int __param, int __val) __attribute__ ((__nothrow__ , __leaf__)); extern int malloc_trim (size_t __pad) __attribute__ ((__nothrow__ , __leaf__)); extern size_t malloc_usable_size (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void malloc_stats (void) __attribute__ ((__nothrow__ , __leaf__)); extern int malloc_info (int __options, FILE *__fp) __attribute__ ((__nothrow__ , __leaf__)); extern void *malloc_get_state (void) __attribute__ ((__nothrow__ , __leaf__)); extern int malloc_set_state (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void (*volatile __free_hook) (void *__ptr, const void *) __attribute__ ((__deprecated__)); extern void *(*volatile __malloc_hook)(size_t __size, const void *) __attribute__ ((__deprecated__)); extern void *(*volatile __realloc_hook)(void *__ptr, size_t __size, const void *) __attribute__ ((__deprecated__)); extern void *(*volatile __memalign_hook)(size_t __alignment, size_t __size, const void *) __attribute__ ((__deprecated__)); extern void (*volatile __after_morecore_hook) (void); extern void __malloc_check_init (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_MALLOC_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: time.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/time.h" 1 3 4 # 27 "/usr/include/time.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 28 "/usr/include/time.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 38 "/usr/include/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 42 "/usr/include/time.h" 2 3 4 # 55 "/usr/include/time.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 56 "/usr/include/time.h" 2 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; struct tm { int tm_sec; int tm_min; int tm_hour; int tm_mday; int tm_mon; int tm_year; int tm_wday; int tm_yday; int tm_isdst; long int tm_gmtoff; const char *tm_zone; }; struct itimerspec { struct timespec it_interval; struct timespec it_value; }; struct sigevent; typedef __pid_t pid_t; # 186 "/usr/include/time.h" 3 4 extern clock_t clock (void) __attribute__ ((__nothrow__ , __leaf__)); extern time_t time (time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern double difftime (time_t __time1, time_t __time0) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern time_t mktime (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern size_t strftime (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); # 221 "/usr/include/time.h" 3 4 # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 222 "/usr/include/time.h" 2 3 4 extern size_t strftime_l (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)); # 236 "/usr/include/time.h" 3 4 extern struct tm *gmtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *gmtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime (const struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime_r (const struct tm *__restrict __tp, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime_r (const time_t *__restrict __timer, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *__tzname[2]; extern int __daylight; extern long int __timezone; extern char *tzname[2]; extern void tzset (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daylight; extern long int timezone; extern int stime (const time_t *__when) __attribute__ ((__nothrow__ , __leaf__)); # 319 "/usr/include/time.h" 3 4 extern time_t timegm (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern time_t timelocal (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int dysize (int __year) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 334 "/usr/include/time.h" 3 4 extern int nanosleep (const struct timespec *__requested_time, struct timespec *__remaining); extern int clock_getres (clockid_t __clock_id, struct timespec *__res) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_gettime (clockid_t __clock_id, struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_settime (clockid_t __clock_id, const struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_nanosleep (clockid_t __clock_id, int __flags, const struct timespec *__req, struct timespec *__rem); extern int clock_getcpuclockid (pid_t __pid, clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_create (clockid_t __clock_id, struct sigevent *__restrict __evp, timer_t *__restrict __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_delete (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_settime (timer_t __timerid, int __flags, const struct itimerspec *__restrict __value, struct itimerspec *__restrict __ovalue) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_gettime (timer_t __timerid, struct itimerspec *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_getoverrun (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timespec_get (struct timespec *__ts, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 430 "/usr/include/time.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_TIME_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: Direct.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: Direct.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: Direct.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: Direct.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: Ws2tcpip.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:22: fatal error: Ws2tcpip.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:22: fatal error: Ws2tcpip.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:22: fatal error: Ws2tcpip.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: endian.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/endian.h" 1 3 4 # 21 "/usr/include/endian.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/endian.h" 2 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 27 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 28 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_ENDIAN_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: ieeefp.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: ieeefp.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: ieeefp.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: ieeefp.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: strings.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/strings.h" 1 3 4 # 26 "/usr/include/strings.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 27 "/usr/include/strings.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 29 "/usr/include/strings.h" 2 3 4 extern int bcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)); extern void bcopy (const void *__src, void *__dest, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern void bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); # 72 "/usr/include/strings.h" 3 4 extern char *index (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 100 "/usr/include/strings.h" 3 4 extern char *rindex (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern int ffs (int __i) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((const)); extern int strcasecmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)); extern int strncasecmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)); # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 124 "/usr/include/strings.h" 2 3 4 extern int strcasecmp_l (const char *__s1, const char *__s2, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern int strncasecmp_l (const char *__s1, const char *__s2, size_t __n, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2, 4))); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_STRINGS_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: inttypes.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/inttypes.h" 1 3 4 # 25 "/usr/include/inttypes.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/inttypes.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 1 3 4 # 9 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 3 4 # 1 "/usr/include/stdint.h" 1 3 4 # 26 "/usr/include/stdint.h" 3 4 # 1 "/usr/include/bits/wchar.h" 1 3 4 # 27 "/usr/include/stdint.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/stdint.h" 2 3 4 # 36 "/usr/include/stdint.h" 3 4 # 36 "/usr/include/stdint.h" 3 4 typedef signed char int8_t; typedef short int int16_t; typedef int int32_t; typedef long int int64_t; typedef unsigned char uint8_t; typedef unsigned short int uint16_t; typedef unsigned int uint32_t; typedef unsigned long int uint64_t; # 65 "/usr/include/stdint.h" 3 4 typedef signed char int_least8_t; typedef short int int_least16_t; typedef int int_least32_t; typedef long int int_least64_t; typedef unsigned char uint_least8_t; typedef unsigned short int uint_least16_t; typedef unsigned int uint_least32_t; typedef unsigned long int uint_least64_t; # 90 "/usr/include/stdint.h" 3 4 typedef signed char int_fast8_t; typedef long int int_fast16_t; typedef long int int_fast32_t; typedef long int int_fast64_t; # 103 "/usr/include/stdint.h" 3 4 typedef unsigned char uint_fast8_t; typedef unsigned long int uint_fast16_t; typedef unsigned long int uint_fast32_t; typedef unsigned long int uint_fast64_t; # 119 "/usr/include/stdint.h" 3 4 typedef long int intptr_t; typedef unsigned long int uintptr_t; # 134 "/usr/include/stdint.h" 3 4 typedef long int intmax_t; typedef unsigned long int uintmax_t; # 10 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 2 3 4 # 28 "/usr/include/inttypes.h" 2 3 4 typedef int __gwchar_t; # 266 "/usr/include/inttypes.h" 3 4 typedef struct { long int quot; long int rem; } imaxdiv_t; # 290 "/usr/include/inttypes.h" 3 4 extern intmax_t imaxabs (intmax_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern imaxdiv_t imaxdiv (intmax_t __numer, intmax_t __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern intmax_t strtoimax (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)); extern uintmax_t strtoumax (const char *__restrict __nptr, char ** __restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)); extern intmax_t wcstoimax (const __gwchar_t *__restrict __nptr, __gwchar_t **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)); extern uintmax_t wcstoumax (const __gwchar_t *__restrict __nptr, __gwchar_t ** __restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)); # 432 "/usr/include/inttypes.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_INTTYPES_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sched.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sched.h" 1 3 4 # 22 "/usr/include/sched.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 23 "/usr/include/sched.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 26 "/usr/include/sched.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 29 "/usr/include/sched.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 35 "/usr/include/sched.h" 2 3 4 typedef __pid_t pid_t; # 1 "/usr/include/bits/sched.h" 1 3 4 # 73 "/usr/include/bits/sched.h" 3 4 struct sched_param { int __sched_priority; }; # 96 "/usr/include/bits/sched.h" 3 4 struct __sched_param { int __sched_priority; }; # 119 "/usr/include/bits/sched.h" 3 4 typedef unsigned long int __cpu_mask; typedef struct { __cpu_mask __bits[1024 / (8 * sizeof (__cpu_mask))]; } cpu_set_t; # 202 "/usr/include/bits/sched.h" 3 4 extern int __sched_cpucount (size_t __setsize, const cpu_set_t *__setp) __attribute__ ((__nothrow__ , __leaf__)); extern cpu_set_t *__sched_cpualloc (size_t __count) __attribute__ ((__nothrow__ , __leaf__)) ; extern void __sched_cpufree (cpu_set_t *__set) __attribute__ ((__nothrow__ , __leaf__)); # 44 "/usr/include/sched.h" 2 3 4 extern int sched_setparam (__pid_t __pid, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getparam (__pid_t __pid, struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_setscheduler (__pid_t __pid, int __policy, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getscheduler (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_yield (void) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_max (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_min (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_rr_get_interval (__pid_t __pid, struct timespec *__t) __attribute__ ((__nothrow__ , __leaf__)); # 126 "/usr/include/sched.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SCHED_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: cxxabi.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: cxxabi.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: cxxabi.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: cxxabi.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/systeminfo.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: sys/systeminfo.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: sys/systeminfo.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: sys/systeminfo.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: dos.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:17: fatal error: dos.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:17: fatal error: dos.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:17: fatal error: dos.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: WindowsX.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:22: fatal error: WindowsX.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:22: fatal error: WindowsX.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:22: fatal error: WindowsX.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/sysinfo.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/sysinfo.h" 1 3 4 # 21 "/usr/include/sys/sysinfo.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/sys/sysinfo.h" 2 3 4 # 1 "/usr/include/linux/kernel.h" 1 3 4 # 1 "/usr/include/linux/sysinfo.h" 1 3 4 # 1 "/usr/include/linux/types.h" 1 3 4 # 1 "/usr/include/asm/types.h" 1 3 4 # 1 "/usr/include/asm-generic/types.h" 1 3 4 # 1 "/usr/include/asm-generic/int-ll64.h" 1 3 4 # 11 "/usr/include/asm-generic/int-ll64.h" 3 4 # 1 "/usr/include/asm/bitsperlong.h" 1 3 4 # 10 "/usr/include/asm/bitsperlong.h" 3 4 # 1 "/usr/include/asm-generic/bitsperlong.h" 1 3 4 # 11 "/usr/include/asm/bitsperlong.h" 2 3 4 # 12 "/usr/include/asm-generic/int-ll64.h" 2 3 4 # 19 "/usr/include/asm-generic/int-ll64.h" 3 4 typedef __signed__ char __s8; typedef unsigned char __u8; typedef __signed__ short __s16; typedef unsigned short __u16; typedef __signed__ int __s32; typedef unsigned int __u32; __extension__ typedef __signed__ long long __s64; __extension__ typedef unsigned long long __u64; # 7 "/usr/include/asm-generic/types.h" 2 3 4 # 5 "/usr/include/asm/types.h" 2 3 4 # 5 "/usr/include/linux/types.h" 2 3 4 # 1 "/usr/include/linux/posix_types.h" 1 3 4 # 1 "/usr/include/linux/stddef.h" 1 3 4 # 5 "/usr/include/linux/posix_types.h" 2 3 4 # 24 "/usr/include/linux/posix_types.h" 3 4 typedef struct { unsigned long fds_bits[1024 / (8 * sizeof(long))]; } __kernel_fd_set; typedef void (*__kernel_sighandler_t)(int); typedef int __kernel_key_t; typedef int __kernel_mqd_t; # 1 "/usr/include/asm/posix_types.h" 1 3 4 # 1 "/usr/include/asm/posix_types_64.h" 1 3 4 # 10 "/usr/include/asm/posix_types_64.h" 3 4 typedef unsigned short __kernel_old_uid_t; typedef unsigned short __kernel_old_gid_t; typedef unsigned long __kernel_old_dev_t; # 1 "/usr/include/asm-generic/posix_types.h" 1 3 4 # 14 "/usr/include/asm-generic/posix_types.h" 3 4 typedef long __kernel_long_t; typedef unsigned long __kernel_ulong_t; typedef __kernel_ulong_t __kernel_ino_t; typedef unsigned int __kernel_mode_t; typedef int __kernel_pid_t; typedef int __kernel_ipc_pid_t; typedef unsigned int __kernel_uid_t; typedef unsigned int __kernel_gid_t; typedef __kernel_long_t __kernel_suseconds_t; typedef int __kernel_daddr_t; typedef unsigned int __kernel_uid32_t; typedef unsigned int __kernel_gid32_t; # 71 "/usr/include/asm-generic/posix_types.h" 3 4 typedef __kernel_ulong_t __kernel_size_t; typedef __kernel_long_t __kernel_ssize_t; typedef __kernel_long_t __kernel_ptrdiff_t; typedef struct { int val[2]; } __kernel_fsid_t; typedef __kernel_long_t __kernel_off_t; typedef long long __kernel_loff_t; typedef __kernel_long_t __kernel_time_t; typedef __kernel_long_t __kernel_clock_t; typedef int __kernel_timer_t; typedef int __kernel_clockid_t; typedef char * __kernel_caddr_t; typedef unsigned short __kernel_uid16_t; typedef unsigned short __kernel_gid16_t; # 18 "/usr/include/asm/posix_types_64.h" 2 3 4 # 7 "/usr/include/asm/posix_types.h" 2 3 4 # 36 "/usr/include/linux/posix_types.h" 2 3 4 # 9 "/usr/include/linux/types.h" 2 3 4 # 27 "/usr/include/linux/types.h" 3 4 typedef __u16 __le16; typedef __u16 __be16; typedef __u32 __le32; typedef __u32 __be32; typedef __u64 __le64; typedef __u64 __be64; typedef __u16 __sum16; typedef __u32 __wsum; # 5 "/usr/include/linux/sysinfo.h" 2 3 4 struct sysinfo { __kernel_long_t uptime; __kernel_ulong_t loads[3]; __kernel_ulong_t totalram; __kernel_ulong_t freeram; __kernel_ulong_t sharedram; __kernel_ulong_t bufferram; __kernel_ulong_t totalswap; __kernel_ulong_t freeswap; __u16 procs; __u16 pad; __kernel_ulong_t totalhigh; __kernel_ulong_t freehigh; __u32 mem_unit; char _f[20-2*sizeof(__kernel_ulong_t)-sizeof(__u32)]; }; # 5 "/usr/include/linux/kernel.h" 2 3 4 # 25 "/usr/include/sys/sysinfo.h" 2 3 4 extern int sysinfo (struct sysinfo *__info) __attribute__ ((__nothrow__ , __leaf__)); extern int get_nprocs_conf (void) __attribute__ ((__nothrow__ , __leaf__)); extern int get_nprocs (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int get_phys_pages (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int get_avphys_pages (void) __attribute__ ((__nothrow__ , __leaf__)); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_SYSINFO_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/wait.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/wait.h" 1 3 4 # 25 "/usr/include/sys/wait.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/wait.h" 2 3 4 # 1 "/usr/include/signal.h" 1 3 4 # 30 "/usr/include/signal.h" 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 102 "/usr/include/bits/sigset.h" 3 4 extern int __sigismember (const __sigset_t *, int); extern int __sigaddset (__sigset_t *, int); extern int __sigdelset (__sigset_t *, int); # 33 "/usr/include/signal.h" 2 3 4 typedef __sig_atomic_t sig_atomic_t; typedef __sigset_t sigset_t; # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 57 "/usr/include/signal.h" 2 3 4 # 1 "/usr/include/bits/signum.h" 1 3 4 # 58 "/usr/include/signal.h" 2 3 4 typedef __pid_t pid_t; typedef __uid_t uid_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 76 "/usr/include/signal.h" 2 3 4 # 1 "/usr/include/bits/siginfo.h" 1 3 4 # 24 "/usr/include/bits/siginfo.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 25 "/usr/include/bits/siginfo.h" 2 3 4 typedef union sigval { int sival_int; void *sival_ptr; } sigval_t; # 58 "/usr/include/bits/siginfo.h" 3 4 typedef __clock_t __sigchld_clock_t; typedef struct { int si_signo; int si_errno; int si_code; union { int _pad[((128 / sizeof (int)) - 4)]; struct { __pid_t si_pid; __uid_t si_uid; } _kill; struct { int si_tid; int si_overrun; sigval_t si_sigval; } _timer; struct { __pid_t si_pid; __uid_t si_uid; sigval_t si_sigval; } _rt; struct { __pid_t si_pid; __uid_t si_uid; int si_status; __sigchld_clock_t si_utime; __sigchld_clock_t si_stime; } _sigchld; struct { void *si_addr; short int si_addr_lsb; struct { void *_lower; void *_upper; } si_addr_bnd; } _sigfault; struct { long int si_band; int si_fd; } _sigpoll; struct { void *_call_addr; int _syscall; unsigned int _arch; } _sigsys; } _sifields; } siginfo_t ; # 160 "/usr/include/bits/siginfo.h" 3 4 enum { SI_ASYNCNL = -60, SI_TKILL = -6, SI_SIGIO, SI_ASYNCIO, SI_MESGQ, SI_TIMER, SI_QUEUE, SI_USER, SI_KERNEL = 0x80 }; enum { ILL_ILLOPC = 1, ILL_ILLOPN, ILL_ILLADR, ILL_ILLTRP, ILL_PRVOPC, ILL_PRVREG, ILL_COPROC, ILL_BADSTK }; enum { FPE_INTDIV = 1, FPE_INTOVF, FPE_FLTDIV, FPE_FLTOVF, FPE_FLTUND, FPE_FLTRES, FPE_FLTINV, FPE_FLTSUB }; enum { SEGV_MAPERR = 1, SEGV_ACCERR }; enum { BUS_ADRALN = 1, BUS_ADRERR, BUS_OBJERR, BUS_MCEERR_AR, BUS_MCEERR_AO }; # 264 "/usr/include/bits/siginfo.h" 3 4 enum { CLD_EXITED = 1, CLD_KILLED, CLD_DUMPED, CLD_TRAPPED, CLD_STOPPED, CLD_CONTINUED }; enum { POLL_IN = 1, POLL_OUT, POLL_MSG, POLL_ERR, POLL_PRI, POLL_HUP }; # 316 "/usr/include/bits/siginfo.h" 3 4 typedef union pthread_attr_t pthread_attr_t; typedef struct sigevent { sigval_t sigev_value; int sigev_signo; int sigev_notify; union { int _pad[((64 / sizeof (int)) - 4)]; __pid_t _tid; struct { void (*_function) (sigval_t); pthread_attr_t *_attribute; } _sigev_thread; } _sigev_un; } sigevent_t; enum { SIGEV_SIGNAL = 0, SIGEV_NONE, SIGEV_THREAD, SIGEV_THREAD_ID = 4 }; # 81 "/usr/include/signal.h" 2 3 4 typedef void (*__sighandler_t) (int); extern __sighandler_t __sysv_signal (int __sig, __sighandler_t __handler) __attribute__ ((__nothrow__ , __leaf__)); # 100 "/usr/include/signal.h" 3 4 extern __sighandler_t signal (int __sig, __sighandler_t __handler) __attribute__ ((__nothrow__ , __leaf__)); # 114 "/usr/include/signal.h" 3 4 # 127 "/usr/include/signal.h" 3 4 extern int kill (__pid_t __pid, int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern int killpg (__pid_t __pgrp, int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern int raise (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern __sighandler_t ssignal (int __sig, __sighandler_t __handler) __attribute__ ((__nothrow__ , __leaf__)); extern int gsignal (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern void psignal (int __sig, const char *__s); extern void psiginfo (const siginfo_t *__pinfo, const char *__s); # 187 "/usr/include/signal.h" 3 4 extern int sigblock (int __mask) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); extern int sigsetmask (int __mask) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); extern int siggetmask (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); # 207 "/usr/include/signal.h" 3 4 typedef __sighandler_t sig_t; extern int sigemptyset (sigset_t *__set) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigfillset (sigset_t *__set) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigaddset (sigset_t *__set, int __signo) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigdelset (sigset_t *__set, int __signo) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigismember (const sigset_t *__set, int __signo) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 243 "/usr/include/signal.h" 3 4 # 1 "/usr/include/bits/sigaction.h" 1 3 4 # 24 "/usr/include/bits/sigaction.h" 3 4 struct sigaction { union { __sighandler_t sa_handler; void (*sa_sigaction) (int, siginfo_t *, void *); } __sigaction_handler; __sigset_t sa_mask; int sa_flags; void (*sa_restorer) (void); }; # 244 "/usr/include/signal.h" 2 3 4 extern int sigprocmask (int __how, const sigset_t *__restrict __set, sigset_t *__restrict __oset) __attribute__ ((__nothrow__ , __leaf__)); extern int sigsuspend (const sigset_t *__set) __attribute__ ((__nonnull__ (1))); extern int sigaction (int __sig, const struct sigaction *__restrict __act, struct sigaction *__restrict __oact) __attribute__ ((__nothrow__ , __leaf__)); extern int sigpending (sigset_t *__set) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigwait (const sigset_t *__restrict __set, int *__restrict __sig) __attribute__ ((__nonnull__ (1, 2))); extern int sigwaitinfo (const sigset_t *__restrict __set, siginfo_t *__restrict __info) __attribute__ ((__nonnull__ (1))); extern int sigtimedwait (const sigset_t *__restrict __set, siginfo_t *__restrict __info, const struct timespec *__restrict __timeout) __attribute__ ((__nonnull__ (1))); extern int sigqueue (__pid_t __pid, int __sig, const union sigval __val) __attribute__ ((__nothrow__ , __leaf__)); # 301 "/usr/include/signal.h" 3 4 extern const char *const _sys_siglist[65]; extern const char *const sys_siglist[65]; # 1 "/usr/include/bits/sigcontext.h" 1 3 4 # 29 "/usr/include/bits/sigcontext.h" 3 4 struct _fpx_sw_bytes { __uint32_t magic1; __uint32_t extended_size; __uint64_t xstate_bv; __uint32_t xstate_size; __uint32_t padding[7]; }; struct _fpreg { unsigned short significand[4]; unsigned short exponent; }; struct _fpxreg { unsigned short significand[4]; unsigned short exponent; unsigned short padding[3]; }; struct _xmmreg { __uint32_t element[4]; }; # 121 "/usr/include/bits/sigcontext.h" 3 4 struct _fpstate { __uint16_t cwd; __uint16_t swd; __uint16_t ftw; __uint16_t fop; __uint64_t rip; __uint64_t rdp; __uint32_t mxcsr; __uint32_t mxcr_mask; struct _fpxreg _st[8]; struct _xmmreg _xmm[16]; __uint32_t padding[24]; }; struct sigcontext { __uint64_t r8; __uint64_t r9; __uint64_t r10; __uint64_t r11; __uint64_t r12; __uint64_t r13; __uint64_t r14; __uint64_t r15; __uint64_t rdi; __uint64_t rsi; __uint64_t rbp; __uint64_t rbx; __uint64_t rdx; __uint64_t rax; __uint64_t rcx; __uint64_t rsp; __uint64_t rip; __uint64_t eflags; unsigned short cs; unsigned short gs; unsigned short fs; unsigned short __pad0; __uint64_t err; __uint64_t trapno; __uint64_t oldmask; __uint64_t cr2; __extension__ union { struct _fpstate * fpstate; __uint64_t __fpstate_word; }; __uint64_t __reserved1 [8]; }; struct _xsave_hdr { __uint64_t xstate_bv; __uint64_t reserved1[2]; __uint64_t reserved2[5]; }; struct _ymmh_state { __uint32_t ymmh_space[64]; }; struct _xstate { struct _fpstate fpstate; struct _xsave_hdr xstate_hdr; struct _ymmh_state ymmh; }; # 307 "/usr/include/signal.h" 2 3 4 extern int sigreturn (struct sigcontext *__scp) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 317 "/usr/include/signal.h" 2 3 4 extern int siginterrupt (int __sig, int __interrupt) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/include/bits/sigstack.h" 1 3 4 # 25 "/usr/include/bits/sigstack.h" 3 4 struct sigstack { void *ss_sp; int ss_onstack; }; enum { SS_ONSTACK = 1, SS_DISABLE }; # 49 "/usr/include/bits/sigstack.h" 3 4 typedef struct sigaltstack { void *ss_sp; int ss_flags; size_t ss_size; } stack_t; # 324 "/usr/include/signal.h" 2 3 4 # 1 "/usr/include/sys/ucontext.h" 1 3 4 # 22 "/usr/include/sys/ucontext.h" 3 4 # 1 "/usr/include/signal.h" 1 3 4 # 23 "/usr/include/sys/ucontext.h" 2 3 4 # 31 "/usr/include/sys/ucontext.h" 3 4 __extension__ typedef long long int greg_t; typedef greg_t gregset_t[23]; # 92 "/usr/include/sys/ucontext.h" 3 4 struct _libc_fpxreg { unsigned short int significand[4]; unsigned short int exponent; unsigned short int padding[3]; }; struct _libc_xmmreg { __uint32_t element[4]; }; struct _libc_fpstate { __uint16_t cwd; __uint16_t swd; __uint16_t ftw; __uint16_t fop; __uint64_t rip; __uint64_t rdp; __uint32_t mxcsr; __uint32_t mxcr_mask; struct _libc_fpxreg _st[8]; struct _libc_xmmreg _xmm[16]; __uint32_t padding[24]; }; typedef struct _libc_fpstate *fpregset_t; typedef struct { gregset_t gregs; fpregset_t fpregs; __extension__ unsigned long long __reserved1 [8]; } mcontext_t; typedef struct ucontext { unsigned long int uc_flags; struct ucontext *uc_link; stack_t uc_stack; mcontext_t uc_mcontext; __sigset_t uc_sigmask; struct _libc_fpstate __fpregs_mem; } ucontext_t; # 327 "/usr/include/signal.h" 2 3 4 extern int sigstack (struct sigstack *__ss, struct sigstack *__oss) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); extern int sigaltstack (const struct sigaltstack *__restrict __ss, struct sigaltstack *__restrict __oss) __attribute__ ((__nothrow__ , __leaf__)); # 361 "/usr/include/signal.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 362 "/usr/include/signal.h" 2 3 4 # 1 "/usr/include/bits/sigthread.h" 1 3 4 # 30 "/usr/include/bits/sigthread.h" 3 4 extern int pthread_sigmask (int __how, const __sigset_t *__restrict __newmask, __sigset_t *__restrict __oldmask)__attribute__ ((__nothrow__ , __leaf__)); extern int pthread_kill (pthread_t __threadid, int __signo) __attribute__ ((__nothrow__ , __leaf__)); # 363 "/usr/include/signal.h" 2 3 4 extern int __libc_current_sigrtmin (void) __attribute__ ((__nothrow__ , __leaf__)); extern int __libc_current_sigrtmax (void) __attribute__ ((__nothrow__ , __leaf__)); # 30 "/usr/include/sys/wait.h" 2 3 4 # 1 "/usr/include/bits/waitflags.h" 1 3 4 # 36 "/usr/include/sys/wait.h" 2 3 4 # 1 "/usr/include/bits/waitstatus.h" 1 3 4 # 39 "/usr/include/sys/wait.h" 2 3 4 # 60 "/usr/include/sys/wait.h" 3 4 typedef enum { P_ALL, P_PID, P_PGID } idtype_t; # 74 "/usr/include/sys/wait.h" 3 4 extern __pid_t wait (int *__stat_loc); # 97 "/usr/include/sys/wait.h" 3 4 extern __pid_t waitpid (__pid_t __pid, int *__stat_loc, int __options); typedef __id_t id_t; # 1 "/usr/include/bits/siginfo.h" 1 3 4 # 24 "/usr/include/bits/siginfo.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 25 "/usr/include/bits/siginfo.h" 2 3 4 # 108 "/usr/include/sys/wait.h" 2 3 4 # 120 "/usr/include/sys/wait.h" 3 4 extern int waitid (idtype_t __idtype, __id_t __id, siginfo_t *__infop, int __options); struct rusage; extern __pid_t wait3 (int *__stat_loc, int __options, struct rusage * __usage) __attribute__ ((__nothrow__)); extern __pid_t wait4 (__pid_t __pid, int *__stat_loc, int __options, struct rusage *__usage) __attribute__ ((__nothrow__)); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_WAIT_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: stdlib.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/stdlib.h" 1 3 4 # 24 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 25 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 33 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitflags.h" 1 3 4 # 42 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitstatus.h" 1 3 4 # 43 "/usr/include/stdlib.h" 2 3 4 # 56 "/usr/include/stdlib.h" 3 4 typedef struct { int quot; int rem; } div_t; typedef struct { long int quot; long int rem; } ldiv_t; __extension__ typedef struct { long long int quot; long long int rem; } lldiv_t; # 100 "/usr/include/stdlib.h" 3 4 extern size_t __ctype_get_mb_cur_max (void) __attribute__ ((__nothrow__ , __leaf__)) ; extern double atof (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern int atoi (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern long int atol (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; __extension__ extern long long int atoll (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern double strtod (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern float strtof (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long double strtold (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int strtol (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern unsigned long int strtoul (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtouq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoll (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtoull (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 266 "/usr/include/stdlib.h" 3 4 extern char *l64a (long int __n) __attribute__ ((__nothrow__ , __leaf__)) ; extern long int a64l (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 276 "/usr/include/stdlib.h" 2 3 4 extern long int random (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srandom (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern char *initstate (unsigned int __seed, char *__statebuf, size_t __statelen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *setstate (char *__statebuf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct random_data { int32_t *fptr; int32_t *rptr; int32_t *state; int rand_type; int rand_deg; int rand_sep; int32_t *end_ptr; }; extern int random_r (struct random_data *__restrict __buf, int32_t *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srandom_r (unsigned int __seed, struct random_data *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int initstate_r (unsigned int __seed, char *__restrict __statebuf, size_t __statelen, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern int setstate_r (char *__restrict __statebuf, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int rand (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srand (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern int rand_r (unsigned int *__seed) __attribute__ ((__nothrow__ , __leaf__)); extern double drand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern double erand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int lrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int nrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int mrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int jrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void srand48 (long int __seedval) __attribute__ ((__nothrow__ , __leaf__)); extern unsigned short int *seed48 (unsigned short int __seed16v[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void lcong48 (unsigned short int __param[7]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct drand48_data { unsigned short int __x[3]; unsigned short int __old_x[3]; unsigned short int __c; unsigned short int __init; __extension__ unsigned long long int __a; }; extern int drand48_r (struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int erand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int nrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int mrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int jrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srand48_r (long int __seedval, struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int seed48_r (unsigned short int __seed16v[3], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lcong48_r (unsigned short int __param[7], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *malloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *calloc (size_t __nmemb, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *realloc (void *__ptr, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__warn_unused_result__)); extern void free (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void cfree (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/include/alloca.h" 1 3 4 # 24 "/usr/include/alloca.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 25 "/usr/include/alloca.h" 2 3 4 extern void *alloca (size_t __size) __attribute__ ((__nothrow__ , __leaf__)); # 454 "/usr/include/stdlib.h" 2 3 4 extern void *valloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int posix_memalign (void **__memptr, size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern void *aligned_alloc (size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__alloc_size__ (2))) ; extern void abort (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern int atexit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int at_quick_exit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int on_exit (void (*__func) (int __status, void *__arg), void *__arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void quick_exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void _Exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern char *getenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 539 "/usr/include/stdlib.h" 3 4 extern int putenv (char *__string) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int setenv (const char *__name, const char *__value, int __replace) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int unsetenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int clearenv (void) __attribute__ ((__nothrow__ , __leaf__)); # 567 "/usr/include/stdlib.h" 3 4 extern char *mktemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 580 "/usr/include/stdlib.h" 3 4 extern int mkstemp (char *__template) __attribute__ ((__nonnull__ (1))) ; # 602 "/usr/include/stdlib.h" 3 4 extern int mkstemps (char *__template, int __suffixlen) __attribute__ ((__nonnull__ (1))) ; # 623 "/usr/include/stdlib.h" 3 4 extern char *mkdtemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 672 "/usr/include/stdlib.h" 3 4 extern int system (const char *__command) ; # 694 "/usr/include/stdlib.h" 3 4 extern char *realpath (const char *__restrict __name, char *__restrict __resolved) __attribute__ ((__nothrow__ , __leaf__)) ; typedef int (*__compar_fn_t) (const void *, const void *); # 712 "/usr/include/stdlib.h" 3 4 extern void *bsearch (const void *__key, const void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 2, 5))) ; extern void qsort (void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 4))); # 735 "/usr/include/stdlib.h" 3 4 extern int abs (int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern long int labs (long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern long long int llabs (long long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern div_t div (int __numer, int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern ldiv_t ldiv (long int __numer, long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern lldiv_t lldiv (long long int __numer, long long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; # 772 "/usr/include/stdlib.h" 3 4 extern char *ecvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *fcvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *gcvt (double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern char *qecvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qfcvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qgcvt (long double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern int ecvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int fcvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qecvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qfcvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int mblen (const char *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int mbtowc (wchar_t *__restrict __pwc, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int wctomb (char *__s, wchar_t __wchar) __attribute__ ((__nothrow__ , __leaf__)); extern size_t mbstowcs (wchar_t *__restrict __pwcs, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern size_t wcstombs (char *__restrict __s, const wchar_t *__restrict __pwcs, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int rpmatch (const char *__response) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 859 "/usr/include/stdlib.h" 3 4 extern int getsubopt (char **__restrict __optionp, char *const *__restrict __tokens, char **__restrict __valuep) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))) ; # 911 "/usr/include/stdlib.h" 3 4 extern int getloadavg (double __loadavg[], int __nelem) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 921 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/bits/stdlib-float.h" 1 3 4 # 922 "/usr/include/stdlib.h" 2 3 4 # 934 "/usr/include/stdlib.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_STDLIB_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: pthread.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/pthread.h" 1 3 4 # 21 "/usr/include/pthread.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 27 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 28 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 23 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/sched.h" 1 3 4 # 28 "/usr/include/sched.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 29 "/usr/include/sched.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 35 "/usr/include/sched.h" 2 3 4 typedef __pid_t pid_t; # 1 "/usr/include/bits/sched.h" 1 3 4 # 73 "/usr/include/bits/sched.h" 3 4 struct sched_param { int __sched_priority; }; # 96 "/usr/include/bits/sched.h" 3 4 struct __sched_param { int __sched_priority; }; # 119 "/usr/include/bits/sched.h" 3 4 typedef unsigned long int __cpu_mask; typedef struct { __cpu_mask __bits[1024 / (8 * sizeof (__cpu_mask))]; } cpu_set_t; # 202 "/usr/include/bits/sched.h" 3 4 extern int __sched_cpucount (size_t __setsize, const cpu_set_t *__setp) __attribute__ ((__nothrow__ , __leaf__)); extern cpu_set_t *__sched_cpualloc (size_t __count) __attribute__ ((__nothrow__ , __leaf__)) ; extern void __sched_cpufree (cpu_set_t *__set) __attribute__ ((__nothrow__ , __leaf__)); # 44 "/usr/include/sched.h" 2 3 4 extern int sched_setparam (__pid_t __pid, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getparam (__pid_t __pid, struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_setscheduler (__pid_t __pid, int __policy, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getscheduler (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_yield (void) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_max (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_min (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_rr_get_interval (__pid_t __pid, struct timespec *__t) __attribute__ ((__nothrow__ , __leaf__)); # 126 "/usr/include/sched.h" 3 4 # 24 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 29 "/usr/include/time.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 38 "/usr/include/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 42 "/usr/include/time.h" 2 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 131 "/usr/include/time.h" 3 4 struct tm { int tm_sec; int tm_min; int tm_hour; int tm_mday; int tm_mon; int tm_year; int tm_wday; int tm_yday; int tm_isdst; long int tm_gmtoff; const char *tm_zone; }; struct itimerspec { struct timespec it_interval; struct timespec it_value; }; struct sigevent; # 186 "/usr/include/time.h" 3 4 extern clock_t clock (void) __attribute__ ((__nothrow__ , __leaf__)); extern time_t time (time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern double difftime (time_t __time1, time_t __time0) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern time_t mktime (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern size_t strftime (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); # 221 "/usr/include/time.h" 3 4 # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 222 "/usr/include/time.h" 2 3 4 extern size_t strftime_l (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)); # 236 "/usr/include/time.h" 3 4 extern struct tm *gmtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *gmtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime (const struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime_r (const struct tm *__restrict __tp, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime_r (const time_t *__restrict __timer, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *__tzname[2]; extern int __daylight; extern long int __timezone; extern char *tzname[2]; extern void tzset (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daylight; extern long int timezone; extern int stime (const time_t *__when) __attribute__ ((__nothrow__ , __leaf__)); # 319 "/usr/include/time.h" 3 4 extern time_t timegm (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern time_t timelocal (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int dysize (int __year) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 334 "/usr/include/time.h" 3 4 extern int nanosleep (const struct timespec *__requested_time, struct timespec *__remaining); extern int clock_getres (clockid_t __clock_id, struct timespec *__res) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_gettime (clockid_t __clock_id, struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_settime (clockid_t __clock_id, const struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_nanosleep (clockid_t __clock_id, int __flags, const struct timespec *__req, struct timespec *__rem); extern int clock_getcpuclockid (pid_t __pid, clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_create (clockid_t __clock_id, struct sigevent *__restrict __evp, timer_t *__restrict __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_delete (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_settime (timer_t __timerid, int __flags, const struct itimerspec *__restrict __value, struct itimerspec *__restrict __ovalue) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_gettime (timer_t __timerid, struct itimerspec *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_getoverrun (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timespec_get (struct timespec *__ts, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 430 "/usr/include/time.h" 3 4 # 25 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 27 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/setjmp.h" 1 3 4 # 26 "/usr/include/bits/setjmp.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 27 "/usr/include/bits/setjmp.h" 2 3 4 typedef long int __jmp_buf[8]; # 28 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/pthread.h" 2 3 4 enum { PTHREAD_CREATE_JOINABLE, PTHREAD_CREATE_DETACHED }; enum { PTHREAD_MUTEX_TIMED_NP, PTHREAD_MUTEX_RECURSIVE_NP, PTHREAD_MUTEX_ERRORCHECK_NP, PTHREAD_MUTEX_ADAPTIVE_NP , PTHREAD_MUTEX_NORMAL = PTHREAD_MUTEX_TIMED_NP, PTHREAD_MUTEX_RECURSIVE = PTHREAD_MUTEX_RECURSIVE_NP, PTHREAD_MUTEX_ERRORCHECK = PTHREAD_MUTEX_ERRORCHECK_NP, PTHREAD_MUTEX_DEFAULT = PTHREAD_MUTEX_NORMAL }; enum { PTHREAD_MUTEX_STALLED, PTHREAD_MUTEX_STALLED_NP = PTHREAD_MUTEX_STALLED, PTHREAD_MUTEX_ROBUST, PTHREAD_MUTEX_ROBUST_NP = PTHREAD_MUTEX_ROBUST }; enum { PTHREAD_PRIO_NONE, PTHREAD_PRIO_INHERIT, PTHREAD_PRIO_PROTECT }; # 114 "/usr/include/pthread.h" 3 4 enum { PTHREAD_RWLOCK_PREFER_READER_NP, PTHREAD_RWLOCK_PREFER_WRITER_NP, PTHREAD_RWLOCK_PREFER_WRITER_NONRECURSIVE_NP, PTHREAD_RWLOCK_DEFAULT_NP = PTHREAD_RWLOCK_PREFER_READER_NP }; # 155 "/usr/include/pthread.h" 3 4 enum { PTHREAD_INHERIT_SCHED, PTHREAD_EXPLICIT_SCHED }; enum { PTHREAD_SCOPE_SYSTEM, PTHREAD_SCOPE_PROCESS }; enum { PTHREAD_PROCESS_PRIVATE, PTHREAD_PROCESS_SHARED }; # 190 "/usr/include/pthread.h" 3 4 struct _pthread_cleanup_buffer { void (*__routine) (void *); void *__arg; int __canceltype; struct _pthread_cleanup_buffer *__prev; }; enum { PTHREAD_CANCEL_ENABLE, PTHREAD_CANCEL_DISABLE }; enum { PTHREAD_CANCEL_DEFERRED, PTHREAD_CANCEL_ASYNCHRONOUS }; # 228 "/usr/include/pthread.h" 3 4 extern int pthread_create (pthread_t *__restrict __newthread, const pthread_attr_t *__restrict __attr, void *(*__start_routine) (void *), void *__restrict __arg) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 3))); extern void pthread_exit (void *__retval) __attribute__ ((__noreturn__)); extern int pthread_join (pthread_t __th, void **__thread_return); # 271 "/usr/include/pthread.h" 3 4 extern int pthread_detach (pthread_t __th) __attribute__ ((__nothrow__ , __leaf__)); extern pthread_t pthread_self (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int pthread_equal (pthread_t __thread1, pthread_t __thread2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int pthread_attr_init (pthread_attr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_destroy (pthread_attr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getdetachstate (const pthread_attr_t *__attr, int *__detachstate) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setdetachstate (pthread_attr_t *__attr, int __detachstate) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getguardsize (const pthread_attr_t *__attr, size_t *__guardsize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setguardsize (pthread_attr_t *__attr, size_t __guardsize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getschedparam (const pthread_attr_t *__restrict __attr, struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setschedparam (pthread_attr_t *__restrict __attr, const struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_getschedpolicy (const pthread_attr_t *__restrict __attr, int *__restrict __policy) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setschedpolicy (pthread_attr_t *__attr, int __policy) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getinheritsched (const pthread_attr_t *__restrict __attr, int *__restrict __inherit) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setinheritsched (pthread_attr_t *__attr, int __inherit) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getscope (const pthread_attr_t *__restrict __attr, int *__restrict __scope) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setscope (pthread_attr_t *__attr, int __scope) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getstackaddr (const pthread_attr_t *__restrict __attr, void **__restrict __stackaddr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) __attribute__ ((__deprecated__)); extern int pthread_attr_setstackaddr (pthread_attr_t *__attr, void *__stackaddr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) __attribute__ ((__deprecated__)); extern int pthread_attr_getstacksize (const pthread_attr_t *__restrict __attr, size_t *__restrict __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setstacksize (pthread_attr_t *__attr, size_t __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getstack (const pthread_attr_t *__restrict __attr, void **__restrict __stackaddr, size_t *__restrict __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern int pthread_attr_setstack (pthread_attr_t *__attr, void *__stackaddr, size_t __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 429 "/usr/include/pthread.h" 3 4 extern int pthread_setschedparam (pthread_t __target_thread, int __policy, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))); extern int pthread_getschedparam (pthread_t __target_thread, int *__restrict __policy, struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern int pthread_setschedprio (pthread_t __target_thread, int __prio) __attribute__ ((__nothrow__ , __leaf__)); # 494 "/usr/include/pthread.h" 3 4 extern int pthread_once (pthread_once_t *__once_control, void (*__init_routine) (void)) __attribute__ ((__nonnull__ (1, 2))); # 506 "/usr/include/pthread.h" 3 4 extern int pthread_setcancelstate (int __state, int *__oldstate); extern int pthread_setcanceltype (int __type, int *__oldtype); extern int pthread_cancel (pthread_t __th); extern void pthread_testcancel (void); typedef struct { struct { __jmp_buf __cancel_jmp_buf; int __mask_was_saved; } __cancel_jmp_buf[1]; void *__pad[4]; } __pthread_unwind_buf_t __attribute__ ((__aligned__)); # 540 "/usr/include/pthread.h" 3 4 struct __pthread_cleanup_frame { void (*__cancel_routine) (void *); void *__cancel_arg; int __do_it; int __cancel_type; }; # 680 "/usr/include/pthread.h" 3 4 extern void __pthread_register_cancel (__pthread_unwind_buf_t *__buf) ; # 692 "/usr/include/pthread.h" 3 4 extern void __pthread_unregister_cancel (__pthread_unwind_buf_t *__buf) ; # 733 "/usr/include/pthread.h" 3 4 extern void __pthread_unwind_next (__pthread_unwind_buf_t *__buf) __attribute__ ((__noreturn__)) __attribute__ ((__weak__)) ; struct __jmp_buf_tag; extern int __sigsetjmp (struct __jmp_buf_tag *__env, int __savemask) __attribute__ ((__nothrow__)); extern int pthread_mutex_init (pthread_mutex_t *__mutex, const pthread_mutexattr_t *__mutexattr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_destroy (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_trylock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_lock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_timedlock (pthread_mutex_t *__restrict __mutex, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutex_unlock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_getprioceiling (const pthread_mutex_t * __restrict __mutex, int *__restrict __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutex_setprioceiling (pthread_mutex_t *__restrict __mutex, int __prioceiling, int *__restrict __old_ceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 3))); extern int pthread_mutex_consistent (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 806 "/usr/include/pthread.h" 3 4 extern int pthread_mutexattr_init (pthread_mutexattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_destroy (pthread_mutexattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getpshared (const pthread_mutexattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setpshared (pthread_mutexattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_gettype (const pthread_mutexattr_t *__restrict __attr, int *__restrict __kind) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_settype (pthread_mutexattr_t *__attr, int __kind) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getprotocol (const pthread_mutexattr_t * __restrict __attr, int *__restrict __protocol) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setprotocol (pthread_mutexattr_t *__attr, int __protocol) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getprioceiling (const pthread_mutexattr_t * __restrict __attr, int *__restrict __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setprioceiling (pthread_mutexattr_t *__attr, int __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getrobust (const pthread_mutexattr_t *__attr, int *__robustness) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setrobust (pthread_mutexattr_t *__attr, int __robustness) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 888 "/usr/include/pthread.h" 3 4 extern int pthread_rwlock_init (pthread_rwlock_t *__restrict __rwlock, const pthread_rwlockattr_t *__restrict __attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_destroy (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_rdlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_tryrdlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_timedrdlock (pthread_rwlock_t *__restrict __rwlock, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlock_wrlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_trywrlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_timedwrlock (pthread_rwlock_t *__restrict __rwlock, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlock_unlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_init (pthread_rwlockattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_destroy (pthread_rwlockattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_getpshared (const pthread_rwlockattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlockattr_setpshared (pthread_rwlockattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_getkind_np (const pthread_rwlockattr_t * __restrict __attr, int *__restrict __pref) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlockattr_setkind_np (pthread_rwlockattr_t *__attr, int __pref) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_init (pthread_cond_t *__restrict __cond, const pthread_condattr_t *__restrict __cond_attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_destroy (pthread_cond_t *__cond) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_signal (pthread_cond_t *__cond) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_broadcast (pthread_cond_t *__cond) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_wait (pthread_cond_t *__restrict __cond, pthread_mutex_t *__restrict __mutex) __attribute__ ((__nonnull__ (1, 2))); # 1000 "/usr/include/pthread.h" 3 4 extern int pthread_cond_timedwait (pthread_cond_t *__restrict __cond, pthread_mutex_t *__restrict __mutex, const struct timespec *__restrict __abstime) __attribute__ ((__nonnull__ (1, 2, 3))); extern int pthread_condattr_init (pthread_condattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_destroy (pthread_condattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_getpshared (const pthread_condattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_condattr_setpshared (pthread_condattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_getclock (const pthread_condattr_t * __restrict __attr, __clockid_t *__restrict __clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_condattr_setclock (pthread_condattr_t *__attr, __clockid_t __clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 1044 "/usr/include/pthread.h" 3 4 extern int pthread_spin_init (pthread_spinlock_t *__lock, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_destroy (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_lock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_trylock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_unlock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_init (pthread_barrier_t *__restrict __barrier, const pthread_barrierattr_t *__restrict __attr, unsigned int __count) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_destroy (pthread_barrier_t *__barrier) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_wait (pthread_barrier_t *__barrier) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_init (pthread_barrierattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_destroy (pthread_barrierattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_getpshared (const pthread_barrierattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_barrierattr_setpshared (pthread_barrierattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 1111 "/usr/include/pthread.h" 3 4 extern int pthread_key_create (pthread_key_t *__key, void (*__destr_function) (void *)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_key_delete (pthread_key_t __key) __attribute__ ((__nothrow__ , __leaf__)); extern void *pthread_getspecific (pthread_key_t __key) __attribute__ ((__nothrow__ , __leaf__)); extern int pthread_setspecific (pthread_key_t __key, const void *__pointer) __attribute__ ((__nothrow__ , __leaf__)) ; extern int pthread_getcpuclockid (pthread_t __thread_id, __clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 1145 "/usr/include/pthread.h" 3 4 extern int pthread_atfork (void (*__prepare) (void), void (*__parent) (void), void (*__child) (void)) __attribute__ ((__nothrow__ , __leaf__)); # 1159 "/usr/include/pthread.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_PTHREAD_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: setjmp.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/setjmp.h" 1 3 4 # 25 "/usr/include/setjmp.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/setjmp.h" 2 3 4 # 1 "/usr/include/bits/setjmp.h" 1 3 4 # 26 "/usr/include/bits/setjmp.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 27 "/usr/include/bits/setjmp.h" 2 3 4 # 31 "/usr/include/bits/setjmp.h" 3 4 typedef long int __jmp_buf[8]; # 30 "/usr/include/setjmp.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 31 "/usr/include/setjmp.h" 2 3 4 struct __jmp_buf_tag { __jmp_buf __jmpbuf; int __mask_was_saved; __sigset_t __saved_mask; }; typedef struct __jmp_buf_tag jmp_buf[1]; extern int setjmp (jmp_buf __env) __attribute__ ((__nothrow__)); extern int __sigsetjmp (struct __jmp_buf_tag __env[1], int __savemask) __attribute__ ((__nothrow__)); extern int _setjmp (struct __jmp_buf_tag __env[1]) __attribute__ ((__nothrow__)); extern void longjmp (struct __jmp_buf_tag __env[1], int __val) __attribute__ ((__nothrow__)) __attribute__ ((__noreturn__)); extern void _longjmp (struct __jmp_buf_tag __env[1], int __val) __attribute__ ((__nothrow__)) __attribute__ ((__noreturn__)); typedef struct __jmp_buf_tag sigjmp_buf[1]; # 102 "/usr/include/setjmp.h" 3 4 extern void siglongjmp (sigjmp_buf __env, int __val) __attribute__ ((__nothrow__)) __attribute__ ((__noreturn__)); # 112 "/usr/include/setjmp.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SETJMP_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/utsname.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/utsname.h" 1 3 4 # 25 "/usr/include/sys/utsname.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/utsname.h" 2 3 4 # 1 "/usr/include/bits/utsname.h" 1 3 4 # 30 "/usr/include/sys/utsname.h" 2 3 4 # 48 "/usr/include/sys/utsname.h" 3 4 # 48 "/usr/include/sys/utsname.h" 3 4 struct utsname { char sysname[65]; char nodename[65]; char release[65]; char version[65]; char machine[65]; char __domainname[65]; }; # 81 "/usr/include/sys/utsname.h" 3 4 extern int uname (struct utsname *__name) __attribute__ ((__nothrow__ , __leaf__)); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_UTSNAME_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: machine/endian.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: machine/endian.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: machine/endian.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: machine/endian.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: limits.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 34 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 1 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 168 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/include/limits.h" 1 3 4 # 25 "/usr/include/limits.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/limits.h" 2 3 4 # 143 "/usr/include/limits.h" 3 4 # 1 "/usr/include/bits/posix1_lim.h" 1 3 4 # 160 "/usr/include/bits/posix1_lim.h" 3 4 # 1 "/usr/include/bits/local_lim.h" 1 3 4 # 38 "/usr/include/bits/local_lim.h" 3 4 # 1 "/usr/include/linux/limits.h" 1 3 4 # 39 "/usr/include/bits/local_lim.h" 2 3 4 # 161 "/usr/include/bits/posix1_lim.h" 2 3 4 # 144 "/usr/include/limits.h" 2 3 4 # 1 "/usr/include/bits/posix2_lim.h" 1 3 4 # 148 "/usr/include/limits.h" 2 3 4 # 169 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 8 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 2 3 4 # 35 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_LIMITS_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: fcntl.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/fcntl.h" 1 3 4 # 25 "/usr/include/fcntl.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/fcntl.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 32 "/usr/include/fcntl.h" 2 3 4 # 1 "/usr/include/bits/fcntl.h" 1 3 4 # 35 "/usr/include/bits/fcntl.h" 3 4 struct flock { short int l_type; short int l_whence; __off_t l_start; __off_t l_len; __pid_t l_pid; }; # 61 "/usr/include/bits/fcntl.h" 3 4 # 1 "/usr/include/bits/fcntl-linux.h" 1 3 4 # 345 "/usr/include/bits/fcntl-linux.h" 3 4 # 419 "/usr/include/bits/fcntl-linux.h" 3 4 # 61 "/usr/include/bits/fcntl.h" 2 3 4 # 36 "/usr/include/fcntl.h" 2 3 4 # 50 "/usr/include/fcntl.h" 3 4 typedef __mode_t mode_t; typedef __off_t off_t; # 69 "/usr/include/fcntl.h" 3 4 typedef __pid_t pid_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 77 "/usr/include/fcntl.h" 2 3 4 # 1 "/usr/include/bits/stat.h" 1 3 4 # 46 "/usr/include/bits/stat.h" 3 4 struct stat { __dev_t st_dev; __ino_t st_ino; __nlink_t st_nlink; __mode_t st_mode; __uid_t st_uid; __gid_t st_gid; int __pad0; __dev_t st_rdev; __off_t st_size; __blksize_t st_blksize; __blkcnt_t st_blocks; # 91 "/usr/include/bits/stat.h" 3 4 struct timespec st_atim; struct timespec st_mtim; struct timespec st_ctim; # 106 "/usr/include/bits/stat.h" 3 4 __syscall_slong_t __glibc_reserved[3]; # 115 "/usr/include/bits/stat.h" 3 4 }; # 80 "/usr/include/fcntl.h" 2 3 4 # 171 "/usr/include/fcntl.h" 3 4 extern int fcntl (int __fd, int __cmd, ...); # 181 "/usr/include/fcntl.h" 3 4 extern int open (const char *__file, int __oflag, ...) __attribute__ ((__nonnull__ (1))); # 205 "/usr/include/fcntl.h" 3 4 extern int openat (int __fd, const char *__file, int __oflag, ...) __attribute__ ((__nonnull__ (2))); # 227 "/usr/include/fcntl.h" 3 4 extern int creat (const char *__file, mode_t __mode) __attribute__ ((__nonnull__ (1))); # 256 "/usr/include/fcntl.h" 3 4 extern int lockf (int __fd, int __cmd, off_t __len); # 273 "/usr/include/fcntl.h" 3 4 extern int posix_fadvise (int __fd, off_t __offset, off_t __len, int __advise) __attribute__ ((__nothrow__ , __leaf__)); # 295 "/usr/include/fcntl.h" 3 4 extern int posix_fallocate (int __fd, off_t __offset, off_t __len); # 317 "/usr/include/fcntl.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_FCNTL_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: string.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/string.h" 1 3 4 # 25 "/usr/include/string.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/string.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 33 "/usr/include/string.h" 2 3 4 extern void *memcpy (void *__restrict __dest, const void *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memmove (void *__dest, const void *__src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memccpy (void *__restrict __dest, const void *__restrict __src, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memset (void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int memcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 92 "/usr/include/string.h" 3 4 extern void *memchr (const void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 123 "/usr/include/string.h" 3 4 extern char *strcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strcat (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncat (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcoll (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strxfrm (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 160 "/usr/include/string.h" 2 3 4 extern int strcoll_l (const char *__s1, const char *__s2, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern size_t strxfrm_l (char *__dest, const char *__src, size_t __n, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern char *strdup (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); extern char *strndup (const char *__string, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); # 206 "/usr/include/string.h" 3 4 # 231 "/usr/include/string.h" 3 4 extern char *strchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 258 "/usr/include/string.h" 3 4 extern char *strrchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 277 "/usr/include/string.h" 3 4 extern size_t strcspn (const char *__s, const char *__reject) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strspn (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 310 "/usr/include/string.h" 3 4 extern char *strpbrk (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 337 "/usr/include/string.h" 3 4 extern char *strstr (const char *__haystack, const char *__needle) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strtok (char *__restrict __s, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *__strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern char *strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); # 392 "/usr/include/string.h" 3 4 extern size_t strlen (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern size_t strnlen (const char *__string, size_t __maxlen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern char *strerror (int __errnum) __attribute__ ((__nothrow__ , __leaf__)); # 422 "/usr/include/string.h" 3 4 extern int strerror_r (int __errnum, char *__buf, size_t __buflen) __asm__ ("" "__xpg_strerror_r") __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 440 "/usr/include/string.h" 3 4 extern char *strerror_l (int __errnum, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)); extern void __bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void bcopy (const void *__src, void *__dest, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int bcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 484 "/usr/include/string.h" 3 4 extern char *index (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 512 "/usr/include/string.h" 3 4 extern char *rindex (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern int ffs (int __i) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 529 "/usr/include/string.h" 3 4 extern int strcasecmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncasecmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 552 "/usr/include/string.h" 3 4 extern char *strsep (char **__restrict __stringp, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strsignal (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern char *__stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *__stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); # 656 "/usr/include/string.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_STRING_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/times.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/times.h" 1 3 4 # 25 "/usr/include/sys/times.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/times.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 55 "/usr/include/time.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 56 "/usr/include/time.h" 2 3 4 typedef __clock_t clock_t; # 29 "/usr/include/sys/times.h" 2 3 4 struct tms { clock_t tms_utime; clock_t tms_stime; clock_t tms_cutime; clock_t tms_cstime; }; extern clock_t times (struct tms *__buffer) __attribute__ ((__nothrow__ , __leaf__)); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_TIMES_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: io.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:16: fatal error: io.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:16: fatal error: io.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:16: fatal error: io.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: stdint.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 1 3 4 # 9 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 3 4 # 1 "/usr/include/stdint.h" 1 3 4 # 25 "/usr/include/stdint.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/stdint.h" 2 3 4 # 1 "/usr/include/bits/wchar.h" 1 3 4 # 27 "/usr/include/stdint.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/stdint.h" 2 3 4 # 36 "/usr/include/stdint.h" 3 4 # 36 "/usr/include/stdint.h" 3 4 typedef signed char int8_t; typedef short int int16_t; typedef int int32_t; typedef long int int64_t; typedef unsigned char uint8_t; typedef unsigned short int uint16_t; typedef unsigned int uint32_t; typedef unsigned long int uint64_t; # 65 "/usr/include/stdint.h" 3 4 typedef signed char int_least8_t; typedef short int int_least16_t; typedef int int_least32_t; typedef long int int_least64_t; typedef unsigned char uint_least8_t; typedef unsigned short int uint_least16_t; typedef unsigned int uint_least32_t; typedef unsigned long int uint_least64_t; # 90 "/usr/include/stdint.h" 3 4 typedef signed char int_fast8_t; typedef long int int_fast16_t; typedef long int int_fast32_t; typedef long int int_fast64_t; # 103 "/usr/include/stdint.h" 3 4 typedef unsigned char uint_fast8_t; typedef unsigned long int uint_fast16_t; typedef unsigned long int uint_fast32_t; typedef unsigned long int uint_fast64_t; # 119 "/usr/include/stdint.h" 3 4 typedef long int intptr_t; typedef unsigned long int uintptr_t; # 134 "/usr/include/stdint.h" 3 4 typedef long int intmax_t; typedef unsigned long int uintmax_t; # 10 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_STDINT_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: pwd.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/pwd.h" 1 3 4 # 25 "/usr/include/pwd.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/pwd.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/pwd.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 33 "/usr/include/pwd.h" 2 3 4 typedef __gid_t gid_t; typedef __uid_t uid_t; struct passwd { char *pw_name; char *pw_passwd; __uid_t pw_uid; __gid_t pw_gid; char *pw_gecos; char *pw_dir; char *pw_shell; }; # 1 "/usr/include/stdio.h" 1 3 4 # 44 "/usr/include/stdio.h" 3 4 struct _IO_FILE; typedef struct _IO_FILE FILE; # 64 "/usr/include/pwd.h" 2 3 4 # 72 "/usr/include/pwd.h" 3 4 extern void setpwent (void); extern void endpwent (void); extern struct passwd *getpwent (void); # 94 "/usr/include/pwd.h" 3 4 extern struct passwd *fgetpwent (FILE *__stream) __attribute__ ((__nonnull__ (1))); extern int putpwent (const struct passwd *__restrict __p, FILE *__restrict __f); extern struct passwd *getpwuid (__uid_t __uid); extern struct passwd *getpwnam (const char *__name) __attribute__ ((__nonnull__ (1))); # 139 "/usr/include/pwd.h" 3 4 extern int getpwent_r (struct passwd *__restrict __resultbuf, char *__restrict __buffer, size_t __buflen, struct passwd **__restrict __result) __attribute__ ((__nonnull__ (1, 2, 4))); extern int getpwuid_r (__uid_t __uid, struct passwd *__restrict __resultbuf, char *__restrict __buffer, size_t __buflen, struct passwd **__restrict __result) __attribute__ ((__nonnull__ (2, 3, 5))); extern int getpwnam_r (const char *__restrict __name, struct passwd *__restrict __resultbuf, char *__restrict __buffer, size_t __buflen, struct passwd **__restrict __result) __attribute__ ((__nonnull__ (1, 2, 3, 5))); # 166 "/usr/include/pwd.h" 3 4 extern int fgetpwent_r (FILE *__restrict __stream, struct passwd *__restrict __resultbuf, char *__restrict __buffer, size_t __buflen, struct passwd **__restrict __result) __attribute__ ((__nonnull__ (1, 2, 3, 5))); # 187 "/usr/include/pwd.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_PWD_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: float.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/float.h" 1 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_FLOAT_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/param.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/param.h" 1 3 4 # 23 "/usr/include/sys/param.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 24 "/usr/include/sys/param.h" 2 3 4 # 1 "/usr/include/sys/types.h" 1 3 4 # 25 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 26 "/usr/include/sys/param.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 34 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 1 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 168 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/include/limits.h" 1 3 4 # 143 "/usr/include/limits.h" 3 4 # 1 "/usr/include/bits/posix1_lim.h" 1 3 4 # 160 "/usr/include/bits/posix1_lim.h" 3 4 # 1 "/usr/include/bits/local_lim.h" 1 3 4 # 38 "/usr/include/bits/local_lim.h" 3 4 # 1 "/usr/include/linux/limits.h" 1 3 4 # 39 "/usr/include/bits/local_lim.h" 2 3 4 # 161 "/usr/include/bits/posix1_lim.h" 2 3 4 # 144 "/usr/include/limits.h" 2 3 4 # 1 "/usr/include/bits/posix2_lim.h" 1 3 4 # 148 "/usr/include/limits.h" 2 3 4 # 169 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 8 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 2 3 4 # 35 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 27 "/usr/include/sys/param.h" 2 3 4 # 1 "/usr/include/signal.h" 1 3 4 # 30 "/usr/include/signal.h" 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 102 "/usr/include/bits/sigset.h" 3 4 extern int __sigismember (const __sigset_t *, int); extern int __sigaddset (__sigset_t *, int); extern int __sigdelset (__sigset_t *, int); # 33 "/usr/include/signal.h" 2 3 4 typedef __sig_atomic_t sig_atomic_t; # 57 "/usr/include/signal.h" 3 4 # 1 "/usr/include/bits/signum.h" 1 3 4 # 58 "/usr/include/signal.h" 2 3 4 # 75 "/usr/include/signal.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 76 "/usr/include/signal.h" 2 3 4 # 1 "/usr/include/bits/siginfo.h" 1 3 4 # 24 "/usr/include/bits/siginfo.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 25 "/usr/include/bits/siginfo.h" 2 3 4 typedef union sigval { int sival_int; void *sival_ptr; } sigval_t; # 58 "/usr/include/bits/siginfo.h" 3 4 typedef __clock_t __sigchld_clock_t; typedef struct { int si_signo; int si_errno; int si_code; union { int _pad[((128 / sizeof (int)) - 4)]; struct { __pid_t si_pid; __uid_t si_uid; } _kill; struct { int si_tid; int si_overrun; sigval_t si_sigval; } _timer; struct { __pid_t si_pid; __uid_t si_uid; sigval_t si_sigval; } _rt; struct { __pid_t si_pid; __uid_t si_uid; int si_status; __sigchld_clock_t si_utime; __sigchld_clock_t si_stime; } _sigchld; struct { void *si_addr; short int si_addr_lsb; struct { void *_lower; void *_upper; } si_addr_bnd; } _sigfault; struct { long int si_band; int si_fd; } _sigpoll; struct { void *_call_addr; int _syscall; unsigned int _arch; } _sigsys; } _sifields; } siginfo_t ; # 160 "/usr/include/bits/siginfo.h" 3 4 enum { SI_ASYNCNL = -60, SI_TKILL = -6, SI_SIGIO, SI_ASYNCIO, SI_MESGQ, SI_TIMER, SI_QUEUE, SI_USER, SI_KERNEL = 0x80 }; enum { ILL_ILLOPC = 1, ILL_ILLOPN, ILL_ILLADR, ILL_ILLTRP, ILL_PRVOPC, ILL_PRVREG, ILL_COPROC, ILL_BADSTK }; enum { FPE_INTDIV = 1, FPE_INTOVF, FPE_FLTDIV, FPE_FLTOVF, FPE_FLTUND, FPE_FLTRES, FPE_FLTINV, FPE_FLTSUB }; enum { SEGV_MAPERR = 1, SEGV_ACCERR }; enum { BUS_ADRALN = 1, BUS_ADRERR, BUS_OBJERR, BUS_MCEERR_AR, BUS_MCEERR_AO }; # 264 "/usr/include/bits/siginfo.h" 3 4 enum { CLD_EXITED = 1, CLD_KILLED, CLD_DUMPED, CLD_TRAPPED, CLD_STOPPED, CLD_CONTINUED }; enum { POLL_IN = 1, POLL_OUT, POLL_MSG, POLL_ERR, POLL_PRI, POLL_HUP }; # 320 "/usr/include/bits/siginfo.h" 3 4 typedef struct sigevent { sigval_t sigev_value; int sigev_signo; int sigev_notify; union { int _pad[((64 / sizeof (int)) - 4)]; __pid_t _tid; struct { void (*_function) (sigval_t); pthread_attr_t *_attribute; } _sigev_thread; } _sigev_un; } sigevent_t; enum { SIGEV_SIGNAL = 0, SIGEV_NONE, SIGEV_THREAD, SIGEV_THREAD_ID = 4 }; # 81 "/usr/include/signal.h" 2 3 4 typedef void (*__sighandler_t) (int); extern __sighandler_t __sysv_signal (int __sig, __sighandler_t __handler) __attribute__ ((__nothrow__ , __leaf__)); # 100 "/usr/include/signal.h" 3 4 extern __sighandler_t signal (int __sig, __sighandler_t __handler) __attribute__ ((__nothrow__ , __leaf__)); # 114 "/usr/include/signal.h" 3 4 # 127 "/usr/include/signal.h" 3 4 extern int kill (__pid_t __pid, int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern int killpg (__pid_t __pgrp, int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern int raise (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern __sighandler_t ssignal (int __sig, __sighandler_t __handler) __attribute__ ((__nothrow__ , __leaf__)); extern int gsignal (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern void psignal (int __sig, const char *__s); extern void psiginfo (const siginfo_t *__pinfo, const char *__s); # 187 "/usr/include/signal.h" 3 4 extern int sigblock (int __mask) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); extern int sigsetmask (int __mask) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); extern int siggetmask (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); # 207 "/usr/include/signal.h" 3 4 typedef __sighandler_t sig_t; extern int sigemptyset (sigset_t *__set) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigfillset (sigset_t *__set) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigaddset (sigset_t *__set, int __signo) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigdelset (sigset_t *__set, int __signo) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigismember (const sigset_t *__set, int __signo) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 243 "/usr/include/signal.h" 3 4 # 1 "/usr/include/bits/sigaction.h" 1 3 4 # 24 "/usr/include/bits/sigaction.h" 3 4 struct sigaction { union { __sighandler_t sa_handler; void (*sa_sigaction) (int, siginfo_t *, void *); } __sigaction_handler; __sigset_t sa_mask; int sa_flags; void (*sa_restorer) (void); }; # 244 "/usr/include/signal.h" 2 3 4 extern int sigprocmask (int __how, const sigset_t *__restrict __set, sigset_t *__restrict __oset) __attribute__ ((__nothrow__ , __leaf__)); extern int sigsuspend (const sigset_t *__set) __attribute__ ((__nonnull__ (1))); extern int sigaction (int __sig, const struct sigaction *__restrict __act, struct sigaction *__restrict __oact) __attribute__ ((__nothrow__ , __leaf__)); extern int sigpending (sigset_t *__set) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sigwait (const sigset_t *__restrict __set, int *__restrict __sig) __attribute__ ((__nonnull__ (1, 2))); extern int sigwaitinfo (const sigset_t *__restrict __set, siginfo_t *__restrict __info) __attribute__ ((__nonnull__ (1))); extern int sigtimedwait (const sigset_t *__restrict __set, siginfo_t *__restrict __info, const struct timespec *__restrict __timeout) __attribute__ ((__nonnull__ (1))); extern int sigqueue (__pid_t __pid, int __sig, const union sigval __val) __attribute__ ((__nothrow__ , __leaf__)); # 301 "/usr/include/signal.h" 3 4 extern const char *const _sys_siglist[65]; extern const char *const sys_siglist[65]; # 1 "/usr/include/bits/sigcontext.h" 1 3 4 # 29 "/usr/include/bits/sigcontext.h" 3 4 struct _fpx_sw_bytes { __uint32_t magic1; __uint32_t extended_size; __uint64_t xstate_bv; __uint32_t xstate_size; __uint32_t padding[7]; }; struct _fpreg { unsigned short significand[4]; unsigned short exponent; }; struct _fpxreg { unsigned short significand[4]; unsigned short exponent; unsigned short padding[3]; }; struct _xmmreg { __uint32_t element[4]; }; # 121 "/usr/include/bits/sigcontext.h" 3 4 struct _fpstate { __uint16_t cwd; __uint16_t swd; __uint16_t ftw; __uint16_t fop; __uint64_t rip; __uint64_t rdp; __uint32_t mxcsr; __uint32_t mxcr_mask; struct _fpxreg _st[8]; struct _xmmreg _xmm[16]; __uint32_t padding[24]; }; struct sigcontext { __uint64_t r8; __uint64_t r9; __uint64_t r10; __uint64_t r11; __uint64_t r12; __uint64_t r13; __uint64_t r14; __uint64_t r15; __uint64_t rdi; __uint64_t rsi; __uint64_t rbp; __uint64_t rbx; __uint64_t rdx; __uint64_t rax; __uint64_t rcx; __uint64_t rsp; __uint64_t rip; __uint64_t eflags; unsigned short cs; unsigned short gs; unsigned short fs; unsigned short __pad0; __uint64_t err; __uint64_t trapno; __uint64_t oldmask; __uint64_t cr2; __extension__ union { struct _fpstate * fpstate; __uint64_t __fpstate_word; }; __uint64_t __reserved1 [8]; }; struct _xsave_hdr { __uint64_t xstate_bv; __uint64_t reserved1[2]; __uint64_t reserved2[5]; }; struct _ymmh_state { __uint32_t ymmh_space[64]; }; struct _xstate { struct _fpstate fpstate; struct _xsave_hdr xstate_hdr; struct _ymmh_state ymmh; }; # 307 "/usr/include/signal.h" 2 3 4 extern int sigreturn (struct sigcontext *__scp) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 317 "/usr/include/signal.h" 2 3 4 extern int siginterrupt (int __sig, int __interrupt) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/include/bits/sigstack.h" 1 3 4 # 25 "/usr/include/bits/sigstack.h" 3 4 struct sigstack { void *ss_sp; int ss_onstack; }; enum { SS_ONSTACK = 1, SS_DISABLE }; # 49 "/usr/include/bits/sigstack.h" 3 4 typedef struct sigaltstack { void *ss_sp; int ss_flags; size_t ss_size; } stack_t; # 324 "/usr/include/signal.h" 2 3 4 # 1 "/usr/include/sys/ucontext.h" 1 3 4 # 22 "/usr/include/sys/ucontext.h" 3 4 # 1 "/usr/include/signal.h" 1 3 4 # 23 "/usr/include/sys/ucontext.h" 2 3 4 # 31 "/usr/include/sys/ucontext.h" 3 4 __extension__ typedef long long int greg_t; typedef greg_t gregset_t[23]; # 92 "/usr/include/sys/ucontext.h" 3 4 struct _libc_fpxreg { unsigned short int significand[4]; unsigned short int exponent; unsigned short int padding[3]; }; struct _libc_xmmreg { __uint32_t element[4]; }; struct _libc_fpstate { __uint16_t cwd; __uint16_t swd; __uint16_t ftw; __uint16_t fop; __uint64_t rip; __uint64_t rdp; __uint32_t mxcsr; __uint32_t mxcr_mask; struct _libc_fpxreg _st[8]; struct _libc_xmmreg _xmm[16]; __uint32_t padding[24]; }; typedef struct _libc_fpstate *fpregset_t; typedef struct { gregset_t gregs; fpregset_t fpregs; __extension__ unsigned long long __reserved1 [8]; } mcontext_t; typedef struct ucontext { unsigned long int uc_flags; struct ucontext *uc_link; stack_t uc_stack; mcontext_t uc_mcontext; __sigset_t uc_sigmask; struct _libc_fpstate __fpregs_mem; } ucontext_t; # 327 "/usr/include/signal.h" 2 3 4 extern int sigstack (struct sigstack *__ss, struct sigstack *__oss) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__deprecated__)); extern int sigaltstack (const struct sigaltstack *__restrict __ss, struct sigaltstack *__restrict __oss) __attribute__ ((__nothrow__ , __leaf__)); # 362 "/usr/include/signal.h" 3 4 # 1 "/usr/include/bits/sigthread.h" 1 3 4 # 30 "/usr/include/bits/sigthread.h" 3 4 extern int pthread_sigmask (int __how, const __sigset_t *__restrict __newmask, __sigset_t *__restrict __oldmask)__attribute__ ((__nothrow__ , __leaf__)); extern int pthread_kill (pthread_t __threadid, int __signo) __attribute__ ((__nothrow__ , __leaf__)); # 363 "/usr/include/signal.h" 2 3 4 extern int __libc_current_sigrtmin (void) __attribute__ ((__nothrow__ , __leaf__)); extern int __libc_current_sigrtmax (void) __attribute__ ((__nothrow__ , __leaf__)); # 29 "/usr/include/sys/param.h" 2 3 4 # 1 "/usr/include/bits/param.h" 1 3 4 # 28 "/usr/include/bits/param.h" 3 4 # 1 "/usr/include/linux/param.h" 1 3 4 # 1 "/usr/include/asm/param.h" 1 3 4 # 1 "/usr/include/asm-generic/param.h" 1 3 4 # 1 "/usr/include/asm/param.h" 2 3 4 # 5 "/usr/include/linux/param.h" 2 3 4 # 29 "/usr/include/bits/param.h" 2 3 4 # 32 "/usr/include/sys/param.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_PARAM_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: netdb.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/netdb.h" 1 3 4 # 25 "/usr/include/netdb.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/netdb.h" 2 3 4 # 1 "/usr/include/netinet/in.h" 1 3 4 # 22 "/usr/include/netinet/in.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 1 3 4 # 9 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 3 4 # 1 "/usr/include/stdint.h" 1 3 4 # 26 "/usr/include/stdint.h" 3 4 # 1 "/usr/include/bits/wchar.h" 1 3 4 # 27 "/usr/include/stdint.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/stdint.h" 2 3 4 # 36 "/usr/include/stdint.h" 3 4 # 36 "/usr/include/stdint.h" 3 4 typedef signed char int8_t; typedef short int int16_t; typedef int int32_t; typedef long int int64_t; typedef unsigned char uint8_t; typedef unsigned short int uint16_t; typedef unsigned int uint32_t; typedef unsigned long int uint64_t; # 65 "/usr/include/stdint.h" 3 4 typedef signed char int_least8_t; typedef short int int_least16_t; typedef int int_least32_t; typedef long int int_least64_t; typedef unsigned char uint_least8_t; typedef unsigned short int uint_least16_t; typedef unsigned int uint_least32_t; typedef unsigned long int uint_least64_t; # 90 "/usr/include/stdint.h" 3 4 typedef signed char int_fast8_t; typedef long int int_fast16_t; typedef long int int_fast32_t; typedef long int int_fast64_t; # 103 "/usr/include/stdint.h" 3 4 typedef unsigned char uint_fast8_t; typedef unsigned long int uint_fast16_t; typedef unsigned long int uint_fast32_t; typedef unsigned long int uint_fast64_t; # 119 "/usr/include/stdint.h" 3 4 typedef long int intptr_t; typedef unsigned long int uintptr_t; # 134 "/usr/include/stdint.h" 3 4 typedef long int intmax_t; typedef unsigned long int uintmax_t; # 10 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 2 3 4 # 23 "/usr/include/netinet/in.h" 2 3 4 # 1 "/usr/include/sys/socket.h" 1 3 4 # 24 "/usr/include/sys/socket.h" 3 4 # 1 "/usr/include/sys/uio.h" 1 3 4 # 23 "/usr/include/sys/uio.h" 3 4 # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 200 "/usr/include/sys/types.h" 3 4 typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 24 "/usr/include/sys/uio.h" 2 3 4 # 1 "/usr/include/bits/uio.h" 1 3 4 # 43 "/usr/include/bits/uio.h" 3 4 struct iovec { void *iov_base; size_t iov_len; }; # 29 "/usr/include/sys/uio.h" 2 3 4 # 39 "/usr/include/sys/uio.h" 3 4 extern ssize_t readv (int __fd, const struct iovec *__iovec, int __count) ; # 50 "/usr/include/sys/uio.h" 3 4 extern ssize_t writev (int __fd, const struct iovec *__iovec, int __count) ; # 65 "/usr/include/sys/uio.h" 3 4 extern ssize_t preadv (int __fd, const struct iovec *__iovec, int __count, __off_t __offset) ; # 77 "/usr/include/sys/uio.h" 3 4 extern ssize_t pwritev (int __fd, const struct iovec *__iovec, int __count, __off_t __offset) ; # 120 "/usr/include/sys/uio.h" 3 4 # 27 "/usr/include/sys/socket.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 29 "/usr/include/sys/socket.h" 2 3 4 # 38 "/usr/include/sys/socket.h" 3 4 # 1 "/usr/include/bits/socket.h" 1 3 4 # 27 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 28 "/usr/include/bits/socket.h" 2 3 4 typedef __socklen_t socklen_t; # 1 "/usr/include/bits/socket_type.h" 1 3 4 # 24 "/usr/include/bits/socket_type.h" 3 4 enum __socket_type { SOCK_STREAM = 1, SOCK_DGRAM = 2, SOCK_RAW = 3, SOCK_RDM = 4, SOCK_SEQPACKET = 5, SOCK_DCCP = 6, SOCK_PACKET = 10, SOCK_CLOEXEC = 02000000, SOCK_NONBLOCK = 00004000 }; # 39 "/usr/include/bits/socket.h" 2 3 4 # 167 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/include/bits/sockaddr.h" 1 3 4 # 28 "/usr/include/bits/sockaddr.h" 3 4 typedef unsigned short int sa_family_t; # 168 "/usr/include/bits/socket.h" 2 3 4 struct sockaddr { sa_family_t sa_family; char sa_data[14]; }; # 183 "/usr/include/bits/socket.h" 3 4 struct sockaddr_storage { sa_family_t ss_family; char __ss_padding[(128 - (sizeof (unsigned short int)) - sizeof (unsigned long int))]; unsigned long int __ss_align; }; enum { MSG_OOB = 0x01, MSG_PEEK = 0x02, MSG_DONTROUTE = 0x04, MSG_CTRUNC = 0x08, MSG_PROXY = 0x10, MSG_TRUNC = 0x20, MSG_DONTWAIT = 0x40, MSG_EOR = 0x80, MSG_WAITALL = 0x100, MSG_FIN = 0x200, MSG_SYN = 0x400, MSG_CONFIRM = 0x800, MSG_RST = 0x1000, MSG_ERRQUEUE = 0x2000, MSG_NOSIGNAL = 0x4000, MSG_MORE = 0x8000, MSG_WAITFORONE = 0x10000, MSG_BATCH = 0x40000, MSG_FASTOPEN = 0x20000000, MSG_CMSG_CLOEXEC = 0x40000000 }; struct msghdr { void *msg_name; socklen_t msg_namelen; struct iovec *msg_iov; size_t msg_iovlen; void *msg_control; size_t msg_controllen; int msg_flags; }; struct cmsghdr { size_t cmsg_len; int cmsg_level; int cmsg_type; __extension__ unsigned char __cmsg_data []; }; # 295 "/usr/include/bits/socket.h" 3 4 extern struct cmsghdr *__cmsg_nxthdr (struct msghdr *__mhdr, struct cmsghdr *__cmsg) __attribute__ ((__nothrow__ , __leaf__)); # 322 "/usr/include/bits/socket.h" 3 4 enum { SCM_RIGHTS = 0x01 }; # 368 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/include/asm/socket.h" 1 3 4 # 1 "/usr/include/asm-generic/socket.h" 1 3 4 # 1 "/usr/include/asm/sockios.h" 1 3 4 # 1 "/usr/include/asm-generic/sockios.h" 1 3 4 # 1 "/usr/include/asm/sockios.h" 2 3 4 # 5 "/usr/include/asm-generic/socket.h" 2 3 4 # 1 "/usr/include/asm/socket.h" 2 3 4 # 369 "/usr/include/bits/socket.h" 2 3 4 # 402 "/usr/include/bits/socket.h" 3 4 struct linger { int l_onoff; int l_linger; }; # 39 "/usr/include/sys/socket.h" 2 3 4 struct osockaddr { unsigned short int sa_family; unsigned char sa_data[14]; }; enum { SHUT_RD = 0, SHUT_WR, SHUT_RDWR }; # 113 "/usr/include/sys/socket.h" 3 4 extern int socket (int __domain, int __type, int __protocol) __attribute__ ((__nothrow__ , __leaf__)); extern int socketpair (int __domain, int __type, int __protocol, int __fds[2]) __attribute__ ((__nothrow__ , __leaf__)); extern int bind (int __fd, const struct sockaddr * __addr, socklen_t __len) __attribute__ ((__nothrow__ , __leaf__)); extern int getsockname (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __len) __attribute__ ((__nothrow__ , __leaf__)); # 137 "/usr/include/sys/socket.h" 3 4 extern int connect (int __fd, const struct sockaddr * __addr, socklen_t __len); extern int getpeername (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __len) __attribute__ ((__nothrow__ , __leaf__)); extern ssize_t send (int __fd, const void *__buf, size_t __n, int __flags); extern ssize_t recv (int __fd, void *__buf, size_t __n, int __flags); extern ssize_t sendto (int __fd, const void *__buf, size_t __n, int __flags, const struct sockaddr * __addr, socklen_t __addr_len); # 174 "/usr/include/sys/socket.h" 3 4 extern ssize_t recvfrom (int __fd, void *__restrict __buf, size_t __n, int __flags, struct sockaddr *__restrict __addr, socklen_t *__restrict __addr_len); extern ssize_t sendmsg (int __fd, const struct msghdr *__message, int __flags); # 202 "/usr/include/sys/socket.h" 3 4 extern ssize_t recvmsg (int __fd, struct msghdr *__message, int __flags); # 219 "/usr/include/sys/socket.h" 3 4 extern int getsockopt (int __fd, int __level, int __optname, void *__restrict __optval, socklen_t *__restrict __optlen) __attribute__ ((__nothrow__ , __leaf__)); extern int setsockopt (int __fd, int __level, int __optname, const void *__optval, socklen_t __optlen) __attribute__ ((__nothrow__ , __leaf__)); extern int listen (int __fd, int __n) __attribute__ ((__nothrow__ , __leaf__)); # 243 "/usr/include/sys/socket.h" 3 4 extern int accept (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __addr_len); # 261 "/usr/include/sys/socket.h" 3 4 extern int shutdown (int __fd, int __how) __attribute__ ((__nothrow__ , __leaf__)); extern int sockatmark (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int isfdtype (int __fd, int __fdtype) __attribute__ ((__nothrow__ , __leaf__)); # 283 "/usr/include/sys/socket.h" 3 4 # 24 "/usr/include/netinet/in.h" 2 3 4 typedef uint32_t in_addr_t; struct in_addr { in_addr_t s_addr; }; # 1 "/usr/include/bits/in.h" 1 3 4 # 141 "/usr/include/bits/in.h" 3 4 struct ip_opts { struct in_addr ip_dst; char ip_opts[40]; }; struct ip_mreqn { struct in_addr imr_multiaddr; struct in_addr imr_address; int imr_ifindex; }; struct in_pktinfo { int ipi_ifindex; struct in_addr ipi_spec_dst; struct in_addr ipi_addr; }; # 38 "/usr/include/netinet/in.h" 2 3 4 enum { IPPROTO_IP = 0, IPPROTO_ICMP = 1, IPPROTO_IGMP = 2, IPPROTO_IPIP = 4, IPPROTO_TCP = 6, IPPROTO_EGP = 8, IPPROTO_PUP = 12, IPPROTO_UDP = 17, IPPROTO_IDP = 22, IPPROTO_TP = 29, IPPROTO_DCCP = 33, IPPROTO_IPV6 = 41, IPPROTO_RSVP = 46, IPPROTO_GRE = 47, IPPROTO_ESP = 50, IPPROTO_AH = 51, IPPROTO_MTP = 92, IPPROTO_BEETPH = 94, IPPROTO_ENCAP = 98, IPPROTO_PIM = 103, IPPROTO_COMP = 108, IPPROTO_SCTP = 132, IPPROTO_UDPLITE = 136, IPPROTO_MPLS = 137, IPPROTO_RAW = 255, IPPROTO_MAX }; enum { IPPROTO_HOPOPTS = 0, IPPROTO_ROUTING = 43, IPPROTO_FRAGMENT = 44, IPPROTO_ICMPV6 = 58, IPPROTO_NONE = 59, IPPROTO_DSTOPTS = 60, IPPROTO_MH = 135 }; typedef uint16_t in_port_t; enum { IPPORT_ECHO = 7, IPPORT_DISCARD = 9, IPPORT_SYSTAT = 11, IPPORT_DAYTIME = 13, IPPORT_NETSTAT = 15, IPPORT_FTP = 21, IPPORT_TELNET = 23, IPPORT_SMTP = 25, IPPORT_TIMESERVER = 37, IPPORT_NAMESERVER = 42, IPPORT_WHOIS = 43, IPPORT_MTP = 57, IPPORT_TFTP = 69, IPPORT_RJE = 77, IPPORT_FINGER = 79, IPPORT_TTYLINK = 87, IPPORT_SUPDUP = 95, IPPORT_EXECSERVER = 512, IPPORT_LOGINSERVER = 513, IPPORT_CMDSERVER = 514, IPPORT_EFSSERVER = 520, IPPORT_BIFFUDP = 512, IPPORT_WHOSERVER = 513, IPPORT_ROUTESERVER = 520, IPPORT_RESERVED = 1024, IPPORT_USERRESERVED = 5000 }; # 211 "/usr/include/netinet/in.h" 3 4 struct in6_addr { union { uint8_t __u6_addr8[16]; uint16_t __u6_addr16[8]; uint32_t __u6_addr32[4]; } __in6_u; }; extern const struct in6_addr in6addr_any; extern const struct in6_addr in6addr_loopback; # 239 "/usr/include/netinet/in.h" 3 4 struct sockaddr_in { sa_family_t sin_family; in_port_t sin_port; struct in_addr sin_addr; unsigned char sin_zero[sizeof (struct sockaddr) - (sizeof (unsigned short int)) - sizeof (in_port_t) - sizeof (struct in_addr)]; }; struct sockaddr_in6 { sa_family_t sin6_family; in_port_t sin6_port; uint32_t sin6_flowinfo; struct in6_addr sin6_addr; uint32_t sin6_scope_id; }; struct ip_mreq { struct in_addr imr_multiaddr; struct in_addr imr_interface; }; struct ip_mreq_source { struct in_addr imr_multiaddr; struct in_addr imr_interface; struct in_addr imr_sourceaddr; }; struct ipv6_mreq { struct in6_addr ipv6mr_multiaddr; unsigned int ipv6mr_interface; }; struct group_req { uint32_t gr_interface; struct sockaddr_storage gr_group; }; struct group_source_req { uint32_t gsr_interface; struct sockaddr_storage gsr_group; struct sockaddr_storage gsr_source; }; struct ip_msfilter { struct in_addr imsf_multiaddr; struct in_addr imsf_interface; uint32_t imsf_fmode; uint32_t imsf_numsrc; struct in_addr imsf_slist[1]; }; struct group_filter { uint32_t gf_interface; struct sockaddr_storage gf_group; uint32_t gf_fmode; uint32_t gf_numsrc; struct sockaddr_storage gf_slist[1]; }; # 376 "/usr/include/netinet/in.h" 3 4 extern uint32_t ntohl (uint32_t __netlong) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern uint16_t ntohs (uint16_t __netshort) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern uint32_t htonl (uint32_t __hostlong) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern uint16_t htons (uint16_t __hostshort) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 388 "/usr/include/netinet/in.h" 2 3 4 # 503 "/usr/include/netinet/in.h" 3 4 extern int bindresvport (int __sockfd, struct sockaddr_in *__sock_in) __attribute__ ((__nothrow__ , __leaf__)); extern int bindresvport6 (int __sockfd, struct sockaddr_in6 *__sock_in) __attribute__ ((__nothrow__ , __leaf__)); # 631 "/usr/include/netinet/in.h" 3 4 # 28 "/usr/include/netdb.h" 2 3 4 # 1 "/usr/include/rpc/netdb.h" 1 3 4 # 42 "/usr/include/rpc/netdb.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 43 "/usr/include/rpc/netdb.h" 2 3 4 struct rpcent { char *r_name; char **r_aliases; int r_number; }; extern void setrpcent (int __stayopen) __attribute__ ((__nothrow__ , __leaf__)); extern void endrpcent (void) __attribute__ ((__nothrow__ , __leaf__)); extern struct rpcent *getrpcbyname (const char *__name) __attribute__ ((__nothrow__ , __leaf__)); extern struct rpcent *getrpcbynumber (int __number) __attribute__ ((__nothrow__ , __leaf__)); extern struct rpcent *getrpcent (void) __attribute__ ((__nothrow__ , __leaf__)); extern int getrpcbyname_r (const char *__name, struct rpcent *__result_buf, char *__buffer, size_t __buflen, struct rpcent **__result) __attribute__ ((__nothrow__ , __leaf__)); extern int getrpcbynumber_r (int __number, struct rpcent *__result_buf, char *__buffer, size_t __buflen, struct rpcent **__result) __attribute__ ((__nothrow__ , __leaf__)); extern int getrpcent_r (struct rpcent *__result_buf, char *__buffer, size_t __buflen, struct rpcent **__result) __attribute__ ((__nothrow__ , __leaf__)); # 33 "/usr/include/netdb.h" 2 3 4 # 42 "/usr/include/netdb.h" 3 4 # 1 "/usr/include/bits/netdb.h" 1 3 4 # 26 "/usr/include/bits/netdb.h" 3 4 struct netent { char *n_name; char **n_aliases; int n_addrtype; uint32_t n_net; }; # 43 "/usr/include/netdb.h" 2 3 4 # 53 "/usr/include/netdb.h" 3 4 extern int *__h_errno_location (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 92 "/usr/include/netdb.h" 3 4 extern void herror (const char *__str) __attribute__ ((__nothrow__ , __leaf__)); extern const char *hstrerror (int __err_num) __attribute__ ((__nothrow__ , __leaf__)); struct hostent { char *h_name; char **h_aliases; int h_addrtype; int h_length; char **h_addr_list; }; extern void sethostent (int __stay_open); extern void endhostent (void); extern struct hostent *gethostent (void); extern struct hostent *gethostbyaddr (const void *__addr, __socklen_t __len, int __type); extern struct hostent *gethostbyname (const char *__name); # 155 "/usr/include/netdb.h" 3 4 extern struct hostent *gethostbyname2 (const char *__name, int __af); # 167 "/usr/include/netdb.h" 3 4 extern int gethostent_r (struct hostent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct hostent **__restrict __result, int *__restrict __h_errnop); extern int gethostbyaddr_r (const void *__restrict __addr, __socklen_t __len, int __type, struct hostent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct hostent **__restrict __result, int *__restrict __h_errnop); extern int gethostbyname_r (const char *__restrict __name, struct hostent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct hostent **__restrict __result, int *__restrict __h_errnop); extern int gethostbyname2_r (const char *__restrict __name, int __af, struct hostent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct hostent **__restrict __result, int *__restrict __h_errnop); # 198 "/usr/include/netdb.h" 3 4 extern void setnetent (int __stay_open); extern void endnetent (void); extern struct netent *getnetent (void); extern struct netent *getnetbyaddr (uint32_t __net, int __type); extern struct netent *getnetbyname (const char *__name); # 237 "/usr/include/netdb.h" 3 4 extern int getnetent_r (struct netent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct netent **__restrict __result, int *__restrict __h_errnop); extern int getnetbyaddr_r (uint32_t __net, int __type, struct netent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct netent **__restrict __result, int *__restrict __h_errnop); extern int getnetbyname_r (const char *__restrict __name, struct netent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct netent **__restrict __result, int *__restrict __h_errnop); struct servent { char *s_name; char **s_aliases; int s_port; char *s_proto; }; extern void setservent (int __stay_open); extern void endservent (void); extern struct servent *getservent (void); extern struct servent *getservbyname (const char *__name, const char *__proto); extern struct servent *getservbyport (int __port, const char *__proto); # 308 "/usr/include/netdb.h" 3 4 extern int getservent_r (struct servent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct servent **__restrict __result); extern int getservbyname_r (const char *__restrict __name, const char *__restrict __proto, struct servent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct servent **__restrict __result); extern int getservbyport_r (int __port, const char *__restrict __proto, struct servent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct servent **__restrict __result); struct protoent { char *p_name; char **p_aliases; int p_proto; }; extern void setprotoent (int __stay_open); extern void endprotoent (void); extern struct protoent *getprotoent (void); extern struct protoent *getprotobyname (const char *__name); extern struct protoent *getprotobynumber (int __proto); # 374 "/usr/include/netdb.h" 3 4 extern int getprotoent_r (struct protoent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct protoent **__restrict __result); extern int getprotobyname_r (const char *__restrict __name, struct protoent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct protoent **__restrict __result); extern int getprotobynumber_r (int __proto, struct protoent *__restrict __result_buf, char *__restrict __buf, size_t __buflen, struct protoent **__restrict __result); # 395 "/usr/include/netdb.h" 3 4 extern int setnetgrent (const char *__netgroup); extern void endnetgrent (void); # 412 "/usr/include/netdb.h" 3 4 extern int getnetgrent (char **__restrict __hostp, char **__restrict __userp, char **__restrict __domainp); # 423 "/usr/include/netdb.h" 3 4 extern int innetgr (const char *__netgroup, const char *__host, const char *__user, const char *__domain); extern int getnetgrent_r (char **__restrict __hostp, char **__restrict __userp, char **__restrict __domainp, char *__restrict __buffer, size_t __buflen); # 451 "/usr/include/netdb.h" 3 4 extern int rcmd (char **__restrict __ahost, unsigned short int __rport, const char *__restrict __locuser, const char *__restrict __remuser, const char *__restrict __cmd, int *__restrict __fd2p); # 463 "/usr/include/netdb.h" 3 4 extern int rcmd_af (char **__restrict __ahost, unsigned short int __rport, const char *__restrict __locuser, const char *__restrict __remuser, const char *__restrict __cmd, int *__restrict __fd2p, sa_family_t __af); # 479 "/usr/include/netdb.h" 3 4 extern int rexec (char **__restrict __ahost, int __rport, const char *__restrict __name, const char *__restrict __pass, const char *__restrict __cmd, int *__restrict __fd2p); # 491 "/usr/include/netdb.h" 3 4 extern int rexec_af (char **__restrict __ahost, int __rport, const char *__restrict __name, const char *__restrict __pass, const char *__restrict __cmd, int *__restrict __fd2p, sa_family_t __af); # 505 "/usr/include/netdb.h" 3 4 extern int ruserok (const char *__rhost, int __suser, const char *__remuser, const char *__locuser); # 515 "/usr/include/netdb.h" 3 4 extern int ruserok_af (const char *__rhost, int __suser, const char *__remuser, const char *__locuser, sa_family_t __af); # 528 "/usr/include/netdb.h" 3 4 extern int iruserok (uint32_t __raddr, int __suser, const char *__remuser, const char *__locuser); # 539 "/usr/include/netdb.h" 3 4 extern int iruserok_af (const void *__raddr, int __suser, const char *__remuser, const char *__locuser, sa_family_t __af); # 551 "/usr/include/netdb.h" 3 4 extern int rresvport (int *__alport); # 560 "/usr/include/netdb.h" 3 4 extern int rresvport_af (int *__alport, sa_family_t __af); struct addrinfo { int ai_flags; int ai_family; int ai_socktype; int ai_protocol; socklen_t ai_addrlen; struct sockaddr *ai_addr; char *ai_canonname; struct addrinfo *ai_next; }; # 662 "/usr/include/netdb.h" 3 4 extern int getaddrinfo (const char *__restrict __name, const char *__restrict __service, const struct addrinfo *__restrict __req, struct addrinfo **__restrict __pai); extern void freeaddrinfo (struct addrinfo *__ai) __attribute__ ((__nothrow__ , __leaf__)); extern const char *gai_strerror (int __ecode) __attribute__ ((__nothrow__ , __leaf__)); extern int getnameinfo (const struct sockaddr *__restrict __sa, socklen_t __salen, char *__restrict __host, socklen_t __hostlen, char *__restrict __serv, socklen_t __servlen, int __flags); # 713 "/usr/include/netdb.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_NETDB_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: search.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/search.h" 1 3 4 # 22 "/usr/include/search.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 23 "/usr/include/search.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 26 "/usr/include/search.h" 2 3 4 # 44 "/usr/include/search.h" 3 4 extern void insque (void *__elem, void *__prev) __attribute__ ((__nothrow__ , __leaf__)); extern void remque (void *__elem) __attribute__ ((__nothrow__ , __leaf__)); typedef int (*__compar_fn_t) (const void *, const void *); typedef enum { FIND, ENTER } ACTION; typedef struct entry { char *key; void *data; } ENTRY; struct _ENTRY; # 87 "/usr/include/search.h" 3 4 extern ENTRY *hsearch (ENTRY __item, ACTION __action) __attribute__ ((__nothrow__ , __leaf__)); extern int hcreate (size_t __nel) __attribute__ ((__nothrow__ , __leaf__)); extern void hdestroy (void) __attribute__ ((__nothrow__ , __leaf__)); # 118 "/usr/include/search.h" 3 4 typedef enum { preorder, postorder, endorder, leaf } VISIT; extern void *tsearch (const void *__key, void **__rootp, __compar_fn_t __compar); extern void *tfind (const void *__key, void *const *__rootp, __compar_fn_t __compar); extern void *tdelete (const void *__restrict __key, void **__restrict __rootp, __compar_fn_t __compar); typedef void (*__action_fn_t) (const void *__nodep, VISIT __value, int __level); extern void twalk (const void *__root, __action_fn_t __action); # 164 "/usr/include/search.h" 3 4 extern void *lfind (const void *__key, const void *__base, size_t *__nmemb, size_t __size, __compar_fn_t __compar); extern void *lsearch (const void *__key, void *__base, size_t *__nmemb, size_t __size, __compar_fn_t __compar); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SEARCH_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: mathimf.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:21: fatal error: mathimf.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:21: fatal error: mathimf.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:21: fatal error: mathimf.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/procfs.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/procfs.h" 1 3 4 # 30 "/usr/include/sys/procfs.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 31 "/usr/include/sys/procfs.h" 2 3 4 # 1 "/usr/include/sys/time.h" 1 3 4 # 23 "/usr/include/sys/time.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 24 "/usr/include/sys/time.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 26 "/usr/include/sys/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 28 "/usr/include/sys/time.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 30 "/usr/include/sys/time.h" 2 3 4 # 55 "/usr/include/sys/time.h" 3 4 struct timezone { int tz_minuteswest; int tz_dsttime; }; typedef struct timezone *__restrict __timezone_ptr_t; # 71 "/usr/include/sys/time.h" 3 4 extern int gettimeofday (struct timeval *__restrict __tv, __timezone_ptr_t __tz) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int settimeofday (const struct timeval *__tv, const struct timezone *__tz) __attribute__ ((__nothrow__ , __leaf__)); extern int adjtime (const struct timeval *__delta, struct timeval *__olddelta) __attribute__ ((__nothrow__ , __leaf__)); enum __itimer_which { ITIMER_REAL = 0, ITIMER_VIRTUAL = 1, ITIMER_PROF = 2 }; struct itimerval { struct timeval it_interval; struct timeval it_value; }; typedef int __itimer_which_t; extern int getitimer (__itimer_which_t __which, struct itimerval *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int setitimer (__itimer_which_t __which, const struct itimerval *__restrict __new, struct itimerval *__restrict __old) __attribute__ ((__nothrow__ , __leaf__)); extern int utimes (const char *__file, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int lutimes (const char *__file, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int futimes (int __fd, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)); # 189 "/usr/include/sys/time.h" 3 4 # 32 "/usr/include/sys/procfs.h" 2 3 4 # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 33 "/usr/include/sys/procfs.h" 2 3 4 # 1 "/usr/include/sys/user.h" 1 3 4 # 27 "/usr/include/sys/user.h" 3 4 struct user_fpregs_struct { unsigned short int cwd; unsigned short int swd; unsigned short int ftw; unsigned short int fop; __extension__ unsigned long long int rip; __extension__ unsigned long long int rdp; unsigned int mxcsr; unsigned int mxcr_mask; unsigned int st_space[32]; unsigned int xmm_space[64]; unsigned int padding[24]; }; struct user_regs_struct { __extension__ unsigned long long int r15; __extension__ unsigned long long int r14; __extension__ unsigned long long int r13; __extension__ unsigned long long int r12; __extension__ unsigned long long int rbp; __extension__ unsigned long long int rbx; __extension__ unsigned long long int r11; __extension__ unsigned long long int r10; __extension__ unsigned long long int r9; __extension__ unsigned long long int r8; __extension__ unsigned long long int rax; __extension__ unsigned long long int rcx; __extension__ unsigned long long int rdx; __extension__ unsigned long long int rsi; __extension__ unsigned long long int rdi; __extension__ unsigned long long int orig_rax; __extension__ unsigned long long int rip; __extension__ unsigned long long int cs; __extension__ unsigned long long int eflags; __extension__ unsigned long long int rsp; __extension__ unsigned long long int ss; __extension__ unsigned long long int fs_base; __extension__ unsigned long long int gs_base; __extension__ unsigned long long int ds; __extension__ unsigned long long int es; __extension__ unsigned long long int fs; __extension__ unsigned long long int gs; }; struct user { struct user_regs_struct regs; int u_fpvalid; struct user_fpregs_struct i387; __extension__ unsigned long long int u_tsize; __extension__ unsigned long long int u_dsize; __extension__ unsigned long long int u_ssize; __extension__ unsigned long long int start_code; __extension__ unsigned long long int start_stack; __extension__ long long int signal; int reserved; __extension__ union { struct user_regs_struct* u_ar0; __extension__ unsigned long long int __u_ar0_word; }; __extension__ union { struct user_fpregs_struct* u_fpstate; __extension__ unsigned long long int __u_fpstate_word; }; __extension__ unsigned long long int magic; char u_comm [32]; __extension__ unsigned long long int u_debugreg [8]; }; # 34 "/usr/include/sys/procfs.h" 2 3 4 __extension__ typedef unsigned long long elf_greg_t; # 49 "/usr/include/sys/procfs.h" 3 4 typedef elf_greg_t elf_gregset_t[(sizeof (struct user_regs_struct) / sizeof(elf_greg_t))]; # 63 "/usr/include/sys/procfs.h" 3 4 typedef struct user_fpregs_struct elf_fpregset_t; struct elf_siginfo { int si_signo; int si_code; int si_errno; }; # 82 "/usr/include/sys/procfs.h" 3 4 struct elf_prstatus { struct elf_siginfo pr_info; short int pr_cursig; unsigned long int pr_sigpend; unsigned long int pr_sighold; __pid_t pr_pid; __pid_t pr_ppid; __pid_t pr_pgrp; __pid_t pr_sid; struct timeval pr_utime; struct timeval pr_stime; struct timeval pr_cutime; struct timeval pr_cstime; elf_gregset_t pr_reg; int pr_fpvalid; }; struct elf_prpsinfo { char pr_state; char pr_sname; char pr_zomb; char pr_nice; unsigned long int pr_flag; unsigned int pr_uid; unsigned int pr_gid; int pr_pid, pr_ppid, pr_pgrp, pr_sid; char pr_fname[16]; char pr_psargs[(80)]; }; typedef void *psaddr_t; typedef elf_gregset_t prgregset_t; typedef elf_fpregset_t prfpregset_t; typedef __pid_t lwpid_t; typedef struct elf_prstatus prstatus_t; typedef struct elf_prpsinfo prpsinfo_t; # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_PROCFS_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: sys/resource.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/resource.h" 1 3 4 # 21 "/usr/include/sys/resource.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/sys/resource.h" 2 3 4 # 1 "/usr/include/bits/resource.h" 1 3 4 # 23 "/usr/include/bits/resource.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 24 "/usr/include/bits/resource.h" 2 3 4 enum __rlimit_resource { RLIMIT_CPU = 0, RLIMIT_FSIZE = 1, RLIMIT_DATA = 2, RLIMIT_STACK = 3, RLIMIT_CORE = 4, __RLIMIT_RSS = 5, RLIMIT_NOFILE = 7, __RLIMIT_OFILE = RLIMIT_NOFILE, RLIMIT_AS = 9, __RLIMIT_NPROC = 6, __RLIMIT_MEMLOCK = 8, __RLIMIT_LOCKS = 10, __RLIMIT_SIGPENDING = 11, __RLIMIT_MSGQUEUE = 12, __RLIMIT_NICE = 13, __RLIMIT_RTPRIO = 14, __RLIMIT_RTTIME = 15, __RLIMIT_NLIMITS = 16, __RLIM_NLIMITS = __RLIMIT_NLIMITS }; # 131 "/usr/include/bits/resource.h" 3 4 typedef __rlim_t rlim_t; struct rlimit { rlim_t rlim_cur; rlim_t rlim_max; }; # 158 "/usr/include/bits/resource.h" 3 4 enum __rusage_who { RUSAGE_SELF = 0, RUSAGE_CHILDREN = -1 # 176 "/usr/include/bits/resource.h" 3 4 }; # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 180 "/usr/include/bits/resource.h" 2 3 4 struct rusage { struct timeval ru_utime; struct timeval ru_stime; __extension__ union { long int ru_maxrss; __syscall_slong_t __ru_maxrss_word; }; __extension__ union { long int ru_ixrss; __syscall_slong_t __ru_ixrss_word; }; __extension__ union { long int ru_idrss; __syscall_slong_t __ru_idrss_word; }; __extension__ union { long int ru_isrss; __syscall_slong_t __ru_isrss_word; }; __extension__ union { long int ru_minflt; __syscall_slong_t __ru_minflt_word; }; __extension__ union { long int ru_majflt; __syscall_slong_t __ru_majflt_word; }; __extension__ union { long int ru_nswap; __syscall_slong_t __ru_nswap_word; }; __extension__ union { long int ru_inblock; __syscall_slong_t __ru_inblock_word; }; __extension__ union { long int ru_oublock; __syscall_slong_t __ru_oublock_word; }; __extension__ union { long int ru_msgsnd; __syscall_slong_t __ru_msgsnd_word; }; __extension__ union { long int ru_msgrcv; __syscall_slong_t __ru_msgrcv_word; }; __extension__ union { long int ru_nsignals; __syscall_slong_t __ru_nsignals_word; }; __extension__ union { long int ru_nvcsw; __syscall_slong_t __ru_nvcsw_word; }; __extension__ union { long int ru_nivcsw; __syscall_slong_t __ru_nivcsw_word; }; }; enum __priority_which { PRIO_PROCESS = 0, PRIO_PGRP = 1, PRIO_USER = 2 }; # 328 "/usr/include/bits/resource.h" 3 4 # 25 "/usr/include/sys/resource.h" 2 3 4 typedef __id_t id_t; # 42 "/usr/include/sys/resource.h" 3 4 typedef int __rlimit_resource_t; typedef int __rusage_who_t; typedef int __priority_which_t; extern int getrlimit (__rlimit_resource_t __resource, struct rlimit *__rlimits) __attribute__ ((__nothrow__ , __leaf__)); # 69 "/usr/include/sys/resource.h" 3 4 extern int setrlimit (__rlimit_resource_t __resource, const struct rlimit *__rlimits) __attribute__ ((__nothrow__ , __leaf__)); # 87 "/usr/include/sys/resource.h" 3 4 extern int getrusage (__rusage_who_t __who, struct rusage *__usage) __attribute__ ((__nothrow__ , __leaf__)); extern int getpriority (__priority_which_t __which, id_t __who) __attribute__ ((__nothrow__ , __leaf__)); extern int setpriority (__priority_which_t __which, id_t __who, int __prio) __attribute__ ((__nothrow__ , __leaf__)); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_RESOURCE_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: unistd.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/unistd.h" 1 3 4 # 25 "/usr/include/unistd.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/unistd.h" 2 3 4 # 205 "/usr/include/unistd.h" 3 4 # 1 "/usr/include/bits/posix_opt.h" 1 3 4 # 206 "/usr/include/unistd.h" 2 3 4 # 1 "/usr/include/bits/environments.h" 1 3 4 # 22 "/usr/include/bits/environments.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/environments.h" 2 3 4 # 210 "/usr/include/unistd.h" 2 3 4 # 220 "/usr/include/unistd.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 221 "/usr/include/unistd.h" 2 3 4 typedef __ssize_t ssize_t; # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 230 "/usr/include/unistd.h" 2 3 4 typedef __gid_t gid_t; typedef __uid_t uid_t; typedef __off_t off_t; # 258 "/usr/include/unistd.h" 3 4 typedef __useconds_t useconds_t; typedef __pid_t pid_t; typedef __intptr_t intptr_t; typedef __socklen_t socklen_t; # 290 "/usr/include/unistd.h" 3 4 extern int access (const char *__name, int __type) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 307 "/usr/include/unistd.h" 3 4 extern int faccessat (int __fd, const char *__file, int __type, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))) ; # 337 "/usr/include/unistd.h" 3 4 extern __off_t lseek (int __fd, __off_t __offset, int __whence) __attribute__ ((__nothrow__ , __leaf__)); # 356 "/usr/include/unistd.h" 3 4 extern int close (int __fd); extern ssize_t read (int __fd, void *__buf, size_t __nbytes) ; extern ssize_t write (int __fd, const void *__buf, size_t __n) ; # 379 "/usr/include/unistd.h" 3 4 extern ssize_t pread (int __fd, void *__buf, size_t __nbytes, __off_t __offset) ; extern ssize_t pwrite (int __fd, const void *__buf, size_t __n, __off_t __offset) ; # 420 "/usr/include/unistd.h" 3 4 extern int pipe (int __pipedes[2]) __attribute__ ((__nothrow__ , __leaf__)) ; # 435 "/usr/include/unistd.h" 3 4 extern unsigned int alarm (unsigned int __seconds) __attribute__ ((__nothrow__ , __leaf__)); # 447 "/usr/include/unistd.h" 3 4 extern unsigned int sleep (unsigned int __seconds); extern __useconds_t ualarm (__useconds_t __value, __useconds_t __interval) __attribute__ ((__nothrow__ , __leaf__)); extern int usleep (__useconds_t __useconds); # 472 "/usr/include/unistd.h" 3 4 extern int pause (void); extern int chown (const char *__file, __uid_t __owner, __gid_t __group) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int fchown (int __fd, __uid_t __owner, __gid_t __group) __attribute__ ((__nothrow__ , __leaf__)) ; extern int lchown (const char *__file, __uid_t __owner, __gid_t __group) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int fchownat (int __fd, const char *__file, __uid_t __owner, __gid_t __group, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))) ; extern int chdir (const char *__path) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int fchdir (int __fd) __attribute__ ((__nothrow__ , __leaf__)) ; # 514 "/usr/include/unistd.h" 3 4 extern char *getcwd (char *__buf, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) ; # 528 "/usr/include/unistd.h" 3 4 extern char *getwd (char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) __attribute__ ((__deprecated__)) ; extern int dup (int __fd) __attribute__ ((__nothrow__ , __leaf__)) ; extern int dup2 (int __fd, int __fd2) __attribute__ ((__nothrow__ , __leaf__)); # 546 "/usr/include/unistd.h" 3 4 extern char **__environ; extern int execve (const char *__path, char *const __argv[], char *const __envp[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int fexecve (int __fd, char *const __argv[], char *const __envp[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int execv (const char *__path, char *const __argv[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execle (const char *__path, const char *__arg, ...) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execl (const char *__path, const char *__arg, ...) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execvp (const char *__file, char *const __argv[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execlp (const char *__file, const char *__arg, ...) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); # 601 "/usr/include/unistd.h" 3 4 extern int nice (int __inc) __attribute__ ((__nothrow__ , __leaf__)) ; extern void _exit (int __status) __attribute__ ((__noreturn__)); # 1 "/usr/include/bits/confname.h" 1 3 4 # 24 "/usr/include/bits/confname.h" 3 4 enum { _PC_LINK_MAX, _PC_MAX_CANON, _PC_MAX_INPUT, _PC_NAME_MAX, _PC_PATH_MAX, _PC_PIPE_BUF, _PC_CHOWN_RESTRICTED, _PC_NO_TRUNC, _PC_VDISABLE, _PC_SYNC_IO, _PC_ASYNC_IO, _PC_PRIO_IO, _PC_SOCK_MAXBUF, _PC_FILESIZEBITS, _PC_REC_INCR_XFER_SIZE, _PC_REC_MAX_XFER_SIZE, _PC_REC_MIN_XFER_SIZE, _PC_REC_XFER_ALIGN, _PC_ALLOC_SIZE_MIN, _PC_SYMLINK_MAX, _PC_2_SYMLINKS }; enum { _SC_ARG_MAX, _SC_CHILD_MAX, _SC_CLK_TCK, _SC_NGROUPS_MAX, _SC_OPEN_MAX, _SC_STREAM_MAX, _SC_TZNAME_MAX, _SC_JOB_CONTROL, _SC_SAVED_IDS, _SC_REALTIME_SIGNALS, _SC_PRIORITY_SCHEDULING, _SC_TIMERS, _SC_ASYNCHRONOUS_IO, _SC_PRIORITIZED_IO, _SC_SYNCHRONIZED_IO, _SC_FSYNC, _SC_MAPPED_FILES, _SC_MEMLOCK, _SC_MEMLOCK_RANGE, _SC_MEMORY_PROTECTION, _SC_MESSAGE_PASSING, _SC_SEMAPHORES, _SC_SHARED_MEMORY_OBJECTS, _SC_AIO_LISTIO_MAX, _SC_AIO_MAX, _SC_AIO_PRIO_DELTA_MAX, _SC_DELAYTIMER_MAX, _SC_MQ_OPEN_MAX, _SC_MQ_PRIO_MAX, _SC_VERSION, _SC_PAGESIZE, _SC_RTSIG_MAX, _SC_SEM_NSEMS_MAX, _SC_SEM_VALUE_MAX, _SC_SIGQUEUE_MAX, _SC_TIMER_MAX, _SC_BC_BASE_MAX, _SC_BC_DIM_MAX, _SC_BC_SCALE_MAX, _SC_BC_STRING_MAX, _SC_COLL_WEIGHTS_MAX, _SC_EQUIV_CLASS_MAX, _SC_EXPR_NEST_MAX, _SC_LINE_MAX, _SC_RE_DUP_MAX, _SC_CHARCLASS_NAME_MAX, _SC_2_VERSION, _SC_2_C_BIND, _SC_2_C_DEV, _SC_2_FORT_DEV, _SC_2_FORT_RUN, _SC_2_SW_DEV, _SC_2_LOCALEDEF, _SC_PII, _SC_PII_XTI, _SC_PII_SOCKET, _SC_PII_INTERNET, _SC_PII_OSI, _SC_POLL, _SC_SELECT, _SC_UIO_MAXIOV, _SC_IOV_MAX = _SC_UIO_MAXIOV, _SC_PII_INTERNET_STREAM, _SC_PII_INTERNET_DGRAM, _SC_PII_OSI_COTS, _SC_PII_OSI_CLTS, _SC_PII_OSI_M, _SC_T_IOV_MAX, _SC_THREADS, _SC_THREAD_SAFE_FUNCTIONS, _SC_GETGR_R_SIZE_MAX, _SC_GETPW_R_SIZE_MAX, _SC_LOGIN_NAME_MAX, _SC_TTY_NAME_MAX, _SC_THREAD_DESTRUCTOR_ITERATIONS, _SC_THREAD_KEYS_MAX, _SC_THREAD_STACK_MIN, _SC_THREAD_THREADS_MAX, _SC_THREAD_ATTR_STACKADDR, _SC_THREAD_ATTR_STACKSIZE, _SC_THREAD_PRIORITY_SCHEDULING, _SC_THREAD_PRIO_INHERIT, _SC_THREAD_PRIO_PROTECT, _SC_THREAD_PROCESS_SHARED, _SC_NPROCESSORS_CONF, _SC_NPROCESSORS_ONLN, _SC_PHYS_PAGES, _SC_AVPHYS_PAGES, _SC_ATEXIT_MAX, _SC_PASS_MAX, _SC_XOPEN_VERSION, _SC_XOPEN_XCU_VERSION, _SC_XOPEN_UNIX, _SC_XOPEN_CRYPT, _SC_XOPEN_ENH_I18N, _SC_XOPEN_SHM, _SC_2_CHAR_TERM, _SC_2_C_VERSION, _SC_2_UPE, _SC_XOPEN_XPG2, _SC_XOPEN_XPG3, _SC_XOPEN_XPG4, _SC_CHAR_BIT, _SC_CHAR_MAX, _SC_CHAR_MIN, _SC_INT_MAX, _SC_INT_MIN, _SC_LONG_BIT, _SC_WORD_BIT, _SC_MB_LEN_MAX, _SC_NZERO, _SC_SSIZE_MAX, _SC_SCHAR_MAX, _SC_SCHAR_MIN, _SC_SHRT_MAX, _SC_SHRT_MIN, _SC_UCHAR_MAX, _SC_UINT_MAX, _SC_ULONG_MAX, _SC_USHRT_MAX, _SC_NL_ARGMAX, _SC_NL_LANGMAX, _SC_NL_MSGMAX, _SC_NL_NMAX, _SC_NL_SETMAX, _SC_NL_TEXTMAX, _SC_XBS5_ILP32_OFF32, _SC_XBS5_ILP32_OFFBIG, _SC_XBS5_LP64_OFF64, _SC_XBS5_LPBIG_OFFBIG, _SC_XOPEN_LEGACY, _SC_XOPEN_REALTIME, _SC_XOPEN_REALTIME_THREADS, _SC_ADVISORY_INFO, _SC_BARRIERS, _SC_BASE, _SC_C_LANG_SUPPORT, _SC_C_LANG_SUPPORT_R, _SC_CLOCK_SELECTION, _SC_CPUTIME, _SC_THREAD_CPUTIME, _SC_DEVICE_IO, _SC_DEVICE_SPECIFIC, _SC_DEVICE_SPECIFIC_R, _SC_FD_MGMT, _SC_FIFO, _SC_PIPE, _SC_FILE_ATTRIBUTES, _SC_FILE_LOCKING, _SC_FILE_SYSTEM, _SC_MONOTONIC_CLOCK, _SC_MULTI_PROCESS, _SC_SINGLE_PROCESS, _SC_NETWORKING, _SC_READER_WRITER_LOCKS, _SC_SPIN_LOCKS, _SC_REGEXP, _SC_REGEX_VERSION, _SC_SHELL, _SC_SIGNALS, _SC_SPAWN, _SC_SPORADIC_SERVER, _SC_THREAD_SPORADIC_SERVER, _SC_SYSTEM_DATABASE, _SC_SYSTEM_DATABASE_R, _SC_TIMEOUTS, _SC_TYPED_MEMORY_OBJECTS, _SC_USER_GROUPS, _SC_USER_GROUPS_R, _SC_2_PBS, _SC_2_PBS_ACCOUNTING, _SC_2_PBS_LOCATE, _SC_2_PBS_MESSAGE, _SC_2_PBS_TRACK, _SC_SYMLOOP_MAX, _SC_STREAMS, _SC_2_PBS_CHECKPOINT, _SC_V6_ILP32_OFF32, _SC_V6_ILP32_OFFBIG, _SC_V6_LP64_OFF64, _SC_V6_LPBIG_OFFBIG, _SC_HOST_NAME_MAX, _SC_TRACE, _SC_TRACE_EVENT_FILTER, _SC_TRACE_INHERIT, _SC_TRACE_LOG, _SC_LEVEL1_ICACHE_SIZE, _SC_LEVEL1_ICACHE_ASSOC, _SC_LEVEL1_ICACHE_LINESIZE, _SC_LEVEL1_DCACHE_SIZE, _SC_LEVEL1_DCACHE_ASSOC, _SC_LEVEL1_DCACHE_LINESIZE, _SC_LEVEL2_CACHE_SIZE, _SC_LEVEL2_CACHE_ASSOC, _SC_LEVEL2_CACHE_LINESIZE, _SC_LEVEL3_CACHE_SIZE, _SC_LEVEL3_CACHE_ASSOC, _SC_LEVEL3_CACHE_LINESIZE, _SC_LEVEL4_CACHE_SIZE, _SC_LEVEL4_CACHE_ASSOC, _SC_LEVEL4_CACHE_LINESIZE, _SC_IPV6 = _SC_LEVEL1_ICACHE_SIZE + 50, _SC_RAW_SOCKETS, _SC_V7_ILP32_OFF32, _SC_V7_ILP32_OFFBIG, _SC_V7_LP64_OFF64, _SC_V7_LPBIG_OFFBIG, _SC_SS_REPL_MAX, _SC_TRACE_EVENT_NAME_MAX, _SC_TRACE_NAME_MAX, _SC_TRACE_SYS_MAX, _SC_TRACE_USER_EVENT_MAX, _SC_XOPEN_STREAMS, _SC_THREAD_ROBUST_PRIO_INHERIT, _SC_THREAD_ROBUST_PRIO_PROTECT }; enum { _CS_PATH, _CS_V6_WIDTH_RESTRICTED_ENVS, _CS_GNU_LIBC_VERSION, _CS_GNU_LIBPTHREAD_VERSION, _CS_V5_WIDTH_RESTRICTED_ENVS, _CS_V7_WIDTH_RESTRICTED_ENVS, _CS_LFS_CFLAGS = 1000, _CS_LFS_LDFLAGS, _CS_LFS_LIBS, _CS_LFS_LINTFLAGS, _CS_LFS64_CFLAGS, _CS_LFS64_LDFLAGS, _CS_LFS64_LIBS, _CS_LFS64_LINTFLAGS, _CS_XBS5_ILP32_OFF32_CFLAGS = 1100, _CS_XBS5_ILP32_OFF32_LDFLAGS, _CS_XBS5_ILP32_OFF32_LIBS, _CS_XBS5_ILP32_OFF32_LINTFLAGS, _CS_XBS5_ILP32_OFFBIG_CFLAGS, _CS_XBS5_ILP32_OFFBIG_LDFLAGS, _CS_XBS5_ILP32_OFFBIG_LIBS, _CS_XBS5_ILP32_OFFBIG_LINTFLAGS, _CS_XBS5_LP64_OFF64_CFLAGS, _CS_XBS5_LP64_OFF64_LDFLAGS, _CS_XBS5_LP64_OFF64_LIBS, _CS_XBS5_LP64_OFF64_LINTFLAGS, _CS_XBS5_LPBIG_OFFBIG_CFLAGS, _CS_XBS5_LPBIG_OFFBIG_LDFLAGS, _CS_XBS5_LPBIG_OFFBIG_LIBS, _CS_XBS5_LPBIG_OFFBIG_LINTFLAGS, _CS_POSIX_V6_ILP32_OFF32_CFLAGS, _CS_POSIX_V6_ILP32_OFF32_LDFLAGS, _CS_POSIX_V6_ILP32_OFF32_LIBS, _CS_POSIX_V6_ILP32_OFF32_LINTFLAGS, _CS_POSIX_V6_ILP32_OFFBIG_CFLAGS, _CS_POSIX_V6_ILP32_OFFBIG_LDFLAGS, _CS_POSIX_V6_ILP32_OFFBIG_LIBS, _CS_POSIX_V6_ILP32_OFFBIG_LINTFLAGS, _CS_POSIX_V6_LP64_OFF64_CFLAGS, _CS_POSIX_V6_LP64_OFF64_LDFLAGS, _CS_POSIX_V6_LP64_OFF64_LIBS, _CS_POSIX_V6_LP64_OFF64_LINTFLAGS, _CS_POSIX_V6_LPBIG_OFFBIG_CFLAGS, _CS_POSIX_V6_LPBIG_OFFBIG_LDFLAGS, _CS_POSIX_V6_LPBIG_OFFBIG_LIBS, _CS_POSIX_V6_LPBIG_OFFBIG_LINTFLAGS, _CS_POSIX_V7_ILP32_OFF32_CFLAGS, _CS_POSIX_V7_ILP32_OFF32_LDFLAGS, _CS_POSIX_V7_ILP32_OFF32_LIBS, _CS_POSIX_V7_ILP32_OFF32_LINTFLAGS, _CS_POSIX_V7_ILP32_OFFBIG_CFLAGS, _CS_POSIX_V7_ILP32_OFFBIG_LDFLAGS, _CS_POSIX_V7_ILP32_OFFBIG_LIBS, _CS_POSIX_V7_ILP32_OFFBIG_LINTFLAGS, _CS_POSIX_V7_LP64_OFF64_CFLAGS, _CS_POSIX_V7_LP64_OFF64_LDFLAGS, _CS_POSIX_V7_LP64_OFF64_LIBS, _CS_POSIX_V7_LP64_OFF64_LINTFLAGS, _CS_POSIX_V7_LPBIG_OFFBIG_CFLAGS, _CS_POSIX_V7_LPBIG_OFFBIG_LDFLAGS, _CS_POSIX_V7_LPBIG_OFFBIG_LIBS, _CS_POSIX_V7_LPBIG_OFFBIG_LINTFLAGS, _CS_V6_ENV, _CS_V7_ENV }; # 613 "/usr/include/unistd.h" 2 3 4 extern long int pathconf (const char *__path, int __name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int fpathconf (int __fd, int __name) __attribute__ ((__nothrow__ , __leaf__)); extern long int sysconf (int __name) __attribute__ ((__nothrow__ , __leaf__)); extern size_t confstr (int __name, char *__buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getpid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getppid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getpgrp (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t __getpgid (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getpgid (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern int setpgid (__pid_t __pid, __pid_t __pgid) __attribute__ ((__nothrow__ , __leaf__)); # 663 "/usr/include/unistd.h" 3 4 extern int setpgrp (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t setsid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getsid (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern __uid_t getuid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __uid_t geteuid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __gid_t getgid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __gid_t getegid (void) __attribute__ ((__nothrow__ , __leaf__)); extern int getgroups (int __size, __gid_t __list[]) __attribute__ ((__nothrow__ , __leaf__)) ; # 703 "/usr/include/unistd.h" 3 4 extern int setuid (__uid_t __uid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setreuid (__uid_t __ruid, __uid_t __euid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int seteuid (__uid_t __uid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setgid (__gid_t __gid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setregid (__gid_t __rgid, __gid_t __egid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setegid (__gid_t __gid) __attribute__ ((__nothrow__ , __leaf__)) ; # 759 "/usr/include/unistd.h" 3 4 extern __pid_t fork (void) __attribute__ ((__nothrow__)); extern __pid_t vfork (void) __attribute__ ((__nothrow__ , __leaf__)); extern char *ttyname (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int ttyname_r (int __fd, char *__buf, size_t __buflen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))) ; extern int isatty (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int ttyslot (void) __attribute__ ((__nothrow__ , __leaf__)); extern int link (const char *__from, const char *__to) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) ; extern int linkat (int __fromfd, const char *__from, int __tofd, const char *__to, int __flags) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))) ; extern int symlink (const char *__from, const char *__to) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) ; extern ssize_t readlink (const char *__restrict __path, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) ; extern int symlinkat (const char *__from, int __tofd, const char *__to) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 3))) ; extern ssize_t readlinkat (int __fd, const char *__restrict __path, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))) ; extern int unlink (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int unlinkat (int __fd, const char *__name, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int rmdir (const char *__path) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern __pid_t tcgetpgrp (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int tcsetpgrp (int __fd, __pid_t __pgrp_id) __attribute__ ((__nothrow__ , __leaf__)); extern char *getlogin (void); extern int getlogin_r (char *__name, size_t __name_len) __attribute__ ((__nonnull__ (1))); extern int setlogin (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 873 "/usr/include/unistd.h" 3 4 # 1 "/usr/include/getopt.h" 1 3 4 # 57 "/usr/include/getopt.h" 3 4 extern char *optarg; # 71 "/usr/include/getopt.h" 3 4 extern int optind; extern int opterr; extern int optopt; # 150 "/usr/include/getopt.h" 3 4 extern int getopt (int ___argc, char *const *___argv, const char *__shortopts) __attribute__ ((__nothrow__ , __leaf__)); # 874 "/usr/include/unistd.h" 2 3 4 extern int gethostname (char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sethostname (const char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int sethostid (long int __id) __attribute__ ((__nothrow__ , __leaf__)) ; extern int getdomainname (char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int setdomainname (const char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int vhangup (void) __attribute__ ((__nothrow__ , __leaf__)); extern int revoke (const char *__file) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int profil (unsigned short int *__sample_buffer, size_t __size, size_t __offset, unsigned int __scale) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int acct (const char *__name) __attribute__ ((__nothrow__ , __leaf__)); extern char *getusershell (void) __attribute__ ((__nothrow__ , __leaf__)); extern void endusershell (void) __attribute__ ((__nothrow__ , __leaf__)); extern void setusershell (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daemon (int __nochdir, int __noclose) __attribute__ ((__nothrow__ , __leaf__)) ; extern int chroot (const char *__path) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern char *getpass (const char *__prompt) __attribute__ ((__nonnull__ (1))); extern int fsync (int __fd); # 971 "/usr/include/unistd.h" 3 4 extern long int gethostid (void); extern void sync (void) __attribute__ ((__nothrow__ , __leaf__)); extern int getpagesize (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int getdtablesize (void) __attribute__ ((__nothrow__ , __leaf__)); # 995 "/usr/include/unistd.h" 3 4 extern int truncate (const char *__file, __off_t __length) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 1018 "/usr/include/unistd.h" 3 4 extern int ftruncate (int __fd, __off_t __length) __attribute__ ((__nothrow__ , __leaf__)) ; # 1039 "/usr/include/unistd.h" 3 4 extern int brk (void *__addr) __attribute__ ((__nothrow__ , __leaf__)) ; extern void *sbrk (intptr_t __delta) __attribute__ ((__nothrow__ , __leaf__)); # 1060 "/usr/include/unistd.h" 3 4 extern long int syscall (long int __sysno, ...) __attribute__ ((__nothrow__ , __leaf__)); # 1083 "/usr/include/unistd.h" 3 4 extern int lockf (int __fd, int __cmd, __off_t __len) ; # 1114 "/usr/include/unistd.h" 3 4 extern int fdatasync (int __fildes); # 1166 "/usr/include/unistd.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_UNISTD_H" to "1" ================================================================================ TEST check from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:77) TESTING: check from config.headers(config/BuildSystem/config/headers.py:77) Checks for "header", and defines HAVE_"header" if found Checking for header: netinet/in.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/netinet/in.h" 1 3 4 # 21 "/usr/include/netinet/in.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/netinet/in.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 1 3 4 # 9 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 3 4 # 1 "/usr/include/stdint.h" 1 3 4 # 26 "/usr/include/stdint.h" 3 4 # 1 "/usr/include/bits/wchar.h" 1 3 4 # 27 "/usr/include/stdint.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/stdint.h" 2 3 4 # 36 "/usr/include/stdint.h" 3 4 # 36 "/usr/include/stdint.h" 3 4 typedef signed char int8_t; typedef short int int16_t; typedef int int32_t; typedef long int int64_t; typedef unsigned char uint8_t; typedef unsigned short int uint16_t; typedef unsigned int uint32_t; typedef unsigned long int uint64_t; # 65 "/usr/include/stdint.h" 3 4 typedef signed char int_least8_t; typedef short int int_least16_t; typedef int int_least32_t; typedef long int int_least64_t; typedef unsigned char uint_least8_t; typedef unsigned short int uint_least16_t; typedef unsigned int uint_least32_t; typedef unsigned long int uint_least64_t; # 90 "/usr/include/stdint.h" 3 4 typedef signed char int_fast8_t; typedef long int int_fast16_t; typedef long int int_fast32_t; typedef long int int_fast64_t; # 103 "/usr/include/stdint.h" 3 4 typedef unsigned char uint_fast8_t; typedef unsigned long int uint_fast16_t; typedef unsigned long int uint_fast32_t; typedef unsigned long int uint_fast64_t; # 119 "/usr/include/stdint.h" 3 4 typedef long int intptr_t; typedef unsigned long int uintptr_t; # 134 "/usr/include/stdint.h" 3 4 typedef long int intmax_t; typedef unsigned long int uintmax_t; # 10 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 2 3 4 # 23 "/usr/include/netinet/in.h" 2 3 4 # 1 "/usr/include/sys/socket.h" 1 3 4 # 24 "/usr/include/sys/socket.h" 3 4 # 1 "/usr/include/sys/uio.h" 1 3 4 # 23 "/usr/include/sys/uio.h" 3 4 # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 200 "/usr/include/sys/types.h" 3 4 typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 24 "/usr/include/sys/uio.h" 2 3 4 # 1 "/usr/include/bits/uio.h" 1 3 4 # 43 "/usr/include/bits/uio.h" 3 4 struct iovec { void *iov_base; size_t iov_len; }; # 29 "/usr/include/sys/uio.h" 2 3 4 # 39 "/usr/include/sys/uio.h" 3 4 extern ssize_t readv (int __fd, const struct iovec *__iovec, int __count) ; # 50 "/usr/include/sys/uio.h" 3 4 extern ssize_t writev (int __fd, const struct iovec *__iovec, int __count) ; # 65 "/usr/include/sys/uio.h" 3 4 extern ssize_t preadv (int __fd, const struct iovec *__iovec, int __count, __off_t __offset) ; # 77 "/usr/include/sys/uio.h" 3 4 extern ssize_t pwritev (int __fd, const struct iovec *__iovec, int __count, __off_t __offset) ; # 120 "/usr/include/sys/uio.h" 3 4 # 27 "/usr/include/sys/socket.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 29 "/usr/include/sys/socket.h" 2 3 4 # 38 "/usr/include/sys/socket.h" 3 4 # 1 "/usr/include/bits/socket.h" 1 3 4 # 27 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 28 "/usr/include/bits/socket.h" 2 3 4 typedef __socklen_t socklen_t; # 1 "/usr/include/bits/socket_type.h" 1 3 4 # 24 "/usr/include/bits/socket_type.h" 3 4 enum __socket_type { SOCK_STREAM = 1, SOCK_DGRAM = 2, SOCK_RAW = 3, SOCK_RDM = 4, SOCK_SEQPACKET = 5, SOCK_DCCP = 6, SOCK_PACKET = 10, SOCK_CLOEXEC = 02000000, SOCK_NONBLOCK = 00004000 }; # 39 "/usr/include/bits/socket.h" 2 3 4 # 167 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/include/bits/sockaddr.h" 1 3 4 # 28 "/usr/include/bits/sockaddr.h" 3 4 typedef unsigned short int sa_family_t; # 168 "/usr/include/bits/socket.h" 2 3 4 struct sockaddr { sa_family_t sa_family; char sa_data[14]; }; # 183 "/usr/include/bits/socket.h" 3 4 struct sockaddr_storage { sa_family_t ss_family; char __ss_padding[(128 - (sizeof (unsigned short int)) - sizeof (unsigned long int))]; unsigned long int __ss_align; }; enum { MSG_OOB = 0x01, MSG_PEEK = 0x02, MSG_DONTROUTE = 0x04, MSG_CTRUNC = 0x08, MSG_PROXY = 0x10, MSG_TRUNC = 0x20, MSG_DONTWAIT = 0x40, MSG_EOR = 0x80, MSG_WAITALL = 0x100, MSG_FIN = 0x200, MSG_SYN = 0x400, MSG_CONFIRM = 0x800, MSG_RST = 0x1000, MSG_ERRQUEUE = 0x2000, MSG_NOSIGNAL = 0x4000, MSG_MORE = 0x8000, MSG_WAITFORONE = 0x10000, MSG_BATCH = 0x40000, MSG_FASTOPEN = 0x20000000, MSG_CMSG_CLOEXEC = 0x40000000 }; struct msghdr { void *msg_name; socklen_t msg_namelen; struct iovec *msg_iov; size_t msg_iovlen; void *msg_control; size_t msg_controllen; int msg_flags; }; struct cmsghdr { size_t cmsg_len; int cmsg_level; int cmsg_type; __extension__ unsigned char __cmsg_data []; }; # 295 "/usr/include/bits/socket.h" 3 4 extern struct cmsghdr *__cmsg_nxthdr (struct msghdr *__mhdr, struct cmsghdr *__cmsg) __attribute__ ((__nothrow__ , __leaf__)); # 322 "/usr/include/bits/socket.h" 3 4 enum { SCM_RIGHTS = 0x01 }; # 368 "/usr/include/bits/socket.h" 3 4 # 1 "/usr/include/asm/socket.h" 1 3 4 # 1 "/usr/include/asm-generic/socket.h" 1 3 4 # 1 "/usr/include/asm/sockios.h" 1 3 4 # 1 "/usr/include/asm-generic/sockios.h" 1 3 4 # 1 "/usr/include/asm/sockios.h" 2 3 4 # 5 "/usr/include/asm-generic/socket.h" 2 3 4 # 1 "/usr/include/asm/socket.h" 2 3 4 # 369 "/usr/include/bits/socket.h" 2 3 4 # 402 "/usr/include/bits/socket.h" 3 4 struct linger { int l_onoff; int l_linger; }; # 39 "/usr/include/sys/socket.h" 2 3 4 struct osockaddr { unsigned short int sa_family; unsigned char sa_data[14]; }; enum { SHUT_RD = 0, SHUT_WR, SHUT_RDWR }; # 113 "/usr/include/sys/socket.h" 3 4 extern int socket (int __domain, int __type, int __protocol) __attribute__ ((__nothrow__ , __leaf__)); extern int socketpair (int __domain, int __type, int __protocol, int __fds[2]) __attribute__ ((__nothrow__ , __leaf__)); extern int bind (int __fd, const struct sockaddr * __addr, socklen_t __len) __attribute__ ((__nothrow__ , __leaf__)); extern int getsockname (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __len) __attribute__ ((__nothrow__ , __leaf__)); # 137 "/usr/include/sys/socket.h" 3 4 extern int connect (int __fd, const struct sockaddr * __addr, socklen_t __len); extern int getpeername (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __len) __attribute__ ((__nothrow__ , __leaf__)); extern ssize_t send (int __fd, const void *__buf, size_t __n, int __flags); extern ssize_t recv (int __fd, void *__buf, size_t __n, int __flags); extern ssize_t sendto (int __fd, const void *__buf, size_t __n, int __flags, const struct sockaddr * __addr, socklen_t __addr_len); # 174 "/usr/include/sys/socket.h" 3 4 extern ssize_t recvfrom (int __fd, void *__restrict __buf, size_t __n, int __flags, struct sockaddr *__restrict __addr, socklen_t *__restrict __addr_len); extern ssize_t sendmsg (int __fd, const struct msghdr *__message, int __flags); # 202 "/usr/include/sys/socket.h" 3 4 extern ssize_t recvmsg (int __fd, struct msghdr *__message, int __flags); # 219 "/usr/include/sys/socket.h" 3 4 extern int getsockopt (int __fd, int __level, int __optname, void *__restrict __optval, socklen_t *__restrict __optlen) __attribute__ ((__nothrow__ , __leaf__)); extern int setsockopt (int __fd, int __level, int __optname, const void *__optval, socklen_t __optlen) __attribute__ ((__nothrow__ , __leaf__)); extern int listen (int __fd, int __n) __attribute__ ((__nothrow__ , __leaf__)); # 243 "/usr/include/sys/socket.h" 3 4 extern int accept (int __fd, struct sockaddr *__restrict __addr, socklen_t *__restrict __addr_len); # 261 "/usr/include/sys/socket.h" 3 4 extern int shutdown (int __fd, int __how) __attribute__ ((__nothrow__ , __leaf__)); extern int sockatmark (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int isfdtype (int __fd, int __fdtype) __attribute__ ((__nothrow__ , __leaf__)); # 283 "/usr/include/sys/socket.h" 3 4 # 24 "/usr/include/netinet/in.h" 2 3 4 typedef uint32_t in_addr_t; struct in_addr { in_addr_t s_addr; }; # 1 "/usr/include/bits/in.h" 1 3 4 # 141 "/usr/include/bits/in.h" 3 4 struct ip_opts { struct in_addr ip_dst; char ip_opts[40]; }; struct ip_mreqn { struct in_addr imr_multiaddr; struct in_addr imr_address; int imr_ifindex; }; struct in_pktinfo { int ipi_ifindex; struct in_addr ipi_spec_dst; struct in_addr ipi_addr; }; # 38 "/usr/include/netinet/in.h" 2 3 4 enum { IPPROTO_IP = 0, IPPROTO_ICMP = 1, IPPROTO_IGMP = 2, IPPROTO_IPIP = 4, IPPROTO_TCP = 6, IPPROTO_EGP = 8, IPPROTO_PUP = 12, IPPROTO_UDP = 17, IPPROTO_IDP = 22, IPPROTO_TP = 29, IPPROTO_DCCP = 33, IPPROTO_IPV6 = 41, IPPROTO_RSVP = 46, IPPROTO_GRE = 47, IPPROTO_ESP = 50, IPPROTO_AH = 51, IPPROTO_MTP = 92, IPPROTO_BEETPH = 94, IPPROTO_ENCAP = 98, IPPROTO_PIM = 103, IPPROTO_COMP = 108, IPPROTO_SCTP = 132, IPPROTO_UDPLITE = 136, IPPROTO_MPLS = 137, IPPROTO_RAW = 255, IPPROTO_MAX }; enum { IPPROTO_HOPOPTS = 0, IPPROTO_ROUTING = 43, IPPROTO_FRAGMENT = 44, IPPROTO_ICMPV6 = 58, IPPROTO_NONE = 59, IPPROTO_DSTOPTS = 60, IPPROTO_MH = 135 }; typedef uint16_t in_port_t; enum { IPPORT_ECHO = 7, IPPORT_DISCARD = 9, IPPORT_SYSTAT = 11, IPPORT_DAYTIME = 13, IPPORT_NETSTAT = 15, IPPORT_FTP = 21, IPPORT_TELNET = 23, IPPORT_SMTP = 25, IPPORT_TIMESERVER = 37, IPPORT_NAMESERVER = 42, IPPORT_WHOIS = 43, IPPORT_MTP = 57, IPPORT_TFTP = 69, IPPORT_RJE = 77, IPPORT_FINGER = 79, IPPORT_TTYLINK = 87, IPPORT_SUPDUP = 95, IPPORT_EXECSERVER = 512, IPPORT_LOGINSERVER = 513, IPPORT_CMDSERVER = 514, IPPORT_EFSSERVER = 520, IPPORT_BIFFUDP = 512, IPPORT_WHOSERVER = 513, IPPORT_ROUTESERVER = 520, IPPORT_RESERVED = 1024, IPPORT_USERRESERVED = 5000 }; # 211 "/usr/include/netinet/in.h" 3 4 struct in6_addr { union { uint8_t __u6_addr8[16]; uint16_t __u6_addr16[8]; uint32_t __u6_addr32[4]; } __in6_u; }; extern const struct in6_addr in6addr_any; extern const struct in6_addr in6addr_loopback; # 239 "/usr/include/netinet/in.h" 3 4 struct sockaddr_in { sa_family_t sin_family; in_port_t sin_port; struct in_addr sin_addr; unsigned char sin_zero[sizeof (struct sockaddr) - (sizeof (unsigned short int)) - sizeof (in_port_t) - sizeof (struct in_addr)]; }; struct sockaddr_in6 { sa_family_t sin6_family; in_port_t sin6_port; uint32_t sin6_flowinfo; struct in6_addr sin6_addr; uint32_t sin6_scope_id; }; struct ip_mreq { struct in_addr imr_multiaddr; struct in_addr imr_interface; }; struct ip_mreq_source { struct in_addr imr_multiaddr; struct in_addr imr_interface; struct in_addr imr_sourceaddr; }; struct ipv6_mreq { struct in6_addr ipv6mr_multiaddr; unsigned int ipv6mr_interface; }; struct group_req { uint32_t gr_interface; struct sockaddr_storage gr_group; }; struct group_source_req { uint32_t gsr_interface; struct sockaddr_storage gsr_group; struct sockaddr_storage gsr_source; }; struct ip_msfilter { struct in_addr imsf_multiaddr; struct in_addr imsf_interface; uint32_t imsf_fmode; uint32_t imsf_numsrc; struct in_addr imsf_slist[1]; }; struct group_filter { uint32_t gf_interface; struct sockaddr_storage gf_group; uint32_t gf_fmode; uint32_t gf_numsrc; struct sockaddr_storage gf_slist[1]; }; # 376 "/usr/include/netinet/in.h" 3 4 extern uint32_t ntohl (uint32_t __netlong) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern uint16_t ntohs (uint16_t __netshort) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern uint32_t htonl (uint32_t __hostlong) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern uint16_t htons (uint16_t __hostshort) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 388 "/usr/include/netinet/in.h" 2 3 4 # 503 "/usr/include/netinet/in.h" 3 4 extern int bindresvport (int __sockfd, struct sockaddr_in *__sock_in) __attribute__ ((__nothrow__ , __leaf__)); extern int bindresvport6 (int __sockfd, struct sockaddr_in6 *__sock_in) __attribute__ ((__nothrow__ , __leaf__)); # 631 "/usr/include/netinet/in.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_NETINET_IN_H" to "1" ================================================================================ TEST checkRecursiveMacros from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:218) TESTING: checkRecursiveMacros from config.headers(config/BuildSystem/config/headers.py:218) Checks that the preprocessor allows recursive macros, and if not defines HAVE_BROKEN_RECURSIVE_MACRO Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.headers/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.headers -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.headers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" void a(int i, int j) {} #define a(b) a(b,__LINE__) int main() { a(0); ; return 0; } ================================================================================ TEST configureCacheDetails from config.utilities.cacheDetails(/home/florian/software/petsc/config/BuildSystem/config/utilities/cacheDetails.py:78) TESTING: configureCacheDetails from config.utilities.cacheDetails(config/BuildSystem/config/utilities/cacheDetails.py:78) Try to determine the size and associativity of the cache. Pushing language C All intermediate test results are stored in /tmp/petsc-KvGRNM/config.utilities.cacheDetails Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include long getconf_LEVEL1_DCACHE_SIZE() { long val = sysconf(_SC_LEVEL1_DCACHE_SIZE); return (16 <= val && val <= 2147483647) ? val : 32768; } int main() { ; return 0; } Popping language C Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_SIZE() { long val = sysconf(_SC_LEVEL1_DCACHE_SIZE); return (16 <= val && val <= 2147483647) ? val : 32768; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_SIZE()); fclose(output);; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest Executing: /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest Popping language C Defined "LEVEL1_DCACHE_SIZE" to "32768" Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_LINESIZE() { long val = sysconf(_SC_LEVEL1_DCACHE_LINESIZE); return (16 <= val && val <= 2147483647) ? val : 32; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_LINESIZE()); fclose(output);; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest Executing: /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest Popping language C Defined "LEVEL1_DCACHE_LINESIZE" to "64" Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include long getconf_LEVEL1_DCACHE_ASSOC() { long val = sysconf(_SC_LEVEL1_DCACHE_ASSOC); return (0 <= val && val <= 2147483647) ? val : 2; } int main() { FILE *output = fopen("conftestval","w"); if (!output) return 1; fprintf(output,"%ld",getconf_LEVEL1_DCACHE_ASSOC()); fclose(output);; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest Executing: /tmp/petsc-KvGRNM/config.utilities.cacheDetails/conftest Popping language C Defined "LEVEL1_DCACHE_ASSOC" to "8" ================================================================================ TEST check_siginfo_t from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:46) TESTING: check_siginfo_t from config.types(config/BuildSystem/config/types.py:46) Checks if siginfo_t exists in signal.h. This check is for windows, and C89 check. Checking for type: siginfo_t All intermediate test results are stored in /tmp/petsc-KvGRNM/config.types Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:11: warning: unused variable 'a' [-Wunused-variable] siginfo_t a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { siginfo_t a;; return 0; } siginfo_t found Defined "HAVE_SIGINFO_T" to "1" ================================================================================ TEST check__int64 from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:52) TESTING: check__int64 from config.types(config/BuildSystem/config/types.py:52) Checks if __int64 exists. This is primarily for windows. Checking for type: __int64 Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:1: error: unknown type name '__int64' __int64 a;; ^~~~~~~ /tmp/petsc-KvGRNM/config.types/conftest.c:13:9: warning: unused variable 'a' [-Wunused-variable] __int64 a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { __int64 a;; return 0; } __int64 found ================================================================================ TEST checkSizeTypes from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:58) TESTING: checkSizeTypes from config.types(config/BuildSystem/config/types.py:58) Checks for types associated with sizes, such as size_t. Checking for type: size_t Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:8: warning: unused variable 'a' [-Wunused-variable] size_t a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { size_t a;; return 0; } size_t found ================================================================================ TEST checkFileTypes from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:68) TESTING: checkFileTypes from config.types(config/BuildSystem/config/types.py:68) Checks for types associated with files, such as mode_t, off_t, etc. Checking for type: mode_t Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:8: warning: unused variable 'a' [-Wunused-variable] mode_t a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { mode_t a;; return 0; } mode_t found Checking for type: off_t Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:7: warning: unused variable 'a' [-Wunused-variable] off_t a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { off_t a;; return 0; } off_t found ================================================================================ TEST checkIntegerTypes from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:63) TESTING: checkIntegerTypes from config.types(config/BuildSystem/config/types.py:63) Checks for types associated with integers, such as int32_t. Checking for type: int32_t Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:9: warning: unused variable 'a' [-Wunused-variable] int32_t a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { int32_t a;; return 0; } int32_t found ================================================================================ TEST checkPID from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:74) TESTING: checkPID from config.types(config/BuildSystem/config/types.py:74) Checks for pid_t, and defines it if necessary Checking for type: pid_t Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:7: warning: unused variable 'a' [-Wunused-variable] pid_t a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { pid_t a;; return 0; } pid_t found ================================================================================ TEST checkUID from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:78) TESTING: checkUID from config.types(config/BuildSystem/config/types.py:78) Checks for uid_t and gid_t, and defines them if necessary Source: #include "confdefs.h" #include "conffix.h" #include Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.types/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.types/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.types/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.types/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.types/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.types/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.types/conftest.c" 2 # 1 "/usr/include/sys/types.h" 1 3 4 # 25 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.types/conftest.c" 2 ================================================================================ TEST checkSignal from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:85) TESTING: checkSignal from config.types(config/BuildSystem/config/types.py:85) Checks the return type of signal() and defines RETSIGTYPE to that type name Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #ifdef signal #undef signal #endif #ifdef __cplusplus extern "C" void (*signal (int, void(*)(int)))(int); #else void (*signal())(); #endif int main() { ; return 0; } Defined "RETSIGTYPE" to "void" ================================================================================ TEST checkC99Complex from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:106) TESTING: checkC99Complex from config.types(config/BuildSystem/config/types.py:106) Check for complex numbers in in C99 std Note that since PETSc source code uses _Complex we test specifically for that, not complex Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:6:17: warning: variable 'x' set but not used [-Wunused-but-set-variable] double _Complex x; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { double _Complex x; x = I; ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:6:17: warning: variable 'x' set but not used [-Wunused-but-set-variable] double _Complex x; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { double _Complex x; x = I; ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_C99_COMPLEX" to "1" ================================================================================ TEST checkCxxComplex from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:117) TESTING: checkCxxComplex from config.types(config/BuildSystem/config/types.py:117) Check for complex numbers in namespace std Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.types/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { std::complex x; ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.types/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_CXX_COMPLEX" to "1" Popping language Cxx ================================================================================ TEST checkFortranKind from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:138) TESTING: checkFortranKind from config.types(config/BuildSystem/config/types.py:138) Checks whether selected_int_kind etc work USE_FORTRANKIND Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.types/conftest.F Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.F:4:43: real(kind=selected_real_kind(10)) d 1 Warning: Unused variable 'd' declared at (1) [-Wunused-variable] /tmp/petsc-KvGRNM/config.types/conftest.F:3:45: integer(kind=selected_int_kind(10)) i 1 Warning: Unused variable 'i' declared at (1) [-Wunused-variable] Source: program main integer(kind=selected_int_kind(10)) i real(kind=selected_real_kind(10)) d end Defined "USE_FORTRANKIND" to "1" Popping language FC ================================================================================ TEST checkConst from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:150) TESTING: checkConst from config.types(config/BuildSystem/config/types.py:150) Checks for working const, and if not found defines it to empty string Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:25:5: warning: this 'if' clause does not guard... [-Wmisleading-indentation] if (x[0]); ^~ /tmp/petsc-KvGRNM/config.types/conftest.c:26:5: note: ...this statement, but the latter is misleadingly indented as if it is guarded by the 'if' { /* SCO 3.2v4 cc rejects this. */ ^ /tmp/petsc-KvGRNM/config.types/conftest.c:30:9: warning: 't' is used uninitialized in this function [-Wuninitialized] *t++ = 0; ~^~ /tmp/petsc-KvGRNM/config.types/conftest.c:46:25: warning: 'b' is used uninitialized in this function [-Wuninitialized] struct s *b; b->j = 5; ~~~~~^~~ Source: #include "confdefs.h" #include "conffix.h" int main() { /* Ultrix mips cc rejects this. */ typedef int charset[2]; const charset x; /* SunOS 4.1.1 cc rejects this. */ char const *const *ccp; char **p; /* NEC SVR4.0.2 mips cc rejects this. */ struct point {int x, y;}; static struct point const zero = {0,0}; /* AIX XL C 1.02.0.0 rejects this. It does not let you subtract one const X* pointer from another in an arm of an if-expression whose if-part is not a constant expression */ const char *g = "string"; ccp = &g + (g ? g-g : 0); /* HPUX 7.0 cc rejects these. */ ++ccp; p = (char**) ccp; ccp = (char const *const *) p; /* This section avoids unused variable warnings */ if (zero.x); if (x[0]); { /* SCO 3.2v4 cc rejects this. */ char *t; char const *s = 0 ? (char *) 0 : (char const *) 0; *t++ = 0; if (*s); } { /* Someone thinks the Sun supposedly-ANSI compiler will reject this. */ int x[] = {25, 17}; const int *foo = &x[0]; ++foo; } { /* Sun SC1.0 ANSI compiler rejects this -- but not the above. */ typedef const int *iptr; iptr p = 0; ++p; } { /* AIX XL C 1.02.0.0 rejects this saying "k.c", line 2.27: 1506-025 (S) Operand must be a modifiable lvalue. */ struct s { int j; const int *ap[3]; }; struct s *b; b->j = 5; } { /* ULTRIX-32 V3.1 (Rev 9) vcc rejects this */ const int foo = 10; /* Get rid of unused variable warning */ if (foo); } ; return 0; } ================================================================================ TEST checkEndian from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:206) TESTING: checkEndian from config.types(config/BuildSystem/config/types.py:206) If the machine is big endian, defines WORDS_BIGENDIAN Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #ifdef HAVE_SYS_PARAM_H #include #endif int main() { #if !BYTE_ORDER || !BIG_ENDIAN || !LITTLE_ENDIAN bogus endian macros #endif ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:11:3: error: unknown type name 'not' not big endian ^~~ /tmp/petsc-KvGRNM/config.types/conftest.c:11:11: error: expected '=', ',', ';', 'asm' or '__attribute__' before 'endian' not big endian ^~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include #ifdef HAVE_SYS_PARAM_H #include #endif int main() { #if BYTE_ORDER != BIG_ENDIAN not big endian #endif ; return 0; } ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: char Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(char)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_CHAR" to "1" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: void * Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(void *)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_VOID_P" to "8" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: short Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(short)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_SHORT" to "2" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: int Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(int)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_INT" to "4" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: long Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(long)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_LONG" to "8" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: long long Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(long long)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_LONG_LONG" to "8" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: float Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(float)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_FLOAT" to "4" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: double Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(double)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_DOUBLE" to "8" ================================================================================ TEST checkSizeof from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:259) TESTING: checkSizeof from config.types(config/BuildSystem/config/types.py:259) Determines the size of type "typeName", and defines SIZEOF_"typeName" to be the size Checking for size of type: size_t Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(size_t)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_SIZE_T" to "8" ================================================================================ TEST checkBitsPerByte from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:310) TESTING: checkBitsPerByte from config.types(config/BuildSystem/config/types.py:310) Determine the nubmer of bits per byte and define BITS_PER_BYTE Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #if STDC_HEADERS #include #include #endif int main() { FILE *f = fopen("conftestval", "w"); char val[2]; int i = 0; if (!f) exit(1); val[0]='\1'; val[1]='\0'; while(val[0]) {val[0] <<= 1; i++;} fprintf(f, "%d\n", i); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Defined "BITS_PER_BYTE" to "8" ================================================================================ TEST checkVisibility from config.types(/home/florian/software/petsc/config/BuildSystem/config/types.py:356) TESTING: checkVisibility from config.types(config/BuildSystem/config/types.py:356) Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { __attribute__((visibility ("default"))) int foo(void);; return 0; } Defined "USE_VISIBILITY_C" to "1" Popping language C Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.types/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { __attribute__((visibility ("default"))) int foo(void);; return 0; } Defined "USE_VISIBILITY_CXX" to "1" Popping language Cxx ================================================================================ TEST configureMemAlign from PETSc.options.memAlign(/home/florian/software/petsc/config/PETSc/options/memAlign.py:30) TESTING: configureMemAlign from PETSc.options.memAlign(config/PETSc/options/memAlign.py:30) Choose alignment Defined "MEMALIGN" to "16" Memory alignment is 16 ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [socket] in library ['socket', 'nsl'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char socket(); static void _check_socket() { socket(); } int main() { _check_socket();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lsocket -lnsl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lsocket collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [handle_sigfpes] in library ['fpe'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char handle_sigfpes(); static void _check_handle_sigfpes() { handle_sigfpes(); } int main() { _check_handle_sigfpes();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lfpe -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lfpe collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [socket] in library ['socket', 'nsl'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char socket(); static void _check_socket() { socket(); } int main() { _check_socket();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lsocket -lnsl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lsocket collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [handle_sigfpes] in library ['fpe'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char handle_sigfpes(); static void _check_handle_sigfpes() { handle_sigfpes(); } int main() { _check_handle_sigfpes();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lfpe -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lfpe collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST checkMath from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:251) TESTING: checkMath from config.libraries(config/BuildSystem/config/libraries.py:251) Check for sin() in libm, the math library Checking for functions [sin floor log10 pow] in library [''] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_sin': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5:41: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_sin() { double x = 0,y; y = sin(x); ^ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_floor': /tmp/petsc-KvGRNM/config.libraries/conftest.c:8:43: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_floor() { double x = 0,y; y = floor(x); ^ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_log10': /tmp/petsc-KvGRNM/config.libraries/conftest.c:11:43: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_log10() { double x = 0,y; y = log10(x); ^ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_pow': /tmp/petsc-KvGRNM/config.libraries/conftest.c:14:41: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_pow() { double x = 0,y ; y = pow(x, x); ^ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double sin(double); static void _check_sin() { double x = 0,y; y = sin(x); ; } double floor(double); static void _check_floor() { double x = 0,y; y = floor(x); ; } double log10(double); static void _check_log10() { double x = 0,y; y = log10(x); ; } double pow(double, double); static void _check_pow() { double x = 0,y ; y = pow(x, x); ; } int main() { _check_sin(); _check_floor(); _check_log10(); _check_pow();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: /tmp/petsc-KvGRNM/config.libraries/conftest.o: undefined reference to symbol 'floor@@GLIBC_2.2.5' /usr/lib/libm.so.6: error adding symbols: DSO missing from command line collect2: error: ld returned 1 exit status Popping language C Checking for functions [sin floor log10 pow] in library ['m'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_sin': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5:41: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_sin() { double x = 0,y; y = sin(x); ^ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_floor': /tmp/petsc-KvGRNM/config.libraries/conftest.c:8:43: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_floor() { double x = 0,y; y = floor(x); ^ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_log10': /tmp/petsc-KvGRNM/config.libraries/conftest.c:11:43: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_log10() { double x = 0,y; y = log10(x); ^ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_pow': /tmp/petsc-KvGRNM/config.libraries/conftest.c:14:41: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_pow() { double x = 0,y ; y = pow(x, x); ^ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double sin(double); static void _check_sin() { double x = 0,y; y = sin(x); ; } double floor(double); static void _check_floor() { double x = 0,y; y = floor(x); ; } double log10(double); static void _check_log10() { double x = 0,y; y = log10(x); ; } double pow(double, double); static void _check_pow() { double x = 0,y ; y = pow(x, x); ; } int main() { _check_sin(); _check_floor(); _check_log10(); _check_pow();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBM" to "1" Popping language C Using libm for the math library ================================================================================ TEST checkMathErf from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:267) TESTING: checkMathErf from config.libraries(config/BuildSystem/config/libraries.py:267) Check for erf() in libm, the math library Checking for functions [erf] in library ['libm.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_erf': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5:41: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_erf() { double x = 0,y; y = erf(x); ^ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double erf(double); static void _check_erf() { double x = 0,y; y = erf(x); ; } int main() { _check_erf();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBM" to "1" Popping language C erf() found Defined "HAVE_ERF" to "1" ================================================================================ TEST checkMathTgamma from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:276) TESTING: checkMathTgamma from config.libraries(config/BuildSystem/config/libraries.py:276) Check for tgama() in libm, the math library Checking for functions [tgamma] in library ['libm.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_tgamma': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5:44: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_tgamma() { double x = 0,y; y = tgamma(x); ^ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double tgamma(double); static void _check_tgamma() { double x = 0,y; y = tgamma(x); ; } int main() { _check_tgamma();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBM" to "1" Popping language C tgamma() found Defined "HAVE_TGAMMA" to "1" ================================================================================ TEST checkMathFenv from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:285) TESTING: checkMathFenv from config.libraries(config/BuildSystem/config/libraries.py:285) Checks if can be used with FE_DFL_ENV Checking for functions [fesetenv] in library ['libm.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_fesetenv() { fesetenv(FE_DFL_ENV);; } int main() { _check_fesetenv();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBM" to "1" Popping language C Defined "HAVE_FENV_H" to "1" ================================================================================ TEST checkMathLog2 from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:293) TESTING: checkMathLog2 from config.libraries(config/BuildSystem/config/libraries.py:293) Check for log2() in libm, the math library Checking for functions [log2] in library ['libm.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_log2': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5:42: warning: variable 'y' set but not used [-Wunused-but-set-variable] static void _check_log2() { double x = 1,y; y = log2(x); ^ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ double log2(double); static void _check_log2() { double x = 1,y; y = log2(x); ; } int main() { _check_log2();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBM" to "1" Popping language C log2() found Defined "HAVE_LOG2" to "1" ================================================================================ TEST checkCompression from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:302) TESTING: checkCompression from config.libraries(config/BuildSystem/config/libraries.py:302) Check for libz, the compression library Checking for functions [compress uncompress] in library [''] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_compress': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5:119: warning: variable 'ret' set but not used [-Wunused-but-set-variable] static void _check_compress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ^~~ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_uncompress': /tmp/petsc-KvGRNM/config.libraries/conftest.c:8:121: warning: variable 'ret' set but not used [-Wunused-but-set-variable] static void _check_uncompress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = uncompress(dest, &destLen, source, sourceLen); ^~~ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); static void _check_compress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; } int uncompress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); static void _check_uncompress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = uncompress(dest, &destLen, source, sourceLen); ; } int main() { _check_compress(); _check_uncompress();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_compress': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `compress' /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_uncompress': /tmp/petsc-KvGRNM/config.libraries/conftest.c:8: undefined reference to `uncompress' collect2: error: ld returned 1 exit status Popping language C Checking for functions [compress uncompress] in library ['z'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_compress': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5:119: warning: variable 'ret' set but not used [-Wunused-but-set-variable] static void _check_compress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ^~~ /tmp/petsc-KvGRNM/config.libraries/conftest.c: In function '_check_uncompress': /tmp/petsc-KvGRNM/config.libraries/conftest.c:8:121: warning: variable 'ret' set but not used [-Wunused-but-set-variable] static void _check_uncompress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = uncompress(dest, &destLen, source, sourceLen); ^~~ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ int compress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); static void _check_compress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = compress(dest, &destLen, source, sourceLen); ; } int uncompress(char *dest, unsigned long *destLen, const char *source, unsigned long sourceLen); static void _check_uncompress() { char *dest = 0; const char *source = 0; unsigned long destLen = 0, sourceLen = 0; int ret = 0; ret = uncompress(dest, &destLen, source, sourceLen); ; } int main() { _check_compress(); _check_uncompress();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lz -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBZ" to "1" Popping language C Using libz for the compression library ================================================================================ TEST checkRealtime from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:323) TESTING: checkRealtime from config.libraries(config/BuildSystem/config/libraries.py:323) Check for presence of clock_gettime() in realtime library (POSIX Realtime extensions) Checking for functions [clock_gettime] in library [''] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_clock_gettime() { struct timespec tp; clock_gettime(CLOCK_REALTIME,&tp);; } int main() { _check_clock_gettime();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C realtime functions are linked in by default ================================================================================ TEST checkDynamic from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:339) TESTING: checkDynamic from config.libraries(config/BuildSystem/config/libraries.py:339) Check for the header and libraries necessary for dynamic library manipulation Checking for functions [dlopen] in library ['dl'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char dlopen(); static void _check_dlopen() { dlopen(); } int main() { _check_dlopen();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -ldl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBDL" to "1" Popping language C Checking for header: dlfcn.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/dlfcn.h" 1 3 4 # 22 "/usr/include/dlfcn.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 23 "/usr/include/dlfcn.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 25 "/usr/include/dlfcn.h" 2 3 4 # 1 "/usr/include/bits/dlfcn.h" 1 3 4 # 28 "/usr/include/dlfcn.h" 2 3 4 # 52 "/usr/include/dlfcn.h" 3 4 extern void *dlopen (const char *__file, int __mode) __attribute__ ((__nothrow__)); extern int dlclose (void *__handle) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern void *dlsym (void *__restrict __handle, const char *__restrict __name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 82 "/usr/include/dlfcn.h" 3 4 extern char *dlerror (void) __attribute__ ((__nothrow__ , __leaf__)); # 188 "/usr/include/dlfcn.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_DLFCN_H" to "1" ================================================================================ TEST configureLibraryOptions from PETSc.options.libraryOptions(/home/florian/software/petsc/config/PETSc/options/libraryOptions.py:37) TESTING: configureLibraryOptions from PETSc.options.libraryOptions(config/PETSc/options/libraryOptions.py:37) Sets PETSC_USE_DEBUG, PETSC_USE_INFO, PETSC_USE_LOG, PETSC_USE_CTABLE and PETSC_USE_FORTRAN_KERNELS Defined "USE_LOG" to "1" Defined "USE_DEBUG" to "1" Defined "USE_INFO" to "1" Defined "USE_CTABLE" to "1" Defined "USE_BACKWARD_LOOP" to "1" **********Checking if running on BGL/IBM detected Checking for functions [bgl_perfctr_void] in library [''] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char bgl_perfctr_void(); static void _check_bgl_perfctr_void() { bgl_perfctr_void(); } int main() { _check_bgl_perfctr_void();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_bgl_perfctr_void': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `bgl_perfctr_void' collect2: error: ld returned 1 exit status Popping language C Checking for functions [ADIOI_BGL_Open] in library [''] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ADIOI_BGL_Open(); static void _check_ADIOI_BGL_Open() { ADIOI_BGL_Open(); } int main() { _check_ADIOI_BGL_Open();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_ADIOI_BGL_Open': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `ADIOI_BGL_Open' collect2: error: ld returned 1 exit status Popping language C *********BGL/IBM test failure Defined "Alignx(a,b)" to " " ================================================================================ TEST configureISColorValueType from PETSc.options.libraryOptions(/home/florian/software/petsc/config/PETSc/options/libraryOptions.py:91) TESTING: configureISColorValueType from PETSc.options.libraryOptions(config/PETSc/options/libraryOptions.py:91) Sets PETSC_IS_COLOR_VALUE_TYPE, MPIU_COLORING_VALUE, IS_COLORING_MAX required by ISColor Defined "MPIU_COLORING_VALUE" to "MPI_UNSIGNED_SHORT" Defined "IS_COLORING_MAX" to "65535" Defined "IS_COLOR_VALUE_TYPE" to "short" Defined "IS_COLOR_VALUE_TYPE_F" to "integer2" ================================================================================ TEST configureCPURelax from config.atomics(/home/florian/software/petsc/config/BuildSystem/config/atomics.py:17) TESTING: configureCPURelax from config.atomics(config/BuildSystem/config/atomics.py:17) Definitions for cpu relax assembly instructions All intermediate test results are stored in /tmp/petsc-KvGRNM/config.atomics Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.atomics/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.atomics/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { asm volatile("rep; nop" ::: "memory");; return 0; } Defined "CPU_RELAX()" to "asm volatile("rep; nop" ::: "memory")" ================================================================================ TEST configureMemoryBarriers from config.atomics(/home/florian/software/petsc/config/BuildSystem/config/atomics.py:36) TESTING: configureMemoryBarriers from config.atomics(config/BuildSystem/config/atomics.py:36) Definitions for memory barrier instructions Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.atomics/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.atomics/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { asm volatile("mfence":::"memory"); return 0; } Defined "MEMORY_BARRIER()" to "asm volatile("mfence":::"memory")" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.atomics/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.atomics/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { asm volatile("lfence":::"memory"); return 0; } Defined "READ_MEMORY_BARRIER()" to "asm volatile("lfence":::"memory")" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.atomics/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.atomics/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { asm volatile("sfence":::"memory"); return 0; } Defined "WRITE_MEMORY_BARRIER()" to "asm volatile("sfence":::"memory")" ================================================================================ TEST checkMemcmp from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:110) TESTING: checkMemcmp from config.functions(config/BuildSystem/config/functions.py:110) Check for 8-bit clean memcmp Making executable to test memcmp() All intermediate test results are stored in /tmp/petsc-KvGRNM/config.functions Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include void exit(int); int main() { char c0 = 0x40; char c1 = (char) 0x80; char c2 = (char) 0x81; exit(memcmp(&c0, &c2, 1) < 0 && memcmp(&c1, &c2, 1) < 0 ? 0 : 1); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.functions/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.functions/conftest Executing: /tmp/petsc-KvGRNM/config.functions/conftest ================================================================================ TEST checkSysinfo from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:135) TESTING: checkSysinfo from config.functions(config/BuildSystem/config/functions.py:135) Check whether sysinfo takes three arguments, and if it does define HAVE_SYSINFO_3ARG Checking for functions [sysinfo] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sysinfo(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_sysinfo) || defined (__stub___sysinfo) sysinfo_will_always_fail_with_ENOSYS(); #else sysinfo(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_SYSINFO" to "1" Checking for header: linux/kernel.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/linux/kernel.h" 1 3 4 # 1 "/usr/include/linux/sysinfo.h" 1 3 4 # 1 "/usr/include/linux/types.h" 1 3 4 # 1 "/usr/include/asm/types.h" 1 3 4 # 1 "/usr/include/asm-generic/types.h" 1 3 4 # 1 "/usr/include/asm-generic/int-ll64.h" 1 3 4 # 11 "/usr/include/asm-generic/int-ll64.h" 3 4 # 1 "/usr/include/asm/bitsperlong.h" 1 3 4 # 10 "/usr/include/asm/bitsperlong.h" 3 4 # 1 "/usr/include/asm-generic/bitsperlong.h" 1 3 4 # 11 "/usr/include/asm/bitsperlong.h" 2 3 4 # 12 "/usr/include/asm-generic/int-ll64.h" 2 3 4 # 19 "/usr/include/asm-generic/int-ll64.h" 3 4 typedef __signed__ char __s8; typedef unsigned char __u8; typedef __signed__ short __s16; typedef unsigned short __u16; typedef __signed__ int __s32; typedef unsigned int __u32; __extension__ typedef __signed__ long long __s64; __extension__ typedef unsigned long long __u64; # 7 "/usr/include/asm-generic/types.h" 2 3 4 # 5 "/usr/include/asm/types.h" 2 3 4 # 5 "/usr/include/linux/types.h" 2 3 4 # 1 "/usr/include/linux/posix_types.h" 1 3 4 # 1 "/usr/include/linux/stddef.h" 1 3 4 # 5 "/usr/include/linux/posix_types.h" 2 3 4 # 24 "/usr/include/linux/posix_types.h" 3 4 typedef struct { unsigned long fds_bits[1024 / (8 * sizeof(long))]; } __kernel_fd_set; typedef void (*__kernel_sighandler_t)(int); typedef int __kernel_key_t; typedef int __kernel_mqd_t; # 1 "/usr/include/asm/posix_types.h" 1 3 4 # 1 "/usr/include/asm/posix_types_64.h" 1 3 4 # 10 "/usr/include/asm/posix_types_64.h" 3 4 typedef unsigned short __kernel_old_uid_t; typedef unsigned short __kernel_old_gid_t; typedef unsigned long __kernel_old_dev_t; # 1 "/usr/include/asm-generic/posix_types.h" 1 3 4 # 14 "/usr/include/asm-generic/posix_types.h" 3 4 typedef long __kernel_long_t; typedef unsigned long __kernel_ulong_t; typedef __kernel_ulong_t __kernel_ino_t; typedef unsigned int __kernel_mode_t; typedef int __kernel_pid_t; typedef int __kernel_ipc_pid_t; typedef unsigned int __kernel_uid_t; typedef unsigned int __kernel_gid_t; typedef __kernel_long_t __kernel_suseconds_t; typedef int __kernel_daddr_t; typedef unsigned int __kernel_uid32_t; typedef unsigned int __kernel_gid32_t; # 71 "/usr/include/asm-generic/posix_types.h" 3 4 typedef __kernel_ulong_t __kernel_size_t; typedef __kernel_long_t __kernel_ssize_t; typedef __kernel_long_t __kernel_ptrdiff_t; typedef struct { int val[2]; } __kernel_fsid_t; typedef __kernel_long_t __kernel_off_t; typedef long long __kernel_loff_t; typedef __kernel_long_t __kernel_time_t; typedef __kernel_long_t __kernel_clock_t; typedef int __kernel_timer_t; typedef int __kernel_clockid_t; typedef char * __kernel_caddr_t; typedef unsigned short __kernel_uid16_t; typedef unsigned short __kernel_gid16_t; # 18 "/usr/include/asm/posix_types_64.h" 2 3 4 # 7 "/usr/include/asm/posix_types.h" 2 3 4 # 36 "/usr/include/linux/posix_types.h" 2 3 4 # 9 "/usr/include/linux/types.h" 2 3 4 # 27 "/usr/include/linux/types.h" 3 4 typedef __u16 __le16; typedef __u16 __be16; typedef __u32 __le32; typedef __u32 __be32; typedef __u64 __le64; typedef __u64 __be64; typedef __u16 __sum16; typedef __u32 __wsum; # 5 "/usr/include/linux/sysinfo.h" 2 3 4 struct sysinfo { __kernel_long_t uptime; __kernel_ulong_t loads[3]; __kernel_ulong_t totalram; __kernel_ulong_t freeram; __kernel_ulong_t sharedram; __kernel_ulong_t bufferram; __kernel_ulong_t totalswap; __kernel_ulong_t freeswap; __u16 procs; __u16 pad; __kernel_ulong_t totalhigh; __kernel_ulong_t freehigh; __u32 mem_unit; char _f[20-2*sizeof(__kernel_ulong_t)-sizeof(__u32)]; }; # 5 "/usr/include/linux/kernel.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_LINUX_KERNEL_H" to "1" Checking for header: sys/sysinfo.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/sys/sysinfo.h" 1 3 4 # 21 "/usr/include/sys/sysinfo.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/sys/sysinfo.h" 2 3 4 # 1 "/usr/include/linux/kernel.h" 1 3 4 # 1 "/usr/include/linux/sysinfo.h" 1 3 4 # 1 "/usr/include/linux/types.h" 1 3 4 # 1 "/usr/include/asm/types.h" 1 3 4 # 1 "/usr/include/asm-generic/types.h" 1 3 4 # 1 "/usr/include/asm-generic/int-ll64.h" 1 3 4 # 11 "/usr/include/asm-generic/int-ll64.h" 3 4 # 1 "/usr/include/asm/bitsperlong.h" 1 3 4 # 10 "/usr/include/asm/bitsperlong.h" 3 4 # 1 "/usr/include/asm-generic/bitsperlong.h" 1 3 4 # 11 "/usr/include/asm/bitsperlong.h" 2 3 4 # 12 "/usr/include/asm-generic/int-ll64.h" 2 3 4 # 19 "/usr/include/asm-generic/int-ll64.h" 3 4 typedef __signed__ char __s8; typedef unsigned char __u8; typedef __signed__ short __s16; typedef unsigned short __u16; typedef __signed__ int __s32; typedef unsigned int __u32; __extension__ typedef __signed__ long long __s64; __extension__ typedef unsigned long long __u64; # 7 "/usr/include/asm-generic/types.h" 2 3 4 # 5 "/usr/include/asm/types.h" 2 3 4 # 5 "/usr/include/linux/types.h" 2 3 4 # 1 "/usr/include/linux/posix_types.h" 1 3 4 # 1 "/usr/include/linux/stddef.h" 1 3 4 # 5 "/usr/include/linux/posix_types.h" 2 3 4 # 24 "/usr/include/linux/posix_types.h" 3 4 typedef struct { unsigned long fds_bits[1024 / (8 * sizeof(long))]; } __kernel_fd_set; typedef void (*__kernel_sighandler_t)(int); typedef int __kernel_key_t; typedef int __kernel_mqd_t; # 1 "/usr/include/asm/posix_types.h" 1 3 4 # 1 "/usr/include/asm/posix_types_64.h" 1 3 4 # 10 "/usr/include/asm/posix_types_64.h" 3 4 typedef unsigned short __kernel_old_uid_t; typedef unsigned short __kernel_old_gid_t; typedef unsigned long __kernel_old_dev_t; # 1 "/usr/include/asm-generic/posix_types.h" 1 3 4 # 14 "/usr/include/asm-generic/posix_types.h" 3 4 typedef long __kernel_long_t; typedef unsigned long __kernel_ulong_t; typedef __kernel_ulong_t __kernel_ino_t; typedef unsigned int __kernel_mode_t; typedef int __kernel_pid_t; typedef int __kernel_ipc_pid_t; typedef unsigned int __kernel_uid_t; typedef unsigned int __kernel_gid_t; typedef __kernel_long_t __kernel_suseconds_t; typedef int __kernel_daddr_t; typedef unsigned int __kernel_uid32_t; typedef unsigned int __kernel_gid32_t; # 71 "/usr/include/asm-generic/posix_types.h" 3 4 typedef __kernel_ulong_t __kernel_size_t; typedef __kernel_long_t __kernel_ssize_t; typedef __kernel_long_t __kernel_ptrdiff_t; typedef struct { int val[2]; } __kernel_fsid_t; typedef __kernel_long_t __kernel_off_t; typedef long long __kernel_loff_t; typedef __kernel_long_t __kernel_time_t; typedef __kernel_long_t __kernel_clock_t; typedef int __kernel_timer_t; typedef int __kernel_clockid_t; typedef char * __kernel_caddr_t; typedef unsigned short __kernel_uid16_t; typedef unsigned short __kernel_gid16_t; # 18 "/usr/include/asm/posix_types_64.h" 2 3 4 # 7 "/usr/include/asm/posix_types.h" 2 3 4 # 36 "/usr/include/linux/posix_types.h" 2 3 4 # 9 "/usr/include/linux/types.h" 2 3 4 # 27 "/usr/include/linux/types.h" 3 4 typedef __u16 __le16; typedef __u16 __be16; typedef __u32 __le32; typedef __u32 __be32; typedef __u64 __le64; typedef __u64 __be64; typedef __u16 __sum16; typedef __u32 __wsum; # 5 "/usr/include/linux/sysinfo.h" 2 3 4 struct sysinfo { __kernel_long_t uptime; __kernel_ulong_t loads[3]; __kernel_ulong_t totalram; __kernel_ulong_t freeram; __kernel_ulong_t sharedram; __kernel_ulong_t bufferram; __kernel_ulong_t totalswap; __kernel_ulong_t freeswap; __u16 procs; __u16 pad; __kernel_ulong_t totalhigh; __kernel_ulong_t freehigh; __u32 mem_unit; char _f[20-2*sizeof(__kernel_ulong_t)-sizeof(__u32)]; }; # 5 "/usr/include/linux/kernel.h" 2 3 4 # 25 "/usr/include/sys/sysinfo.h" 2 3 4 extern int sysinfo (struct sysinfo *__info) __attribute__ ((__nothrow__ , __leaf__)); extern int get_nprocs_conf (void) __attribute__ ((__nothrow__ , __leaf__)); extern int get_nprocs (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int get_phys_pages (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int get_avphys_pages (void) __attribute__ ((__nothrow__ , __leaf__)); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_SYSINFO_H" to "1" Checking for header: sys/systeminfo.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: sys/systeminfo.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: sys/systeminfo.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:28: fatal error: sys/systeminfo.h: No such file or directory #include ^compilation terminated.: Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:13:4: error: #error "Cannot check sysinfo without special headers" # error "Cannot check sysinfo without special headers" ^~~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.functions/conftest.c:17:30: warning: implicit declaration of function 'sysinfo' [-Wimplicit-function-declaration] char buf[10]; long count=10; sysinfo(1, buf, count); ^~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #ifdef HAVE_LINUX_KERNEL_H # include # include # ifdef HAVE_SYS_SYSINFO_H # include # endif #elif defined(HAVE_SYS_SYSTEMINFO_H) # include #else # error "Cannot check sysinfo without special headers" #endif int main() { char buf[10]; long count=10; sysinfo(1, buf, count); ; return 0; } Compile failed inside link ================================================================================ TEST checkVPrintf from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:158) TESTING: checkVPrintf from config.functions(config/BuildSystem/config/functions.py:158) Checks whether vprintf requires a char * last argument, and if it does defines HAVE_VPRINTF_CHAR Checking for functions [vprintf] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:13:6: warning: conflicting types for built-in function 'vprintf' char vprintf(); ^~~~~~~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char vprintf(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_vprintf) || defined (__stub___vprintf) vprintf_will_always_fail_with_ENOSYS(); #else vprintf(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_VPRINTF" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include int main() { va_list Argp; vprintf( "%d", Argp ); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl ================================================================================ TEST checkVFPrintf from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:165) TESTING: checkVFPrintf from config.functions(config/BuildSystem/config/functions.py:165) Checks whether vfprintf requires a char * last argument, and if it does defines HAVE_VFPRINTF_CHAR Checking for functions [vfprintf] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:13:6: warning: conflicting types for built-in function 'vfprintf' char vfprintf(); ^~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char vfprintf(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_vfprintf) || defined (__stub___vfprintf) vfprintf_will_always_fail_with_ENOSYS(); #else vfprintf(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_VFPRINTF" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include int main() { va_list Argp; vfprintf(stdout, "%d", Argp ); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl ================================================================================ TEST checkVSNPrintf from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:172) TESTING: checkVSNPrintf from config.functions(config/BuildSystem/config/functions.py:172) Checks whether vsnprintf requires a char * last argument, and if it does defines HAVE_VSNPRINTF_CHAR Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.functions/conftest.c:6:1: warning: implicit declaration of function '_vsnprintf' [-Wimplicit-function-declaration] _vsnprintf(0,0,0,0); ^~~~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { _vsnprintf(0,0,0,0); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:6: undefined reference to `_vsnprintf' collect2: error: ld returned 1 exit status Checking for functions [vsnprintf] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:13:6: warning: conflicting types for built-in function 'vsnprintf' char vsnprintf(); ^~~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char vsnprintf(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_vsnprintf) || defined (__stub___vsnprintf) vsnprintf_will_always_fail_with_ENOSYS(); #else vsnprintf(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_VSNPRINTF" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include int main() { va_list Argp;char str[6]; vsnprintf(str,5, "%d", Argp ); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl ================================================================================ TEST checkNanosleep from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:206) TESTING: checkNanosleep from config.functions(config/BuildSystem/config/functions.py:206) Check for functional nanosleep() - as time.h behaves differently for different compiler flags - like -std=c89 Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { struct timespec tp; tp.tv_sec = 0; tp.tv_nsec = (long)(1e9); nanosleep(&tp,0); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_NANOSLEEP" to "1" ================================================================================ TEST checkSignalHandlerType from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:182) TESTING: checkSignalHandlerType from config.functions(config/BuildSystem/config/functions.py:182) Checks the type of C++ signals handlers, and defines SIGNAL_CAST to the correct value Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.functions -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.functions/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include static void myhandler(int sig) {} int main() { signal(SIGFPE,myhandler); ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.functions/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "SIGNAL_CAST" to " " Popping language Cxx ================================================================================ TEST checkFreeReturnType from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:192) TESTING: checkFreeReturnType from config.functions(config/BuildSystem/config/functions.py:192) Checks whether free returns void or int, and defines HAVE_FREE_RETURN_INT Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.functions/conftest.c:6:25: error: void value not ignored as it ought to be int ierr; void *p; ierr = free(p); return 0; ^ /tmp/petsc-KvGRNM/config.functions/conftest.c:6:5: warning: variable 'ierr' set but not used [-Wunused-but-set-variable] int ierr; void *p; ierr = free(p); return 0; ^~~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { int ierr; void *p; ierr = free(p); return 0; ; return 0; } Compile failed inside link ================================================================================ TEST checkVariableArgumentLists from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:198) TESTING: checkVariableArgumentLists from config.functions(config/BuildSystem/config/functions.py:198) Checks whether the variable argument list functionality is working Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { va_list l1, l2; va_copy(l1, l2); return 0; ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_VA_COPY" to "1" ================================================================================ TEST checkClassify from config.functions(/home/florian/software/petsc/config/BuildSystem/config/functions.py:89) TESTING: checkClassify from config.functions(config/BuildSystem/config/functions.py:89) Recursive decompose to rapidly classify functions as found or missing To confirm that a function is missing, we require a compile/link failure with only that function in a compilation unit. In contrast, we can confirm that many functions are present by compiling them all together in a large compilation unit. We optimistically compile everything together, then trim all functions that were named in the error message and bisect the result. The trimming is only an optimization to increase the likelihood of a big-batch compile succeeding; we do not rely on the compiler naming missing functions. Checking for functions [rand getdomainname _sleep snprintf realpath dlsym bzero _getcwd getwd uname _lseek sleep _access lseek usleep dlclose gethostname clock get_nprocs access _snprintf dlerror mkstemp fork getpagesize sbreak memalign sigset getcwd gethostbyname gettimeofday readlink _set_output_format PXFGETARG sigaction strcasecmp dlopen drand48 socket memmove signal popen getrusage times _mkdir time sysctlbyname stricmp] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:16:6: warning: conflicting types for built-in function 'snprintf' char snprintf(); ^~~~~~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c:19:6: warning: conflicting types for built-in function 'bzero' char bzero(); ^~~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c:36:6: warning: conflicting types for built-in function 'fork' char fork(); ^~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c:48:6: warning: conflicting types for built-in function 'strcasecmp' char strcasecmp(); ^~~~~~~~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c:52:6: warning: conflicting types for built-in function 'memmove' char memmove(); ^~~~~~~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char rand(); char getdomainname(); char _sleep(); char snprintf(); char realpath(); char dlsym(); char bzero(); char _getcwd(); char getwd(); char uname(); char _lseek(); char sleep(); char _access(); char lseek(); char usleep(); char dlclose(); char gethostname(); char clock(); char get_nprocs(); char access(); char _snprintf(); char dlerror(); char mkstemp(); char fork(); char getpagesize(); char sbreak(); char memalign(); char sigset(); char getcwd(); char gethostbyname(); char gettimeofday(); char readlink(); char _set_output_format(); char PXFGETARG(); char sigaction(); char strcasecmp(); char dlopen(); char drand48(); char socket(); char memmove(); char signal(); char popen(); char getrusage(); char times(); char _mkdir(); char time(); char sysctlbyname(); char stricmp(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_rand) || defined (__stub___rand) rand_will_always_fail_with_ENOSYS(); #else rand(); #endif #if defined (__stub_getdomainname) || defined (__stub___getdomainname) getdomainname_will_always_fail_with_ENOSYS(); #else getdomainname(); #endif #if defined (__stub__sleep) || defined (__stub____sleep) _sleep_will_always_fail_with_ENOSYS(); #else _sleep(); #endif #if defined (__stub_snprintf) || defined (__stub___snprintf) snprintf_will_always_fail_with_ENOSYS(); #else snprintf(); #endif #if defined (__stub_realpath) || defined (__stub___realpath) realpath_will_always_fail_with_ENOSYS(); #else realpath(); #endif #if defined (__stub_dlsym) || defined (__stub___dlsym) dlsym_will_always_fail_with_ENOSYS(); #else dlsym(); #endif #if defined (__stub_bzero) || defined (__stub___bzero) bzero_will_always_fail_with_ENOSYS(); #else bzero(); #endif #if defined (__stub__getcwd) || defined (__stub____getcwd) _getcwd_will_always_fail_with_ENOSYS(); #else _getcwd(); #endif #if defined (__stub_getwd) || defined (__stub___getwd) getwd_will_always_fail_with_ENOSYS(); #else getwd(); #endif #if defined (__stub_uname) || defined (__stub___uname) uname_will_always_fail_with_ENOSYS(); #else uname(); #endif #if defined (__stub__lseek) || defined (__stub____lseek) _lseek_will_always_fail_with_ENOSYS(); #else _lseek(); #endif #if defined (__stub_sleep) || defined (__stub___sleep) sleep_will_always_fail_with_ENOSYS(); #else sleep(); #endif #if defined (__stub__access) || defined (__stub____access) _access_will_always_fail_with_ENOSYS(); #else _access(); #endif #if defined (__stub_lseek) || defined (__stub___lseek) lseek_will_always_fail_with_ENOSYS(); #else lseek(); #endif #if defined (__stub_usleep) || defined (__stub___usleep) usleep_will_always_fail_with_ENOSYS(); #else usleep(); #endif #if defined (__stub_dlclose) || defined (__stub___dlclose) dlclose_will_always_fail_with_ENOSYS(); #else dlclose(); #endif #if defined (__stub_gethostname) || defined (__stub___gethostname) gethostname_will_always_fail_with_ENOSYS(); #else gethostname(); #endif #if defined (__stub_clock) || defined (__stub___clock) clock_will_always_fail_with_ENOSYS(); #else clock(); #endif #if defined (__stub_get_nprocs) || defined (__stub___get_nprocs) get_nprocs_will_always_fail_with_ENOSYS(); #else get_nprocs(); #endif #if defined (__stub_access) || defined (__stub___access) access_will_always_fail_with_ENOSYS(); #else access(); #endif #if defined (__stub__snprintf) || defined (__stub____snprintf) _snprintf_will_always_fail_with_ENOSYS(); #else _snprintf(); #endif #if defined (__stub_dlerror) || defined (__stub___dlerror) dlerror_will_always_fail_with_ENOSYS(); #else dlerror(); #endif #if defined (__stub_mkstemp) || defined (__stub___mkstemp) mkstemp_will_always_fail_with_ENOSYS(); #else mkstemp(); #endif #if defined (__stub_fork) || defined (__stub___fork) fork_will_always_fail_with_ENOSYS(); #else fork(); #endif #if defined (__stub_getpagesize) || defined (__stub___getpagesize) getpagesize_will_always_fail_with_ENOSYS(); #else getpagesize(); #endif #if defined (__stub_sbreak) || defined (__stub___sbreak) sbreak_will_always_fail_with_ENOSYS(); #else sbreak(); #endif #if defined (__stub_memalign) || defined (__stub___memalign) memalign_will_always_fail_with_ENOSYS(); #else memalign(); #endif #if defined (__stub_sigset) || defined (__stub___sigset) sigset_will_always_fail_with_ENOSYS(); #else sigset(); #endif #if defined (__stub_getcwd) || defined (__stub___getcwd) getcwd_will_always_fail_with_ENOSYS(); #else getcwd(); #endif #if defined (__stub_gethostbyname) || defined (__stub___gethostbyname) gethostbyname_will_always_fail_with_ENOSYS(); #else gethostbyname(); #endif #if defined (__stub_gettimeofday) || defined (__stub___gettimeofday) gettimeofday_will_always_fail_with_ENOSYS(); #else gettimeofday(); #endif #if defined (__stub_readlink) || defined (__stub___readlink) readlink_will_always_fail_with_ENOSYS(); #else readlink(); #endif #if defined (__stub__set_output_format) || defined (__stub____set_output_format) _set_output_format_will_always_fail_with_ENOSYS(); #else _set_output_format(); #endif #if defined (__stub_PXFGETARG) || defined (__stub___PXFGETARG) PXFGETARG_will_always_fail_with_ENOSYS(); #else PXFGETARG(); #endif #if defined (__stub_sigaction) || defined (__stub___sigaction) sigaction_will_always_fail_with_ENOSYS(); #else sigaction(); #endif #if defined (__stub_strcasecmp) || defined (__stub___strcasecmp) strcasecmp_will_always_fail_with_ENOSYS(); #else strcasecmp(); #endif #if defined (__stub_dlopen) || defined (__stub___dlopen) dlopen_will_always_fail_with_ENOSYS(); #else dlopen(); #endif #if defined (__stub_drand48) || defined (__stub___drand48) drand48_will_always_fail_with_ENOSYS(); #else drand48(); #endif #if defined (__stub_socket) || defined (__stub___socket) socket_will_always_fail_with_ENOSYS(); #else socket(); #endif #if defined (__stub_memmove) || defined (__stub___memmove) memmove_will_always_fail_with_ENOSYS(); #else memmove(); #endif #if defined (__stub_signal) || defined (__stub___signal) signal_will_always_fail_with_ENOSYS(); #else signal(); #endif #if defined (__stub_popen) || defined (__stub___popen) popen_will_always_fail_with_ENOSYS(); #else popen(); #endif #if defined (__stub_getrusage) || defined (__stub___getrusage) getrusage_will_always_fail_with_ENOSYS(); #else getrusage(); #endif #if defined (__stub_times) || defined (__stub___times) times_will_always_fail_with_ENOSYS(); #else times(); #endif #if defined (__stub__mkdir) || defined (__stub____mkdir) _mkdir_will_always_fail_with_ENOSYS(); #else _mkdir(); #endif #if defined (__stub_time) || defined (__stub___time) time_will_always_fail_with_ENOSYS(); #else time(); #endif #if defined (__stub_sysctlbyname) || defined (__stub___sysctlbyname) sysctlbyname_will_always_fail_with_ENOSYS(); #else sysctlbyname(); #endif #if defined (__stub_stricmp) || defined (__stub___stricmp) stricmp_will_always_fail_with_ENOSYS(); #else stricmp(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:119: warning: the `getwd' function is dangerous and should not be used. /tmp/petsc-KvGRNM/config.functions/conftest.c:83: undefined reference to `_sleep' /tmp/petsc-KvGRNM/config.functions/conftest.c:113: undefined reference to `_getcwd' /tmp/petsc-KvGRNM/config.functions/conftest.c:131: undefined reference to `_lseek' /tmp/petsc-KvGRNM/config.functions/conftest.c:143: undefined reference to `_access' /tmp/petsc-KvGRNM/config.functions/conftest.c:191: undefined reference to `_snprintf' /tmp/petsc-KvGRNM/config.functions/conftest.c:221: undefined reference to `sbreak' /tmp/petsc-KvGRNM/config.functions/conftest.c:263: undefined reference to `_set_output_format' /tmp/petsc-KvGRNM/config.functions/conftest.c:269: undefined reference to `PXFGETARG' /tmp/petsc-KvGRNM/config.functions/conftest.c:335: undefined reference to `_mkdir' /tmp/petsc-KvGRNM/config.functions/conftest.c:347: undefined reference to `sysctlbyname' /tmp/petsc-KvGRNM/config.functions/conftest.c:353: undefined reference to `stricmp' collect2: error: ld returned 1 exit status Checking for functions [rand getdomainname realpath dlsym bzero uname usleep dlclose gethostname clock get_nprocs dlerror mkstemp fork getpagesize] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:17:6: warning: conflicting types for built-in function 'bzero' char bzero(); ^~~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c:26:6: warning: conflicting types for built-in function 'fork' char fork(); ^~~~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char rand(); char getdomainname(); char realpath(); char dlsym(); char bzero(); char uname(); char usleep(); char dlclose(); char gethostname(); char clock(); char get_nprocs(); char dlerror(); char mkstemp(); char fork(); char getpagesize(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_rand) || defined (__stub___rand) rand_will_always_fail_with_ENOSYS(); #else rand(); #endif #if defined (__stub_getdomainname) || defined (__stub___getdomainname) getdomainname_will_always_fail_with_ENOSYS(); #else getdomainname(); #endif #if defined (__stub_realpath) || defined (__stub___realpath) realpath_will_always_fail_with_ENOSYS(); #else realpath(); #endif #if defined (__stub_dlsym) || defined (__stub___dlsym) dlsym_will_always_fail_with_ENOSYS(); #else dlsym(); #endif #if defined (__stub_bzero) || defined (__stub___bzero) bzero_will_always_fail_with_ENOSYS(); #else bzero(); #endif #if defined (__stub_uname) || defined (__stub___uname) uname_will_always_fail_with_ENOSYS(); #else uname(); #endif #if defined (__stub_usleep) || defined (__stub___usleep) usleep_will_always_fail_with_ENOSYS(); #else usleep(); #endif #if defined (__stub_dlclose) || defined (__stub___dlclose) dlclose_will_always_fail_with_ENOSYS(); #else dlclose(); #endif #if defined (__stub_gethostname) || defined (__stub___gethostname) gethostname_will_always_fail_with_ENOSYS(); #else gethostname(); #endif #if defined (__stub_clock) || defined (__stub___clock) clock_will_always_fail_with_ENOSYS(); #else clock(); #endif #if defined (__stub_get_nprocs) || defined (__stub___get_nprocs) get_nprocs_will_always_fail_with_ENOSYS(); #else get_nprocs(); #endif #if defined (__stub_dlerror) || defined (__stub___dlerror) dlerror_will_always_fail_with_ENOSYS(); #else dlerror(); #endif #if defined (__stub_mkstemp) || defined (__stub___mkstemp) mkstemp_will_always_fail_with_ENOSYS(); #else mkstemp(); #endif #if defined (__stub_fork) || defined (__stub___fork) fork_will_always_fail_with_ENOSYS(); #else fork(); #endif #if defined (__stub_getpagesize) || defined (__stub___getpagesize) getpagesize_will_always_fail_with_ENOSYS(); #else getpagesize(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_RAND" to "1" Defined "HAVE_GETDOMAINNAME" to "1" Defined "HAVE_REALPATH" to "1" Defined "HAVE_DLSYM" to "1" Defined "HAVE_BZERO" to "1" Defined "HAVE_UNAME" to "1" Defined "HAVE_USLEEP" to "1" Defined "HAVE_DLCLOSE" to "1" Defined "HAVE_GETHOSTNAME" to "1" Defined "HAVE_CLOCK" to "1" Defined "HAVE_GET_NPROCS" to "1" Defined "HAVE_DLERROR" to "1" Defined "HAVE_MKSTEMP" to "1" Defined "HAVE_FORK" to "1" Defined "HAVE_GETPAGESIZE" to "1" Checking for functions [memalign sigset gethostbyname gettimeofday readlink sigaction strcasecmp dlopen drand48 socket memmove signal popen getrusage times time] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:19:6: warning: conflicting types for built-in function 'strcasecmp' char strcasecmp(); ^~~~~~~~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c:23:6: warning: conflicting types for built-in function 'memmove' char memmove(); ^~~~~~~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char memalign(); char sigset(); char gethostbyname(); char gettimeofday(); char readlink(); char sigaction(); char strcasecmp(); char dlopen(); char drand48(); char socket(); char memmove(); char signal(); char popen(); char getrusage(); char times(); char time(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_memalign) || defined (__stub___memalign) memalign_will_always_fail_with_ENOSYS(); #else memalign(); #endif #if defined (__stub_sigset) || defined (__stub___sigset) sigset_will_always_fail_with_ENOSYS(); #else sigset(); #endif #if defined (__stub_gethostbyname) || defined (__stub___gethostbyname) gethostbyname_will_always_fail_with_ENOSYS(); #else gethostbyname(); #endif #if defined (__stub_gettimeofday) || defined (__stub___gettimeofday) gettimeofday_will_always_fail_with_ENOSYS(); #else gettimeofday(); #endif #if defined (__stub_readlink) || defined (__stub___readlink) readlink_will_always_fail_with_ENOSYS(); #else readlink(); #endif #if defined (__stub_sigaction) || defined (__stub___sigaction) sigaction_will_always_fail_with_ENOSYS(); #else sigaction(); #endif #if defined (__stub_strcasecmp) || defined (__stub___strcasecmp) strcasecmp_will_always_fail_with_ENOSYS(); #else strcasecmp(); #endif #if defined (__stub_dlopen) || defined (__stub___dlopen) dlopen_will_always_fail_with_ENOSYS(); #else dlopen(); #endif #if defined (__stub_drand48) || defined (__stub___drand48) drand48_will_always_fail_with_ENOSYS(); #else drand48(); #endif #if defined (__stub_socket) || defined (__stub___socket) socket_will_always_fail_with_ENOSYS(); #else socket(); #endif #if defined (__stub_memmove) || defined (__stub___memmove) memmove_will_always_fail_with_ENOSYS(); #else memmove(); #endif #if defined (__stub_signal) || defined (__stub___signal) signal_will_always_fail_with_ENOSYS(); #else signal(); #endif #if defined (__stub_popen) || defined (__stub___popen) popen_will_always_fail_with_ENOSYS(); #else popen(); #endif #if defined (__stub_getrusage) || defined (__stub___getrusage) getrusage_will_always_fail_with_ENOSYS(); #else getrusage(); #endif #if defined (__stub_times) || defined (__stub___times) times_will_always_fail_with_ENOSYS(); #else times(); #endif #if defined (__stub_time) || defined (__stub___time) time_will_always_fail_with_ENOSYS(); #else time(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MEMALIGN" to "1" Defined "HAVE_SIGSET" to "1" Defined "HAVE_GETHOSTBYNAME" to "1" Defined "HAVE_GETTIMEOFDAY" to "1" Defined "HAVE_READLINK" to "1" Defined "HAVE_SIGACTION" to "1" Defined "HAVE_STRCASECMP" to "1" Defined "HAVE_DLOPEN" to "1" Defined "HAVE_DRAND48" to "1" Defined "HAVE_SOCKET" to "1" Defined "HAVE_MEMMOVE" to "1" Defined "HAVE_SIGNAL" to "1" Defined "HAVE_POPEN" to "1" Defined "HAVE_GETRUSAGE" to "1" Defined "HAVE_TIMES" to "1" Defined "HAVE_TIME" to "1" Checking for functions [_sleep] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _sleep(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__sleep) || defined (__stub____sleep) _sleep_will_always_fail_with_ENOSYS(); #else _sleep(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `_sleep' collect2: error: ld returned 1 exit status Checking for functions [snprintf] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:13:6: warning: conflicting types for built-in function 'snprintf' char snprintf(); ^~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char snprintf(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_snprintf) || defined (__stub___snprintf) snprintf_will_always_fail_with_ENOSYS(); #else snprintf(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_SNPRINTF" to "1" Checking for functions [_getcwd] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _getcwd(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__getcwd) || defined (__stub____getcwd) _getcwd_will_always_fail_with_ENOSYS(); #else _getcwd(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `_getcwd' collect2: error: ld returned 1 exit status Checking for functions [getwd] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getwd(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_getwd) || defined (__stub___getwd) getwd_will_always_fail_with_ENOSYS(); #else getwd(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: warning: the `getwd' function is dangerous and should not be used. Defined "HAVE_GETWD" to "1" Checking for functions [_lseek] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _lseek(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__lseek) || defined (__stub____lseek) _lseek_will_always_fail_with_ENOSYS(); #else _lseek(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `_lseek' collect2: error: ld returned 1 exit status Checking for functions [sleep] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sleep(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_sleep) || defined (__stub___sleep) sleep_will_always_fail_with_ENOSYS(); #else sleep(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_SLEEP" to "1" Checking for functions [_access] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _access(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__access) || defined (__stub____access) _access_will_always_fail_with_ENOSYS(); #else _access(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `_access' collect2: error: ld returned 1 exit status Checking for functions [lseek] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char lseek(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_lseek) || defined (__stub___lseek) lseek_will_always_fail_with_ENOSYS(); #else lseek(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LSEEK" to "1" Checking for functions [access] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char access(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_access) || defined (__stub___access) access_will_always_fail_with_ENOSYS(); #else access(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_ACCESS" to "1" Checking for functions [_snprintf] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _snprintf(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__snprintf) || defined (__stub____snprintf) _snprintf_will_always_fail_with_ENOSYS(); #else _snprintf(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `_snprintf' collect2: error: ld returned 1 exit status Checking for functions [sbreak] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sbreak(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_sbreak) || defined (__stub___sbreak) sbreak_will_always_fail_with_ENOSYS(); #else sbreak(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `sbreak' collect2: error: ld returned 1 exit status Checking for functions [getcwd] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char getcwd(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_getcwd) || defined (__stub___getcwd) getcwd_will_always_fail_with_ENOSYS(); #else getcwd(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_GETCWD" to "1" Checking for functions [_set_output_format] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _set_output_format(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__set_output_format) || defined (__stub____set_output_format) _set_output_format_will_always_fail_with_ENOSYS(); #else _set_output_format(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `_set_output_format' collect2: error: ld returned 1 exit status Checking for functions [PXFGETARG] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char PXFGETARG(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_PXFGETARG) || defined (__stub___PXFGETARG) PXFGETARG_will_always_fail_with_ENOSYS(); #else PXFGETARG(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `PXFGETARG' collect2: error: ld returned 1 exit status Checking for functions [_mkdir] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _mkdir(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__mkdir) || defined (__stub____mkdir) _mkdir_will_always_fail_with_ENOSYS(); #else _mkdir(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `_mkdir' collect2: error: ld returned 1 exit status Checking for functions [sysctlbyname] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char sysctlbyname(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_sysctlbyname) || defined (__stub___sysctlbyname) sysctlbyname_will_always_fail_with_ENOSYS(); #else sysctlbyname(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `sysctlbyname' collect2: error: ld returned 1 exit status Checking for functions [stricmp] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char stricmp(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_stricmp) || defined (__stub___stricmp) stricmp_will_always_fail_with_ENOSYS(); #else stricmp(); #endif ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `stricmp' collect2: error: ld returned 1 exit status ================================================================================ TEST configureMemorySize from config.utilities.getResidentSetSize(/home/florian/software/petsc/config/BuildSystem/config/utilities/getResidentSetSize.py:31) TESTING: configureMemorySize from config.utilities.getResidentSetSize(config/BuildSystem/config/utilities/getResidentSetSize.py:31) Try to determine how to measure the memory usage Defined "USE_PROC_FOR_SIZE" to "1" Using /proc for PetscMemoryGetCurrentUsage() ================================================================================ TEST configureFortranCommandLine from config.utilities.fortranCommandLine(/home/florian/software/petsc/config/BuildSystem/config/utilities/fortranCommandLine.py:27) TESTING: configureFortranCommandLine from config.utilities.fortranCommandLine(config/BuildSystem/config/utilities/fortranCommandLine.py:27) Check for the mechanism to retrieve command line arguments in Fortran Defined "HAVE_FORTRAN_GET_COMMAND_ARGUMENT" to "1" Pushing language FC Checking for functions [] in library [''] [] Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.libraries/conftest.F Successful compile: Source: program main integer i character*(80) arg i = command_argument_count() call get_command_argument(i,arg) end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language FC Popping language FC Pushing language C Defined "HAVE_GFORTRAN_IARGC" to "1" Popping language C Checking for functions [get_command_argument_] in library [''] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char get_command_argument_(); static void _check_get_command_argument_() { get_command_argument_(); } int main() { _check_get_command_argument_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_get_command_argument_': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `get_command_argument_' collect2: error: ld returned 1 exit status Popping language C Checking for functions [getarg_] in library [''] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char getarg_(); static void _check_getarg_() { getarg_(); } int main() { _check_getarg_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_getarg_': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `getarg_' collect2: error: ld returned 1 exit status Popping language C Pushing language C Popping language C Pushing language C Popping language C Pushing language C Popping language C Pushing language C Popping language C Pushing language C Popping language C Checking for functions [ipxfargc_] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char ipxfargc_(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_ipxfargc_) || defined (__stub___ipxfargc_) ipxfargc__will_always_fail_with_ENOSYS(); #else ipxfargc_(); #endif ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `ipxfargc_' collect2: error: ld returned 1 exit status Checking for functions [f90_unix_MP_iargc] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char f90_unix_MP_iargc(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_f90_unix_MP_iargc) || defined (__stub___f90_unix_MP_iargc) f90_unix_MP_iargc_will_always_fail_with_ENOSYS(); #else f90_unix_MP_iargc(); #endif ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `f90_unix_MP_iargc' collect2: error: ld returned 1 exit status Checking for functions [PXFGETARG] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char PXFGETARG(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_PXFGETARG) || defined (__stub___PXFGETARG) PXFGETARG_will_always_fail_with_ENOSYS(); #else PXFGETARG(); #endif ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `PXFGETARG' collect2: error: ld returned 1 exit status Checking for functions [iargc_] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char iargc_(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_iargc_) || defined (__stub___iargc_) iargc__will_always_fail_with_ENOSYS(); #else iargc_(); #endif ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.o: In function `main': /tmp/petsc-KvGRNM/config.functions/conftest.c:24: undefined reference to `iargc_' collect2: error: ld returned 1 exit status Checking for functions [GETARG at 16] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.functions/conftest.c:13:12: error: stray '@' in program char GETARG at 16(); ^ /tmp/petsc-KvGRNM/config.functions/conftest.c:13:13: error: expected '=', ',', ';', 'asm' or '__attribute__' before numeric constant char GETARG at 16(); ^~ /tmp/petsc-KvGRNM/config.functions/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.functions/conftest.c:21:27: error: missing ')' after "defined" #if defined (__stub_GETARG at 16) || defined (__stub___GETARG at 16) ^ /tmp/petsc-KvGRNM/config.functions/conftest.c:21:28: error: missing binary operator before token "16" #if defined (__stub_GETARG at 16) || defined (__stub___GETARG at 16) ^~ /tmp/petsc-KvGRNM/config.functions/conftest.c:24:7: error: stray '@' in program GETARG at 16(); ^ /tmp/petsc-KvGRNM/config.functions/conftest.c:24:1: error: 'GETARG' undeclared (first use in this function) GETARG at 16(); ^~~~~~ /tmp/petsc-KvGRNM/config.functions/conftest.c:24:1: note: each undeclared identifier is reported only once for each function it appears in /tmp/petsc-KvGRNM/config.functions/conftest.c:24:8: error: expected ';' before numeric constant GETARG at 16(); ^~ Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char GETARG at 16(); #ifdef __cplusplus } #endif int main() { #if defined (__stub_GETARG at 16) || defined (__stub___GETARG at 16) GETARG at 16_will_always_fail_with_ENOSYS(); #else GETARG at 16(); #endif ; return 0; } Compile failed inside link Checking for functions [_gfortran_iargc] Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.functions/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* System header to define __stub macros and hopefully no other prototypes since they would conflict with our 'char funcname()' declaration below. */ #include /* Override any gcc2 internal prototype to avoid an error. */ #ifdef __cplusplus extern "C" { #endif /* We use char because int might match the return type of a gcc2 builtin and then its argument prototype would still apply. */ char _gfortran_iargc(); #ifdef __cplusplus } #endif int main() { #if defined (__stub__gfortran_iargc) || defined (__stub____gfortran_iargc) _gfortran_iargc_will_always_fail_with_ENOSYS(); #else _gfortran_iargc(); #endif ; return 0; } Executing: mpicc -o /tmp/petsc-KvGRNM/config.functions/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.functions/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm Defined "HAVE__GFORTRAN_IARGC" to "1" ================================================================================ TEST configureFeatureTestMacros from config.utilities.featureTestMacros(/home/florian/software/petsc/config/BuildSystem/config/utilities/featureTestMacros.py:13) TESTING: configureFeatureTestMacros from config.utilities.featureTestMacros(config/BuildSystem/config/utilities/featureTestMacros.py:13) Checks if certain feature test macros are support All intermediate test results are stored in /tmp/petsc-KvGRNM/config.utilities.featureTestMacros Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.c:4:20: fatal error: sysctl.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #define _POSIX_C_SOURCE 200112L #include int main() { ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.c Possible ERROR while running compiler: stderr: In file included from /usr/include/stdlib.h:24:0, from /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.c:4: /usr/include/features.h:148:3: warning: #warning "_BSD_SOURCE and _SVID_SOURCE are deprecated, use _DEFAULT_SOURCE" [-Wcpp] # warning "_BSD_SOURCE and _SVID_SOURCE are deprecated, use _DEFAULT_SOURCE" ^~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #define _BSD_SOURCE #include int main() { ; return 0; } Defined "_BSD_SOURCE" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #define _DEFAULT_SOURCE #include int main() { ; return 0; } Defined "_DEFAULT_SOURCE" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.featureTestMacros/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #define _GNU_SOURCE #include int main() { cpu_set_t mset; CPU_ZERO(&mset);; return 0; } Defined "_GNU_SOURCE" to "1" ================================================================================ TEST configureMissingDefines from config.utilities.missing(/home/florian/software/petsc/config/BuildSystem/config/utilities/missing.py:57) TESTING: configureMissingDefines from config.utilities.missing(config/BuildSystem/config/utilities/missing.py:57) Checks for limits All intermediate test results are stored in /tmp/petsc-KvGRNM/config.utilities.missing Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_LIMITS_H #include #endif int main() { int i=INT_MAX; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_FLOAT_H #include #endif int main() { double d=DBL_MAX; if (d); ; return 0; } ================================================================================ TEST configureMissingUtypeTypedefs from config.utilities.missing(/home/florian/software/petsc/config/BuildSystem/config/utilities/missing.py:67) TESTING: configureMissingUtypeTypedefs from config.utilities.missing(config/BuildSystem/config/utilities/missing.py:67) Checks if u_short is undefined Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c:6:9: warning: unused variable 'foo' [-Wunused-variable] u_short foo; ^~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { u_short foo; ; return 0; } ================================================================================ TEST configureMissingFunctions from config.utilities.missing(/home/florian/software/petsc/config/BuildSystem/config/utilities/missing.py:73) TESTING: configureMissingFunctions from config.utilities.missing(config/BuildSystem/config/utilities/missing.py:73) Checks for SOCKETS ================================================================================ TEST configureMissingSignals from config.utilities.missing(/home/florian/software/petsc/config/BuildSystem/config/utilities/missing.py:93) TESTING: configureMissingSignals from config.utilities.missing(config/BuildSystem/config/utilities/missing.py:93) Check for missing signals, and define MISSING_ if necessary Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGABRT; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGALRM; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGBUS; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGCHLD; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGCONT; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGFPE; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGHUP; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGILL; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGINT; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGKILL; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGPIPE; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGQUIT; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGSEGV; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGSTOP; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGSYS; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGTERM; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGTRAP; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGTSTP; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGURG; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGUSR1; if (i); ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int i=SIGUSR2; if (i); ; return 0; } ================================================================================ TEST configureMissingGetdomainnamePrototype from config.utilities.missing(/home/florian/software/petsc/config/BuildSystem/config/utilities/missing.py:110) TESTING: configureMissingGetdomainnamePrototype from config.utilities.missing(config/BuildSystem/config/utilities/missing.py:110) Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #if !defined(_BSD_SOURCE) #define _BSD_SOURCE #endif #if !defined(_DEFAULT_SOURCE) #define _DEFAULT_SOURCE #endif #if !defined(_GNU_SOURCE) #define _GNU_SOURCE #endif #ifdef PETSC_HAVE_UNISTD_H #include #endif #ifdef PETSC_HAVE_NETDB_H #include #endif int main() { int (*getdomainname_ptr)(char*,size_t) = getdomainname; char test[10]; if (getdomainname_ptr(test,10)) return 1; ; return 0; } Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.missing -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.utilities.missing/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #if !defined(_BSD_SOURCE) #define _BSD_SOURCE #endif #if !defined(_DEFAULT_SOURCE) #define _DEFAULT_SOURCE #endif #if !defined(_GNU_SOURCE) #define _GNU_SOURCE #endif #ifdef PETSC_HAVE_UNISTD_H #include #endif #ifdef PETSC_HAVE_NETDB_H #include #endif int main() { int (*getdomainname_ptr)(char*,size_t) = getdomainname; char test[10]; if (getdomainname_ptr(test,10)) return 1; ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language Cxx ================================================================================ TEST configureMissingSrandPrototype from config.utilities.missing(/home/florian/software/petsc/config/BuildSystem/config/utilities/missing.py:135) TESTING: configureMissingSrandPrototype from config.utilities.missing(config/BuildSystem/config/utilities/missing.py:135) Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.utilities.missing/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #if !defined(_BSD_SOURCE) #define _BSD_SOURCE #endif #if !defined(_DEFAULT_SOURCE) #define _DEFAULT_SOURCE #endif #if !defined(_GNU_SOURCE) #define _GNU_SOURCE #endif #ifdef PETSC_HAVE_STDLIB_H #include #endif int main() { double (*drand48_ptr)(void) = drand48; void (*srand48_ptr)(long int) = srand48; long int seed=10; srand48_ptr(seed); if (drand48_ptr() > 0.5) return 1; ; return 0; } Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.missing -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.utilities.missing/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #if !defined(_BSD_SOURCE) #define _BSD_SOURCE #endif #if !defined(_DEFAULT_SOURCE) #define _DEFAULT_SOURCE #endif #if !defined(_GNU_SOURCE) #define _GNU_SOURCE #endif #ifdef PETSC_HAVE_STDLIB_H #include #endif int main() { double (*drand48_ptr)(void) = drand48; void (*srand48_ptr)(long int) = srand48; long int seed=10; srand48_ptr(seed); if (drand48_ptr() > 0.5) return 1; ; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.utilities.missing/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.utilities.missing/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language Cxx ================================================================================ TEST configureFPTrap from config.utilities.FPTrap(/home/florian/software/petsc/config/BuildSystem/config/utilities/FPTrap.py:27) TESTING: configureFPTrap from config.utilities.FPTrap(config/BuildSystem/config/utilities/FPTrap.py:27) Checking the handling of floating point traps Checking for header: sigfpe.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: sigfpe.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: sigfpe.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:20: fatal error: sigfpe.h: No such file or directory #include ^compilation terminated.: Checking for header: fpxcp.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:19: fatal error: fpxcp.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:19: fatal error: fpxcp.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:19: fatal error: fpxcp.h: No such file or directory #include ^compilation terminated.: Checking for header: floatingpoint.h Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Possible ERROR while running preprocessor: exit code 256 stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2stderr: /tmp/petsc-KvGRNM/config.headers/conftest.c:3:27: fatal error: floatingpoint.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include Preprocess stderr before filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:27: fatal error: floatingpoint.h: No such file or directory #include ^ compilation terminated. : Preprocess stderr after filtering:/tmp/petsc-KvGRNM/config.headers/conftest.c:3:27: fatal error: floatingpoint.h: No such file or directory #include ^compilation terminated.: ================================================================================ TEST configureMkdir from config.programs(/home/florian/software/petsc/config/BuildSystem/config/programs.py:23) TESTING: configureMkdir from config.programs(config/BuildSystem/config/programs.py:23) Make sure we can have mkdir automatically make intermediate directories Checking for program /home/florian/software/bin/mkdir...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mkdir...not found Checking for program /usr/local/sbin/mkdir...not found Checking for program /usr/local/bin/mkdir...not found Checking for program /usr/bin/mkdir...found Executing: /usr/bin/mkdir -p .conftest/tmp Adding -p flag to /usr/bin/mkdir -p to automatically create directories Defined make macro "MKDIR" to "/usr/bin/mkdir -p" ================================================================================ TEST configureAutoreconf from config.programs(/home/florian/software/petsc/config/BuildSystem/config/programs.py:45) TESTING: configureAutoreconf from config.programs(config/BuildSystem/config/programs.py:45) Check for autoreconf Checking for program /home/florian/software/bin/autoreconf...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/autoreconf...not found Checking for program /usr/local/sbin/autoreconf...not found Checking for program /usr/local/bin/autoreconf...not found Checking for program /usr/bin/autoreconf...found All intermediate test results are stored in /tmp/petsc-KvGRNM/config.programs Executing: cd /tmp/petsc-KvGRNM/config.programs/autoconfdir&&/usr/bin/autoreconf autoreconf test successful! Checking for program /home/florian/software/bin/libtoolize...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/libtoolize...not found Checking for program /usr/local/sbin/libtoolize...not found Checking for program /usr/local/bin/libtoolize...not found Checking for program /usr/bin/libtoolize...found ================================================================================ TEST configurePrograms from config.programs(/home/florian/software/petsc/config/BuildSystem/config/programs.py:72) TESTING: configurePrograms from config.programs(config/BuildSystem/config/programs.py:72) Check for the programs needed to build and run PETSc Checking for program /home/florian/software/bin/sh...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/sh...not found Checking for program /usr/local/sbin/sh...not found Checking for program /usr/local/bin/sh...not found Checking for program /usr/bin/sh...found Defined make macro "SHELL" to "/usr/bin/sh" Checking for program /home/florian/software/bin/sed...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/sed...not found Checking for program /usr/local/sbin/sed...not found Checking for program /usr/local/bin/sed...not found Checking for program /usr/bin/sed...found Defined make macro "SED" to "/usr/bin/sed" Executing: /usr/bin/sed -i s/sed/sd/g "/tmp/petsc-KvGRNM/config.programs/sed1" Adding SEDINPLACE cmd: /usr/bin/sed -i Defined make macro "SEDINPLACE" to "/usr/bin/sed -i" Checking for program /home/florian/software/bin/mv...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mv...not found Checking for program /usr/local/sbin/mv...not found Checking for program /usr/local/bin/mv...not found Checking for program /usr/bin/mv...found Defined make macro "MV" to "/usr/bin/mv" Checking for program /home/florian/software/bin/cp...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/cp...not found Checking for program /usr/local/sbin/cp...not found Checking for program /usr/local/bin/cp...not found Checking for program /usr/bin/cp...found Defined make macro "CP" to "/usr/bin/cp" Checking for program /home/florian/software/bin/grep...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/grep...not found Checking for program /usr/local/sbin/grep...not found Checking for program /usr/local/bin/grep...not found Checking for program /usr/bin/grep...found Defined make macro "GREP" to "/usr/bin/grep" Checking for program /home/florian/software/bin/rm...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/rm...not found Checking for program /usr/local/sbin/rm...not found Checking for program /usr/local/bin/rm...not found Checking for program /usr/bin/rm...found Defined make macro "RM" to "/usr/bin/rm -f" Checking for program /home/florian/software/bin/diff...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/diff...not found Checking for program /usr/local/sbin/diff...not found Checking for program /usr/local/bin/diff...not found Checking for program /usr/bin/diff...found Executing: "/usr/bin/diff" -w "/tmp/petsc-KvGRNM/config.programs/diff1" "/tmp/petsc-KvGRNM/config.programs/diff2" Defined make macro "DIFF" to "/usr/bin/diff -w" Checking for program /usr/ucb/ps...not found Checking for program /usr/usb/ps...not found Checking for program /home/florian/ps...not found Checking for program /home/florian/software/petsc/bin/win32fe/ps...not found Checking for program /home/florian/software/bin/gzip...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/gzip...not found Checking for program /usr/local/sbin/gzip...not found Checking for program /usr/local/bin/gzip...not found Checking for program /usr/bin/gzip...found Defined make macro "GZIP" to "/usr/bin/gzip" Defined "HAVE_GZIP" to "1" Defined make macro "PYTHON" to "/usr/bin/python2" ================================================================================ TEST configureMake from config.packages.make(/home/florian/software/petsc/config/BuildSystem/config/packages/make.py:83) TESTING: configureMake from config.packages.make(config/BuildSystem/config/packages/make.py:83) Check for user specified make - or gmake, make Checking for program /home/florian/software/bin/gmake...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/gmake...not found Checking for program /usr/local/sbin/gmake...not found Checking for program /usr/local/bin/gmake...not found Checking for program /usr/bin/gmake...not found Checking for program /usr/lib/jvm/default/bin/gmake...not found Checking for program /opt/paraview/bin/gmake...not found Checking for program /usr/bin/site_perl/gmake...not found Checking for program /usr/bin/vendor_perl/gmake...not found Checking for program /usr/bin/core_perl/gmake...not found Checking for program /home/florian/gmake...not found Checking for program /home/florian/software/petsc/bin/win32fe/gmake...not found Checking for program /home/florian/software/bin/make...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/make...not found Checking for program /usr/local/sbin/make...not found Checking for program /usr/local/bin/make...not found Checking for program /usr/bin/make...found Defined make macro "MAKE" to "/usr/bin/make" Executing: /usr/bin/make --version stdout: GNU Make 4.2.1 Built for x86_64-pc-linux-gnu Copyright (C) 1988-2016 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. ================================================================================ TEST configureCheckGNUMake from config.packages.make(/home/florian/software/petsc/config/BuildSystem/config/packages/make.py:120) TESTING: configureCheckGNUMake from config.packages.make(config/BuildSystem/config/packages/make.py:120) Setup other GNU make stuff Executing: /usr/bin/make --version stdout: GNU Make 4.2.1 Built for x86_64-pc-linux-gnu Copyright (C) 1988-2016 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Executing: uname -s stdout: Linux Executing: uname -s stdout: Linux Defined make macro "MAKE_IS_GNUMAKE" to "1" Defined make rule "libc" with dependencies "${LIBNAME}(${OBJSC})" and code [] Defined make rule "libcxx" with dependencies "${LIBNAME}(${OBJSCXX})" and code [] Defined make rule "libcu" with dependencies "${LIBNAME}(${OBJSCU})" and code [] Defined make rule "libf" with dependencies "${OBJSF}" and code -${AR} ${AR_FLAGS} ${LIBNAME} ${OBJSF} ================================================================================ TEST configureMakeNP from config.packages.make(/home/florian/software/petsc/config/BuildSystem/config/packages/make.py:158) TESTING: configureMakeNP from config.packages.make(config/BuildSystem/config/packages/make.py:158) check no of cores on the build machine [perhaps to do make '-j ncores'] module multiprocessing found 4 cores: using make_np = 4 Defined make macro "MAKE_NP" to "4" Defined make macro "NPMAX" to "4" Defined make macro "OMAKE_PRINTDIR " to "/usr/bin/make --print-directory" Defined make macro "OMAKE" to "/usr/bin/make --no-print-directory" Defined make macro "MAKE_PAR_OUT_FLG" to "--output-sync=recurse" ================================================================================ TEST alternateConfigureLibrary from config.packages.OpenMPI(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.OpenMPI(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Executing: uname -s stdout: Linux Executing: uname -s stdout: Linux ================================================================================ TEST alternateConfigureLibrary from config.packages.MPICH(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.MPICH(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:486) TESTING: configureLibrary from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:486) Calls the regular package configureLibrary and then does an additional test needed by MPI ================================================================================== Checking for a functional MPI Checking for library in Compiler specific search MPI: [] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [MPI_Init MPI_Comm_create] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Init(); static void _check_MPI_Init() { MPI_Init(); } char MPI_Comm_create(); static void _check_MPI_Comm_create() { MPI_Comm_create(); } int main() { _check_MPI_Init(); _check_MPI_Comm_create();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C Checking for headers Compiler specific search MPI: ['/usr/include', '/usr/lib/openmpi'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['mpi.h'] in ['/usr/include', '/usr/lib/openmpi'] Checking include with compiler flags var CPPFLAGS ['/usr/include', '/usr/lib/openmpi'] Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers -I/usr/include -I/usr/lib/openmpi /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/mpi.h" 1 3 4 # 225 "/usr/include/mpi.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long int ptrdiff_t; # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 426 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef struct { long long __max_align_ll __attribute__((__aligned__(__alignof__(long long)))); long double __max_align_ld __attribute__((__aligned__(__alignof__(long double)))); } max_align_t; # 226 "/usr/include/mpi.h" 2 3 4 # 258 "/usr/include/mpi.h" 3 4 # 1 "/usr/include/mpi_portable_platform.h" 1 3 4 # 259 "/usr/include/mpi.h" 2 3 4 # 323 "/usr/include/mpi.h" 3 4 typedef ptrdiff_t MPI_Aint; typedef long long MPI_Offset; typedef long long MPI_Count; typedef struct ompi_communicator_t *MPI_Comm; typedef struct ompi_datatype_t *MPI_Datatype; typedef struct ompi_errhandler_t *MPI_Errhandler; typedef struct ompi_file_t *MPI_File; typedef struct ompi_group_t *MPI_Group; typedef struct ompi_info_t *MPI_Info; typedef struct ompi_op_t *MPI_Op; typedef struct ompi_request_t *MPI_Request; typedef struct ompi_message_t *MPI_Message; typedef struct ompi_status_public_t MPI_Status; typedef struct ompi_win_t *MPI_Win; typedef struct mca_base_var_enum_t *MPI_T_enum; typedef struct ompi_mpit_cvar_handle_t *MPI_T_cvar_handle; typedef struct mca_base_pvar_handle_t *MPI_T_pvar_handle; typedef struct mca_base_pvar_session_t *MPI_T_pvar_session; struct ompi_status_public_t { int MPI_SOURCE; int MPI_TAG; int MPI_ERROR; int _cancelled; size_t _ucount; }; typedef struct ompi_status_public_t ompi_status_public_t; # 370 "/usr/include/mpi.h" 3 4 typedef int (MPI_Copy_function)(MPI_Comm, int, void *, void *, void *, int *); typedef int (MPI_Delete_function)(MPI_Comm, int, void *, void *); typedef int (MPI_Datarep_extent_function)(MPI_Datatype, MPI_Aint *, void *); typedef int (MPI_Datarep_conversion_function)(void *, MPI_Datatype, int, void *, MPI_Offset, void *); typedef void (MPI_Comm_errhandler_function)(MPI_Comm *, int *, ...); typedef MPI_Comm_errhandler_function MPI_Comm_errhandler_fn ; typedef void (ompi_file_errhandler_fn)(MPI_File *, int *, ...); typedef ompi_file_errhandler_fn MPI_File_errhandler_fn ; typedef ompi_file_errhandler_fn MPI_File_errhandler_function; typedef void (MPI_Win_errhandler_function)(MPI_Win *, int *, ...); typedef MPI_Win_errhandler_function MPI_Win_errhandler_fn ; typedef void (MPI_Handler_function)(MPI_Comm *, int *, ...); typedef void (MPI_User_function)(void *, void *, int *, MPI_Datatype *); typedef int (MPI_Comm_copy_attr_function)(MPI_Comm, int, void *, void *, void *, int *); typedef int (MPI_Comm_delete_attr_function)(MPI_Comm, int, void *, void *); typedef int (MPI_Type_copy_attr_function)(MPI_Datatype, int, void *, void *, void *, int *); typedef int (MPI_Type_delete_attr_function)(MPI_Datatype, int, void *, void *); typedef int (MPI_Win_copy_attr_function)(MPI_Win, int, void *, void *, void *, int *); typedef int (MPI_Win_delete_attr_function)(MPI_Win, int, void *, void *); typedef int (MPI_Grequest_query_function)(void *, MPI_Status *); typedef int (MPI_Grequest_free_function)(void *); typedef int (MPI_Grequest_cancel_function)(void *, int); # 506 "/usr/include/mpi.h" 3 4 enum { MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, MPI_APPNUM, MPI_LASTUSEDCODE, MPI_UNIVERSE_SIZE, MPI_WIN_BASE, MPI_WIN_SIZE, MPI_WIN_DISP_UNIT, MPI_WIN_CREATE_FLAVOR, MPI_WIN_MODEL, IMPI_CLIENT_SIZE, IMPI_CLIENT_COLOR, IMPI_HOST_SIZE, IMPI_HOST_COLOR }; # 623 "/usr/include/mpi.h" 3 4 enum { MPI_IDENT, MPI_CONGRUENT, MPI_SIMILAR, MPI_UNEQUAL }; enum { MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, MPI_THREAD_MULTIPLE }; enum { MPI_COMBINER_NAMED, MPI_COMBINER_DUP, MPI_COMBINER_CONTIGUOUS, MPI_COMBINER_VECTOR, MPI_COMBINER_HVECTOR_INTEGER, MPI_COMBINER_HVECTOR, MPI_COMBINER_INDEXED, MPI_COMBINER_HINDEXED_INTEGER, MPI_COMBINER_HINDEXED, MPI_COMBINER_INDEXED_BLOCK, MPI_COMBINER_STRUCT_INTEGER, MPI_COMBINER_STRUCT, MPI_COMBINER_SUBARRAY, MPI_COMBINER_DARRAY, MPI_COMBINER_F90_REAL, MPI_COMBINER_F90_COMPLEX, MPI_COMBINER_F90_INTEGER, MPI_COMBINER_RESIZED, MPI_COMBINER_HINDEXED_BLOCK }; enum { MPI_COMM_TYPE_SHARED }; enum { MPI_T_VERBOSITY_USER_BASIC, MPI_T_VERBOSITY_USER_DETAIL, MPI_T_VERBOSITY_USER_ALL, MPI_T_VERBOSITY_TUNER_BASIC, MPI_T_VERBOSITY_TUNER_DETAIL, MPI_T_VERBOSITY_TUNER_ALL, MPI_T_VERBOSITY_MPIDEV_BASIC, MPI_T_VERBOSITY_MPIDEV_DETAIL, MPI_T_VERBOSITY_MPIDEV_ALL }; enum { MPI_T_SCOPE_CONSTANT, MPI_T_SCOPE_READONLY, MPI_T_SCOPE_LOCAL, MPI_T_SCOPE_GROUP, MPI_T_SCOPE_GROUP_EQ, MPI_T_SCOPE_ALL, MPI_T_SCOPE_ALL_EQ }; enum { MPI_T_BIND_NO_OBJECT, MPI_T_BIND_MPI_COMM, MPI_T_BIND_MPI_DATATYPE, MPI_T_BIND_MPI_ERRHANDLER, MPI_T_BIND_MPI_FILE, MPI_T_BIND_MPI_GROUP, MPI_T_BIND_MPI_OP, MPI_T_BIND_MPI_REQUEST, MPI_T_BIND_MPI_WIN, MPI_T_BIND_MPI_MESSAGE, MPI_T_BIND_MPI_INFO }; enum { MPI_T_PVAR_CLASS_STATE, MPI_T_PVAR_CLASS_LEVEL, MPI_T_PVAR_CLASS_SIZE, MPI_T_PVAR_CLASS_PERCENTAGE, MPI_T_PVAR_CLASS_HIGHWATERMARK, MPI_T_PVAR_CLASS_LOWWATERMARK, MPI_T_PVAR_CLASS_COUNTER, MPI_T_PVAR_CLASS_AGGREGATE, MPI_T_PVAR_CLASS_TIMER, MPI_T_PVAR_CLASS_GENERIC }; # 812 "/usr/include/mpi.h" 3 4 __attribute__((visibility("default"))) int OMPI_C_MPI_TYPE_NULL_DELETE_FN( MPI_Datatype datatype, int type_keyval, void* attribute_val_out, void* extra_state ); __attribute__((visibility("default"))) int OMPI_C_MPI_TYPE_NULL_COPY_FN( MPI_Datatype datatype, int type_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_TYPE_DUP_FN( MPI_Datatype datatype, int type_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_COMM_NULL_DELETE_FN( MPI_Comm comm, int comm_keyval, void* attribute_val_out, void* extra_state ); __attribute__((visibility("default"))) int OMPI_C_MPI_COMM_NULL_COPY_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_COMM_DUP_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_NULL_DELETE_FN( MPI_Comm comm, int comm_keyval, void* attribute_val_out, void* extra_state ) ; __attribute__((visibility("default"))) int OMPI_C_MPI_NULL_COPY_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ) ; __attribute__((visibility("default"))) int OMPI_C_MPI_DUP_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ) ; __attribute__((visibility("default"))) int OMPI_C_MPI_WIN_NULL_DELETE_FN( MPI_Win window, int win_keyval, void* attribute_val_out, void* extra_state ); __attribute__((visibility("default"))) int OMPI_C_MPI_WIN_NULL_COPY_FN( MPI_Win window, int win_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_WIN_DUP_FN( MPI_Win window, int win_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); # 882 "/usr/include/mpi.h" 3 4 __attribute__((visibility("default"))) extern struct ompi_predefined_communicator_t ompi_mpi_comm_world; __attribute__((visibility("default"))) extern struct ompi_predefined_communicator_t ompi_mpi_comm_self; __attribute__((visibility("default"))) extern struct ompi_predefined_communicator_t ompi_mpi_comm_null; __attribute__((visibility("default"))) extern struct ompi_predefined_group_t ompi_mpi_group_empty; __attribute__((visibility("default"))) extern struct ompi_predefined_group_t ompi_mpi_group_null; __attribute__((visibility("default"))) extern struct ompi_predefined_request_t ompi_request_null; __attribute__((visibility("default"))) extern struct ompi_predefined_message_t ompi_message_null; __attribute__((visibility("default"))) extern struct ompi_predefined_message_t ompi_message_no_proc; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_null; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_min; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_max; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_sum; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_prod; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_land; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_band; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_lor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_bor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_lxor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_bxor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_maxloc; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_minloc; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_replace; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_no_op; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_datatype_null; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_lb ; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_ub ; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_char; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_signed_char; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_char; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_byte; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_short; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_short; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_long; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long_long_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_long_long; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_float; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_double; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long_double; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_wchar; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_packed; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_bool; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_cplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_dblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_ldblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_character; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_dblprec; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_dblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_ldblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2integer; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2real; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2dblprec; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2cplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2dblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_float_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_double_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_longdbl_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_short_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical1; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical2; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical4; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer1; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer2; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer4; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer16; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real2; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real4; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real16; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_complex8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_complex16; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_complex32; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int8_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint8_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int16_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint16_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int32_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint32_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int64_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint64_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_aint; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_offset; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_count; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_bool; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_float_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_double_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_long_double_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_errhandler_t ompi_mpi_errhandler_null; __attribute__((visibility("default"))) extern struct ompi_predefined_errhandler_t ompi_mpi_errors_are_fatal; __attribute__((visibility("default"))) extern struct ompi_predefined_errhandler_t ompi_mpi_errors_return; __attribute__((visibility("default"))) extern struct ompi_predefined_win_t ompi_mpi_win_null; __attribute__((visibility("default"))) extern struct ompi_predefined_file_t ompi_mpi_file_null; __attribute__((visibility("default"))) extern struct ompi_predefined_info_t ompi_mpi_info_null; __attribute__((visibility("default"))) extern struct ompi_predefined_info_t ompi_mpi_info_env; __attribute__((visibility("default"))) extern int *MPI_F_STATUS_IGNORE; __attribute__((visibility("default"))) extern int *MPI_F_STATUSES_IGNORE; # 1180 "/usr/include/mpi.h" 3 4 __attribute__((visibility("default"))) int MPI_Abort(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int MPI_Accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int MPI_Add_error_class(int *errorclass); __attribute__((visibility("default"))) int MPI_Add_error_code(int errorclass, int *errorcode); __attribute__((visibility("default"))) int MPI_Add_error_string(int errorcode, const char *string); __attribute__((visibility("default"))) int MPI_Address(void *location, MPI_Aint *address) ; __attribute__((visibility("default"))) int MPI_Allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iallgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iallgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alloc_mem(MPI_Aint size, MPI_Info info, void *baseptr); __attribute__((visibility("default"))) int MPI_Allreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iallreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ialltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ialltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ialltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Attr_delete(MPI_Comm comm, int keyval) ; __attribute__((visibility("default"))) int MPI_Attr_get(MPI_Comm comm, int keyval, void *attribute_val, int *flag) ; __attribute__((visibility("default"))) int MPI_Attr_put(MPI_Comm comm, int keyval, void *attribute_val) ; __attribute__((visibility("default"))) int MPI_Barrier(MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ibarrier(MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Bcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Bsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ibcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Bsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Buffer_attach(void *buffer, int size); __attribute__((visibility("default"))) int MPI_Buffer_detach(void *buffer, int *size); __attribute__((visibility("default"))) int MPI_Cancel(MPI_Request *request); __attribute__((visibility("default"))) int MPI_Cart_coords(MPI_Comm comm, int rank, int maxdims, int coords[]); __attribute__((visibility("default"))) int MPI_Cart_create(MPI_Comm old_comm, int ndims, const int dims[], const int periods[], int reorder, MPI_Comm *comm_cart); __attribute__((visibility("default"))) int MPI_Cart_get(MPI_Comm comm, int maxdims, int dims[], int periods[], int coords[]); __attribute__((visibility("default"))) int MPI_Cart_map(MPI_Comm comm, int ndims, const int dims[], const int periods[], int *newrank); __attribute__((visibility("default"))) int MPI_Cart_rank(MPI_Comm comm, const int coords[], int *rank); __attribute__((visibility("default"))) int MPI_Cart_shift(MPI_Comm comm, int direction, int disp, int *rank_source, int *rank_dest); __attribute__((visibility("default"))) int MPI_Cart_sub(MPI_Comm comm, const int remain_dims[], MPI_Comm *new_comm); __attribute__((visibility("default"))) int MPI_Cartdim_get(MPI_Comm comm, int *ndims); __attribute__((visibility("default"))) int MPI_Close_port(const char *port_name); __attribute__((visibility("default"))) int MPI_Comm_accept(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_c2f(MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Comm_call_errhandler(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int MPI_Comm_compare(MPI_Comm comm1, MPI_Comm comm2, int *result); __attribute__((visibility("default"))) int MPI_Comm_connect(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_create_errhandler(MPI_Comm_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Comm_create_keyval(MPI_Comm_copy_attr_function *comm_copy_attr_fn, MPI_Comm_delete_attr_function *comm_delete_attr_fn, int *comm_keyval, void *extra_state); __attribute__((visibility("default"))) int MPI_Comm_create_group(MPI_Comm comm, MPI_Group group, int tag, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_create(MPI_Comm comm, MPI_Group group, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_delete_attr(MPI_Comm comm, int comm_keyval); __attribute__((visibility("default"))) int MPI_Comm_disconnect(MPI_Comm *comm); __attribute__((visibility("default"))) int MPI_Comm_dup(MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_idup(MPI_Comm comm, MPI_Comm *newcomm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Comm_dup_with_info(MPI_Comm comm, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) MPI_Comm MPI_Comm_f2c(int comm); __attribute__((visibility("default"))) int MPI_Comm_free_keyval(int *comm_keyval); __attribute__((visibility("default"))) int MPI_Comm_free(MPI_Comm *comm); __attribute__((visibility("default"))) int MPI_Comm_get_attr(MPI_Comm comm, int comm_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int MPI_Dist_graph_create(MPI_Comm comm_old, int n, const int nodes[], const int degrees[], const int targets[], const int weights[], MPI_Info info, int reorder, MPI_Comm * newcomm); __attribute__((visibility("default"))) int MPI_Dist_graph_create_adjacent(MPI_Comm comm_old, int indegree, const int sources[], const int sourceweights[], int outdegree, const int destinations[], const int destweights[], MPI_Info info, int reorder, MPI_Comm *comm_dist_graph); __attribute__((visibility("default"))) int MPI_Dist_graph_neighbors(MPI_Comm comm, int maxindegree, int sources[], int sourceweights[], int maxoutdegree, int destinations[], int destweights[]); __attribute__((visibility("default"))) int MPI_Dist_graph_neighbors_count(MPI_Comm comm, int *inneighbors, int *outneighbors, int *weighted); __attribute__((visibility("default"))) int MPI_Comm_get_errhandler(MPI_Comm comm, MPI_Errhandler *erhandler); __attribute__((visibility("default"))) int MPI_Comm_get_info(MPI_Comm comm, MPI_Info *info_used); __attribute__((visibility("default"))) int MPI_Comm_get_name(MPI_Comm comm, char *comm_name, int *resultlen); __attribute__((visibility("default"))) int MPI_Comm_get_parent(MPI_Comm *parent); __attribute__((visibility("default"))) int MPI_Comm_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int MPI_Comm_join(int fd, MPI_Comm *intercomm); __attribute__((visibility("default"))) int MPI_Comm_rank(MPI_Comm comm, int *rank); __attribute__((visibility("default"))) int MPI_Comm_remote_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int MPI_Comm_remote_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int MPI_Comm_set_attr(MPI_Comm comm, int comm_keyval, void *attribute_val); __attribute__((visibility("default"))) int MPI_Comm_set_errhandler(MPI_Comm comm, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_Comm_set_info(MPI_Comm comm, MPI_Info info); __attribute__((visibility("default"))) int MPI_Comm_set_name(MPI_Comm comm, const char *comm_name); __attribute__((visibility("default"))) int MPI_Comm_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int MPI_Comm_spawn(const char *command, char *argv[], int maxprocs, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int MPI_Comm_spawn_multiple(int count, char *array_of_commands[], char **array_of_argv[], const int array_of_maxprocs[], const MPI_Info array_of_info[], int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int MPI_Comm_split(MPI_Comm comm, int color, int key, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_split_type(MPI_Comm comm, int split_type, int key, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_test_inter(MPI_Comm comm, int *flag); __attribute__((visibility("default"))) int MPI_Compare_and_swap(const void *origin_addr, const void *compare_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Win win); __attribute__((visibility("default"))) int MPI_Dims_create(int nnodes, int ndims, int dims[]); __attribute__((visibility("default"))) int MPI_Errhandler_c2f(MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_Errhandler_create(MPI_Handler_function *function, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) MPI_Errhandler MPI_Errhandler_f2c(int errhandler); __attribute__((visibility("default"))) int MPI_Errhandler_free(MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Errhandler_get(MPI_Comm comm, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) int MPI_Errhandler_set(MPI_Comm comm, MPI_Errhandler errhandler) ; __attribute__((visibility("default"))) int MPI_Error_class(int errorcode, int *errorclass); __attribute__((visibility("default"))) int MPI_Error_string(int errorcode, char *string, int *resultlen); __attribute__((visibility("default"))) int MPI_Exscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Fetch_and_op(const void *origin_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int MPI_Iexscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_c2f(MPI_File file); __attribute__((visibility("default"))) MPI_File MPI_File_f2c(int file); __attribute__((visibility("default"))) int MPI_File_call_errhandler(MPI_File fh, int errorcode); __attribute__((visibility("default"))) int MPI_File_create_errhandler(MPI_File_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_File_set_errhandler( MPI_File file, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_File_get_errhandler( MPI_File file, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_File_open(MPI_Comm comm, const char *filename, int amode, MPI_Info info, MPI_File *fh); __attribute__((visibility("default"))) int MPI_File_close(MPI_File *fh); __attribute__((visibility("default"))) int MPI_File_delete(const char *filename, MPI_Info info); __attribute__((visibility("default"))) int MPI_File_set_size(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int MPI_File_preallocate(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int MPI_File_get_size(MPI_File fh, MPI_Offset *size); __attribute__((visibility("default"))) int MPI_File_get_group(MPI_File fh, MPI_Group *group); __attribute__((visibility("default"))) int MPI_File_get_amode(MPI_File fh, int *amode); __attribute__((visibility("default"))) int MPI_File_set_info(MPI_File fh, MPI_Info info); __attribute__((visibility("default"))) int MPI_File_get_info(MPI_File fh, MPI_Info *info_used); __attribute__((visibility("default"))) int MPI_File_set_view(MPI_File fh, MPI_Offset disp, MPI_Datatype etype, MPI_Datatype filetype, const char *datarep, MPI_Info info); __attribute__((visibility("default"))) int MPI_File_get_view(MPI_File fh, MPI_Offset *disp, MPI_Datatype *etype, MPI_Datatype *filetype, char *datarep); __attribute__((visibility("default"))) int MPI_File_read_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_at_all(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_at_all(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_iread_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_iwrite_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_read(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_all(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_all(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_iread(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_iwrite(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_seek(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int MPI_File_get_position(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int MPI_File_get_byte_offset(MPI_File fh, MPI_Offset offset, MPI_Offset *disp); __attribute__((visibility("default"))) int MPI_File_read_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_iread_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_iwrite_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_read_ordered(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_ordered(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_seek_shared(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int MPI_File_get_position_shared(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int MPI_File_read_at_all_begin(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_read_at_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_at_all_begin(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_write_at_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_all_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_read_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_all_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_write_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_ordered_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_read_ordered_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_ordered_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_write_ordered_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_get_type_extent(MPI_File fh, MPI_Datatype datatype, MPI_Aint *extent); __attribute__((visibility("default"))) int MPI_File_set_atomicity(MPI_File fh, int flag); __attribute__((visibility("default"))) int MPI_File_get_atomicity(MPI_File fh, int *flag); __attribute__((visibility("default"))) int MPI_File_sync(MPI_File fh); __attribute__((visibility("default"))) int MPI_Finalize(void); __attribute__((visibility("default"))) int MPI_Finalized(int *flag); __attribute__((visibility("default"))) int MPI_Free_mem(void *base); __attribute__((visibility("default"))) int MPI_Gather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Igather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Gatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Igatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Get_address(const void *location, MPI_Aint *address); __attribute__((visibility("default"))) int MPI_Get_count(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int MPI_Get_elements(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int MPI_Get_elements_x(const MPI_Status *status, MPI_Datatype datatype, MPI_Count *count); __attribute__((visibility("default"))) int MPI_Get(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int MPI_Get_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int MPI_Get_library_version(char *version, int *resultlen); __attribute__((visibility("default"))) int MPI_Get_processor_name(char *name, int *resultlen); __attribute__((visibility("default"))) int MPI_Get_version(int *version, int *subversion); __attribute__((visibility("default"))) int MPI_Graph_create(MPI_Comm comm_old, int nnodes, const int index[], const int edges[], int reorder, MPI_Comm *comm_graph); __attribute__((visibility("default"))) int MPI_Graph_get(MPI_Comm comm, int maxindex, int maxedges, int index[], int edges[]); __attribute__((visibility("default"))) int MPI_Graph_map(MPI_Comm comm, int nnodes, const int index[], const int edges[], int *newrank); __attribute__((visibility("default"))) int MPI_Graph_neighbors_count(MPI_Comm comm, int rank, int *nneighbors); __attribute__((visibility("default"))) int MPI_Graph_neighbors(MPI_Comm comm, int rank, int maxneighbors, int neighbors[]); __attribute__((visibility("default"))) int MPI_Graphdims_get(MPI_Comm comm, int *nnodes, int *nedges); __attribute__((visibility("default"))) int MPI_Grequest_complete(MPI_Request request); __attribute__((visibility("default"))) int MPI_Grequest_start(MPI_Grequest_query_function *query_fn, MPI_Grequest_free_function *free_fn, MPI_Grequest_cancel_function *cancel_fn, void *extra_state, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Group_c2f(MPI_Group group); __attribute__((visibility("default"))) int MPI_Group_compare(MPI_Group group1, MPI_Group group2, int *result); __attribute__((visibility("default"))) int MPI_Group_difference(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_excl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) MPI_Group MPI_Group_f2c(int group); __attribute__((visibility("default"))) int MPI_Group_free(MPI_Group *group); __attribute__((visibility("default"))) int MPI_Group_incl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_intersection(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_range_excl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_range_incl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_rank(MPI_Group group, int *rank); __attribute__((visibility("default"))) int MPI_Group_size(MPI_Group group, int *size); __attribute__((visibility("default"))) int MPI_Group_translate_ranks(MPI_Group group1, int n, const int ranks1[], MPI_Group group2, int ranks2[]); __attribute__((visibility("default"))) int MPI_Group_union(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Ibsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Improbe(int source, int tag, MPI_Comm comm, int *flag, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Imrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Info_c2f(MPI_Info info); __attribute__((visibility("default"))) int MPI_Info_create(MPI_Info *info); __attribute__((visibility("default"))) int MPI_Info_delete(MPI_Info info, const char *key); __attribute__((visibility("default"))) int MPI_Info_dup(MPI_Info info, MPI_Info *newinfo); __attribute__((visibility("default"))) MPI_Info MPI_Info_f2c(int info); __attribute__((visibility("default"))) int MPI_Info_free(MPI_Info *info); __attribute__((visibility("default"))) int MPI_Info_get(MPI_Info info, const char *key, int valuelen, char *value, int *flag); __attribute__((visibility("default"))) int MPI_Info_get_nkeys(MPI_Info info, int *nkeys); __attribute__((visibility("default"))) int MPI_Info_get_nthkey(MPI_Info info, int n, char *key); __attribute__((visibility("default"))) int MPI_Info_get_valuelen(MPI_Info info, const char *key, int *valuelen, int *flag); __attribute__((visibility("default"))) int MPI_Info_set(MPI_Info info, const char *key, const char *value); __attribute__((visibility("default"))) int MPI_Init(int *argc, char ***argv); __attribute__((visibility("default"))) int MPI_Initialized(int *flag); __attribute__((visibility("default"))) int MPI_Init_thread(int *argc, char ***argv, int required, int *provided); __attribute__((visibility("default"))) int MPI_Intercomm_create(MPI_Comm local_comm, int local_leader, MPI_Comm bridge_comm, int remote_leader, int tag, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int MPI_Intercomm_merge(MPI_Comm intercomm, int high, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int MPI_Iprobe(int source, int tag, MPI_Comm comm, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Irecv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Irsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Isend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Issend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Is_thread_main(int *flag); __attribute__((visibility("default"))) int MPI_Keyval_create(MPI_Copy_function *copy_fn, MPI_Delete_function *delete_fn, int *keyval, void *extra_state) ; __attribute__((visibility("default"))) int MPI_Keyval_free(int *keyval) ; __attribute__((visibility("default"))) int MPI_Lookup_name(const char *service_name, MPI_Info info, char *port_name); __attribute__((visibility("default"))) int MPI_Message_c2f(MPI_Message message); __attribute__((visibility("default"))) MPI_Message MPI_Message_f2c(int message); __attribute__((visibility("default"))) int MPI_Mprobe(int source, int tag, MPI_Comm comm, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Mrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Neighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Op_c2f(MPI_Op op); __attribute__((visibility("default"))) int MPI_Op_commutative(MPI_Op op, int *commute); __attribute__((visibility("default"))) int MPI_Op_create(MPI_User_function *function, int commute, MPI_Op *op); __attribute__((visibility("default"))) int MPI_Open_port(MPI_Info info, char *port_name); __attribute__((visibility("default"))) MPI_Op MPI_Op_f2c(int op); __attribute__((visibility("default"))) int MPI_Op_free(MPI_Op *op); __attribute__((visibility("default"))) int MPI_Pack_external(const char datarep[], const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, MPI_Aint outsize, MPI_Aint *position); __attribute__((visibility("default"))) int MPI_Pack_external_size(const char datarep[], int incount, MPI_Datatype datatype, MPI_Aint *size); __attribute__((visibility("default"))) int MPI_Pack(const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, int outsize, int *position, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Pack_size(int incount, MPI_Datatype datatype, MPI_Comm comm, int *size); __attribute__((visibility("default"))) int MPI_Pcontrol(const int level, ...); __attribute__((visibility("default"))) int MPI_Probe(int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Publish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int MPI_Put(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int MPI_Query_thread(int *provided); __attribute__((visibility("default"))) int MPI_Raccumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Recv_init(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Recv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Reduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ireduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Reduce_local(const void *inbuf, void *inoutbuf, int count, MPI_Datatype datatype, MPI_Op op); __attribute__((visibility("default"))) int MPI_Reduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ireduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Reduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ireduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Register_datarep(const char *datarep, MPI_Datarep_conversion_function *read_conversion_fn, MPI_Datarep_conversion_function *write_conversion_fn, MPI_Datarep_extent_function *dtype_file_extent_fn, void *extra_state); __attribute__((visibility("default"))) int MPI_Request_c2f(MPI_Request request); __attribute__((visibility("default"))) MPI_Request MPI_Request_f2c(int request); __attribute__((visibility("default"))) int MPI_Request_free(MPI_Request *request); __attribute__((visibility("default"))) int MPI_Request_get_status(MPI_Request request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Rget(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Rget_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Rput(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_cout, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Rsend(const void *ibuf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Rsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Scan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Scatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iscatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Scatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iscatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Send_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Send(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Sendrecv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, int dest, int sendtag, void *recvbuf, int recvcount, MPI_Datatype recvtype, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Sendrecv_replace(void * buf, int count, MPI_Datatype datatype, int dest, int sendtag, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Ssend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Ssend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Start(MPI_Request *request); __attribute__((visibility("default"))) int MPI_Startall(int count, MPI_Request array_of_requests[]); __attribute__((visibility("default"))) int MPI_Status_c2f(const MPI_Status *c_status, int *f_status); __attribute__((visibility("default"))) int MPI_Status_f2c(const int *f_status, MPI_Status *c_status); __attribute__((visibility("default"))) int MPI_Status_set_cancelled(MPI_Status *status, int flag); __attribute__((visibility("default"))) int MPI_Status_set_elements(MPI_Status *status, MPI_Datatype datatype, int count); __attribute__((visibility("default"))) int MPI_Status_set_elements_x(MPI_Status *status, MPI_Datatype datatype, MPI_Count count); __attribute__((visibility("default"))) int MPI_Testall(int count, MPI_Request array_of_requests[], int *flag, MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int MPI_Testany(int count, MPI_Request array_of_requests[], int *index, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Test(MPI_Request *request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Test_cancelled(const MPI_Status *status, int *flag); __attribute__((visibility("default"))) int MPI_Testsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int MPI_Topo_test(MPI_Comm comm, int *status); __attribute__((visibility("default"))) int MPI_Type_c2f(MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_Type_commit(MPI_Datatype *type); __attribute__((visibility("default"))) int MPI_Type_contiguous(int count, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_darray(int size, int rank, int ndims, const int gsize_array[], const int distrib_array[], const int darg_array[], const int psize_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_f90_complex(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_f90_integer(int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_f90_real(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_hindexed_block(int count, int blocklength, const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_hindexed(int count, const int array_of_blocklengths[], const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_keyval(MPI_Type_copy_attr_function *type_copy_attr_fn, MPI_Type_delete_attr_function *type_delete_attr_fn, int *type_keyval, void *extra_state); __attribute__((visibility("default"))) int MPI_Type_create_indexed_block(int count, int blocklength, const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_struct(int count, const int array_of_block_lengths[], const MPI_Aint array_of_displacements[], const MPI_Datatype array_of_types[], MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_subarray(int ndims, const int size_array[], const int subsize_array[], const int start_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_resized(MPI_Datatype oldtype, MPI_Aint lb, MPI_Aint extent, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_delete_attr(MPI_Datatype type, int type_keyval); __attribute__((visibility("default"))) int MPI_Type_dup(MPI_Datatype type, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_extent(MPI_Datatype type, MPI_Aint *extent) ; __attribute__((visibility("default"))) int MPI_Type_free(MPI_Datatype *type); __attribute__((visibility("default"))) int MPI_Type_free_keyval(int *type_keyval); __attribute__((visibility("default"))) MPI_Datatype MPI_Type_f2c(int datatype); __attribute__((visibility("default"))) int MPI_Type_get_attr(MPI_Datatype type, int type_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int MPI_Type_get_contents(MPI_Datatype mtype, int max_integers, int max_addresses, int max_datatypes, int array_of_integers[], MPI_Aint array_of_addresses[], MPI_Datatype array_of_datatypes[]); __attribute__((visibility("default"))) int MPI_Type_get_envelope(MPI_Datatype type, int *num_integers, int *num_addresses, int *num_datatypes, int *combiner); __attribute__((visibility("default"))) int MPI_Type_get_extent(MPI_Datatype type, MPI_Aint *lb, MPI_Aint *extent); __attribute__((visibility("default"))) int MPI_Type_get_extent_x(MPI_Datatype type, MPI_Count *lb, MPI_Count *extent); __attribute__((visibility("default"))) int MPI_Type_get_name(MPI_Datatype type, char *type_name, int *resultlen); __attribute__((visibility("default"))) int MPI_Type_get_true_extent(MPI_Datatype datatype, MPI_Aint *true_lb, MPI_Aint *true_extent); __attribute__((visibility("default"))) int MPI_Type_get_true_extent_x(MPI_Datatype datatype, MPI_Count *true_lb, MPI_Count *true_extent); __attribute__((visibility("default"))) int MPI_Type_hindexed(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int MPI_Type_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int MPI_Type_indexed(int count, const int array_of_blocklengths[], const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_lb(MPI_Datatype type, MPI_Aint *lb) ; __attribute__((visibility("default"))) int MPI_Type_match_size(int typeclass, int size, MPI_Datatype *type); __attribute__((visibility("default"))) int MPI_Type_set_attr(MPI_Datatype type, int type_keyval, void *attr_val); __attribute__((visibility("default"))) int MPI_Type_set_name(MPI_Datatype type, const char *type_name); __attribute__((visibility("default"))) int MPI_Type_size(MPI_Datatype type, int *size); __attribute__((visibility("default"))) int MPI_Type_size_x(MPI_Datatype type, MPI_Count *size); __attribute__((visibility("default"))) int MPI_Type_struct(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype array_of_types[], MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int MPI_Type_ub(MPI_Datatype mtype, MPI_Aint *ub) ; __attribute__((visibility("default"))) int MPI_Type_vector(int count, int blocklength, int stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Unpack(const void *inbuf, int insize, int *position, void *outbuf, int outcount, MPI_Datatype datatype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Unpublish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int MPI_Unpack_external (const char datarep[], const void *inbuf, MPI_Aint insize, MPI_Aint *position, void *outbuf, int outcount, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_Waitall(int count, MPI_Request array_of_requests[], MPI_Status *array_of_statuses); __attribute__((visibility("default"))) int MPI_Waitany(int count, MPI_Request array_of_requests[], int *index, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Wait(MPI_Request *request, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Waitsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int MPI_Win_allocate(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_allocate_shared(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_attach(MPI_Win win, void *base, MPI_Aint size); __attribute__((visibility("default"))) int MPI_Win_c2f(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_call_errhandler(MPI_Win win, int errorcode); __attribute__((visibility("default"))) int MPI_Win_complete(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_create(void *base, MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_create_dynamic(MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_create_errhandler(MPI_Win_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Win_create_keyval(MPI_Win_copy_attr_function *win_copy_attr_fn, MPI_Win_delete_attr_function *win_delete_attr_fn, int *win_keyval, void *extra_state); __attribute__((visibility("default"))) int MPI_Win_delete_attr(MPI_Win win, int win_keyval); __attribute__((visibility("default"))) int MPI_Win_detach(MPI_Win win, const void *base); __attribute__((visibility("default"))) MPI_Win MPI_Win_f2c(int win); __attribute__((visibility("default"))) int MPI_Win_fence(int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush(int rank, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush_all(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush_local(int rank, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush_local_all(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_free(MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_free_keyval(int *win_keyval); __attribute__((visibility("default"))) int MPI_Win_get_attr(MPI_Win win, int win_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int MPI_Win_get_errhandler(MPI_Win win, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Win_get_group(MPI_Win win, MPI_Group *group); __attribute__((visibility("default"))) int MPI_Win_get_info(MPI_Win win, MPI_Info *info_used); __attribute__((visibility("default"))) int MPI_Win_get_name(MPI_Win win, char *win_name, int *resultlen); __attribute__((visibility("default"))) int MPI_Win_lock(int lock_type, int rank, int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_lock_all(int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_post(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_set_attr(MPI_Win win, int win_keyval, void *attribute_val); __attribute__((visibility("default"))) int MPI_Win_set_errhandler(MPI_Win win, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_Win_set_info(MPI_Win win, MPI_Info info); __attribute__((visibility("default"))) int MPI_Win_set_name(MPI_Win win, const char *win_name); __attribute__((visibility("default"))) int MPI_Win_shared_query(MPI_Win win, int rank, MPI_Aint *size, int *disp_unit, void *baseptr); __attribute__((visibility("default"))) int MPI_Win_start(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_sync(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_test(MPI_Win win, int *flag); __attribute__((visibility("default"))) int MPI_Win_unlock(int rank, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_unlock_all(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_wait(MPI_Win win); __attribute__((visibility("default"))) double MPI_Wtick(void); __attribute__((visibility("default"))) double MPI_Wtime(void); __attribute__((visibility("default"))) int PMPI_Abort(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int PMPI_Accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Add_error_class(int *errorclass); __attribute__((visibility("default"))) int PMPI_Add_error_code(int errorclass, int *errorcode); __attribute__((visibility("default"))) int PMPI_Add_error_string(int errorcode, const char *string); __attribute__((visibility("default"))) int PMPI_Address(void *location, MPI_Aint *address) ; __attribute__((visibility("default"))) int PMPI_Allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iallgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iallgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alloc_mem(MPI_Aint size, MPI_Info info, void *baseptr); __attribute__((visibility("default"))) int PMPI_Allreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iallreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ialltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ialltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ialltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Attr_delete(MPI_Comm comm, int keyval) ; __attribute__((visibility("default"))) int PMPI_Attr_get(MPI_Comm comm, int keyval, void *attribute_val, int *flag) ; __attribute__((visibility("default"))) int PMPI_Dist_graph_create(MPI_Comm comm_old, int n, const int nodes[], const int degrees[], const int targets[], const int weights[], MPI_Info info, int reorder, MPI_Comm * newcomm); __attribute__((visibility("default"))) int PMPI_Dist_graph_create_adjacent(MPI_Comm comm_old, int indegree, const int sources[], const int sourceweights[], int outdegree, const int destinations[], const int destweights[], MPI_Info info, int reorder, MPI_Comm *comm_dist_graph); __attribute__((visibility("default"))) int PMPI_Dist_graph_neighbors(MPI_Comm comm, int maxindegree, int sources[], int sourceweights[], int maxoutdegree, int destinations[], int destweights[]); __attribute__((visibility("default"))) int PMPI_Dist_graph_neighbors_count(MPI_Comm comm, int *inneighbors, int *outneighbors, int *weighted); __attribute__((visibility("default"))) int PMPI_Attr_put(MPI_Comm comm, int keyval, void *attribute_val) ; __attribute__((visibility("default"))) int PMPI_Barrier(MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ibarrier(MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Bcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ibcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Bsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Bsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Buffer_attach(void *buffer, int size); __attribute__((visibility("default"))) int PMPI_Buffer_detach(void *buffer, int *size); __attribute__((visibility("default"))) int PMPI_Cancel(MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Cart_coords(MPI_Comm comm, int rank, int maxdims, int coords[]); __attribute__((visibility("default"))) int PMPI_Cart_create(MPI_Comm old_comm, int ndims, const int dims[], const int periods[], int reorder, MPI_Comm *comm_cart); __attribute__((visibility("default"))) int PMPI_Cart_get(MPI_Comm comm, int maxdims, int dims[], int periods[], int coords[]); __attribute__((visibility("default"))) int PMPI_Cart_map(MPI_Comm comm, int ndims, const int dims[], const int periods[], int *newrank); __attribute__((visibility("default"))) int PMPI_Cart_rank(MPI_Comm comm, const int coords[], int *rank); __attribute__((visibility("default"))) int PMPI_Cart_shift(MPI_Comm comm, int direction, int disp, int *rank_source, int *rank_dest); __attribute__((visibility("default"))) int PMPI_Cart_sub(MPI_Comm comm, const int remain_dims[], MPI_Comm *new_comm); __attribute__((visibility("default"))) int PMPI_Cartdim_get(MPI_Comm comm, int *ndims); __attribute__((visibility("default"))) int PMPI_Close_port(const char *port_name); __attribute__((visibility("default"))) int PMPI_Comm_accept(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_c2f(MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Comm_call_errhandler(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int PMPI_Comm_compare(MPI_Comm comm1, MPI_Comm comm2, int *result); __attribute__((visibility("default"))) int PMPI_Comm_connect(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_create_errhandler(MPI_Comm_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Comm_create_keyval(MPI_Comm_copy_attr_function *comm_copy_attr_fn, MPI_Comm_delete_attr_function *comm_delete_attr_fn, int *comm_keyval, void *extra_state); __attribute__((visibility("default"))) int PMPI_Comm_create_group(MPI_Comm comm, MPI_Group group, int tag, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_create(MPI_Comm comm, MPI_Group group, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_delete_attr(MPI_Comm comm, int comm_keyval); __attribute__((visibility("default"))) int PMPI_Comm_disconnect(MPI_Comm *comm); __attribute__((visibility("default"))) int PMPI_Comm_dup(MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_idup(MPI_Comm comm, MPI_Comm *newcomm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Comm_dup_with_info(MPI_Comm comm, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) MPI_Comm PMPI_Comm_f2c(int comm); __attribute__((visibility("default"))) int PMPI_Comm_free_keyval(int *comm_keyval); __attribute__((visibility("default"))) int PMPI_Comm_free(MPI_Comm *comm); __attribute__((visibility("default"))) int PMPI_Comm_get_attr(MPI_Comm comm, int comm_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int PMPI_Comm_get_errhandler(MPI_Comm comm, MPI_Errhandler *erhandler); __attribute__((visibility("default"))) int PMPI_Comm_get_info(MPI_Comm comm, MPI_Info *info_used); __attribute__((visibility("default"))) int PMPI_Comm_get_name(MPI_Comm comm, char *comm_name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Comm_get_parent(MPI_Comm *parent); __attribute__((visibility("default"))) int PMPI_Comm_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Comm_join(int fd, MPI_Comm *intercomm); __attribute__((visibility("default"))) int PMPI_Comm_rank(MPI_Comm comm, int *rank); __attribute__((visibility("default"))) int PMPI_Comm_remote_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Comm_remote_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int PMPI_Comm_set_attr(MPI_Comm comm, int comm_keyval, void *attribute_val); __attribute__((visibility("default"))) int PMPI_Comm_set_errhandler(MPI_Comm comm, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_Comm_set_info(MPI_Comm comm, MPI_Info info); __attribute__((visibility("default"))) int PMPI_Comm_set_name(MPI_Comm comm, const char *comm_name); __attribute__((visibility("default"))) int PMPI_Comm_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int PMPI_Comm_spawn(const char *command, char *argv[], int maxprocs, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int PMPI_Comm_spawn_multiple(int count, char *array_of_commands[], char **array_of_argv[], const int array_of_maxprocs[], const MPI_Info array_of_info[], int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int PMPI_Comm_split(MPI_Comm comm, int color, int key, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_split_type(MPI_Comm comm, int split_type, int key, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_test_inter(MPI_Comm comm, int *flag); __attribute__((visibility("default"))) int PMPI_Compare_and_swap(const void *origin_addr, const void *compare_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Dims_create(int nnodes, int ndims, int dims[]); __attribute__((visibility("default"))) int PMPI_Errhandler_c2f(MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_Errhandler_create(MPI_Handler_function *function, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) MPI_Errhandler PMPI_Errhandler_f2c(int errhandler); __attribute__((visibility("default"))) int PMPI_Errhandler_free(MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Errhandler_get(MPI_Comm comm, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) int PMPI_Errhandler_set(MPI_Comm comm, MPI_Errhandler errhandler) ; __attribute__((visibility("default"))) int PMPI_Error_class(int errorcode, int *errorclass); __attribute__((visibility("default"))) int PMPI_Error_string(int errorcode, char *string, int *resultlen); __attribute__((visibility("default"))) int PMPI_Exscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Fetch_and_op(const void *origin_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Iexscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_c2f(MPI_File file); __attribute__((visibility("default"))) MPI_File PMPI_File_f2c(int file); __attribute__((visibility("default"))) int PMPI_File_call_errhandler(MPI_File fh, int errorcode); __attribute__((visibility("default"))) int PMPI_File_create_errhandler(MPI_File_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_File_set_errhandler( MPI_File file, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_File_get_errhandler( MPI_File file, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_File_open(MPI_Comm comm, const char *filename, int amode, MPI_Info info, MPI_File *fh); __attribute__((visibility("default"))) int PMPI_File_close(MPI_File *fh); __attribute__((visibility("default"))) int PMPI_File_delete(const char *filename, MPI_Info info); __attribute__((visibility("default"))) int PMPI_File_set_size(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int PMPI_File_preallocate(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int PMPI_File_get_size(MPI_File fh, MPI_Offset *size); __attribute__((visibility("default"))) int PMPI_File_get_group(MPI_File fh, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_File_get_amode(MPI_File fh, int *amode); __attribute__((visibility("default"))) int PMPI_File_set_info(MPI_File fh, MPI_Info info); __attribute__((visibility("default"))) int PMPI_File_get_info(MPI_File fh, MPI_Info *info_used); __attribute__((visibility("default"))) int PMPI_File_set_view(MPI_File fh, MPI_Offset disp, MPI_Datatype etype, MPI_Datatype filetype, const char *datarep, MPI_Info info); __attribute__((visibility("default"))) int PMPI_File_get_view(MPI_File fh, MPI_Offset *disp, MPI_Datatype *etype, MPI_Datatype *filetype, char *datarep); __attribute__((visibility("default"))) int PMPI_File_read_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_at_all(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_at_all(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_iread_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_iwrite_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_read(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_all(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_all(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_iread(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_iwrite(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_seek(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int PMPI_File_get_position(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int PMPI_File_get_byte_offset(MPI_File fh, MPI_Offset offset, MPI_Offset *disp); __attribute__((visibility("default"))) int PMPI_File_read_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_iread_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_iwrite_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_read_ordered(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_ordered(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_seek_shared(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int PMPI_File_get_position_shared(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int PMPI_File_read_at_all_begin(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_read_at_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_at_all_begin(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_write_at_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_all_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_read_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_all_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_write_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_ordered_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_read_ordered_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_ordered_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_write_ordered_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_get_type_extent(MPI_File fh, MPI_Datatype datatype, MPI_Aint *extent); __attribute__((visibility("default"))) int PMPI_File_set_atomicity(MPI_File fh, int flag); __attribute__((visibility("default"))) int PMPI_File_get_atomicity(MPI_File fh, int *flag); __attribute__((visibility("default"))) int PMPI_File_sync(MPI_File fh); __attribute__((visibility("default"))) int PMPI_Finalize(void); __attribute__((visibility("default"))) int PMPI_Finalized(int *flag); __attribute__((visibility("default"))) int PMPI_Free_mem(void *base); __attribute__((visibility("default"))) int PMPI_Gather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Igather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Gatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Igatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Get_address(const void *location, MPI_Aint *address); __attribute__((visibility("default"))) int PMPI_Get_count(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int PMPI_Get_elements(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int PMPI_Get_elements_x(const MPI_Status *status, MPI_Datatype datatype, MPI_Count *count); __attribute__((visibility("default"))) int PMPI_Get(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Get_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Get_library_version(char *version, int *resultlen); __attribute__((visibility("default"))) int PMPI_Get_processor_name(char *name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Get_version(int *version, int *subversion); __attribute__((visibility("default"))) int PMPI_Graph_create(MPI_Comm comm_old, int nnodes, const int index[], const int edges[], int reorder, MPI_Comm *comm_graph); __attribute__((visibility("default"))) int PMPI_Graph_get(MPI_Comm comm, int maxindex, int maxedges, int index[], int edges[]); __attribute__((visibility("default"))) int PMPI_Graph_map(MPI_Comm comm, int nnodes, const int index[], const int edges[], int *newrank); __attribute__((visibility("default"))) int PMPI_Graph_neighbors_count(MPI_Comm comm, int rank, int *nneighbors); __attribute__((visibility("default"))) int PMPI_Graph_neighbors(MPI_Comm comm, int rank, int maxneighbors, int neighbors[]); __attribute__((visibility("default"))) int PMPI_Graphdims_get(MPI_Comm comm, int *nnodes, int *nedges); __attribute__((visibility("default"))) int PMPI_Grequest_complete(MPI_Request request); __attribute__((visibility("default"))) int PMPI_Grequest_start(MPI_Grequest_query_function *query_fn, MPI_Grequest_free_function *free_fn, MPI_Grequest_cancel_function *cancel_fn, void *extra_state, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Group_c2f(MPI_Group group); __attribute__((visibility("default"))) int PMPI_Group_compare(MPI_Group group1, MPI_Group group2, int *result); __attribute__((visibility("default"))) int PMPI_Group_difference(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_excl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) MPI_Group PMPI_Group_f2c(int group); __attribute__((visibility("default"))) int PMPI_Group_free(MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Group_incl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_intersection(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_range_excl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_range_incl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_rank(MPI_Group group, int *rank); __attribute__((visibility("default"))) int PMPI_Group_size(MPI_Group group, int *size); __attribute__((visibility("default"))) int PMPI_Group_translate_ranks(MPI_Group group1, int n, const int ranks1[], MPI_Group group2, int ranks2[]); __attribute__((visibility("default"))) int PMPI_Group_union(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Ibsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Improbe(int source, int tag, MPI_Comm comm, int *flag, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Imrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Info_c2f(MPI_Info info); __attribute__((visibility("default"))) int PMPI_Info_create(MPI_Info *info); __attribute__((visibility("default"))) int PMPI_Info_delete(MPI_Info info, const char *key); __attribute__((visibility("default"))) int PMPI_Info_dup(MPI_Info info, MPI_Info *newinfo); __attribute__((visibility("default"))) MPI_Info PMPI_Info_f2c(int info); __attribute__((visibility("default"))) int PMPI_Info_free(MPI_Info *info); __attribute__((visibility("default"))) int PMPI_Info_get(MPI_Info info, const char *key, int valuelen, char *value, int *flag); __attribute__((visibility("default"))) int PMPI_Info_get_nkeys(MPI_Info info, int *nkeys); __attribute__((visibility("default"))) int PMPI_Info_get_nthkey(MPI_Info info, int n, char *key); __attribute__((visibility("default"))) int PMPI_Info_get_valuelen(MPI_Info info, const char *key, int *valuelen, int *flag); __attribute__((visibility("default"))) int PMPI_Info_set(MPI_Info info, const char *key, const char *value); __attribute__((visibility("default"))) int PMPI_Init(int *argc, char ***argv); __attribute__((visibility("default"))) int PMPI_Initialized(int *flag); __attribute__((visibility("default"))) int PMPI_Init_thread(int *argc, char ***argv, int required, int *provided); __attribute__((visibility("default"))) int PMPI_Intercomm_create(MPI_Comm local_comm, int local_leader, MPI_Comm bridge_comm, int remote_leader, int tag, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int PMPI_Intercomm_merge(MPI_Comm intercomm, int high, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int PMPI_Iprobe(int source, int tag, MPI_Comm comm, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Irecv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Irsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Isend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Issend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Is_thread_main(int *flag); __attribute__((visibility("default"))) int PMPI_Keyval_create(MPI_Copy_function *copy_fn, MPI_Delete_function *delete_fn, int *keyval, void *extra_state) ; __attribute__((visibility("default"))) int PMPI_Keyval_free(int *keyval) ; __attribute__((visibility("default"))) int PMPI_Lookup_name(const char *service_name, MPI_Info info, char *port_name); __attribute__((visibility("default"))) int PMPI_Message_c2f(MPI_Message message); __attribute__((visibility("default"))) MPI_Message PMPI_Message_f2c(int message); __attribute__((visibility("default"))) int PMPI_Mprobe(int source, int tag, MPI_Comm comm, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Mrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Neighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Op_c2f(MPI_Op op); __attribute__((visibility("default"))) int PMPI_Op_commutative(MPI_Op op, int *commute); __attribute__((visibility("default"))) int PMPI_Op_create(MPI_User_function *function, int commute, MPI_Op *op); __attribute__((visibility("default"))) int PMPI_Open_port(MPI_Info info, char *port_name); __attribute__((visibility("default"))) MPI_Op PMPI_Op_f2c(int op); __attribute__((visibility("default"))) int PMPI_Op_free(MPI_Op *op); __attribute__((visibility("default"))) int PMPI_Pack_external(const char datarep[], const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, MPI_Aint outsize, MPI_Aint *position); __attribute__((visibility("default"))) int PMPI_Pack_external_size(const char datarep[], int incount, MPI_Datatype datatype, MPI_Aint *size); __attribute__((visibility("default"))) int PMPI_Pack(const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, int outsize, int *position, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Pack_size(int incount, MPI_Datatype datatype, MPI_Comm comm, int *size); __attribute__((visibility("default"))) int PMPI_Pcontrol(const int level, ...); __attribute__((visibility("default"))) int PMPI_Probe(int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Publish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int PMPI_Put(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Query_thread(int *provided); __attribute__((visibility("default"))) int PMPI_Raccumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Recv_init(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Recv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Reduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ireduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Reduce_local(const void *inbuf, void *inoutbuf, int count, MPI_Datatype datatype, MPI_Op); __attribute__((visibility("default"))) int PMPI_Reduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ireduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Reduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ireduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Register_datarep(const char *datarep, MPI_Datarep_conversion_function *read_conversion_fn, MPI_Datarep_conversion_function *write_conversion_fn, MPI_Datarep_extent_function *dtype_file_extent_fn, void *extra_state); __attribute__((visibility("default"))) int PMPI_Request_c2f(MPI_Request request); __attribute__((visibility("default"))) MPI_Request PMPI_Request_f2c(int request); __attribute__((visibility("default"))) int PMPI_Request_free(MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Request_get_status(MPI_Request request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Rget(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Rget_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Rput(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_cout, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Rsend(const void *ibuf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Rsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Scan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Scatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iscatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Scatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iscatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Send_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Send(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Sendrecv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, int dest, int sendtag, void *recvbuf, int recvcount, MPI_Datatype recvtype, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Sendrecv_replace(void * buf, int count, MPI_Datatype datatype, int dest, int sendtag, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Ssend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Ssend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Start(MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Startall(int count, MPI_Request array_of_requests[]); __attribute__((visibility("default"))) int PMPI_Status_c2f(const MPI_Status *c_status, int *f_status); __attribute__((visibility("default"))) int PMPI_Status_f2c(const int *f_status, MPI_Status *c_status); __attribute__((visibility("default"))) int PMPI_Status_set_cancelled(MPI_Status *status, int flag); __attribute__((visibility("default"))) int PMPI_Status_set_elements(MPI_Status *status, MPI_Datatype datatype, int count); __attribute__((visibility("default"))) int PMPI_Status_set_elements_x(MPI_Status *status, MPI_Datatype datatype, MPI_Count count); __attribute__((visibility("default"))) int PMPI_Testall(int count, MPI_Request array_of_requests[], int *flag, MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Testany(int count, MPI_Request array_of_requests[], int *index, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Test(MPI_Request *request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Test_cancelled(const MPI_Status *status, int *flag); __attribute__((visibility("default"))) int PMPI_Testsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Topo_test(MPI_Comm comm, int *status); __attribute__((visibility("default"))) int PMPI_Type_c2f(MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_Type_commit(MPI_Datatype *type); __attribute__((visibility("default"))) int PMPI_Type_contiguous(int count, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_darray(int size, int rank, int ndims, const int gsize_array[], const int distrib_array[], const int darg_array[], const int psize_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_f90_complex(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_f90_integer(int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_f90_real(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_hindexed(int count, const int array_of_blocklengths[], const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_keyval(MPI_Type_copy_attr_function *type_copy_attr_fn, MPI_Type_delete_attr_function *type_delete_attr_fn, int *type_keyval, void *extra_state); __attribute__((visibility("default"))) int PMPI_Type_create_hindexed_block(int count, int blocklength, const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_indexed_block(int count, int blocklength, const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_struct(int count, const int array_of_block_lengths[], const MPI_Aint array_of_displacements[], const MPI_Datatype array_of_types[], MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_subarray(int ndims, const int size_array[], const int subsize_array[], const int start_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_resized(MPI_Datatype oldtype, MPI_Aint lb, MPI_Aint extent, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_delete_attr(MPI_Datatype type, int type_keyval); __attribute__((visibility("default"))) int PMPI_Type_dup(MPI_Datatype type, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_extent(MPI_Datatype type, MPI_Aint *extent) ; __attribute__((visibility("default"))) int PMPI_Type_free(MPI_Datatype *type); __attribute__((visibility("default"))) int PMPI_Type_free_keyval(int *type_keyval); __attribute__((visibility("default"))) MPI_Datatype PMPI_Type_f2c(int datatype); __attribute__((visibility("default"))) int PMPI_Type_get_attr(MPI_Datatype type, int type_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int PMPI_Type_get_contents(MPI_Datatype mtype, int max_integers, int max_addresses, int max_datatypes, int array_of_integers[], MPI_Aint array_of_addresses[], MPI_Datatype array_of_datatypes[]); __attribute__((visibility("default"))) int PMPI_Type_get_envelope(MPI_Datatype type, int *num_integers, int *num_addresses, int *num_datatypes, int *combiner); __attribute__((visibility("default"))) int PMPI_Type_get_extent(MPI_Datatype type, MPI_Aint *lb, MPI_Aint *extent); __attribute__((visibility("default"))) int PMPI_Type_get_extent_x(MPI_Datatype type, MPI_Count *lb, MPI_Count *extent); __attribute__((visibility("default"))) int PMPI_Type_get_name(MPI_Datatype type, char *type_name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Type_get_true_extent(MPI_Datatype datatype, MPI_Aint *true_lb, MPI_Aint *true_extent); __attribute__((visibility("default"))) int PMPI_Type_get_true_extent_x(MPI_Datatype datatype, MPI_Count *true_lb, MPI_Count *true_extent); __attribute__((visibility("default"))) int PMPI_Type_hindexed(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int PMPI_Type_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int PMPI_Type_indexed(int count, const int array_of_blocklengths[], const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_lb(MPI_Datatype type, MPI_Aint *lb) ; __attribute__((visibility("default"))) int PMPI_Type_match_size(int typeclass, int size, MPI_Datatype *type); __attribute__((visibility("default"))) int PMPI_Type_set_attr(MPI_Datatype type, int type_keyval, void *attr_val); __attribute__((visibility("default"))) int PMPI_Type_set_name(MPI_Datatype type, const char *type_name); __attribute__((visibility("default"))) int PMPI_Type_size(MPI_Datatype type, int *size); __attribute__((visibility("default"))) int PMPI_Type_size_x(MPI_Datatype type, MPI_Count *size); __attribute__((visibility("default"))) int PMPI_Type_struct(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype array_of_types[], MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int PMPI_Type_ub(MPI_Datatype mtype, MPI_Aint *ub) ; __attribute__((visibility("default"))) int PMPI_Type_vector(int count, int blocklength, int stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Unpack(const void *inbuf, int insize, int *position, void *outbuf, int outcount, MPI_Datatype datatype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Unpublish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int PMPI_Unpack_external (const char datarep[], const void *inbuf, MPI_Aint insize, MPI_Aint *position, void *outbuf, int outcount, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_Waitall(int count, MPI_Request array_of_requests[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Waitany(int count, MPI_Request array_of_requests[], int *index, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Wait(MPI_Request *request, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Waitsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Win_allocate(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_allocate_shared(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_attach(MPI_Win win, void *base, MPI_Aint size); __attribute__((visibility("default"))) int PMPI_Win_c2f(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_call_errhandler(MPI_Win win, int errorcode); __attribute__((visibility("default"))) int PMPI_Win_complete(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_create(void *base, MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_create_dynamic(MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_create_errhandler(MPI_Win_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Win_create_keyval(MPI_Win_copy_attr_function *win_copy_attr_fn, MPI_Win_delete_attr_function *win_delete_attr_fn, int *win_keyval, void *extra_state); __attribute__((visibility("default"))) int PMPI_Win_delete_attr(MPI_Win win, int win_keyval); __attribute__((visibility("default"))) int PMPI_Win_detach(MPI_Win win, const void *base); __attribute__((visibility("default"))) MPI_Win PMPI_Win_f2c(int win); __attribute__((visibility("default"))) int PMPI_Win_fence(int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush(int rank, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush_all(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush_local(int rank, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush_local_all(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_free(MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_free_keyval(int *win_keyval); __attribute__((visibility("default"))) int PMPI_Win_get_attr(MPI_Win win, int win_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int PMPI_Win_get_errhandler(MPI_Win win, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Win_get_group(MPI_Win win, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Win_get_info(MPI_Win win, MPI_Info *info_used); __attribute__((visibility("default"))) int PMPI_Win_get_name(MPI_Win win, char *win_name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Win_lock(int lock_type, int rank, int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_lock_all(int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_post(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_set_attr(MPI_Win win, int win_keyval, void *attribute_val); __attribute__((visibility("default"))) int PMPI_Win_set_errhandler(MPI_Win win, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_Win_set_info(MPI_Win win, MPI_Info info); __attribute__((visibility("default"))) int PMPI_Win_set_name(MPI_Win win, const char *win_name); __attribute__((visibility("default"))) int PMPI_Win_shared_query(MPI_Win win, int rank, MPI_Aint *size, int *disp_unit, void *baseptr); __attribute__((visibility("default"))) int PMPI_Win_start(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_sync(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_test(MPI_Win win, int *flag); __attribute__((visibility("default"))) int PMPI_Win_unlock(int rank, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_unlock_all(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_wait(MPI_Win win); __attribute__((visibility("default"))) double PMPI_Wtick(void); __attribute__((visibility("default"))) double PMPI_Wtime(void); __attribute__((visibility("default"))) int PMPI_T_init_thread (int required, int *provided); __attribute__((visibility("default"))) int PMPI_T_finalize (void); __attribute__((visibility("default"))) int PMPI_T_cvar_get_num (int *num_cvar); __attribute__((visibility("default"))) int PMPI_T_cvar_get_info (int cvar_index, char *name, int *name_len, int *verbosity, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *scope); __attribute__((visibility("default"))) int PMPI_T_cvar_get_index (const char *name, int *cvar_index); __attribute__((visibility("default"))) int PMPI_T_cvar_handle_alloc (int cvar_index, void *obj_handle, MPI_T_cvar_handle *handle, int *count); __attribute__((visibility("default"))) int PMPI_T_cvar_handle_free (MPI_T_cvar_handle *handle); __attribute__((visibility("default"))) int PMPI_T_cvar_read (MPI_T_cvar_handle handle, void *buf); __attribute__((visibility("default"))) int PMPI_T_cvar_write (MPI_T_cvar_handle handle, const void *buf); __attribute__((visibility("default"))) int PMPI_T_category_get_num(int *num_cat); __attribute__((visibility("default"))) int PMPI_T_category_get_info(int cat_index, char *name, int *name_len, char *desc, int *desc_len, int *num_cvars, int *num_pvars, int *num_categories); __attribute__((visibility("default"))) int PMPI_T_category_get_index (const char *name, int *category_index); __attribute__((visibility("default"))) int PMPI_T_category_get_cvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int PMPI_T_category_get_pvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int PMPI_T_category_get_categories(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int PMPI_T_category_changed(int *stamp); __attribute__((visibility("default"))) int PMPI_T_pvar_get_num(int *num_pvar); __attribute__((visibility("default"))) int PMPI_T_pvar_get_info(int pvar_index, char *name, int *name_len, int *verbosity, int *var_class, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *readonly, int *continuous, int *atomic); __attribute__((visibility("default"))) int PMPI_T_pvar_get_index (const char *name, int var_class, int *pvar_index); __attribute__((visibility("default"))) int PMPI_T_pvar_session_create(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int PMPI_T_pvar_session_free(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int PMPI_T_pvar_handle_alloc(MPI_T_pvar_session session, int pvar_index, void *obj_handle, MPI_T_pvar_handle *handle, int *count); __attribute__((visibility("default"))) int PMPI_T_pvar_handle_free(MPI_T_pvar_session session, MPI_T_pvar_handle *handle); __attribute__((visibility("default"))) int PMPI_T_pvar_start(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int PMPI_T_pvar_stop(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int PMPI_T_pvar_read(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int PMPI_T_pvar_write(MPI_T_pvar_session session, MPI_T_pvar_handle handle, const void *buf); __attribute__((visibility("default"))) int PMPI_T_pvar_reset(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int PMPI_T_pvar_readreset(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int PMPI_T_enum_get_info(MPI_T_enum enumtype, int *num, char *name, int *name_len); __attribute__((visibility("default"))) int PMPI_T_enum_get_item(MPI_T_enum enumtype, int index, int *value, char *name, int *name_len); __attribute__((visibility("default"))) int MPI_T_init_thread (int required, int *provided); __attribute__((visibility("default"))) int MPI_T_finalize (void); __attribute__((visibility("default"))) int MPI_T_cvar_get_num (int *num_cvar); __attribute__((visibility("default"))) int MPI_T_cvar_get_info (int cvar_index, char *name, int *name_len, int *verbosity, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *scope); __attribute__((visibility("default"))) int MPI_T_cvar_get_index (const char *name, int *cvar_index); __attribute__((visibility("default"))) int MPI_T_cvar_handle_alloc (int cvar_index, void *obj_handle, MPI_T_cvar_handle *handle, int *count); __attribute__((visibility("default"))) int MPI_T_cvar_handle_free (MPI_T_cvar_handle *handle); __attribute__((visibility("default"))) int MPI_T_cvar_read (MPI_T_cvar_handle handle, void *buf); __attribute__((visibility("default"))) int MPI_T_cvar_write (MPI_T_cvar_handle handle, const void *buf); __attribute__((visibility("default"))) int MPI_T_category_get_num(int *num_cat); __attribute__((visibility("default"))) int MPI_T_category_get_info(int cat_index, char *name, int *name_len, char *desc, int *desc_len, int *num_cvars, int *num_pvars, int *num_categories); __attribute__((visibility("default"))) int MPI_T_category_get_index (const char *name, int *category_index); __attribute__((visibility("default"))) int MPI_T_category_get_cvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int MPI_T_category_get_pvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int MPI_T_category_get_categories(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int MPI_T_category_changed(int *stamp); __attribute__((visibility("default"))) int MPI_T_pvar_get_num(int *num_pvar); __attribute__((visibility("default"))) int MPI_T_pvar_get_info(int pvar_index, char *name, int *name_len, int *verbosity, int *var_class, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *readonly, int *continuous, int *atomic); __attribute__((visibility("default"))) int MPI_T_pvar_get_index (const char *name, int var_class, int *pvar_index); __attribute__((visibility("default"))) int MPI_T_pvar_session_create(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int MPI_T_pvar_session_free(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int MPI_T_pvar_handle_alloc(MPI_T_pvar_session session, int pvar_index, void *obj_handle, MPI_T_pvar_handle *handle, int *count); __attribute__((visibility("default"))) int MPI_T_pvar_handle_free(MPI_T_pvar_session session, MPI_T_pvar_handle *handle); __attribute__((visibility("default"))) int MPI_T_pvar_start(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int MPI_T_pvar_stop(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int MPI_T_pvar_read(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int MPI_T_pvar_write(MPI_T_pvar_session session, MPI_T_pvar_handle handle, const void *buf); __attribute__((visibility("default"))) int MPI_T_pvar_reset(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int MPI_T_pvar_readreset(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int MPI_T_enum_get_info(MPI_T_enum enumtype, int *num, char *name, int *name_len); __attribute__((visibility("default"))) int MPI_T_enum_get_item(MPI_T_enum enumtype, int index, int *value, char *name, int *name_len); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Found header files ['mpi.h'] in ['/usr/include', '/usr/lib/openmpi'] Popping language C ================================================================================ TEST configureConversion from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:217) TESTING: configureConversion from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:217) Check for the functions which convert communicators between C and Fortran - Define HAVE_MPI_COMM_F2C and HAVE_MPI_COMM_C2F if they are present - Some older MPI 1 implementations are missing these All intermediate test results are stored in /tmp/petsc-KvGRNM/config.packages.MPI Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Comm_f2c((MPI_Fint)0)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MPI_COMM_F2C" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Comm_c2f(MPI_COMM_WORLD)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MPI_COMM_C2F" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:6:10: warning: unused variable 'a' [-Wunused-variable] MPI_Fint a; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_Fint a; ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MPI_FINT" to "1" ================================================================================ TEST configureMPI2 from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:180) TESTING: configureMPI2 from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:180) Check for functions added to the interface in MPI-2 Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int flag;if (MPI_Finalized(&flag)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MPI_FINALIZED" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Allreduce(MPI_IN_PLACE,0, 1, MPI_INT, MPI_SUM, MPI_COMM_SELF)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MPI_IN_PLACE" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:6:94: warning: initialization makes pointer from integer without a cast [-Wint-conversion] int count=2; int blocklens[2]={0,1}; MPI_Aint indices[2]={0,1}; MPI_Datatype old_types[2]={0,1}; MPI_Datatype *newtype = 0; ^ /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:6:94: note: (near initialization for 'old_types[1]') Source: #include "confdefs.h" #include "conffix.h" #include int main() { int count=2; int blocklens[2]={0,1}; MPI_Aint indices[2]={0,1}; MPI_Datatype old_types[2]={0,1}; MPI_Datatype *newtype = 0; if (MPI_Type_create_struct(count, blocklens, indices, old_types, newtype)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_Comm_errhandler_fn * p_err_fun = 0; MPI_Errhandler * p_errhandler = 0; if (MPI_Comm_create_errhandler(p_err_fun,p_errhandler)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Comm_set_errhandler(MPI_COMM_WORLD,MPI_ERRORS_RETURN)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { if (MPI_Reduce_local(0, 0, 0, MPI_INT, MPI_SUM));; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MPI_REDUCE_LOCAL" to "1" ================================================================================ TEST configureTypes from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:237) TESTING: configureTypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:237) Checking for MPI types Checking for size of type: MPI_Comm Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif #define MPICH_IGNORE_CXX_SEEK #define MPICH_SKIP_MPICXX 1 #define OMPI_SKIP_MPICXX 1 #include int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(MPI_Comm)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_MPI_COMM" to "8" Checking for size of type: MPI_Fint Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif #define MPICH_IGNORE_CXX_SEEK #define MPICH_SKIP_MPICXX 1 #define OMPI_SKIP_MPICXX 1 #include int main() { FILE *f = fopen("conftestval", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(MPI_Fint)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.types/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.types/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.types/conftest Executing: /tmp/petsc-KvGRNM/config.types/conftest Popping language C Defined "SIZEOF_MPI_FINT" to "4" ================================================================================ TEST configureMPITypes from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:249) TESTING: configureMPITypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:249) Checking for MPI Datatype handles Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_LONG_DOUBLE, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_LONG_DOUBLE, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.packages.MPI/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.packages.MPI/conftest Executing: /tmp/petsc-KvGRNM/config.packages.MPI/conftest Defined "HAVE_MPI_LONG_DOUBLE" to "1" Popping language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_INT64_T, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_INT64_T, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.packages.MPI/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.packages.MPI/conftest Executing: /tmp/petsc-KvGRNM/config.packages.MPI/conftest Defined "HAVE_MPI_INT64_T" to "1" Popping language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_C_DOUBLE_COMPLEX, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #ifdef PETSC_HAVE_STDLIB_H #include #endif #include int main() { MPI_Aint size; int ierr; MPI_Init(0,0); ierr = MPI_Type_extent(MPI_C_DOUBLE_COMPLEX, &size); if(ierr || (size == 0)) exit(1); MPI_Finalize(); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.packages.MPI/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.packages.MPI/conftest Executing: /tmp/petsc-KvGRNM/config.packages.MPI/conftest Defined "HAVE_MPI_C_DOUBLE_COMPLEX" to "1" Popping language C ================================================================================ TEST configureMissingPrototypes from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:325) TESTING: configureMissingPrototypes from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:325) Checks for missing prototypes, which it adds to petscfix.h ================================================================================ TEST SGIMPICheck from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:344) TESTING: SGIMPICheck from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:344) Returns true if SGI MPI is used Checking for functions [MPI_SGI_barrier] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_SGI_barrier(); static void _check_MPI_SGI_barrier() { MPI_SGI_barrier(); } int main() { _check_MPI_SGI_barrier();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_MPI_SGI_barrier': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `MPI_SGI_barrier' collect2: error: ld returned 1 exit status Popping language C SGI MPI test failure ================================================================================ TEST CxxMPICheck from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:354) TESTING: CxxMPICheck from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:354) Make sure C++ can compile and link Pushing language Cxx Checking for header mpi.h Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.utilities.missing -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.libraries/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { ; return 0; } Checking for C++ MPI_Finalize() Checking for functions [MPI_Finalize] in library [] [] Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.libraries -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/config.libraries/conftest.cc Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.cc: In function 'void _check_MPI_Finalize()': /tmp/petsc-KvGRNM/config.libraries/conftest.cc:5:41: warning: variable 'ierr' set but not used [-Wunused-but-set-variable] static void _check_MPI_Finalize() { int ierr; ^~~~ Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_MPI_Finalize() { int ierr; ierr = MPI_Finalize();; } int main() { _check_MPI_Finalize();; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/config.libraries/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language Cxx Popping language Cxx ================================================================================ TEST FortranMPICheck from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:372) TESTING: FortranMPICheck from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:372) Make sure fortran include [mpif.h] and library symbols are found Pushing language FC Checking for header mpif.h Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.libraries/conftest.F Successful compile: Source: program main #include "mpif.h" end Checking for fortran mpi_init() Checking for functions [] in library [] [] Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.libraries/conftest.F Successful compile: Source: program main #include "mpif.h" integer ierr call mpi_init(ierr) end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language FC Checking for mpi.mod Checking for functions [] in library [] [] Pushing language FC Executing: mpif90 -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.libraries/conftest.F Successful compile: Source: program main use mpi integer ierr,rank call mpi_init(ierr) call mpi_comm_rank(MPI_COMM_WORLD,rank,ierr) end Pushing language FC Popping language FC Executing: mpif90 -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language FC Defined "HAVE_MPI_F90MODULE" to "1" Popping language FC ================================================================================ TEST configureIO from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:397) TESTING: configureIO from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:397) Check for the functions in MPI/IO - Define HAVE_MPIIO if they are present - Some older MPI 1 implementations are missing these Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_Aint lb, extent; if (MPI_Type_get_extent(MPI_INT, &lb, &extent)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:9:5: warning: 'fh' is used uninitialized in this function [-Wuninitialized] if (MPI_File_write_all(fh, buf, 1, MPI_INT, &status)); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:9:5: warning: 'buf' is used uninitialized in this function [-Wuninitialized] Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; void *buf; MPI_Status status; if (MPI_File_write_all(fh, buf, 1, MPI_INT, &status)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:9:5: warning: 'fh' is used uninitialized in this function [-Wuninitialized] if (MPI_File_read_all(fh, buf, 1, MPI_INT, &status)); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:9:5: warning: 'buf' is used uninitialized in this function [-Wuninitialized] Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; void *buf; MPI_Status status; if (MPI_File_read_all(fh, buf, 1, MPI_INT, &status)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:9:5: warning: 'fh' is used uninitialized in this function [-Wuninitialized] if (MPI_File_set_view(fh, disp, MPI_INT, MPI_INT, "", info)); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:9:5: warning: 'disp' is used uninitialized in this function [-Wuninitialized] /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:9:5: warning: 'info' is used uninitialized in this function [-Wuninitialized] Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; MPI_Offset disp; MPI_Info info; if (MPI_File_set_view(fh, disp, MPI_INT, MPI_INT, "", info)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:8:5: warning: 'info' is used uninitialized in this function [-Wuninitialized] if (MPI_File_open(MPI_COMM_SELF, "", 0, info, &fh)); ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; MPI_Info info; if (MPI_File_open(MPI_COMM_SELF, "", 0, info, &fh)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:7:10: warning: unused variable 'info' [-Wunused-variable] MPI_Info info; ^~~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { MPI_File fh; MPI_Info info; if (MPI_File_close(&fh)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_MPIIO" to "1" ================================================================================ TEST findMPIInc from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:460) TESTING: findMPIInc from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:460) Find MPI include paths from "mpicc -show" and use with CUDAC_FLAGS ================================================================================ TEST checkMPICHorOpenMPI from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:434) TESTING: checkMPICHorOpenMPI from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:434) Determine if MPICH_NUMVERSION or OMPI_MAJOR_VERSION exist in mpi.h Used for consistency checking of MPI installation at compile time Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:4:17: error: 'MPICH_NUMVERSION' undeclared here (not in a function) int mpich_ver = MPICH_NUMVERSION; ^~~~~~~~~~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include int mpich_ver = MPICH_NUMVERSION; int main() { ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int ompi_major = OMPI_MAJOR_VERSION; int ompi_minor = OMPI_MINOR_VERSION; int ompi_release = OMPI_RELEASE_VERSION; int main() { ; return 0; } Source: #include "confdefs.h" #include "conffix.h" #include int ompi_major = OMPI_MAJOR_VERSION; int ompi_minor = OMPI_MINOR_VERSION; int ompi_release = OMPI_RELEASE_VERSION; Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.packages.MPI /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.packages.MPI/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.packages.MPI/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" 2 # 1 "/usr/include/mpi.h" 1 3 4 # 225 "/usr/include/mpi.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long int ptrdiff_t; # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 426 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef struct { long long __max_align_ll __attribute__((__aligned__(__alignof__(long long)))); long double __max_align_ld __attribute__((__aligned__(__alignof__(long double)))); } max_align_t; # 226 "/usr/include/mpi.h" 2 3 4 # 258 "/usr/include/mpi.h" 3 4 # 1 "/usr/include/mpi_portable_platform.h" 1 3 4 # 259 "/usr/include/mpi.h" 2 3 4 # 323 "/usr/include/mpi.h" 3 4 typedef ptrdiff_t MPI_Aint; typedef long long MPI_Offset; typedef long long MPI_Count; typedef struct ompi_communicator_t *MPI_Comm; typedef struct ompi_datatype_t *MPI_Datatype; typedef struct ompi_errhandler_t *MPI_Errhandler; typedef struct ompi_file_t *MPI_File; typedef struct ompi_group_t *MPI_Group; typedef struct ompi_info_t *MPI_Info; typedef struct ompi_op_t *MPI_Op; typedef struct ompi_request_t *MPI_Request; typedef struct ompi_message_t *MPI_Message; typedef struct ompi_status_public_t MPI_Status; typedef struct ompi_win_t *MPI_Win; typedef struct mca_base_var_enum_t *MPI_T_enum; typedef struct ompi_mpit_cvar_handle_t *MPI_T_cvar_handle; typedef struct mca_base_pvar_handle_t *MPI_T_pvar_handle; typedef struct mca_base_pvar_session_t *MPI_T_pvar_session; struct ompi_status_public_t { int MPI_SOURCE; int MPI_TAG; int MPI_ERROR; int _cancelled; size_t _ucount; }; typedef struct ompi_status_public_t ompi_status_public_t; # 370 "/usr/include/mpi.h" 3 4 typedef int (MPI_Copy_function)(MPI_Comm, int, void *, void *, void *, int *); typedef int (MPI_Delete_function)(MPI_Comm, int, void *, void *); typedef int (MPI_Datarep_extent_function)(MPI_Datatype, MPI_Aint *, void *); typedef int (MPI_Datarep_conversion_function)(void *, MPI_Datatype, int, void *, MPI_Offset, void *); typedef void (MPI_Comm_errhandler_function)(MPI_Comm *, int *, ...); typedef MPI_Comm_errhandler_function MPI_Comm_errhandler_fn ; typedef void (ompi_file_errhandler_fn)(MPI_File *, int *, ...); typedef ompi_file_errhandler_fn MPI_File_errhandler_fn ; typedef ompi_file_errhandler_fn MPI_File_errhandler_function; typedef void (MPI_Win_errhandler_function)(MPI_Win *, int *, ...); typedef MPI_Win_errhandler_function MPI_Win_errhandler_fn ; typedef void (MPI_Handler_function)(MPI_Comm *, int *, ...); typedef void (MPI_User_function)(void *, void *, int *, MPI_Datatype *); typedef int (MPI_Comm_copy_attr_function)(MPI_Comm, int, void *, void *, void *, int *); typedef int (MPI_Comm_delete_attr_function)(MPI_Comm, int, void *, void *); typedef int (MPI_Type_copy_attr_function)(MPI_Datatype, int, void *, void *, void *, int *); typedef int (MPI_Type_delete_attr_function)(MPI_Datatype, int, void *, void *); typedef int (MPI_Win_copy_attr_function)(MPI_Win, int, void *, void *, void *, int *); typedef int (MPI_Win_delete_attr_function)(MPI_Win, int, void *, void *); typedef int (MPI_Grequest_query_function)(void *, MPI_Status *); typedef int (MPI_Grequest_free_function)(void *); typedef int (MPI_Grequest_cancel_function)(void *, int); # 506 "/usr/include/mpi.h" 3 4 enum { MPI_TAG_UB, MPI_HOST, MPI_IO, MPI_WTIME_IS_GLOBAL, MPI_APPNUM, MPI_LASTUSEDCODE, MPI_UNIVERSE_SIZE, MPI_WIN_BASE, MPI_WIN_SIZE, MPI_WIN_DISP_UNIT, MPI_WIN_CREATE_FLAVOR, MPI_WIN_MODEL, IMPI_CLIENT_SIZE, IMPI_CLIENT_COLOR, IMPI_HOST_SIZE, IMPI_HOST_COLOR }; # 623 "/usr/include/mpi.h" 3 4 enum { MPI_IDENT, MPI_CONGRUENT, MPI_SIMILAR, MPI_UNEQUAL }; enum { MPI_THREAD_SINGLE, MPI_THREAD_FUNNELED, MPI_THREAD_SERIALIZED, MPI_THREAD_MULTIPLE }; enum { MPI_COMBINER_NAMED, MPI_COMBINER_DUP, MPI_COMBINER_CONTIGUOUS, MPI_COMBINER_VECTOR, MPI_COMBINER_HVECTOR_INTEGER, MPI_COMBINER_HVECTOR, MPI_COMBINER_INDEXED, MPI_COMBINER_HINDEXED_INTEGER, MPI_COMBINER_HINDEXED, MPI_COMBINER_INDEXED_BLOCK, MPI_COMBINER_STRUCT_INTEGER, MPI_COMBINER_STRUCT, MPI_COMBINER_SUBARRAY, MPI_COMBINER_DARRAY, MPI_COMBINER_F90_REAL, MPI_COMBINER_F90_COMPLEX, MPI_COMBINER_F90_INTEGER, MPI_COMBINER_RESIZED, MPI_COMBINER_HINDEXED_BLOCK }; enum { MPI_COMM_TYPE_SHARED }; enum { MPI_T_VERBOSITY_USER_BASIC, MPI_T_VERBOSITY_USER_DETAIL, MPI_T_VERBOSITY_USER_ALL, MPI_T_VERBOSITY_TUNER_BASIC, MPI_T_VERBOSITY_TUNER_DETAIL, MPI_T_VERBOSITY_TUNER_ALL, MPI_T_VERBOSITY_MPIDEV_BASIC, MPI_T_VERBOSITY_MPIDEV_DETAIL, MPI_T_VERBOSITY_MPIDEV_ALL }; enum { MPI_T_SCOPE_CONSTANT, MPI_T_SCOPE_READONLY, MPI_T_SCOPE_LOCAL, MPI_T_SCOPE_GROUP, MPI_T_SCOPE_GROUP_EQ, MPI_T_SCOPE_ALL, MPI_T_SCOPE_ALL_EQ }; enum { MPI_T_BIND_NO_OBJECT, MPI_T_BIND_MPI_COMM, MPI_T_BIND_MPI_DATATYPE, MPI_T_BIND_MPI_ERRHANDLER, MPI_T_BIND_MPI_FILE, MPI_T_BIND_MPI_GROUP, MPI_T_BIND_MPI_OP, MPI_T_BIND_MPI_REQUEST, MPI_T_BIND_MPI_WIN, MPI_T_BIND_MPI_MESSAGE, MPI_T_BIND_MPI_INFO }; enum { MPI_T_PVAR_CLASS_STATE, MPI_T_PVAR_CLASS_LEVEL, MPI_T_PVAR_CLASS_SIZE, MPI_T_PVAR_CLASS_PERCENTAGE, MPI_T_PVAR_CLASS_HIGHWATERMARK, MPI_T_PVAR_CLASS_LOWWATERMARK, MPI_T_PVAR_CLASS_COUNTER, MPI_T_PVAR_CLASS_AGGREGATE, MPI_T_PVAR_CLASS_TIMER, MPI_T_PVAR_CLASS_GENERIC }; # 812 "/usr/include/mpi.h" 3 4 __attribute__((visibility("default"))) int OMPI_C_MPI_TYPE_NULL_DELETE_FN( MPI_Datatype datatype, int type_keyval, void* attribute_val_out, void* extra_state ); __attribute__((visibility("default"))) int OMPI_C_MPI_TYPE_NULL_COPY_FN( MPI_Datatype datatype, int type_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_TYPE_DUP_FN( MPI_Datatype datatype, int type_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_COMM_NULL_DELETE_FN( MPI_Comm comm, int comm_keyval, void* attribute_val_out, void* extra_state ); __attribute__((visibility("default"))) int OMPI_C_MPI_COMM_NULL_COPY_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_COMM_DUP_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_NULL_DELETE_FN( MPI_Comm comm, int comm_keyval, void* attribute_val_out, void* extra_state ) ; __attribute__((visibility("default"))) int OMPI_C_MPI_NULL_COPY_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ) ; __attribute__((visibility("default"))) int OMPI_C_MPI_DUP_FN( MPI_Comm comm, int comm_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ) ; __attribute__((visibility("default"))) int OMPI_C_MPI_WIN_NULL_DELETE_FN( MPI_Win window, int win_keyval, void* attribute_val_out, void* extra_state ); __attribute__((visibility("default"))) int OMPI_C_MPI_WIN_NULL_COPY_FN( MPI_Win window, int win_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); __attribute__((visibility("default"))) int OMPI_C_MPI_WIN_DUP_FN( MPI_Win window, int win_keyval, void* extra_state, void* attribute_val_in, void* attribute_val_out, int* flag ); # 882 "/usr/include/mpi.h" 3 4 __attribute__((visibility("default"))) extern struct ompi_predefined_communicator_t ompi_mpi_comm_world; __attribute__((visibility("default"))) extern struct ompi_predefined_communicator_t ompi_mpi_comm_self; __attribute__((visibility("default"))) extern struct ompi_predefined_communicator_t ompi_mpi_comm_null; __attribute__((visibility("default"))) extern struct ompi_predefined_group_t ompi_mpi_group_empty; __attribute__((visibility("default"))) extern struct ompi_predefined_group_t ompi_mpi_group_null; __attribute__((visibility("default"))) extern struct ompi_predefined_request_t ompi_request_null; __attribute__((visibility("default"))) extern struct ompi_predefined_message_t ompi_message_null; __attribute__((visibility("default"))) extern struct ompi_predefined_message_t ompi_message_no_proc; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_null; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_min; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_max; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_sum; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_prod; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_land; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_band; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_lor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_bor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_lxor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_bxor; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_maxloc; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_minloc; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_replace; __attribute__((visibility("default"))) extern struct ompi_predefined_op_t ompi_mpi_op_no_op; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_datatype_null; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_lb ; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_ub ; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_char; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_signed_char; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_char; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_byte; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_short; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_short; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_long; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long_long_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_unsigned_long_long; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_float; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_double; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long_double; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_wchar; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_packed; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_bool; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_cplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_dblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cxx_ldblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_character; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_dblprec; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_cplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_dblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_ldblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2integer; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2real; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2dblprec; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2cplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_2dblcplex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_float_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_double_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_longdbl_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_short_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_long_int; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical1; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical2; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical4; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_logical8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer1; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer2; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer4; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_integer16; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real2; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real4; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_real16; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_complex8; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_complex16; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_complex32; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int8_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint8_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int16_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint16_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int32_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint32_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_int64_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_uint64_t; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_aint; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_offset; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_count; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_bool; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_float_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_double_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_datatype_t ompi_mpi_c_long_double_complex; __attribute__((visibility("default"))) extern struct ompi_predefined_errhandler_t ompi_mpi_errhandler_null; __attribute__((visibility("default"))) extern struct ompi_predefined_errhandler_t ompi_mpi_errors_are_fatal; __attribute__((visibility("default"))) extern struct ompi_predefined_errhandler_t ompi_mpi_errors_return; __attribute__((visibility("default"))) extern struct ompi_predefined_win_t ompi_mpi_win_null; __attribute__((visibility("default"))) extern struct ompi_predefined_file_t ompi_mpi_file_null; __attribute__((visibility("default"))) extern struct ompi_predefined_info_t ompi_mpi_info_null; __attribute__((visibility("default"))) extern struct ompi_predefined_info_t ompi_mpi_info_env; __attribute__((visibility("default"))) extern int *MPI_F_STATUS_IGNORE; __attribute__((visibility("default"))) extern int *MPI_F_STATUSES_IGNORE; # 1180 "/usr/include/mpi.h" 3 4 __attribute__((visibility("default"))) int MPI_Abort(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int MPI_Accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int MPI_Add_error_class(int *errorclass); __attribute__((visibility("default"))) int MPI_Add_error_code(int errorclass, int *errorcode); __attribute__((visibility("default"))) int MPI_Add_error_string(int errorcode, const char *string); __attribute__((visibility("default"))) int MPI_Address(void *location, MPI_Aint *address) ; __attribute__((visibility("default"))) int MPI_Allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iallgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iallgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alloc_mem(MPI_Aint size, MPI_Info info, void *baseptr); __attribute__((visibility("default"))) int MPI_Allreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iallreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ialltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ialltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Alltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ialltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Attr_delete(MPI_Comm comm, int keyval) ; __attribute__((visibility("default"))) int MPI_Attr_get(MPI_Comm comm, int keyval, void *attribute_val, int *flag) ; __attribute__((visibility("default"))) int MPI_Attr_put(MPI_Comm comm, int keyval, void *attribute_val) ; __attribute__((visibility("default"))) int MPI_Barrier(MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ibarrier(MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Bcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Bsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ibcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Bsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Buffer_attach(void *buffer, int size); __attribute__((visibility("default"))) int MPI_Buffer_detach(void *buffer, int *size); __attribute__((visibility("default"))) int MPI_Cancel(MPI_Request *request); __attribute__((visibility("default"))) int MPI_Cart_coords(MPI_Comm comm, int rank, int maxdims, int coords[]); __attribute__((visibility("default"))) int MPI_Cart_create(MPI_Comm old_comm, int ndims, const int dims[], const int periods[], int reorder, MPI_Comm *comm_cart); __attribute__((visibility("default"))) int MPI_Cart_get(MPI_Comm comm, int maxdims, int dims[], int periods[], int coords[]); __attribute__((visibility("default"))) int MPI_Cart_map(MPI_Comm comm, int ndims, const int dims[], const int periods[], int *newrank); __attribute__((visibility("default"))) int MPI_Cart_rank(MPI_Comm comm, const int coords[], int *rank); __attribute__((visibility("default"))) int MPI_Cart_shift(MPI_Comm comm, int direction, int disp, int *rank_source, int *rank_dest); __attribute__((visibility("default"))) int MPI_Cart_sub(MPI_Comm comm, const int remain_dims[], MPI_Comm *new_comm); __attribute__((visibility("default"))) int MPI_Cartdim_get(MPI_Comm comm, int *ndims); __attribute__((visibility("default"))) int MPI_Close_port(const char *port_name); __attribute__((visibility("default"))) int MPI_Comm_accept(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_c2f(MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Comm_call_errhandler(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int MPI_Comm_compare(MPI_Comm comm1, MPI_Comm comm2, int *result); __attribute__((visibility("default"))) int MPI_Comm_connect(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_create_errhandler(MPI_Comm_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Comm_create_keyval(MPI_Comm_copy_attr_function *comm_copy_attr_fn, MPI_Comm_delete_attr_function *comm_delete_attr_fn, int *comm_keyval, void *extra_state); __attribute__((visibility("default"))) int MPI_Comm_create_group(MPI_Comm comm, MPI_Group group, int tag, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_create(MPI_Comm comm, MPI_Group group, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_delete_attr(MPI_Comm comm, int comm_keyval); __attribute__((visibility("default"))) int MPI_Comm_disconnect(MPI_Comm *comm); __attribute__((visibility("default"))) int MPI_Comm_dup(MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_idup(MPI_Comm comm, MPI_Comm *newcomm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Comm_dup_with_info(MPI_Comm comm, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) MPI_Comm MPI_Comm_f2c(int comm); __attribute__((visibility("default"))) int MPI_Comm_free_keyval(int *comm_keyval); __attribute__((visibility("default"))) int MPI_Comm_free(MPI_Comm *comm); __attribute__((visibility("default"))) int MPI_Comm_get_attr(MPI_Comm comm, int comm_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int MPI_Dist_graph_create(MPI_Comm comm_old, int n, const int nodes[], const int degrees[], const int targets[], const int weights[], MPI_Info info, int reorder, MPI_Comm * newcomm); __attribute__((visibility("default"))) int MPI_Dist_graph_create_adjacent(MPI_Comm comm_old, int indegree, const int sources[], const int sourceweights[], int outdegree, const int destinations[], const int destweights[], MPI_Info info, int reorder, MPI_Comm *comm_dist_graph); __attribute__((visibility("default"))) int MPI_Dist_graph_neighbors(MPI_Comm comm, int maxindegree, int sources[], int sourceweights[], int maxoutdegree, int destinations[], int destweights[]); __attribute__((visibility("default"))) int MPI_Dist_graph_neighbors_count(MPI_Comm comm, int *inneighbors, int *outneighbors, int *weighted); __attribute__((visibility("default"))) int MPI_Comm_get_errhandler(MPI_Comm comm, MPI_Errhandler *erhandler); __attribute__((visibility("default"))) int MPI_Comm_get_info(MPI_Comm comm, MPI_Info *info_used); __attribute__((visibility("default"))) int MPI_Comm_get_name(MPI_Comm comm, char *comm_name, int *resultlen); __attribute__((visibility("default"))) int MPI_Comm_get_parent(MPI_Comm *parent); __attribute__((visibility("default"))) int MPI_Comm_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int MPI_Comm_join(int fd, MPI_Comm *intercomm); __attribute__((visibility("default"))) int MPI_Comm_rank(MPI_Comm comm, int *rank); __attribute__((visibility("default"))) int MPI_Comm_remote_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int MPI_Comm_remote_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int MPI_Comm_set_attr(MPI_Comm comm, int comm_keyval, void *attribute_val); __attribute__((visibility("default"))) int MPI_Comm_set_errhandler(MPI_Comm comm, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_Comm_set_info(MPI_Comm comm, MPI_Info info); __attribute__((visibility("default"))) int MPI_Comm_set_name(MPI_Comm comm, const char *comm_name); __attribute__((visibility("default"))) int MPI_Comm_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int MPI_Comm_spawn(const char *command, char *argv[], int maxprocs, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int MPI_Comm_spawn_multiple(int count, char *array_of_commands[], char **array_of_argv[], const int array_of_maxprocs[], const MPI_Info array_of_info[], int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int MPI_Comm_split(MPI_Comm comm, int color, int key, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_split_type(MPI_Comm comm, int split_type, int key, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) int MPI_Comm_test_inter(MPI_Comm comm, int *flag); __attribute__((visibility("default"))) int MPI_Compare_and_swap(const void *origin_addr, const void *compare_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Win win); __attribute__((visibility("default"))) int MPI_Dims_create(int nnodes, int ndims, int dims[]); __attribute__((visibility("default"))) int MPI_Errhandler_c2f(MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_Errhandler_create(MPI_Handler_function *function, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) MPI_Errhandler MPI_Errhandler_f2c(int errhandler); __attribute__((visibility("default"))) int MPI_Errhandler_free(MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Errhandler_get(MPI_Comm comm, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) int MPI_Errhandler_set(MPI_Comm comm, MPI_Errhandler errhandler) ; __attribute__((visibility("default"))) int MPI_Error_class(int errorcode, int *errorclass); __attribute__((visibility("default"))) int MPI_Error_string(int errorcode, char *string, int *resultlen); __attribute__((visibility("default"))) int MPI_Exscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Fetch_and_op(const void *origin_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int MPI_Iexscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_c2f(MPI_File file); __attribute__((visibility("default"))) MPI_File MPI_File_f2c(int file); __attribute__((visibility("default"))) int MPI_File_call_errhandler(MPI_File fh, int errorcode); __attribute__((visibility("default"))) int MPI_File_create_errhandler(MPI_File_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_File_set_errhandler( MPI_File file, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_File_get_errhandler( MPI_File file, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_File_open(MPI_Comm comm, const char *filename, int amode, MPI_Info info, MPI_File *fh); __attribute__((visibility("default"))) int MPI_File_close(MPI_File *fh); __attribute__((visibility("default"))) int MPI_File_delete(const char *filename, MPI_Info info); __attribute__((visibility("default"))) int MPI_File_set_size(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int MPI_File_preallocate(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int MPI_File_get_size(MPI_File fh, MPI_Offset *size); __attribute__((visibility("default"))) int MPI_File_get_group(MPI_File fh, MPI_Group *group); __attribute__((visibility("default"))) int MPI_File_get_amode(MPI_File fh, int *amode); __attribute__((visibility("default"))) int MPI_File_set_info(MPI_File fh, MPI_Info info); __attribute__((visibility("default"))) int MPI_File_get_info(MPI_File fh, MPI_Info *info_used); __attribute__((visibility("default"))) int MPI_File_set_view(MPI_File fh, MPI_Offset disp, MPI_Datatype etype, MPI_Datatype filetype, const char *datarep, MPI_Info info); __attribute__((visibility("default"))) int MPI_File_get_view(MPI_File fh, MPI_Offset *disp, MPI_Datatype *etype, MPI_Datatype *filetype, char *datarep); __attribute__((visibility("default"))) int MPI_File_read_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_at_all(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_at_all(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_iread_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_iwrite_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_read(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_all(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_all(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_iread(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_iwrite(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_seek(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int MPI_File_get_position(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int MPI_File_get_byte_offset(MPI_File fh, MPI_Offset offset, MPI_Offset *disp); __attribute__((visibility("default"))) int MPI_File_read_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_iread_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_iwrite_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int MPI_File_read_ordered(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_ordered(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_seek_shared(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int MPI_File_get_position_shared(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int MPI_File_read_at_all_begin(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_read_at_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_at_all_begin(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_write_at_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_all_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_read_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_all_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_write_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_read_ordered_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_read_ordered_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_write_ordered_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_File_write_ordered_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int MPI_File_get_type_extent(MPI_File fh, MPI_Datatype datatype, MPI_Aint *extent); __attribute__((visibility("default"))) int MPI_File_set_atomicity(MPI_File fh, int flag); __attribute__((visibility("default"))) int MPI_File_get_atomicity(MPI_File fh, int *flag); __attribute__((visibility("default"))) int MPI_File_sync(MPI_File fh); __attribute__((visibility("default"))) int MPI_Finalize(void); __attribute__((visibility("default"))) int MPI_Finalized(int *flag); __attribute__((visibility("default"))) int MPI_Free_mem(void *base); __attribute__((visibility("default"))) int MPI_Gather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Igather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Gatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Igatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Get_address(const void *location, MPI_Aint *address); __attribute__((visibility("default"))) int MPI_Get_count(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int MPI_Get_elements(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int MPI_Get_elements_x(const MPI_Status *status, MPI_Datatype datatype, MPI_Count *count); __attribute__((visibility("default"))) int MPI_Get(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int MPI_Get_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int MPI_Get_library_version(char *version, int *resultlen); __attribute__((visibility("default"))) int MPI_Get_processor_name(char *name, int *resultlen); __attribute__((visibility("default"))) int MPI_Get_version(int *version, int *subversion); __attribute__((visibility("default"))) int MPI_Graph_create(MPI_Comm comm_old, int nnodes, const int index[], const int edges[], int reorder, MPI_Comm *comm_graph); __attribute__((visibility("default"))) int MPI_Graph_get(MPI_Comm comm, int maxindex, int maxedges, int index[], int edges[]); __attribute__((visibility("default"))) int MPI_Graph_map(MPI_Comm comm, int nnodes, const int index[], const int edges[], int *newrank); __attribute__((visibility("default"))) int MPI_Graph_neighbors_count(MPI_Comm comm, int rank, int *nneighbors); __attribute__((visibility("default"))) int MPI_Graph_neighbors(MPI_Comm comm, int rank, int maxneighbors, int neighbors[]); __attribute__((visibility("default"))) int MPI_Graphdims_get(MPI_Comm comm, int *nnodes, int *nedges); __attribute__((visibility("default"))) int MPI_Grequest_complete(MPI_Request request); __attribute__((visibility("default"))) int MPI_Grequest_start(MPI_Grequest_query_function *query_fn, MPI_Grequest_free_function *free_fn, MPI_Grequest_cancel_function *cancel_fn, void *extra_state, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Group_c2f(MPI_Group group); __attribute__((visibility("default"))) int MPI_Group_compare(MPI_Group group1, MPI_Group group2, int *result); __attribute__((visibility("default"))) int MPI_Group_difference(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_excl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) MPI_Group MPI_Group_f2c(int group); __attribute__((visibility("default"))) int MPI_Group_free(MPI_Group *group); __attribute__((visibility("default"))) int MPI_Group_incl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_intersection(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_range_excl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_range_incl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Group_rank(MPI_Group group, int *rank); __attribute__((visibility("default"))) int MPI_Group_size(MPI_Group group, int *size); __attribute__((visibility("default"))) int MPI_Group_translate_ranks(MPI_Group group1, int n, const int ranks1[], MPI_Group group2, int ranks2[]); __attribute__((visibility("default"))) int MPI_Group_union(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int MPI_Ibsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Improbe(int source, int tag, MPI_Comm comm, int *flag, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Imrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Info_c2f(MPI_Info info); __attribute__((visibility("default"))) int MPI_Info_create(MPI_Info *info); __attribute__((visibility("default"))) int MPI_Info_delete(MPI_Info info, const char *key); __attribute__((visibility("default"))) int MPI_Info_dup(MPI_Info info, MPI_Info *newinfo); __attribute__((visibility("default"))) MPI_Info MPI_Info_f2c(int info); __attribute__((visibility("default"))) int MPI_Info_free(MPI_Info *info); __attribute__((visibility("default"))) int MPI_Info_get(MPI_Info info, const char *key, int valuelen, char *value, int *flag); __attribute__((visibility("default"))) int MPI_Info_get_nkeys(MPI_Info info, int *nkeys); __attribute__((visibility("default"))) int MPI_Info_get_nthkey(MPI_Info info, int n, char *key); __attribute__((visibility("default"))) int MPI_Info_get_valuelen(MPI_Info info, const char *key, int *valuelen, int *flag); __attribute__((visibility("default"))) int MPI_Info_set(MPI_Info info, const char *key, const char *value); __attribute__((visibility("default"))) int MPI_Init(int *argc, char ***argv); __attribute__((visibility("default"))) int MPI_Initialized(int *flag); __attribute__((visibility("default"))) int MPI_Init_thread(int *argc, char ***argv, int required, int *provided); __attribute__((visibility("default"))) int MPI_Intercomm_create(MPI_Comm local_comm, int local_leader, MPI_Comm bridge_comm, int remote_leader, int tag, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int MPI_Intercomm_merge(MPI_Comm intercomm, int high, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int MPI_Iprobe(int source, int tag, MPI_Comm comm, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Irecv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Irsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Isend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Issend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Is_thread_main(int *flag); __attribute__((visibility("default"))) int MPI_Keyval_create(MPI_Copy_function *copy_fn, MPI_Delete_function *delete_fn, int *keyval, void *extra_state) ; __attribute__((visibility("default"))) int MPI_Keyval_free(int *keyval) ; __attribute__((visibility("default"))) int MPI_Lookup_name(const char *service_name, MPI_Info info, char *port_name); __attribute__((visibility("default"))) int MPI_Message_c2f(MPI_Message message); __attribute__((visibility("default"))) MPI_Message MPI_Message_f2c(int message); __attribute__((visibility("default"))) int MPI_Mprobe(int source, int tag, MPI_Comm comm, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Mrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Neighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Neighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ineighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Op_c2f(MPI_Op op); __attribute__((visibility("default"))) int MPI_Op_commutative(MPI_Op op, int *commute); __attribute__((visibility("default"))) int MPI_Op_create(MPI_User_function *function, int commute, MPI_Op *op); __attribute__((visibility("default"))) int MPI_Open_port(MPI_Info info, char *port_name); __attribute__((visibility("default"))) MPI_Op MPI_Op_f2c(int op); __attribute__((visibility("default"))) int MPI_Op_free(MPI_Op *op); __attribute__((visibility("default"))) int MPI_Pack_external(const char datarep[], const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, MPI_Aint outsize, MPI_Aint *position); __attribute__((visibility("default"))) int MPI_Pack_external_size(const char datarep[], int incount, MPI_Datatype datatype, MPI_Aint *size); __attribute__((visibility("default"))) int MPI_Pack(const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, int outsize, int *position, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Pack_size(int incount, MPI_Datatype datatype, MPI_Comm comm, int *size); __attribute__((visibility("default"))) int MPI_Pcontrol(const int level, ...); __attribute__((visibility("default"))) int MPI_Probe(int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Publish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int MPI_Put(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int MPI_Query_thread(int *provided); __attribute__((visibility("default"))) int MPI_Raccumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Recv_init(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Recv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Reduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ireduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Reduce_local(const void *inbuf, void *inoutbuf, int count, MPI_Datatype datatype, MPI_Op op); __attribute__((visibility("default"))) int MPI_Reduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ireduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Reduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Ireduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Register_datarep(const char *datarep, MPI_Datarep_conversion_function *read_conversion_fn, MPI_Datarep_conversion_function *write_conversion_fn, MPI_Datarep_extent_function *dtype_file_extent_fn, void *extra_state); __attribute__((visibility("default"))) int MPI_Request_c2f(MPI_Request request); __attribute__((visibility("default"))) MPI_Request MPI_Request_f2c(int request); __attribute__((visibility("default"))) int MPI_Request_free(MPI_Request *request); __attribute__((visibility("default"))) int MPI_Request_get_status(MPI_Request request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Rget(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Rget_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Rput(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_cout, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Rsend(const void *ibuf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Rsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Scan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Scatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iscatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Scatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Iscatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Send_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Send(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Sendrecv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, int dest, int sendtag, void *recvbuf, int recvcount, MPI_Datatype recvtype, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Sendrecv_replace(void * buf, int count, MPI_Datatype datatype, int dest, int sendtag, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Ssend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int MPI_Ssend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Start(MPI_Request *request); __attribute__((visibility("default"))) int MPI_Startall(int count, MPI_Request array_of_requests[]); __attribute__((visibility("default"))) int MPI_Status_c2f(const MPI_Status *c_status, int *f_status); __attribute__((visibility("default"))) int MPI_Status_f2c(const int *f_status, MPI_Status *c_status); __attribute__((visibility("default"))) int MPI_Status_set_cancelled(MPI_Status *status, int flag); __attribute__((visibility("default"))) int MPI_Status_set_elements(MPI_Status *status, MPI_Datatype datatype, int count); __attribute__((visibility("default"))) int MPI_Status_set_elements_x(MPI_Status *status, MPI_Datatype datatype, MPI_Count count); __attribute__((visibility("default"))) int MPI_Testall(int count, MPI_Request array_of_requests[], int *flag, MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int MPI_Testany(int count, MPI_Request array_of_requests[], int *index, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Test(MPI_Request *request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Test_cancelled(const MPI_Status *status, int *flag); __attribute__((visibility("default"))) int MPI_Testsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int MPI_Topo_test(MPI_Comm comm, int *status); __attribute__((visibility("default"))) int MPI_Type_c2f(MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_Type_commit(MPI_Datatype *type); __attribute__((visibility("default"))) int MPI_Type_contiguous(int count, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_darray(int size, int rank, int ndims, const int gsize_array[], const int distrib_array[], const int darg_array[], const int psize_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_f90_complex(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_f90_integer(int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_f90_real(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_hindexed_block(int count, int blocklength, const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_hindexed(int count, const int array_of_blocklengths[], const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_keyval(MPI_Type_copy_attr_function *type_copy_attr_fn, MPI_Type_delete_attr_function *type_delete_attr_fn, int *type_keyval, void *extra_state); __attribute__((visibility("default"))) int MPI_Type_create_indexed_block(int count, int blocklength, const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_struct(int count, const int array_of_block_lengths[], const MPI_Aint array_of_displacements[], const MPI_Datatype array_of_types[], MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_subarray(int ndims, const int size_array[], const int subsize_array[], const int start_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_create_resized(MPI_Datatype oldtype, MPI_Aint lb, MPI_Aint extent, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_delete_attr(MPI_Datatype type, int type_keyval); __attribute__((visibility("default"))) int MPI_Type_dup(MPI_Datatype type, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_extent(MPI_Datatype type, MPI_Aint *extent) ; __attribute__((visibility("default"))) int MPI_Type_free(MPI_Datatype *type); __attribute__((visibility("default"))) int MPI_Type_free_keyval(int *type_keyval); __attribute__((visibility("default"))) MPI_Datatype MPI_Type_f2c(int datatype); __attribute__((visibility("default"))) int MPI_Type_get_attr(MPI_Datatype type, int type_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int MPI_Type_get_contents(MPI_Datatype mtype, int max_integers, int max_addresses, int max_datatypes, int array_of_integers[], MPI_Aint array_of_addresses[], MPI_Datatype array_of_datatypes[]); __attribute__((visibility("default"))) int MPI_Type_get_envelope(MPI_Datatype type, int *num_integers, int *num_addresses, int *num_datatypes, int *combiner); __attribute__((visibility("default"))) int MPI_Type_get_extent(MPI_Datatype type, MPI_Aint *lb, MPI_Aint *extent); __attribute__((visibility("default"))) int MPI_Type_get_extent_x(MPI_Datatype type, MPI_Count *lb, MPI_Count *extent); __attribute__((visibility("default"))) int MPI_Type_get_name(MPI_Datatype type, char *type_name, int *resultlen); __attribute__((visibility("default"))) int MPI_Type_get_true_extent(MPI_Datatype datatype, MPI_Aint *true_lb, MPI_Aint *true_extent); __attribute__((visibility("default"))) int MPI_Type_get_true_extent_x(MPI_Datatype datatype, MPI_Count *true_lb, MPI_Count *true_extent); __attribute__((visibility("default"))) int MPI_Type_hindexed(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int MPI_Type_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int MPI_Type_indexed(int count, const int array_of_blocklengths[], const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Type_lb(MPI_Datatype type, MPI_Aint *lb) ; __attribute__((visibility("default"))) int MPI_Type_match_size(int typeclass, int size, MPI_Datatype *type); __attribute__((visibility("default"))) int MPI_Type_set_attr(MPI_Datatype type, int type_keyval, void *attr_val); __attribute__((visibility("default"))) int MPI_Type_set_name(MPI_Datatype type, const char *type_name); __attribute__((visibility("default"))) int MPI_Type_size(MPI_Datatype type, int *size); __attribute__((visibility("default"))) int MPI_Type_size_x(MPI_Datatype type, MPI_Count *size); __attribute__((visibility("default"))) int MPI_Type_struct(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype array_of_types[], MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int MPI_Type_ub(MPI_Datatype mtype, MPI_Aint *ub) ; __attribute__((visibility("default"))) int MPI_Type_vector(int count, int blocklength, int stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int MPI_Unpack(const void *inbuf, int insize, int *position, void *outbuf, int outcount, MPI_Datatype datatype, MPI_Comm comm); __attribute__((visibility("default"))) int MPI_Unpublish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int MPI_Unpack_external (const char datarep[], const void *inbuf, MPI_Aint insize, MPI_Aint *position, void *outbuf, int outcount, MPI_Datatype datatype); __attribute__((visibility("default"))) int MPI_Waitall(int count, MPI_Request array_of_requests[], MPI_Status *array_of_statuses); __attribute__((visibility("default"))) int MPI_Waitany(int count, MPI_Request array_of_requests[], int *index, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Wait(MPI_Request *request, MPI_Status *status); __attribute__((visibility("default"))) int MPI_Waitsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int MPI_Win_allocate(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_allocate_shared(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_attach(MPI_Win win, void *base, MPI_Aint size); __attribute__((visibility("default"))) int MPI_Win_c2f(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_call_errhandler(MPI_Win win, int errorcode); __attribute__((visibility("default"))) int MPI_Win_complete(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_create(void *base, MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_create_dynamic(MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_create_errhandler(MPI_Win_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Win_create_keyval(MPI_Win_copy_attr_function *win_copy_attr_fn, MPI_Win_delete_attr_function *win_delete_attr_fn, int *win_keyval, void *extra_state); __attribute__((visibility("default"))) int MPI_Win_delete_attr(MPI_Win win, int win_keyval); __attribute__((visibility("default"))) int MPI_Win_detach(MPI_Win win, const void *base); __attribute__((visibility("default"))) MPI_Win MPI_Win_f2c(int win); __attribute__((visibility("default"))) int MPI_Win_fence(int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush(int rank, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush_all(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush_local(int rank, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_flush_local_all(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_free(MPI_Win *win); __attribute__((visibility("default"))) int MPI_Win_free_keyval(int *win_keyval); __attribute__((visibility("default"))) int MPI_Win_get_attr(MPI_Win win, int win_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int MPI_Win_get_errhandler(MPI_Win win, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int MPI_Win_get_group(MPI_Win win, MPI_Group *group); __attribute__((visibility("default"))) int MPI_Win_get_info(MPI_Win win, MPI_Info *info_used); __attribute__((visibility("default"))) int MPI_Win_get_name(MPI_Win win, char *win_name, int *resultlen); __attribute__((visibility("default"))) int MPI_Win_lock(int lock_type, int rank, int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_lock_all(int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_post(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_set_attr(MPI_Win win, int win_keyval, void *attribute_val); __attribute__((visibility("default"))) int MPI_Win_set_errhandler(MPI_Win win, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int MPI_Win_set_info(MPI_Win win, MPI_Info info); __attribute__((visibility("default"))) int MPI_Win_set_name(MPI_Win win, const char *win_name); __attribute__((visibility("default"))) int MPI_Win_shared_query(MPI_Win win, int rank, MPI_Aint *size, int *disp_unit, void *baseptr); __attribute__((visibility("default"))) int MPI_Win_start(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_sync(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_test(MPI_Win win, int *flag); __attribute__((visibility("default"))) int MPI_Win_unlock(int rank, MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_unlock_all(MPI_Win win); __attribute__((visibility("default"))) int MPI_Win_wait(MPI_Win win); __attribute__((visibility("default"))) double MPI_Wtick(void); __attribute__((visibility("default"))) double MPI_Wtime(void); __attribute__((visibility("default"))) int PMPI_Abort(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int PMPI_Accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Add_error_class(int *errorclass); __attribute__((visibility("default"))) int PMPI_Add_error_code(int errorclass, int *errorcode); __attribute__((visibility("default"))) int PMPI_Add_error_string(int errorcode, const char *string); __attribute__((visibility("default"))) int PMPI_Address(void *location, MPI_Aint *address) ; __attribute__((visibility("default"))) int PMPI_Allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iallgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iallgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alloc_mem(MPI_Aint size, MPI_Info info, void *baseptr); __attribute__((visibility("default"))) int PMPI_Allreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iallreduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ialltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ialltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Alltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ialltoallw(const void *sendbuf, const int sendcounts[], const int sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const int rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Attr_delete(MPI_Comm comm, int keyval) ; __attribute__((visibility("default"))) int PMPI_Attr_get(MPI_Comm comm, int keyval, void *attribute_val, int *flag) ; __attribute__((visibility("default"))) int PMPI_Dist_graph_create(MPI_Comm comm_old, int n, const int nodes[], const int degrees[], const int targets[], const int weights[], MPI_Info info, int reorder, MPI_Comm * newcomm); __attribute__((visibility("default"))) int PMPI_Dist_graph_create_adjacent(MPI_Comm comm_old, int indegree, const int sources[], const int sourceweights[], int outdegree, const int destinations[], const int destweights[], MPI_Info info, int reorder, MPI_Comm *comm_dist_graph); __attribute__((visibility("default"))) int PMPI_Dist_graph_neighbors(MPI_Comm comm, int maxindegree, int sources[], int sourceweights[], int maxoutdegree, int destinations[], int destweights[]); __attribute__((visibility("default"))) int PMPI_Dist_graph_neighbors_count(MPI_Comm comm, int *inneighbors, int *outneighbors, int *weighted); __attribute__((visibility("default"))) int PMPI_Attr_put(MPI_Comm comm, int keyval, void *attribute_val) ; __attribute__((visibility("default"))) int PMPI_Barrier(MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ibarrier(MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Bcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ibcast(void *buffer, int count, MPI_Datatype datatype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Bsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Bsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Buffer_attach(void *buffer, int size); __attribute__((visibility("default"))) int PMPI_Buffer_detach(void *buffer, int *size); __attribute__((visibility("default"))) int PMPI_Cancel(MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Cart_coords(MPI_Comm comm, int rank, int maxdims, int coords[]); __attribute__((visibility("default"))) int PMPI_Cart_create(MPI_Comm old_comm, int ndims, const int dims[], const int periods[], int reorder, MPI_Comm *comm_cart); __attribute__((visibility("default"))) int PMPI_Cart_get(MPI_Comm comm, int maxdims, int dims[], int periods[], int coords[]); __attribute__((visibility("default"))) int PMPI_Cart_map(MPI_Comm comm, int ndims, const int dims[], const int periods[], int *newrank); __attribute__((visibility("default"))) int PMPI_Cart_rank(MPI_Comm comm, const int coords[], int *rank); __attribute__((visibility("default"))) int PMPI_Cart_shift(MPI_Comm comm, int direction, int disp, int *rank_source, int *rank_dest); __attribute__((visibility("default"))) int PMPI_Cart_sub(MPI_Comm comm, const int remain_dims[], MPI_Comm *new_comm); __attribute__((visibility("default"))) int PMPI_Cartdim_get(MPI_Comm comm, int *ndims); __attribute__((visibility("default"))) int PMPI_Close_port(const char *port_name); __attribute__((visibility("default"))) int PMPI_Comm_accept(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_c2f(MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Comm_call_errhandler(MPI_Comm comm, int errorcode); __attribute__((visibility("default"))) int PMPI_Comm_compare(MPI_Comm comm1, MPI_Comm comm2, int *result); __attribute__((visibility("default"))) int PMPI_Comm_connect(const char *port_name, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_create_errhandler(MPI_Comm_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Comm_create_keyval(MPI_Comm_copy_attr_function *comm_copy_attr_fn, MPI_Comm_delete_attr_function *comm_delete_attr_fn, int *comm_keyval, void *extra_state); __attribute__((visibility("default"))) int PMPI_Comm_create_group(MPI_Comm comm, MPI_Group group, int tag, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_create(MPI_Comm comm, MPI_Group group, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_delete_attr(MPI_Comm comm, int comm_keyval); __attribute__((visibility("default"))) int PMPI_Comm_disconnect(MPI_Comm *comm); __attribute__((visibility("default"))) int PMPI_Comm_dup(MPI_Comm comm, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_idup(MPI_Comm comm, MPI_Comm *newcomm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Comm_dup_with_info(MPI_Comm comm, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) MPI_Comm PMPI_Comm_f2c(int comm); __attribute__((visibility("default"))) int PMPI_Comm_free_keyval(int *comm_keyval); __attribute__((visibility("default"))) int PMPI_Comm_free(MPI_Comm *comm); __attribute__((visibility("default"))) int PMPI_Comm_get_attr(MPI_Comm comm, int comm_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int PMPI_Comm_get_errhandler(MPI_Comm comm, MPI_Errhandler *erhandler); __attribute__((visibility("default"))) int PMPI_Comm_get_info(MPI_Comm comm, MPI_Info *info_used); __attribute__((visibility("default"))) int PMPI_Comm_get_name(MPI_Comm comm, char *comm_name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Comm_get_parent(MPI_Comm *parent); __attribute__((visibility("default"))) int PMPI_Comm_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Comm_join(int fd, MPI_Comm *intercomm); __attribute__((visibility("default"))) int PMPI_Comm_rank(MPI_Comm comm, int *rank); __attribute__((visibility("default"))) int PMPI_Comm_remote_group(MPI_Comm comm, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Comm_remote_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int PMPI_Comm_set_attr(MPI_Comm comm, int comm_keyval, void *attribute_val); __attribute__((visibility("default"))) int PMPI_Comm_set_errhandler(MPI_Comm comm, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_Comm_set_info(MPI_Comm comm, MPI_Info info); __attribute__((visibility("default"))) int PMPI_Comm_set_name(MPI_Comm comm, const char *comm_name); __attribute__((visibility("default"))) int PMPI_Comm_size(MPI_Comm comm, int *size); __attribute__((visibility("default"))) int PMPI_Comm_spawn(const char *command, char *argv[], int maxprocs, MPI_Info info, int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int PMPI_Comm_spawn_multiple(int count, char *array_of_commands[], char **array_of_argv[], const int array_of_maxprocs[], const MPI_Info array_of_info[], int root, MPI_Comm comm, MPI_Comm *intercomm, int array_of_errcodes[]); __attribute__((visibility("default"))) int PMPI_Comm_split(MPI_Comm comm, int color, int key, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_split_type(MPI_Comm comm, int split_type, int key, MPI_Info info, MPI_Comm *newcomm); __attribute__((visibility("default"))) int PMPI_Comm_test_inter(MPI_Comm comm, int *flag); __attribute__((visibility("default"))) int PMPI_Compare_and_swap(const void *origin_addr, const void *compare_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Dims_create(int nnodes, int ndims, int dims[]); __attribute__((visibility("default"))) int PMPI_Errhandler_c2f(MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_Errhandler_create(MPI_Handler_function *function, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) MPI_Errhandler PMPI_Errhandler_f2c(int errhandler); __attribute__((visibility("default"))) int PMPI_Errhandler_free(MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Errhandler_get(MPI_Comm comm, MPI_Errhandler *errhandler) ; __attribute__((visibility("default"))) int PMPI_Errhandler_set(MPI_Comm comm, MPI_Errhandler errhandler) ; __attribute__((visibility("default"))) int PMPI_Error_class(int errorcode, int *errorclass); __attribute__((visibility("default"))) int PMPI_Error_string(int errorcode, char *string, int *resultlen); __attribute__((visibility("default"))) int PMPI_Exscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Fetch_and_op(const void *origin_addr, void *result_addr, MPI_Datatype datatype, int target_rank, MPI_Aint target_disp, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Iexscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_c2f(MPI_File file); __attribute__((visibility("default"))) MPI_File PMPI_File_f2c(int file); __attribute__((visibility("default"))) int PMPI_File_call_errhandler(MPI_File fh, int errorcode); __attribute__((visibility("default"))) int PMPI_File_create_errhandler(MPI_File_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_File_set_errhandler( MPI_File file, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_File_get_errhandler( MPI_File file, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_File_open(MPI_Comm comm, const char *filename, int amode, MPI_Info info, MPI_File *fh); __attribute__((visibility("default"))) int PMPI_File_close(MPI_File *fh); __attribute__((visibility("default"))) int PMPI_File_delete(const char *filename, MPI_Info info); __attribute__((visibility("default"))) int PMPI_File_set_size(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int PMPI_File_preallocate(MPI_File fh, MPI_Offset size); __attribute__((visibility("default"))) int PMPI_File_get_size(MPI_File fh, MPI_Offset *size); __attribute__((visibility("default"))) int PMPI_File_get_group(MPI_File fh, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_File_get_amode(MPI_File fh, int *amode); __attribute__((visibility("default"))) int PMPI_File_set_info(MPI_File fh, MPI_Info info); __attribute__((visibility("default"))) int PMPI_File_get_info(MPI_File fh, MPI_Info *info_used); __attribute__((visibility("default"))) int PMPI_File_set_view(MPI_File fh, MPI_Offset disp, MPI_Datatype etype, MPI_Datatype filetype, const char *datarep, MPI_Info info); __attribute__((visibility("default"))) int PMPI_File_get_view(MPI_File fh, MPI_Offset *disp, MPI_Datatype *etype, MPI_Datatype *filetype, char *datarep); __attribute__((visibility("default"))) int PMPI_File_read_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_at_all(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_at_all(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_iread_at(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_iwrite_at(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_read(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_all(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_all(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_iread(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_iwrite(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_seek(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int PMPI_File_get_position(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int PMPI_File_get_byte_offset(MPI_File fh, MPI_Offset offset, MPI_Offset *disp); __attribute__((visibility("default"))) int PMPI_File_read_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_iread_shared(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_iwrite_shared(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_File_read_ordered(MPI_File fh, void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_ordered(MPI_File fh, const void *buf, int count, MPI_Datatype datatype, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_seek_shared(MPI_File fh, MPI_Offset offset, int whence); __attribute__((visibility("default"))) int PMPI_File_get_position_shared(MPI_File fh, MPI_Offset *offset); __attribute__((visibility("default"))) int PMPI_File_read_at_all_begin(MPI_File fh, MPI_Offset offset, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_read_at_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_at_all_begin(MPI_File fh, MPI_Offset offset, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_write_at_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_all_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_read_all_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_all_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_write_all_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_read_ordered_begin(MPI_File fh, void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_read_ordered_end(MPI_File fh, void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_write_ordered_begin(MPI_File fh, const void *buf, int count, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_File_write_ordered_end(MPI_File fh, const void *buf, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_File_get_type_extent(MPI_File fh, MPI_Datatype datatype, MPI_Aint *extent); __attribute__((visibility("default"))) int PMPI_File_set_atomicity(MPI_File fh, int flag); __attribute__((visibility("default"))) int PMPI_File_get_atomicity(MPI_File fh, int *flag); __attribute__((visibility("default"))) int PMPI_File_sync(MPI_File fh); __attribute__((visibility("default"))) int PMPI_Finalize(void); __attribute__((visibility("default"))) int PMPI_Finalized(int *flag); __attribute__((visibility("default"))) int PMPI_Free_mem(void *base); __attribute__((visibility("default"))) int PMPI_Gather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Igather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Gatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Igatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Get_address(const void *location, MPI_Aint *address); __attribute__((visibility("default"))) int PMPI_Get_count(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int PMPI_Get_elements(const MPI_Status *status, MPI_Datatype datatype, int *count); __attribute__((visibility("default"))) int PMPI_Get_elements_x(const MPI_Status *status, MPI_Datatype datatype, MPI_Count *count); __attribute__((visibility("default"))) int PMPI_Get(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Get_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Get_library_version(char *version, int *resultlen); __attribute__((visibility("default"))) int PMPI_Get_processor_name(char *name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Get_version(int *version, int *subversion); __attribute__((visibility("default"))) int PMPI_Graph_create(MPI_Comm comm_old, int nnodes, const int index[], const int edges[], int reorder, MPI_Comm *comm_graph); __attribute__((visibility("default"))) int PMPI_Graph_get(MPI_Comm comm, int maxindex, int maxedges, int index[], int edges[]); __attribute__((visibility("default"))) int PMPI_Graph_map(MPI_Comm comm, int nnodes, const int index[], const int edges[], int *newrank); __attribute__((visibility("default"))) int PMPI_Graph_neighbors_count(MPI_Comm comm, int rank, int *nneighbors); __attribute__((visibility("default"))) int PMPI_Graph_neighbors(MPI_Comm comm, int rank, int maxneighbors, int neighbors[]); __attribute__((visibility("default"))) int PMPI_Graphdims_get(MPI_Comm comm, int *nnodes, int *nedges); __attribute__((visibility("default"))) int PMPI_Grequest_complete(MPI_Request request); __attribute__((visibility("default"))) int PMPI_Grequest_start(MPI_Grequest_query_function *query_fn, MPI_Grequest_free_function *free_fn, MPI_Grequest_cancel_function *cancel_fn, void *extra_state, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Group_c2f(MPI_Group group); __attribute__((visibility("default"))) int PMPI_Group_compare(MPI_Group group1, MPI_Group group2, int *result); __attribute__((visibility("default"))) int PMPI_Group_difference(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_excl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) MPI_Group PMPI_Group_f2c(int group); __attribute__((visibility("default"))) int PMPI_Group_free(MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Group_incl(MPI_Group group, int n, const int ranks[], MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_intersection(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_range_excl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_range_incl(MPI_Group group, int n, int ranges[][3], MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Group_rank(MPI_Group group, int *rank); __attribute__((visibility("default"))) int PMPI_Group_size(MPI_Group group, int *size); __attribute__((visibility("default"))) int PMPI_Group_translate_ranks(MPI_Group group1, int n, const int ranks1[], MPI_Group group2, int ranks2[]); __attribute__((visibility("default"))) int PMPI_Group_union(MPI_Group group1, MPI_Group group2, MPI_Group *newgroup); __attribute__((visibility("default"))) int PMPI_Ibsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Improbe(int source, int tag, MPI_Comm comm, int *flag, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Imrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Info_c2f(MPI_Info info); __attribute__((visibility("default"))) int PMPI_Info_create(MPI_Info *info); __attribute__((visibility("default"))) int PMPI_Info_delete(MPI_Info info, const char *key); __attribute__((visibility("default"))) int PMPI_Info_dup(MPI_Info info, MPI_Info *newinfo); __attribute__((visibility("default"))) MPI_Info PMPI_Info_f2c(int info); __attribute__((visibility("default"))) int PMPI_Info_free(MPI_Info *info); __attribute__((visibility("default"))) int PMPI_Info_get(MPI_Info info, const char *key, int valuelen, char *value, int *flag); __attribute__((visibility("default"))) int PMPI_Info_get_nkeys(MPI_Info info, int *nkeys); __attribute__((visibility("default"))) int PMPI_Info_get_nthkey(MPI_Info info, int n, char *key); __attribute__((visibility("default"))) int PMPI_Info_get_valuelen(MPI_Info info, const char *key, int *valuelen, int *flag); __attribute__((visibility("default"))) int PMPI_Info_set(MPI_Info info, const char *key, const char *value); __attribute__((visibility("default"))) int PMPI_Init(int *argc, char ***argv); __attribute__((visibility("default"))) int PMPI_Initialized(int *flag); __attribute__((visibility("default"))) int PMPI_Init_thread(int *argc, char ***argv, int required, int *provided); __attribute__((visibility("default"))) int PMPI_Intercomm_create(MPI_Comm local_comm, int local_leader, MPI_Comm bridge_comm, int remote_leader, int tag, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int PMPI_Intercomm_merge(MPI_Comm intercomm, int high, MPI_Comm *newintercomm); __attribute__((visibility("default"))) int PMPI_Iprobe(int source, int tag, MPI_Comm comm, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Irecv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Irsend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Isend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Issend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Is_thread_main(int *flag); __attribute__((visibility("default"))) int PMPI_Keyval_create(MPI_Copy_function *copy_fn, MPI_Delete_function *delete_fn, int *keyval, void *extra_state) ; __attribute__((visibility("default"))) int PMPI_Keyval_free(int *keyval) ; __attribute__((visibility("default"))) int PMPI_Lookup_name(const char *service_name, MPI_Info info, char *port_name); __attribute__((visibility("default"))) int PMPI_Message_c2f(MPI_Message message); __attribute__((visibility("default"))) MPI_Message PMPI_Message_f2c(int message); __attribute__((visibility("default"))) int PMPI_Mprobe(int source, int tag, MPI_Comm comm, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Mrecv(void *buf, int count, MPI_Datatype type, MPI_Message *message, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Neighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_allgather(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_allgatherv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int displs[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_alltoall(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_alltoallv(const void *sendbuf, const int sendcounts[], const int sdispls[], MPI_Datatype sendtype, void *recvbuf, const int recvcounts[], const int rdispls[], MPI_Datatype recvtype, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Neighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ineighbor_alltoallw(const void *sendbuf, const int sendcounts[], const MPI_Aint sdispls[], const MPI_Datatype sendtypes[], void *recvbuf, const int recvcounts[], const MPI_Aint rdispls[], const MPI_Datatype recvtypes[], MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Op_c2f(MPI_Op op); __attribute__((visibility("default"))) int PMPI_Op_commutative(MPI_Op op, int *commute); __attribute__((visibility("default"))) int PMPI_Op_create(MPI_User_function *function, int commute, MPI_Op *op); __attribute__((visibility("default"))) int PMPI_Open_port(MPI_Info info, char *port_name); __attribute__((visibility("default"))) MPI_Op PMPI_Op_f2c(int op); __attribute__((visibility("default"))) int PMPI_Op_free(MPI_Op *op); __attribute__((visibility("default"))) int PMPI_Pack_external(const char datarep[], const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, MPI_Aint outsize, MPI_Aint *position); __attribute__((visibility("default"))) int PMPI_Pack_external_size(const char datarep[], int incount, MPI_Datatype datatype, MPI_Aint *size); __attribute__((visibility("default"))) int PMPI_Pack(const void *inbuf, int incount, MPI_Datatype datatype, void *outbuf, int outsize, int *position, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Pack_size(int incount, MPI_Datatype datatype, MPI_Comm comm, int *size); __attribute__((visibility("default"))) int PMPI_Pcontrol(const int level, ...); __attribute__((visibility("default"))) int PMPI_Probe(int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Publish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int PMPI_Put(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Query_thread(int *provided); __attribute__((visibility("default"))) int PMPI_Raccumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Recv_init(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Recv(void *buf, int count, MPI_Datatype datatype, int source, int tag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Reduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ireduce(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Reduce_local(const void *inbuf, void *inoutbuf, int count, MPI_Datatype datatype, MPI_Op); __attribute__((visibility("default"))) int PMPI_Reduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ireduce_scatter(const void *sendbuf, void *recvbuf, const int recvcounts[], MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Reduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Ireduce_scatter_block(const void *sendbuf, void *recvbuf, int recvcount, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Register_datarep(const char *datarep, MPI_Datarep_conversion_function *read_conversion_fn, MPI_Datarep_conversion_function *write_conversion_fn, MPI_Datarep_extent_function *dtype_file_extent_fn, void *extra_state); __attribute__((visibility("default"))) int PMPI_Request_c2f(MPI_Request request); __attribute__((visibility("default"))) MPI_Request PMPI_Request_f2c(int request); __attribute__((visibility("default"))) int PMPI_Request_free(MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Request_get_status(MPI_Request request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Rget(void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Rget_accumulate(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, void *result_addr, int result_count, MPI_Datatype result_datatype, int target_rank, MPI_Aint target_disp, int target_count, MPI_Datatype target_datatype, MPI_Op op, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Rput(const void *origin_addr, int origin_count, MPI_Datatype origin_datatype, int target_rank, MPI_Aint target_disp, int target_cout, MPI_Datatype target_datatype, MPI_Win win, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Rsend(const void *ibuf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Rsend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Scan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iscan(const void *sendbuf, void *recvbuf, int count, MPI_Datatype datatype, MPI_Op op, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Scatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iscatter(const void *sendbuf, int sendcount, MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Scatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Iscatterv(const void *sendbuf, const int sendcounts[], const int displs[], MPI_Datatype sendtype, void *recvbuf, int recvcount, MPI_Datatype recvtype, int root, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Send_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Send(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Sendrecv(const void *sendbuf, int sendcount, MPI_Datatype sendtype, int dest, int sendtag, void *recvbuf, int recvcount, MPI_Datatype recvtype, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Sendrecv_replace(void * buf, int count, MPI_Datatype datatype, int dest, int sendtag, int source, int recvtag, MPI_Comm comm, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Ssend_init(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm, MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Ssend(const void *buf, int count, MPI_Datatype datatype, int dest, int tag, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Start(MPI_Request *request); __attribute__((visibility("default"))) int PMPI_Startall(int count, MPI_Request array_of_requests[]); __attribute__((visibility("default"))) int PMPI_Status_c2f(const MPI_Status *c_status, int *f_status); __attribute__((visibility("default"))) int PMPI_Status_f2c(const int *f_status, MPI_Status *c_status); __attribute__((visibility("default"))) int PMPI_Status_set_cancelled(MPI_Status *status, int flag); __attribute__((visibility("default"))) int PMPI_Status_set_elements(MPI_Status *status, MPI_Datatype datatype, int count); __attribute__((visibility("default"))) int PMPI_Status_set_elements_x(MPI_Status *status, MPI_Datatype datatype, MPI_Count count); __attribute__((visibility("default"))) int PMPI_Testall(int count, MPI_Request array_of_requests[], int *flag, MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Testany(int count, MPI_Request array_of_requests[], int *index, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Test(MPI_Request *request, int *flag, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Test_cancelled(const MPI_Status *status, int *flag); __attribute__((visibility("default"))) int PMPI_Testsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Topo_test(MPI_Comm comm, int *status); __attribute__((visibility("default"))) int PMPI_Type_c2f(MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_Type_commit(MPI_Datatype *type); __attribute__((visibility("default"))) int PMPI_Type_contiguous(int count, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_darray(int size, int rank, int ndims, const int gsize_array[], const int distrib_array[], const int darg_array[], const int psize_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_f90_complex(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_f90_integer(int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_f90_real(int p, int r, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_hindexed(int count, const int array_of_blocklengths[], const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_keyval(MPI_Type_copy_attr_function *type_copy_attr_fn, MPI_Type_delete_attr_function *type_delete_attr_fn, int *type_keyval, void *extra_state); __attribute__((visibility("default"))) int PMPI_Type_create_hindexed_block(int count, int blocklength, const MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_indexed_block(int count, int blocklength, const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_struct(int count, const int array_of_block_lengths[], const MPI_Aint array_of_displacements[], const MPI_Datatype array_of_types[], MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_subarray(int ndims, const int size_array[], const int subsize_array[], const int start_array[], int order, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_create_resized(MPI_Datatype oldtype, MPI_Aint lb, MPI_Aint extent, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_delete_attr(MPI_Datatype type, int type_keyval); __attribute__((visibility("default"))) int PMPI_Type_dup(MPI_Datatype type, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_extent(MPI_Datatype type, MPI_Aint *extent) ; __attribute__((visibility("default"))) int PMPI_Type_free(MPI_Datatype *type); __attribute__((visibility("default"))) int PMPI_Type_free_keyval(int *type_keyval); __attribute__((visibility("default"))) MPI_Datatype PMPI_Type_f2c(int datatype); __attribute__((visibility("default"))) int PMPI_Type_get_attr(MPI_Datatype type, int type_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int PMPI_Type_get_contents(MPI_Datatype mtype, int max_integers, int max_addresses, int max_datatypes, int array_of_integers[], MPI_Aint array_of_addresses[], MPI_Datatype array_of_datatypes[]); __attribute__((visibility("default"))) int PMPI_Type_get_envelope(MPI_Datatype type, int *num_integers, int *num_addresses, int *num_datatypes, int *combiner); __attribute__((visibility("default"))) int PMPI_Type_get_extent(MPI_Datatype type, MPI_Aint *lb, MPI_Aint *extent); __attribute__((visibility("default"))) int PMPI_Type_get_extent_x(MPI_Datatype type, MPI_Count *lb, MPI_Count *extent); __attribute__((visibility("default"))) int PMPI_Type_get_name(MPI_Datatype type, char *type_name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Type_get_true_extent(MPI_Datatype datatype, MPI_Aint *true_lb, MPI_Aint *true_extent); __attribute__((visibility("default"))) int PMPI_Type_get_true_extent_x(MPI_Datatype datatype, MPI_Count *true_lb, MPI_Count *true_extent); __attribute__((visibility("default"))) int PMPI_Type_hindexed(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int PMPI_Type_hvector(int count, int blocklength, MPI_Aint stride, MPI_Datatype oldtype, MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int PMPI_Type_indexed(int count, const int array_of_blocklengths[], const int array_of_displacements[], MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Type_lb(MPI_Datatype type, MPI_Aint *lb) ; __attribute__((visibility("default"))) int PMPI_Type_match_size(int typeclass, int size, MPI_Datatype *type); __attribute__((visibility("default"))) int PMPI_Type_set_attr(MPI_Datatype type, int type_keyval, void *attr_val); __attribute__((visibility("default"))) int PMPI_Type_set_name(MPI_Datatype type, const char *type_name); __attribute__((visibility("default"))) int PMPI_Type_size(MPI_Datatype type, int *size); __attribute__((visibility("default"))) int PMPI_Type_size_x(MPI_Datatype type, MPI_Count *size); __attribute__((visibility("default"))) int PMPI_Type_struct(int count, int array_of_blocklengths[], MPI_Aint array_of_displacements[], MPI_Datatype array_of_types[], MPI_Datatype *newtype) ; __attribute__((visibility("default"))) int PMPI_Type_ub(MPI_Datatype mtype, MPI_Aint *ub) ; __attribute__((visibility("default"))) int PMPI_Type_vector(int count, int blocklength, int stride, MPI_Datatype oldtype, MPI_Datatype *newtype); __attribute__((visibility("default"))) int PMPI_Unpack(const void *inbuf, int insize, int *position, void *outbuf, int outcount, MPI_Datatype datatype, MPI_Comm comm); __attribute__((visibility("default"))) int PMPI_Unpublish_name(const char *service_name, MPI_Info info, const char *port_name); __attribute__((visibility("default"))) int PMPI_Unpack_external (const char datarep[], const void *inbuf, MPI_Aint insize, MPI_Aint *position, void *outbuf, int outcount, MPI_Datatype datatype); __attribute__((visibility("default"))) int PMPI_Waitall(int count, MPI_Request array_of_requests[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Waitany(int count, MPI_Request array_of_requests[], int *index, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Wait(MPI_Request *request, MPI_Status *status); __attribute__((visibility("default"))) int PMPI_Waitsome(int incount, MPI_Request array_of_requests[], int *outcount, int array_of_indices[], MPI_Status array_of_statuses[]); __attribute__((visibility("default"))) int PMPI_Win_allocate(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_allocate_shared(MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, void *baseptr, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_attach(MPI_Win win, void *base, MPI_Aint size); __attribute__((visibility("default"))) int PMPI_Win_c2f(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_call_errhandler(MPI_Win win, int errorcode); __attribute__((visibility("default"))) int PMPI_Win_complete(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_create(void *base, MPI_Aint size, int disp_unit, MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_create_dynamic(MPI_Info info, MPI_Comm comm, MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_create_errhandler(MPI_Win_errhandler_function *function, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Win_create_keyval(MPI_Win_copy_attr_function *win_copy_attr_fn, MPI_Win_delete_attr_function *win_delete_attr_fn, int *win_keyval, void *extra_state); __attribute__((visibility("default"))) int PMPI_Win_delete_attr(MPI_Win win, int win_keyval); __attribute__((visibility("default"))) int PMPI_Win_detach(MPI_Win win, const void *base); __attribute__((visibility("default"))) MPI_Win PMPI_Win_f2c(int win); __attribute__((visibility("default"))) int PMPI_Win_fence(int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush(int rank, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush_all(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush_local(int rank, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_flush_local_all(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_free(MPI_Win *win); __attribute__((visibility("default"))) int PMPI_Win_free_keyval(int *win_keyval); __attribute__((visibility("default"))) int PMPI_Win_get_attr(MPI_Win win, int win_keyval, void *attribute_val, int *flag); __attribute__((visibility("default"))) int PMPI_Win_get_errhandler(MPI_Win win, MPI_Errhandler *errhandler); __attribute__((visibility("default"))) int PMPI_Win_get_group(MPI_Win win, MPI_Group *group); __attribute__((visibility("default"))) int PMPI_Win_get_info(MPI_Win win, MPI_Info *info_used); __attribute__((visibility("default"))) int PMPI_Win_get_name(MPI_Win win, char *win_name, int *resultlen); __attribute__((visibility("default"))) int PMPI_Win_lock(int lock_type, int rank, int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_lock_all(int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_post(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_set_attr(MPI_Win win, int win_keyval, void *attribute_val); __attribute__((visibility("default"))) int PMPI_Win_set_errhandler(MPI_Win win, MPI_Errhandler errhandler); __attribute__((visibility("default"))) int PMPI_Win_set_info(MPI_Win win, MPI_Info info); __attribute__((visibility("default"))) int PMPI_Win_set_name(MPI_Win win, const char *win_name); __attribute__((visibility("default"))) int PMPI_Win_shared_query(MPI_Win win, int rank, MPI_Aint *size, int *disp_unit, void *baseptr); __attribute__((visibility("default"))) int PMPI_Win_start(MPI_Group group, int assert, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_sync(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_test(MPI_Win win, int *flag); __attribute__((visibility("default"))) int PMPI_Win_unlock(int rank, MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_unlock_all(MPI_Win win); __attribute__((visibility("default"))) int PMPI_Win_wait(MPI_Win win); __attribute__((visibility("default"))) double PMPI_Wtick(void); __attribute__((visibility("default"))) double PMPI_Wtime(void); __attribute__((visibility("default"))) int PMPI_T_init_thread (int required, int *provided); __attribute__((visibility("default"))) int PMPI_T_finalize (void); __attribute__((visibility("default"))) int PMPI_T_cvar_get_num (int *num_cvar); __attribute__((visibility("default"))) int PMPI_T_cvar_get_info (int cvar_index, char *name, int *name_len, int *verbosity, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *scope); __attribute__((visibility("default"))) int PMPI_T_cvar_get_index (const char *name, int *cvar_index); __attribute__((visibility("default"))) int PMPI_T_cvar_handle_alloc (int cvar_index, void *obj_handle, MPI_T_cvar_handle *handle, int *count); __attribute__((visibility("default"))) int PMPI_T_cvar_handle_free (MPI_T_cvar_handle *handle); __attribute__((visibility("default"))) int PMPI_T_cvar_read (MPI_T_cvar_handle handle, void *buf); __attribute__((visibility("default"))) int PMPI_T_cvar_write (MPI_T_cvar_handle handle, const void *buf); __attribute__((visibility("default"))) int PMPI_T_category_get_num(int *num_cat); __attribute__((visibility("default"))) int PMPI_T_category_get_info(int cat_index, char *name, int *name_len, char *desc, int *desc_len, int *num_cvars, int *num_pvars, int *num_categories); __attribute__((visibility("default"))) int PMPI_T_category_get_index (const char *name, int *category_index); __attribute__((visibility("default"))) int PMPI_T_category_get_cvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int PMPI_T_category_get_pvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int PMPI_T_category_get_categories(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int PMPI_T_category_changed(int *stamp); __attribute__((visibility("default"))) int PMPI_T_pvar_get_num(int *num_pvar); __attribute__((visibility("default"))) int PMPI_T_pvar_get_info(int pvar_index, char *name, int *name_len, int *verbosity, int *var_class, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *readonly, int *continuous, int *atomic); __attribute__((visibility("default"))) int PMPI_T_pvar_get_index (const char *name, int var_class, int *pvar_index); __attribute__((visibility("default"))) int PMPI_T_pvar_session_create(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int PMPI_T_pvar_session_free(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int PMPI_T_pvar_handle_alloc(MPI_T_pvar_session session, int pvar_index, void *obj_handle, MPI_T_pvar_handle *handle, int *count); __attribute__((visibility("default"))) int PMPI_T_pvar_handle_free(MPI_T_pvar_session session, MPI_T_pvar_handle *handle); __attribute__((visibility("default"))) int PMPI_T_pvar_start(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int PMPI_T_pvar_stop(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int PMPI_T_pvar_read(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int PMPI_T_pvar_write(MPI_T_pvar_session session, MPI_T_pvar_handle handle, const void *buf); __attribute__((visibility("default"))) int PMPI_T_pvar_reset(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int PMPI_T_pvar_readreset(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int PMPI_T_enum_get_info(MPI_T_enum enumtype, int *num, char *name, int *name_len); __attribute__((visibility("default"))) int PMPI_T_enum_get_item(MPI_T_enum enumtype, int index, int *value, char *name, int *name_len); __attribute__((visibility("default"))) int MPI_T_init_thread (int required, int *provided); __attribute__((visibility("default"))) int MPI_T_finalize (void); __attribute__((visibility("default"))) int MPI_T_cvar_get_num (int *num_cvar); __attribute__((visibility("default"))) int MPI_T_cvar_get_info (int cvar_index, char *name, int *name_len, int *verbosity, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *scope); __attribute__((visibility("default"))) int MPI_T_cvar_get_index (const char *name, int *cvar_index); __attribute__((visibility("default"))) int MPI_T_cvar_handle_alloc (int cvar_index, void *obj_handle, MPI_T_cvar_handle *handle, int *count); __attribute__((visibility("default"))) int MPI_T_cvar_handle_free (MPI_T_cvar_handle *handle); __attribute__((visibility("default"))) int MPI_T_cvar_read (MPI_T_cvar_handle handle, void *buf); __attribute__((visibility("default"))) int MPI_T_cvar_write (MPI_T_cvar_handle handle, const void *buf); __attribute__((visibility("default"))) int MPI_T_category_get_num(int *num_cat); __attribute__((visibility("default"))) int MPI_T_category_get_info(int cat_index, char *name, int *name_len, char *desc, int *desc_len, int *num_cvars, int *num_pvars, int *num_categories); __attribute__((visibility("default"))) int MPI_T_category_get_index (const char *name, int *category_index); __attribute__((visibility("default"))) int MPI_T_category_get_cvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int MPI_T_category_get_pvars(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int MPI_T_category_get_categories(int cat_index, int len, int indices[]); __attribute__((visibility("default"))) int MPI_T_category_changed(int *stamp); __attribute__((visibility("default"))) int MPI_T_pvar_get_num(int *num_pvar); __attribute__((visibility("default"))) int MPI_T_pvar_get_info(int pvar_index, char *name, int *name_len, int *verbosity, int *var_class, MPI_Datatype *datatype, MPI_T_enum *enumtype, char *desc, int *desc_len, int *bind, int *readonly, int *continuous, int *atomic); __attribute__((visibility("default"))) int MPI_T_pvar_get_index (const char *name, int var_class, int *pvar_index); __attribute__((visibility("default"))) int MPI_T_pvar_session_create(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int MPI_T_pvar_session_free(MPI_T_pvar_session *session); __attribute__((visibility("default"))) int MPI_T_pvar_handle_alloc(MPI_T_pvar_session session, int pvar_index, void *obj_handle, MPI_T_pvar_handle *handle, int *count); __attribute__((visibility("default"))) int MPI_T_pvar_handle_free(MPI_T_pvar_session session, MPI_T_pvar_handle *handle); __attribute__((visibility("default"))) int MPI_T_pvar_start(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int MPI_T_pvar_stop(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int MPI_T_pvar_read(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int MPI_T_pvar_write(MPI_T_pvar_session session, MPI_T_pvar_handle handle, const void *buf); __attribute__((visibility("default"))) int MPI_T_pvar_reset(MPI_T_pvar_session session, MPI_T_pvar_handle handle); __attribute__((visibility("default"))) int MPI_T_pvar_readreset(MPI_T_pvar_session session, MPI_T_pvar_handle handle, void *buf); __attribute__((visibility("default"))) int MPI_T_enum_get_info(MPI_T_enum enumtype, int *num, char *name, int *name_len); __attribute__((visibility("default"))) int MPI_T_enum_get_item(MPI_T_enum enumtype, int index, int *value, char *name, int *name_len); # 4 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" 2 # 4 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" int ompi_major = # 4 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" 3 4 1 # 4 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" ; int ompi_minor = # 5 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" 3 4 10 # 5 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" ; int ompi_release = # 6 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" 3 4 5 # 6 "/tmp/petsc-KvGRNM/config.packages.MPI/conftest.c" ; Unable to parse OpenMPI version from header. Probably a buggy preprocessor Checking for functions [MPI_Alltoallw] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Alltoallw(); static void _check_MPI_Alltoallw() { MPI_Alltoallw(); } int main() { _check_MPI_Alltoallw();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C Checking for functions [MPI_Type_create_indexed_block] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Type_create_indexed_block(); static void _check_MPI_Type_create_indexed_block() { MPI_Type_create_indexed_block(); } int main() { _check_MPI_Type_create_indexed_block();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C Defined "HAVE_MPI_ALLTOALLW" to "1" Checking for functions [MPI_Win_create] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Win_create(); static void _check_MPI_Win_create() { MPI_Win_create(); } int main() { _check_MPI_Win_create();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C Defined "HAVE_MPI_WIN_CREATE" to "1" Defined "HAVE_MPI_REPLACE" to "1" Checking for functions [MPI_Comm_spawn MPI_Type_get_envelope MPI_Type_get_extent MPI_Type_dup MPI_Init_thread MPI_Iallreduce MPI_Ibarrier MPI_Finalized MPI_Exscan MPI_Reduce_scatter MPI_Reduce_scatter_block] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPI_Comm_spawn(); static void _check_MPI_Comm_spawn() { MPI_Comm_spawn(); } char MPI_Type_get_envelope(); static void _check_MPI_Type_get_envelope() { MPI_Type_get_envelope(); } char MPI_Type_get_extent(); static void _check_MPI_Type_get_extent() { MPI_Type_get_extent(); } char MPI_Type_dup(); static void _check_MPI_Type_dup() { MPI_Type_dup(); } char MPI_Init_thread(); static void _check_MPI_Init_thread() { MPI_Init_thread(); } char MPI_Iallreduce(); static void _check_MPI_Iallreduce() { MPI_Iallreduce(); } char MPI_Ibarrier(); static void _check_MPI_Ibarrier() { MPI_Ibarrier(); } char MPI_Finalized(); static void _check_MPI_Finalized() { MPI_Finalized(); } char MPI_Exscan(); static void _check_MPI_Exscan() { MPI_Exscan(); } char MPI_Reduce_scatter(); static void _check_MPI_Reduce_scatter() { MPI_Reduce_scatter(); } char MPI_Reduce_scatter_block(); static void _check_MPI_Reduce_scatter_block() { MPI_Reduce_scatter_block(); } int main() { _check_MPI_Comm_spawn(); _check_MPI_Type_get_envelope(); _check_MPI_Type_get_extent(); _check_MPI_Type_dup(); _check_MPI_Init_thread(); _check_MPI_Iallreduce(); _check_MPI_Ibarrier(); _check_MPI_Finalized(); _check_MPI_Exscan(); _check_MPI_Reduce_scatter(); _check_MPI_Reduce_scatter_block();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C Defined "HAVE_MPI_COMM_SPAWN" to "1" Defined "HAVE_MPI_TYPE_GET_ENVELOPE" to "1" Defined "HAVE_MPI_TYPE_GET_EXTENT" to "1" Defined "HAVE_MPI_TYPE_DUP" to "1" Defined "HAVE_MPI_INIT_THREAD" to "1" Defined "HAVE_MPI_IALLREDUCE" to "1" Defined "HAVE_MPI_IBARRIER" to "1" Defined "HAVE_MPI_FINALIZED" to "1" Defined "HAVE_MPI_EXSCAN" to "1" Defined "HAVE_MPI_REDUCE_SCATTER" to "1" Defined "HAVE_MPI_REDUCE_SCATTER_BLOCK" to "1" Checking for functions [MPIX_Iallreduce] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIX_Iallreduce(); static void _check_MPIX_Iallreduce() { MPIX_Iallreduce(); } int main() { _check_MPIX_Iallreduce();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_MPIX_Iallreduce': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `MPIX_Iallreduce' collect2: error: ld returned 1 exit status Popping language C Checking for functions [MPIX_Ibarrier] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIX_Ibarrier(); static void _check_MPIX_Ibarrier() { MPIX_Ibarrier(); } int main() { _check_MPIX_Ibarrier();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_MPIX_Ibarrier': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `MPIX_Ibarrier' collect2: error: ld returned 1 exit status Popping language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:6:5: warning: unused variable 'combiner' [-Wunused-variable] int combiner = MPI_COMBINER_DUP;; ^~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { int combiner = MPI_COMBINER_DUP;; return 0; } Defined "HAVE_MPI_COMBINER_DUP" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:6:5: warning: unused variable 'combiner' [-Wunused-variable] int combiner = MPI_COMBINER_CONTIGUOUS;; ^~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { int combiner = MPI_COMBINER_CONTIGUOUS;; return 0; } Defined "HAVE_MPI_COMBINER_CONTIGUOUS" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c:6:5: warning: unused variable 'combiner' [-Wunused-variable] int combiner = MPI_COMBINER_NAMED;; ^~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include int main() { int combiner = MPI_COMBINER_NAMED;; return 0; } Defined "HAVE_MPI_COMBINER_NAMED" to "1" Checking for functions [MPIDI_CH3I_sock_set] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIDI_CH3I_sock_set(); static void _check_MPIDI_CH3I_sock_set() { MPIDI_CH3I_sock_set(); } int main() { _check_MPIDI_CH3I_sock_set();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_MPIDI_CH3I_sock_set': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `MPIDI_CH3I_sock_set' collect2: error: ld returned 1 exit status Popping language C Checking for functions [MPIDI_CH3I_sock_fixed_nbc_progress] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char MPIDI_CH3I_sock_fixed_nbc_progress(); static void _check_MPIDI_CH3I_sock_fixed_nbc_progress() { MPIDI_CH3I_sock_fixed_nbc_progress(); } int main() { _check_MPIDI_CH3I_sock_fixed_nbc_progress();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_MPIDI_CH3I_sock_fixed_nbc_progress': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `MPIDI_CH3I_sock_fixed_nbc_progress' collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST checkSharedLibrary from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:130) TESTING: checkSharedLibrary from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:130) Sets flag indicating if MPI libraries are shared or not and determines if MPI libraries CANNOT be used by shared libraries ================================================================================ TEST configureMPIEXEC from config.packages.MPI(/home/florian/software/petsc/config/BuildSystem/config/packages/MPI.py:143) TESTING: configureMPIEXEC from config.packages.MPI(config/BuildSystem/config/packages/MPI.py:143) Checking for mpiexec Pushing language C Popping language C Checking for program /home/florian/software/bin/mpiexec...not found Checking for program /home/florian/software/bin/mpirun...not found Checking for program /home/florian/software/bin/mprun...not found Checking for program /home/florian/software/bin/mpiexec...not found Checking for program /home/florian/software/bin/mpirun...not found Checking for program /home/florian/software/bin/mprun...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpiexec...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpirun...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mprun...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpiexec...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mpirun...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mprun...not found Checking for program /usr/local/sbin/mpiexec...not found Checking for program /usr/local/sbin/mpirun...not found Checking for program /usr/local/sbin/mprun...not found Checking for program /usr/local/sbin/mpiexec...not found Checking for program /usr/local/sbin/mpirun...not found Checking for program /usr/local/sbin/mprun...not found Checking for program /usr/local/bin/mpiexec...not found Checking for program /usr/local/bin/mpirun...not found Checking for program /usr/local/bin/mprun...not found Checking for program /usr/local/bin/mpiexec...not found Checking for program /usr/local/bin/mpirun...not found Checking for program /usr/local/bin/mprun...not found Checking for program /usr/bin/mpiexec...found Defined make macro "MPIEXEC" to "mpiexec" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #ifdef __cplusplus extern "C" #endif int init(int argc, char *argv[]) { int isInitialized; MPI_Init(&argc, &argv); MPI_Initialized(&isInitialized); return (int) isInitialized; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/libconftest.so -shared -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #ifdef __cplusplus extern "C" #endif int checkInit(void) { int isInitialized; MPI_Initialized(&isInitialized); if (isInitialized) MPI_Finalize(); return (int) isInitialized; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.MPI/libconftest.so -shared -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.MPI/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #include #ifdef PETSC_HAVE_DLFCN_H #include #endif int main() { int argc = 1; char *argv[2] = {(char *) "conftest", NULL}; void *lib; int (*init)(int, char **); int (*checkInit)(void); lib = dlopen("/tmp/petsc-KvGRNM/config.libraries/lib1.so", RTLD_LAZY); if (!lib) { fprintf(stderr, "Could not open lib1.so: %s\n", dlerror()); exit(1); } init = (int (*)(int, char **)) dlsym(lib, "init"); if (!init) { fprintf(stderr, "Could not find initialization function\n"); exit(1); } if (!(*init)(argc, argv)) { fprintf(stderr, "Could not initialize library\n"); exit(1); } lib = dlopen("/tmp/petsc-KvGRNM/config.libraries/lib2.so", RTLD_LAZY); if (!lib) { fprintf(stderr, "Could not open lib2.so: %s\n", dlerror()); exit(1); } checkInit = (int (*)(void)) dlsym(lib, "checkInit"); if (!checkInit) { fprintf(stderr, "Could not find initialization check function\n"); exit(1); } if (!(*checkInit)()) { fprintf(stderr, "Did not link with shared library\n"); exit(2); } ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl -ldl Testing executable /tmp/petsc-KvGRNM/config.libraries/conftest to see if it can be run Executing: mpiexec /tmp/petsc-KvGRNM/config.libraries/conftest Executing: mpiexec /tmp/petsc-KvGRNM/config.libraries/conftest stdout: ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- -------------------------------------------------------------------------- mpiexec detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[42589,1],0] Exit code: 1 -------------------------------------------------------------------------- ERROR while running executable: Could not execute "mpiexec /tmp/petsc-KvGRNM/config.libraries/conftest": ------------------------------------------------------- Primary job terminated normally, but 1 process returned a non-zero exit code.. Per user-direction, the job has been aborted. ------------------------------------------------------- -------------------------------------------------------------------------- mpiexec detected that one or more processes exited with non-zero status, thus causing the job to be terminated. The first process to do so was: Process name: [[42589,1],0] Exit code: 1 --------------------------------------------------------------------------Could not find initialization function Library was not shared Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.yaml(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.yaml(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from config.packages.valgrind(/home/florian/software/petsc/config/BuildSystem/config/package.py:679) TESTING: configureLibrary from config.packages.valgrind(config/BuildSystem/config/package.py:679) Find an installation and check if it can work with PETSc ================================================================================== Checking for a functional valgrind Not checking for library in Compiler specific search VALGRIND: [] because no functions given to check for ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names No functions to check for in library [] [] Checking for headers Compiler specific search VALGRIND: ['/usr/include', '/usr/lib/openmpi'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['valgrind/valgrind.h'] in ['/usr/include', '/usr/lib/openmpi'] Checking include with compiler flags var CPPFLAGS ['/usr/include', '/usr/lib/openmpi'] Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.headers -I/usr/include -I/usr/lib/openmpi /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/valgrind/valgrind.h" 1 3 4 # 95 "/usr/include/valgrind/valgrind.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 1 3 4 # 40 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 3 4 # 40 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 3 4 typedef __builtin_va_list __gnuc_va_list; # 99 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 3 4 typedef __gnuc_va_list va_list; # 96 "/usr/include/valgrind/valgrind.h" 2 3 4 # 402 "/usr/include/valgrind/valgrind.h" 3 4 typedef struct { unsigned long int nraddr; } OrigFn; # 6647 "/usr/include/valgrind/valgrind.h" 3 4 typedef enum { VG_USERREQ__RUNNING_ON_VALGRIND = 0x1001, VG_USERREQ__DISCARD_TRANSLATIONS = 0x1002, VG_USERREQ__CLIENT_CALL0 = 0x1101, VG_USERREQ__CLIENT_CALL1 = 0x1102, VG_USERREQ__CLIENT_CALL2 = 0x1103, VG_USERREQ__CLIENT_CALL3 = 0x1104, VG_USERREQ__COUNT_ERRORS = 0x1201, VG_USERREQ__GDB_MONITOR_COMMAND = 0x1202, VG_USERREQ__MALLOCLIKE_BLOCK = 0x1301, VG_USERREQ__RESIZEINPLACE_BLOCK = 0x130b, VG_USERREQ__FREELIKE_BLOCK = 0x1302, VG_USERREQ__CREATE_MEMPOOL = 0x1303, VG_USERREQ__DESTROY_MEMPOOL = 0x1304, VG_USERREQ__MEMPOOL_ALLOC = 0x1305, VG_USERREQ__MEMPOOL_FREE = 0x1306, VG_USERREQ__MEMPOOL_TRIM = 0x1307, VG_USERREQ__MOVE_MEMPOOL = 0x1308, VG_USERREQ__MEMPOOL_CHANGE = 0x1309, VG_USERREQ__MEMPOOL_EXISTS = 0x130a, # 6692 "/usr/include/valgrind/valgrind.h" 3 4 VG_USERREQ__PRINTF = 0x1401, VG_USERREQ__PRINTF_BACKTRACE = 0x1402, VG_USERREQ__PRINTF_VALIST_BY_REF = 0x1403, VG_USERREQ__PRINTF_BACKTRACE_VALIST_BY_REF = 0x1404, VG_USERREQ__STACK_REGISTER = 0x1501, VG_USERREQ__STACK_DEREGISTER = 0x1502, VG_USERREQ__STACK_CHANGE = 0x1503, VG_USERREQ__LOAD_PDB_DEBUGINFO = 0x1601, VG_USERREQ__MAP_IP_TO_SRCLOC = 0x1701, VG_USERREQ__CHANGE_ERR_DISABLEMENT = 0x1801, VG_USERREQ__VEX_INIT_FOR_IRI = 0x1901 } Vg_ClientRequest; # 6752 "/usr/include/valgrind/valgrind.h" 3 4 static int VALGRIND_PRINTF(const char *format, ...) __attribute__((format(__printf__, 1, 2), __unused__)); static int VALGRIND_PRINTF(const char *format, ...) { unsigned long _qzz_res; va_list vargs; __builtin_va_start(vargs,format); _qzz_res = __extension__ ({ volatile unsigned long int _zzq_args[6]; volatile unsigned long int _zzq_result; _zzq_args[0] = (unsigned long int)(VG_USERREQ__PRINTF_VALIST_BY_REF); _zzq_args[1] = (unsigned long int)((unsigned long)format); _zzq_args[2] = (unsigned long int)((unsigned long)&vargs); _zzq_args[3] = (unsigned long int)(0); _zzq_args[4] = (unsigned long int)(0); _zzq_args[5] = (unsigned long int)(0); __asm__ volatile("rolq $3, %%rdi ; rolq $13, %%rdi\n\t" "rolq $61, %%rdi ; rolq $51, %%rdi\n\t" "xchgq %%rbx,%%rbx" : "=d" (_zzq_result) : "a" (&_zzq_args[0]), "0" (0) : "cc", "memory" ); _zzq_result; }) ; __builtin_va_end(vargs); return (int)_qzz_res; } static int VALGRIND_PRINTF_BACKTRACE(const char *format, ...) __attribute__((format(__printf__, 1, 2), __unused__)); static int VALGRIND_PRINTF_BACKTRACE(const char *format, ...) { unsigned long _qzz_res; va_list vargs; __builtin_va_start(vargs,format); _qzz_res = __extension__ ({ volatile unsigned long int _zzq_args[6]; volatile unsigned long int _zzq_result; _zzq_args[0] = (unsigned long int)(VG_USERREQ__PRINTF_BACKTRACE_VALIST_BY_REF); _zzq_args[1] = (unsigned long int)((unsigned long)format); _zzq_args[2] = (unsigned long int)((unsigned long)&vargs); _zzq_args[3] = (unsigned long int)(0); _zzq_args[4] = (unsigned long int)(0); _zzq_args[5] = (unsigned long int)(0); __asm__ volatile("rolq $3, %%rdi ; rolq $13, %%rdi\n\t" "rolq $61, %%rdi ; rolq $51, %%rdi\n\t" "xchgq %%rbx,%%rbx" : "=d" (_zzq_result) : "a" (&_zzq_args[0]), "0" (0) : "cc", "memory" ); _zzq_result; }) ; __builtin_va_end(vargs); return (int)_qzz_res; } # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Found header files ['valgrind/valgrind.h'] in ['/usr/include', '/usr/lib/openmpi'] Popping language C All intermediate test results are stored in /tmp/petsc-KvGRNM/config.packages.valgrind Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.valgrind/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.valgrind/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { RUNNING_ON_VALGRIND; ; return 0; } Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.tetgen(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.tetgen(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.tchem(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.tchem(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from config.packages.ssl(/home/florian/software/petsc/config/BuildSystem/config/packages/ssl.py:27) TESTING: configureLibrary from config.packages.ssl(config/BuildSystem/config/packages/ssl.py:27) ================================================================================== Checking for a functional ssl Checking for library in Compiler specific search SSL: [] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [SSLv23_method] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char SSLv23_method(); static void _check_SSLv23_method() { SSLv23_method(); } int main() { _check_SSLv23_method();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_SSLv23_method': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `SSLv23_method' collect2: error: ld returned 1 exit status Popping language C Checking for library in Compiler specific search SSL: ['libssl.a', 'libcrypto.a'] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [SSLv23_method] in library ['libssl.a', 'libcrypto.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char SSLv23_method(); static void _check_SSLv23_method() { SSLv23_method(); } int main() { _check_SSLv23_method();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lssl -lcrypto -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBSSL" to "1" Defined "HAVE_LIBCRYPTO" to "1" Popping language C Checking for headers Compiler specific search SSL: ['/usr/include', '/usr/lib/openmpi'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['openssl/ssl.h'] in ['/usr/include', '/usr/lib/openmpi'] Checking include with compiler flags var CPPFLAGS ['/usr/include', '/usr/lib/openmpi'] Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.headers -I/usr/include -I/usr/lib/openmpi /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/openssl/ssl.h" 1 3 4 # 146 "/usr/include/openssl/ssl.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 147 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/comp.h" 1 3 4 # 1 "/usr/include/openssl/crypto.h" 1 3 4 # 120 "/usr/include/openssl/crypto.h" 3 4 # 1 "/usr/include/stdlib.h" 1 3 4 # 24 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 25 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 33 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitflags.h" 1 3 4 # 42 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitstatus.h" 1 3 4 # 43 "/usr/include/stdlib.h" 2 3 4 # 56 "/usr/include/stdlib.h" 3 4 typedef struct { int quot; int rem; } div_t; typedef struct { long int quot; long int rem; } ldiv_t; __extension__ typedef struct { long long int quot; long long int rem; } lldiv_t; # 100 "/usr/include/stdlib.h" 3 4 extern size_t __ctype_get_mb_cur_max (void) __attribute__ ((__nothrow__ , __leaf__)) ; extern double atof (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern int atoi (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern long int atol (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; __extension__ extern long long int atoll (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern double strtod (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern float strtof (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long double strtold (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int strtol (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern unsigned long int strtoul (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtouq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoll (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtoull (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 266 "/usr/include/stdlib.h" 3 4 extern char *l64a (long int __n) __attribute__ ((__nothrow__ , __leaf__)) ; extern long int a64l (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 276 "/usr/include/stdlib.h" 2 3 4 extern long int random (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srandom (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern char *initstate (unsigned int __seed, char *__statebuf, size_t __statelen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *setstate (char *__statebuf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct random_data { int32_t *fptr; int32_t *rptr; int32_t *state; int rand_type; int rand_deg; int rand_sep; int32_t *end_ptr; }; extern int random_r (struct random_data *__restrict __buf, int32_t *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srandom_r (unsigned int __seed, struct random_data *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int initstate_r (unsigned int __seed, char *__restrict __statebuf, size_t __statelen, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern int setstate_r (char *__restrict __statebuf, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int rand (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srand (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern int rand_r (unsigned int *__seed) __attribute__ ((__nothrow__ , __leaf__)); extern double drand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern double erand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int lrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int nrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int mrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int jrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void srand48 (long int __seedval) __attribute__ ((__nothrow__ , __leaf__)); extern unsigned short int *seed48 (unsigned short int __seed16v[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void lcong48 (unsigned short int __param[7]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct drand48_data { unsigned short int __x[3]; unsigned short int __old_x[3]; unsigned short int __c; unsigned short int __init; __extension__ unsigned long long int __a; }; extern int drand48_r (struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int erand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int nrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int mrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int jrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srand48_r (long int __seedval, struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int seed48_r (unsigned short int __seed16v[3], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lcong48_r (unsigned short int __param[7], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *malloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *calloc (size_t __nmemb, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *realloc (void *__ptr, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__warn_unused_result__)); extern void free (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void cfree (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/include/alloca.h" 1 3 4 # 24 "/usr/include/alloca.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 25 "/usr/include/alloca.h" 2 3 4 extern void *alloca (size_t __size) __attribute__ ((__nothrow__ , __leaf__)); # 454 "/usr/include/stdlib.h" 2 3 4 extern void *valloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int posix_memalign (void **__memptr, size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern void *aligned_alloc (size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__alloc_size__ (2))) ; extern void abort (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern int atexit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int at_quick_exit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int on_exit (void (*__func) (int __status, void *__arg), void *__arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void quick_exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void _Exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern char *getenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 539 "/usr/include/stdlib.h" 3 4 extern int putenv (char *__string) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int setenv (const char *__name, const char *__value, int __replace) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int unsetenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int clearenv (void) __attribute__ ((__nothrow__ , __leaf__)); # 567 "/usr/include/stdlib.h" 3 4 extern char *mktemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 580 "/usr/include/stdlib.h" 3 4 extern int mkstemp (char *__template) __attribute__ ((__nonnull__ (1))) ; # 602 "/usr/include/stdlib.h" 3 4 extern int mkstemps (char *__template, int __suffixlen) __attribute__ ((__nonnull__ (1))) ; # 623 "/usr/include/stdlib.h" 3 4 extern char *mkdtemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 672 "/usr/include/stdlib.h" 3 4 extern int system (const char *__command) ; # 694 "/usr/include/stdlib.h" 3 4 extern char *realpath (const char *__restrict __name, char *__restrict __resolved) __attribute__ ((__nothrow__ , __leaf__)) ; typedef int (*__compar_fn_t) (const void *, const void *); # 712 "/usr/include/stdlib.h" 3 4 extern void *bsearch (const void *__key, const void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 2, 5))) ; extern void qsort (void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 4))); # 735 "/usr/include/stdlib.h" 3 4 extern int abs (int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern long int labs (long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern long long int llabs (long long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern div_t div (int __numer, int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern ldiv_t ldiv (long int __numer, long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern lldiv_t lldiv (long long int __numer, long long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; # 772 "/usr/include/stdlib.h" 3 4 extern char *ecvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *fcvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *gcvt (double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern char *qecvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qfcvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qgcvt (long double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern int ecvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int fcvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qecvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qfcvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int mblen (const char *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int mbtowc (wchar_t *__restrict __pwc, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int wctomb (char *__s, wchar_t __wchar) __attribute__ ((__nothrow__ , __leaf__)); extern size_t mbstowcs (wchar_t *__restrict __pwcs, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern size_t wcstombs (char *__restrict __s, const wchar_t *__restrict __pwcs, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int rpmatch (const char *__response) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 859 "/usr/include/stdlib.h" 3 4 extern int getsubopt (char **__restrict __optionp, char *const *__restrict __tokens, char **__restrict __valuep) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))) ; # 911 "/usr/include/stdlib.h" 3 4 extern int getloadavg (double __loadavg[], int __nelem) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 921 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/bits/stdlib-float.h" 1 3 4 # 922 "/usr/include/stdlib.h" 2 3 4 # 934 "/usr/include/stdlib.h" 3 4 # 121 "/usr/include/openssl/crypto.h" 2 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 123 "/usr/include/openssl/crypto.h" 2 3 4 # 1 "/usr/include/stdio.h" 1 3 4 # 29 "/usr/include/stdio.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 34 "/usr/include/stdio.h" 2 3 4 # 44 "/usr/include/stdio.h" 3 4 struct _IO_FILE; typedef struct _IO_FILE FILE; # 64 "/usr/include/stdio.h" 3 4 typedef struct _IO_FILE __FILE; # 74 "/usr/include/stdio.h" 3 4 # 1 "/usr/include/libio.h" 1 3 4 # 31 "/usr/include/libio.h" 3 4 # 1 "/usr/include/_G_config.h" 1 3 4 # 15 "/usr/include/_G_config.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 16 "/usr/include/_G_config.h" 2 3 4 # 1 "/usr/include/wchar.h" 1 3 4 # 82 "/usr/include/wchar.h" 3 4 typedef struct { int __count; union { unsigned int __wch; char __wchb[4]; } __value; } __mbstate_t; # 21 "/usr/include/_G_config.h" 2 3 4 typedef struct { __off_t __pos; __mbstate_t __state; } _G_fpos_t; typedef struct { __off64_t __pos; __mbstate_t __state; } _G_fpos64_t; # 32 "/usr/include/libio.h" 2 3 4 # 49 "/usr/include/libio.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 1 3 4 # 40 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 3 4 typedef __builtin_va_list __gnuc_va_list; # 50 "/usr/include/libio.h" 2 3 4 # 144 "/usr/include/libio.h" 3 4 struct _IO_jump_t; struct _IO_FILE; typedef void _IO_lock_t; struct _IO_marker { struct _IO_marker *_next; struct _IO_FILE *_sbuf; int _pos; # 173 "/usr/include/libio.h" 3 4 }; enum __codecvt_result { __codecvt_ok, __codecvt_partial, __codecvt_error, __codecvt_noconv }; # 241 "/usr/include/libio.h" 3 4 struct _IO_FILE { int _flags; char* _IO_read_ptr; char* _IO_read_end; char* _IO_read_base; char* _IO_write_base; char* _IO_write_ptr; char* _IO_write_end; char* _IO_buf_base; char* _IO_buf_end; char *_IO_save_base; char *_IO_backup_base; char *_IO_save_end; struct _IO_marker *_markers; struct _IO_FILE *_chain; int _fileno; int _flags2; __off_t _old_offset; unsigned short _cur_column; signed char _vtable_offset; char _shortbuf[1]; _IO_lock_t *_lock; # 289 "/usr/include/libio.h" 3 4 __off64_t _offset; void *__pad1; void *__pad2; void *__pad3; void *__pad4; size_t __pad5; int _mode; char _unused2[15 * sizeof (int) - 4 * sizeof (void *) - sizeof (size_t)]; }; typedef struct _IO_FILE _IO_FILE; struct _IO_FILE_plus; extern struct _IO_FILE_plus _IO_2_1_stdin_; extern struct _IO_FILE_plus _IO_2_1_stdout_; extern struct _IO_FILE_plus _IO_2_1_stderr_; # 333 "/usr/include/libio.h" 3 4 typedef __ssize_t __io_read_fn (void *__cookie, char *__buf, size_t __nbytes); typedef __ssize_t __io_write_fn (void *__cookie, const char *__buf, size_t __n); typedef int __io_seek_fn (void *__cookie, __off64_t *__pos, int __w); typedef int __io_close_fn (void *__cookie); # 385 "/usr/include/libio.h" 3 4 extern int __underflow (_IO_FILE *); extern int __uflow (_IO_FILE *); extern int __overflow (_IO_FILE *, int); # 429 "/usr/include/libio.h" 3 4 extern int _IO_getc (_IO_FILE *__fp); extern int _IO_putc (int __c, _IO_FILE *__fp); extern int _IO_feof (_IO_FILE *__fp) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_ferror (_IO_FILE *__fp) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_peekc_locked (_IO_FILE *__fp); extern void _IO_flockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); extern void _IO_funlockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_ftrylockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); # 459 "/usr/include/libio.h" 3 4 extern int _IO_vfscanf (_IO_FILE * __restrict, const char * __restrict, __gnuc_va_list, int *__restrict); extern int _IO_vfprintf (_IO_FILE *__restrict, const char *__restrict, __gnuc_va_list); extern __ssize_t _IO_padn (_IO_FILE *, int, __ssize_t); extern size_t _IO_sgetn (_IO_FILE *, void *, size_t); extern __off64_t _IO_seekoff (_IO_FILE *, __off64_t, int, int); extern __off64_t _IO_seekpos (_IO_FILE *, __off64_t, int); extern void _IO_free_backup_area (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); # 75 "/usr/include/stdio.h" 2 3 4 typedef __gnuc_va_list va_list; # 110 "/usr/include/stdio.h" 3 4 typedef _G_fpos_t fpos_t; # 166 "/usr/include/stdio.h" 3 4 # 1 "/usr/include/bits/stdio_lim.h" 1 3 4 # 167 "/usr/include/stdio.h" 2 3 4 extern struct _IO_FILE *stdin; extern struct _IO_FILE *stdout; extern struct _IO_FILE *stderr; extern int remove (const char *__filename) __attribute__ ((__nothrow__ , __leaf__)); extern int rename (const char *__old, const char *__new) __attribute__ ((__nothrow__ , __leaf__)); extern int renameat (int __oldfd, const char *__old, int __newfd, const char *__new) __attribute__ ((__nothrow__ , __leaf__)); extern FILE *tmpfile (void) ; # 211 "/usr/include/stdio.h" 3 4 extern char *tmpnam (char *__s) __attribute__ ((__nothrow__ , __leaf__)) ; extern char *tmpnam_r (char *__s) __attribute__ ((__nothrow__ , __leaf__)) ; # 229 "/usr/include/stdio.h" 3 4 extern char *tempnam (const char *__dir, const char *__pfx) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int fclose (FILE *__stream); extern int fflush (FILE *__stream); # 254 "/usr/include/stdio.h" 3 4 extern int fflush_unlocked (FILE *__stream); # 268 "/usr/include/stdio.h" 3 4 extern FILE *fopen (const char *__restrict __filename, const char *__restrict __modes) ; extern FILE *freopen (const char *__restrict __filename, const char *__restrict __modes, FILE *__restrict __stream) ; # 297 "/usr/include/stdio.h" 3 4 # 308 "/usr/include/stdio.h" 3 4 extern FILE *fdopen (int __fd, const char *__modes) __attribute__ ((__nothrow__ , __leaf__)) ; # 321 "/usr/include/stdio.h" 3 4 extern FILE *fmemopen (void *__s, size_t __len, const char *__modes) __attribute__ ((__nothrow__ , __leaf__)) ; extern FILE *open_memstream (char **__bufloc, size_t *__sizeloc) __attribute__ ((__nothrow__ , __leaf__)) ; extern void setbuf (FILE *__restrict __stream, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern int setvbuf (FILE *__restrict __stream, char *__restrict __buf, int __modes, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern void setbuffer (FILE *__restrict __stream, char *__restrict __buf, size_t __size) __attribute__ ((__nothrow__ , __leaf__)); extern void setlinebuf (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int fprintf (FILE *__restrict __stream, const char *__restrict __format, ...); extern int printf (const char *__restrict __format, ...); extern int sprintf (char *__restrict __s, const char *__restrict __format, ...) __attribute__ ((__nothrow__)); extern int vfprintf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg); extern int vprintf (const char *__restrict __format, __gnuc_va_list __arg); extern int vsprintf (char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__)); extern int snprintf (char *__restrict __s, size_t __maxlen, const char *__restrict __format, ...) __attribute__ ((__nothrow__)) __attribute__ ((__format__ (__printf__, 3, 4))); extern int vsnprintf (char *__restrict __s, size_t __maxlen, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__)) __attribute__ ((__format__ (__printf__, 3, 0))); # 414 "/usr/include/stdio.h" 3 4 extern int vdprintf (int __fd, const char *__restrict __fmt, __gnuc_va_list __arg) __attribute__ ((__format__ (__printf__, 2, 0))); extern int dprintf (int __fd, const char *__restrict __fmt, ...) __attribute__ ((__format__ (__printf__, 2, 3))); extern int fscanf (FILE *__restrict __stream, const char *__restrict __format, ...) ; extern int scanf (const char *__restrict __format, ...) ; extern int sscanf (const char *__restrict __s, const char *__restrict __format, ...) __attribute__ ((__nothrow__ , __leaf__)); # 445 "/usr/include/stdio.h" 3 4 extern int fscanf (FILE *__restrict __stream, const char *__restrict __format, ...) __asm__ ("" "__isoc99_fscanf") ; extern int scanf (const char *__restrict __format, ...) __asm__ ("" "__isoc99_scanf") ; extern int sscanf (const char *__restrict __s, const char *__restrict __format, ...) __asm__ ("" "__isoc99_sscanf") __attribute__ ((__nothrow__ , __leaf__)) ; # 465 "/usr/include/stdio.h" 3 4 extern int vfscanf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__format__ (__scanf__, 2, 0))) ; extern int vscanf (const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__format__ (__scanf__, 1, 0))) ; extern int vsscanf (const char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__format__ (__scanf__, 2, 0))); # 496 "/usr/include/stdio.h" 3 4 extern int vfscanf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vfscanf") __attribute__ ((__format__ (__scanf__, 2, 0))) ; extern int vscanf (const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vscanf") __attribute__ ((__format__ (__scanf__, 1, 0))) ; extern int vsscanf (const char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vsscanf") __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__format__ (__scanf__, 2, 0))); # 524 "/usr/include/stdio.h" 3 4 extern int fgetc (FILE *__stream); extern int getc (FILE *__stream); extern int getchar (void); # 552 "/usr/include/stdio.h" 3 4 extern int getc_unlocked (FILE *__stream); extern int getchar_unlocked (void); # 563 "/usr/include/stdio.h" 3 4 extern int fgetc_unlocked (FILE *__stream); extern int fputc (int __c, FILE *__stream); extern int putc (int __c, FILE *__stream); extern int putchar (int __c); # 596 "/usr/include/stdio.h" 3 4 extern int fputc_unlocked (int __c, FILE *__stream); extern int putc_unlocked (int __c, FILE *__stream); extern int putchar_unlocked (int __c); extern int getw (FILE *__stream); extern int putw (int __w, FILE *__stream); extern char *fgets (char *__restrict __s, int __n, FILE *__restrict __stream) ; # 642 "/usr/include/stdio.h" 3 4 # 667 "/usr/include/stdio.h" 3 4 extern __ssize_t __getdelim (char **__restrict __lineptr, size_t *__restrict __n, int __delimiter, FILE *__restrict __stream) ; extern __ssize_t getdelim (char **__restrict __lineptr, size_t *__restrict __n, int __delimiter, FILE *__restrict __stream) ; extern __ssize_t getline (char **__restrict __lineptr, size_t *__restrict __n, FILE *__restrict __stream) ; extern int fputs (const char *__restrict __s, FILE *__restrict __stream); extern int puts (const char *__s); extern int ungetc (int __c, FILE *__stream); extern size_t fread (void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream) ; extern size_t fwrite (const void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __s); # 739 "/usr/include/stdio.h" 3 4 extern size_t fread_unlocked (void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream) ; extern size_t fwrite_unlocked (const void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream); extern int fseek (FILE *__stream, long int __off, int __whence); extern long int ftell (FILE *__stream) ; extern void rewind (FILE *__stream); # 775 "/usr/include/stdio.h" 3 4 extern int fseeko (FILE *__stream, __off_t __off, int __whence); extern __off_t ftello (FILE *__stream) ; # 794 "/usr/include/stdio.h" 3 4 extern int fgetpos (FILE *__restrict __stream, fpos_t *__restrict __pos); extern int fsetpos (FILE *__stream, const fpos_t *__pos); # 817 "/usr/include/stdio.h" 3 4 # 826 "/usr/include/stdio.h" 3 4 extern void clearerr (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int feof (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int ferror (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void clearerr_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int feof_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int ferror_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void perror (const char *__s); # 1 "/usr/include/bits/sys_errlist.h" 1 3 4 # 26 "/usr/include/bits/sys_errlist.h" 3 4 extern int sys_nerr; extern const char *const sys_errlist[]; # 856 "/usr/include/stdio.h" 2 3 4 extern int fileno (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int fileno_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; # 874 "/usr/include/stdio.h" 3 4 extern FILE *popen (const char *__command, const char *__modes) ; extern int pclose (FILE *__stream); extern char *ctermid (char *__s) __attribute__ ((__nothrow__ , __leaf__)); # 914 "/usr/include/stdio.h" 3 4 extern void flockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int ftrylockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void funlockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); # 944 "/usr/include/stdio.h" 3 4 # 126 "/usr/include/openssl/crypto.h" 2 3 4 # 1 "/usr/include/openssl/stack.h" 1 3 4 # 66 "/usr/include/openssl/stack.h" 3 4 typedef struct stack_st { int num; char **data; int sorted; int num_alloc; int (*comp) (const void *, const void *); } _STACK; int sk_num(const _STACK *); void *sk_value(const _STACK *, int); void *sk_set(_STACK *, int, void *); _STACK *sk_new(int (*cmp) (const void *, const void *)); _STACK *sk_new_null(void); void sk_free(_STACK *); void sk_pop_free(_STACK *st, void (*func) (void *)); _STACK *sk_deep_copy(_STACK *, void *(*)(void *), void (*)(void *)); int sk_insert(_STACK *sk, void *data, int where); void *sk_delete(_STACK *st, int loc); void *sk_delete_ptr(_STACK *st, void *p); int sk_find(_STACK *st, void *data); int sk_find_ex(_STACK *st, void *data); int sk_push(_STACK *st, void *data); int sk_unshift(_STACK *st, void *data); void *sk_shift(_STACK *st); void *sk_pop(_STACK *st); void sk_zero(_STACK *st); int (*sk_set_cmp_func(_STACK *sk, int (*c) (const void *, const void *))) (const void *, const void *); _STACK *sk_dup(_STACK *st); void sk_sort(_STACK *st); int sk_is_sorted(const _STACK *st); # 129 "/usr/include/openssl/crypto.h" 2 3 4 # 1 "/usr/include/openssl/safestack.h" 1 3 4 # 119 "/usr/include/openssl/safestack.h" 3 4 typedef char *OPENSSL_STRING; typedef const char *OPENSSL_CSTRING; # 131 "/usr/include/openssl/safestack.h" 3 4 struct stack_st_OPENSSL_STRING { _STACK stack; }; typedef void *OPENSSL_BLOCK; struct stack_st_OPENSSL_BLOCK { _STACK stack; }; # 130 "/usr/include/openssl/crypto.h" 2 3 4 # 1 "/usr/include/openssl/opensslv.h" 1 3 4 # 131 "/usr/include/openssl/crypto.h" 2 3 4 # 1 "/usr/include/openssl/ossl_typ.h" 1 3 4 # 62 "/usr/include/openssl/ossl_typ.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 63 "/usr/include/openssl/ossl_typ.h" 2 3 4 # 83 "/usr/include/openssl/ossl_typ.h" 3 4 typedef struct asn1_string_st ASN1_INTEGER; typedef struct asn1_string_st ASN1_ENUMERATED; typedef struct asn1_string_st ASN1_BIT_STRING; typedef struct asn1_string_st ASN1_OCTET_STRING; typedef struct asn1_string_st ASN1_PRINTABLESTRING; typedef struct asn1_string_st ASN1_T61STRING; typedef struct asn1_string_st ASN1_IA5STRING; typedef struct asn1_string_st ASN1_GENERALSTRING; typedef struct asn1_string_st ASN1_UNIVERSALSTRING; typedef struct asn1_string_st ASN1_BMPSTRING; typedef struct asn1_string_st ASN1_UTCTIME; typedef struct asn1_string_st ASN1_TIME; typedef struct asn1_string_st ASN1_GENERALIZEDTIME; typedef struct asn1_string_st ASN1_VISIBLESTRING; typedef struct asn1_string_st ASN1_UTF8STRING; typedef struct asn1_string_st ASN1_STRING; typedef int ASN1_BOOLEAN; typedef int ASN1_NULL; typedef struct asn1_object_st ASN1_OBJECT; typedef struct ASN1_ITEM_st ASN1_ITEM; typedef struct asn1_pctx_st ASN1_PCTX; # 120 "/usr/include/openssl/ossl_typ.h" 3 4 typedef struct bignum_st BIGNUM; typedef struct bignum_ctx BN_CTX; typedef struct bn_blinding_st BN_BLINDING; typedef struct bn_mont_ctx_st BN_MONT_CTX; typedef struct bn_recp_ctx_st BN_RECP_CTX; typedef struct bn_gencb_st BN_GENCB; typedef struct buf_mem_st BUF_MEM; typedef struct evp_cipher_st EVP_CIPHER; typedef struct evp_cipher_ctx_st EVP_CIPHER_CTX; typedef struct env_md_st EVP_MD; typedef struct env_md_ctx_st EVP_MD_CTX; typedef struct evp_pkey_st EVP_PKEY; typedef struct evp_pkey_asn1_method_st EVP_PKEY_ASN1_METHOD; typedef struct evp_pkey_method_st EVP_PKEY_METHOD; typedef struct evp_pkey_ctx_st EVP_PKEY_CTX; typedef struct dh_st DH; typedef struct dh_method DH_METHOD; typedef struct dsa_st DSA; typedef struct dsa_method DSA_METHOD; typedef struct rsa_st RSA; typedef struct rsa_meth_st RSA_METHOD; typedef struct rand_meth_st RAND_METHOD; typedef struct ecdh_method ECDH_METHOD; typedef struct ecdsa_method ECDSA_METHOD; typedef struct x509_st X509; typedef struct X509_algor_st X509_ALGOR; typedef struct X509_crl_st X509_CRL; typedef struct x509_crl_method_st X509_CRL_METHOD; typedef struct x509_revoked_st X509_REVOKED; typedef struct X509_name_st X509_NAME; typedef struct X509_pubkey_st X509_PUBKEY; typedef struct x509_store_st X509_STORE; typedef struct x509_store_ctx_st X509_STORE_CTX; typedef struct pkcs8_priv_key_info_st PKCS8_PRIV_KEY_INFO; typedef struct v3_ext_ctx X509V3_CTX; typedef struct conf_st CONF; typedef struct store_st STORE; typedef struct store_method_st STORE_METHOD; typedef struct ui_st UI; typedef struct ui_method_st UI_METHOD; typedef struct st_ERR_FNS ERR_FNS; typedef struct engine_st ENGINE; typedef struct ssl_st SSL; typedef struct ssl_ctx_st SSL_CTX; typedef struct comp_method_st COMP_METHOD; typedef struct X509_POLICY_NODE_st X509_POLICY_NODE; typedef struct X509_POLICY_LEVEL_st X509_POLICY_LEVEL; typedef struct X509_POLICY_TREE_st X509_POLICY_TREE; typedef struct X509_POLICY_CACHE_st X509_POLICY_CACHE; typedef struct AUTHORITY_KEYID_st AUTHORITY_KEYID; typedef struct DIST_POINT_st DIST_POINT; typedef struct ISSUING_DIST_POINT_st ISSUING_DIST_POINT; typedef struct NAME_CONSTRAINTS_st NAME_CONSTRAINTS; typedef struct crypto_ex_data_st CRYPTO_EX_DATA; typedef int CRYPTO_EX_new (void *parent, void *ptr, CRYPTO_EX_DATA *ad, int idx, long argl, void *argp); typedef void CRYPTO_EX_free (void *parent, void *ptr, CRYPTO_EX_DATA *ad, int idx, long argl, void *argp); typedef int CRYPTO_EX_dup (CRYPTO_EX_DATA *to, CRYPTO_EX_DATA *from, void *from_d, int idx, long argl, void *argp); typedef struct ocsp_req_ctx_st OCSP_REQ_CTX; typedef struct ocsp_response_st OCSP_RESPONSE; typedef struct ocsp_responder_id_st OCSP_RESPID; # 132 "/usr/include/openssl/crypto.h" 2 3 4 # 141 "/usr/include/openssl/crypto.h" 3 4 # 1 "/usr/include/openssl/symhacks.h" 1 3 4 # 58 "/usr/include/openssl/symhacks.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 59 "/usr/include/openssl/symhacks.h" 2 3 4 # 142 "/usr/include/openssl/crypto.h" 2 3 4 # 175 "/usr/include/openssl/crypto.h" 3 4 typedef struct openssl_item_st { int code; void *value; size_t value_size; size_t *value_length; } OPENSSL_ITEM; # 262 "/usr/include/openssl/crypto.h" 3 4 typedef struct { int references; struct CRYPTO_dynlock_value *data; } CRYPTO_dynlock; # 290 "/usr/include/openssl/crypto.h" 3 4 typedef struct bio_st BIO_dummy; struct crypto_ex_data_st { struct stack_st_void *sk; int dummy; }; struct stack_st_void { _STACK stack; }; typedef struct crypto_ex_data_func_st { long argl; void *argp; CRYPTO_EX_new *new_func; CRYPTO_EX_free *free_func; CRYPTO_EX_dup *dup_func; } CRYPTO_EX_DATA_FUNCS; struct stack_st_CRYPTO_EX_DATA_FUNCS { _STACK stack; }; # 369 "/usr/include/openssl/crypto.h" 3 4 int CRYPTO_mem_ctrl(int mode); int CRYPTO_is_mem_check_on(void); # 396 "/usr/include/openssl/crypto.h" 3 4 const char *SSLeay_version(int type); unsigned long SSLeay(void); int OPENSSL_issetugid(void); typedef struct st_CRYPTO_EX_DATA_IMPL CRYPTO_EX_DATA_IMPL; const CRYPTO_EX_DATA_IMPL *CRYPTO_get_ex_data_implementation(void); int CRYPTO_set_ex_data_implementation(const CRYPTO_EX_DATA_IMPL *i); int CRYPTO_ex_data_new_class(void); int CRYPTO_get_ex_new_index(int class_index, long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int CRYPTO_new_ex_data(int class_index, void *obj, CRYPTO_EX_DATA *ad); int CRYPTO_dup_ex_data(int class_index, CRYPTO_EX_DATA *to, CRYPTO_EX_DATA *from); void CRYPTO_free_ex_data(int class_index, void *obj, CRYPTO_EX_DATA *ad); int CRYPTO_set_ex_data(CRYPTO_EX_DATA *ad, int idx, void *val); void *CRYPTO_get_ex_data(const CRYPTO_EX_DATA *ad, int idx); void CRYPTO_cleanup_all_ex_data(void); int CRYPTO_get_new_lockid(char *name); int CRYPTO_num_locks(void); void CRYPTO_lock(int mode, int type, const char *file, int line); void CRYPTO_set_locking_callback(void (*func) (int mode, int type, const char *file, int line)); void (*CRYPTO_get_locking_callback(void)) (int mode, int type, const char *file, int line); void CRYPTO_set_add_lock_callback(int (*func) (int *num, int mount, int type, const char *file, int line)); int (*CRYPTO_get_add_lock_callback(void)) (int *num, int mount, int type, const char *file, int line); typedef struct crypto_threadid_st { void *ptr; unsigned long val; } CRYPTO_THREADID; void CRYPTO_THREADID_set_numeric(CRYPTO_THREADID *id, unsigned long val); void CRYPTO_THREADID_set_pointer(CRYPTO_THREADID *id, void *ptr); int CRYPTO_THREADID_set_callback(void (*threadid_func) (CRYPTO_THREADID *)); void (*CRYPTO_THREADID_get_callback(void)) (CRYPTO_THREADID *); void CRYPTO_THREADID_current(CRYPTO_THREADID *id); int CRYPTO_THREADID_cmp(const CRYPTO_THREADID *a, const CRYPTO_THREADID *b); void CRYPTO_THREADID_cpy(CRYPTO_THREADID *dest, const CRYPTO_THREADID *src); unsigned long CRYPTO_THREADID_hash(const CRYPTO_THREADID *id); void CRYPTO_set_id_callback(unsigned long (*func) (void)); unsigned long (*CRYPTO_get_id_callback(void)) (void); unsigned long CRYPTO_thread_id(void); const char *CRYPTO_get_lock_name(int type); int CRYPTO_add_lock(int *pointer, int amount, int type, const char *file, int line); int CRYPTO_get_new_dynlockid(void); void CRYPTO_destroy_dynlockid(int i); struct CRYPTO_dynlock_value *CRYPTO_get_dynlock_value(int i); void CRYPTO_set_dynlock_create_callback(struct CRYPTO_dynlock_value *(*dyn_create_function) (const char *file, int line)); void CRYPTO_set_dynlock_lock_callback(void (*dyn_lock_function) (int mode, struct CRYPTO_dynlock_value *l, const char *file, int line)); void CRYPTO_set_dynlock_destroy_callback(void (*dyn_destroy_function) (struct CRYPTO_dynlock_value *l, const char *file, int line)); struct CRYPTO_dynlock_value *(*CRYPTO_get_dynlock_create_callback(void)) (const char *file, int line); void (*CRYPTO_get_dynlock_lock_callback(void)) (int mode, struct CRYPTO_dynlock_value *l, const char *file, int line); void (*CRYPTO_get_dynlock_destroy_callback(void)) (struct CRYPTO_dynlock_value *l, const char *file, int line); int CRYPTO_set_mem_functions(void *(*m) (size_t), void *(*r) (void *, size_t), void (*f) (void *)); int CRYPTO_set_locked_mem_functions(void *(*m) (size_t), void (*free_func) (void *)); int CRYPTO_set_mem_ex_functions(void *(*m) (size_t, const char *, int), void *(*r) (void *, size_t, const char *, int), void (*f) (void *)); int CRYPTO_set_locked_mem_ex_functions(void *(*m) (size_t, const char *, int), void (*free_func) (void *)); int CRYPTO_set_mem_debug_functions(void (*m) (void *, int, const char *, int, int), void (*r) (void *, void *, int, const char *, int, int), void (*f) (void *, int), void (*so) (long), long (*go) (void)); void CRYPTO_get_mem_functions(void *(**m) (size_t), void *(**r) (void *, size_t), void (**f) (void *)); void CRYPTO_get_locked_mem_functions(void *(**m) (size_t), void (**f) (void *)); void CRYPTO_get_mem_ex_functions(void *(**m) (size_t, const char *, int), void *(**r) (void *, size_t, const char *, int), void (**f) (void *)); void CRYPTO_get_locked_mem_ex_functions(void *(**m) (size_t, const char *, int), void (**f) (void *)); void CRYPTO_get_mem_debug_functions(void (**m) (void *, int, const char *, int, int), void (**r) (void *, void *, int, const char *, int, int), void (**f) (void *, int), void (**so) (long), long (**go) (void)); void *CRYPTO_malloc_locked(int num, const char *file, int line); void CRYPTO_free_locked(void *ptr); void *CRYPTO_malloc(int num, const char *file, int line); char *CRYPTO_strdup(const char *str, const char *file, int line); void CRYPTO_free(void *ptr); void *CRYPTO_realloc(void *addr, int num, const char *file, int line); void *CRYPTO_realloc_clean(void *addr, int old_num, int num, const char *file, int line); void *CRYPTO_remalloc(void *addr, int num, const char *file, int line); void OPENSSL_cleanse(void *ptr, size_t len); void CRYPTO_set_mem_debug_options(long bits); long CRYPTO_get_mem_debug_options(void); int CRYPTO_push_info_(const char *info, const char *file, int line); int CRYPTO_pop_info(void); int CRYPTO_remove_all_info(void); # 563 "/usr/include/openssl/crypto.h" 3 4 void CRYPTO_dbg_malloc(void *addr, int num, const char *file, int line, int before_p); void CRYPTO_dbg_realloc(void *addr1, void *addr2, int num, const char *file, int line, int before_p); void CRYPTO_dbg_free(void *addr, int before_p); # 577 "/usr/include/openssl/crypto.h" 3 4 void CRYPTO_dbg_set_options(long bits); long CRYPTO_dbg_get_options(void); void CRYPTO_mem_leaks_fp(FILE *); void CRYPTO_mem_leaks(struct bio_st *bio); typedef void *CRYPTO_MEM_LEAK_CB (unsigned long, const char *, int, int, void *); void CRYPTO_mem_leaks_cb(CRYPTO_MEM_LEAK_CB *cb); void OpenSSLDie(const char *file, int line, const char *assertion); unsigned long *OPENSSL_ia32cap_loc(void); int OPENSSL_isservice(void); int FIPS_mode(void); int FIPS_mode_set(int r); void OPENSSL_init(void); # 631 "/usr/include/openssl/crypto.h" 3 4 int CRYPTO_memcmp(const volatile void *a, const volatile void *b, size_t len); void ERR_load_CRYPTO_strings(void); # 6 "/usr/include/openssl/comp.h" 2 3 4 # 15 "/usr/include/openssl/comp.h" 3 4 typedef struct comp_ctx_st COMP_CTX; struct comp_method_st { int type; const char *name; int (*init) (COMP_CTX *ctx); void (*finish) (COMP_CTX *ctx); int (*compress) (COMP_CTX *ctx, unsigned char *out, unsigned int olen, unsigned char *in, unsigned int ilen); int (*expand) (COMP_CTX *ctx, unsigned char *out, unsigned int olen, unsigned char *in, unsigned int ilen); long (*ctrl) (void); long (*callback_ctrl) (void); }; struct comp_ctx_st { COMP_METHOD *meth; unsigned long compress_in; unsigned long compress_out; unsigned long expand_in; unsigned long expand_out; CRYPTO_EX_DATA ex_data; }; COMP_CTX *COMP_CTX_new(COMP_METHOD *meth); void COMP_CTX_free(COMP_CTX *ctx); int COMP_compress_block(COMP_CTX *ctx, unsigned char *out, int olen, unsigned char *in, int ilen); int COMP_expand_block(COMP_CTX *ctx, unsigned char *out, int olen, unsigned char *in, int ilen); COMP_METHOD *COMP_rle(void); COMP_METHOD *COMP_zlib(void); void COMP_zlib_cleanup(void); # 65 "/usr/include/openssl/comp.h" 3 4 void ERR_load_COMP_strings(void); # 150 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/bio.h" 1 3 4 # 62 "/usr/include/openssl/bio.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 63 "/usr/include/openssl/bio.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 1 3 4 # 68 "/usr/include/openssl/bio.h" 2 3 4 # 238 "/usr/include/openssl/bio.h" 3 4 typedef struct bio_st BIO; void BIO_set_flags(BIO *b, int flags); int BIO_test_flags(const BIO *b, int flags); void BIO_clear_flags(BIO *b, int flags); # 298 "/usr/include/openssl/bio.h" 3 4 long (*BIO_get_callback(const BIO *b)) (struct bio_st *, int, const char *, int, long, long); void BIO_set_callback(BIO *b, long (*callback) (struct bio_st *, int, const char *, int, long, long)); char *BIO_get_callback_arg(const BIO *b); void BIO_set_callback_arg(BIO *b, char *arg); const char *BIO_method_name(const BIO *b); int BIO_method_type(const BIO *b); typedef void bio_info_cb (struct bio_st *, int, const char *, int, long, long); typedef struct bio_method_st { int type; const char *name; int (*bwrite) (BIO *, const char *, int); int (*bread) (BIO *, char *, int); int (*bputs) (BIO *, const char *); int (*bgets) (BIO *, char *, int); long (*ctrl) (BIO *, int, long, void *); int (*create) (BIO *); int (*destroy) (BIO *); long (*callback_ctrl) (BIO *, int, bio_info_cb *); } BIO_METHOD; struct bio_st { BIO_METHOD *method; long (*callback) (struct bio_st *, int, const char *, int, long, long); char *cb_arg; int init; int shutdown; int flags; int retry_reason; int num; void *ptr; struct bio_st *next_bio; struct bio_st *prev_bio; int references; unsigned long num_read; unsigned long num_write; CRYPTO_EX_DATA ex_data; }; struct stack_st_BIO { _STACK stack; }; typedef struct bio_f_buffer_ctx_struct { # 359 "/usr/include/openssl/bio.h" 3 4 int ibuf_size; int obuf_size; char *ibuf; int ibuf_len; int ibuf_off; char *obuf; int obuf_len; int obuf_off; } BIO_F_BUFFER_CTX; typedef int asn1_ps_func (BIO *b, unsigned char **pbuf, int *plen, void *parg); # 594 "/usr/include/openssl/bio.h" 3 4 size_t BIO_ctrl_pending(BIO *b); size_t BIO_ctrl_wpending(BIO *b); # 613 "/usr/include/openssl/bio.h" 3 4 size_t BIO_ctrl_get_write_guarantee(BIO *b); size_t BIO_ctrl_get_read_request(BIO *b); int BIO_ctrl_reset_read_request(BIO *b); # 636 "/usr/include/openssl/bio.h" 3 4 int BIO_set_ex_data(BIO *bio, int idx, void *data); void *BIO_get_ex_data(BIO *bio, int idx); int BIO_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); unsigned long BIO_number_read(BIO *bio); unsigned long BIO_number_written(BIO *bio); int BIO_asn1_set_prefix(BIO *b, asn1_ps_func *prefix, asn1_ps_func *prefix_free); int BIO_asn1_get_prefix(BIO *b, asn1_ps_func **pprefix, asn1_ps_func **pprefix_free); int BIO_asn1_set_suffix(BIO *b, asn1_ps_func *suffix, asn1_ps_func *suffix_free); int BIO_asn1_get_suffix(BIO *b, asn1_ps_func **psuffix, asn1_ps_func **psuffix_free); BIO_METHOD *BIO_s_file(void); BIO *BIO_new_file(const char *filename, const char *mode); BIO *BIO_new_fp(FILE *stream, int close_flag); BIO *BIO_new(BIO_METHOD *type); int BIO_set(BIO *a, BIO_METHOD *type); int BIO_free(BIO *a); void BIO_vfree(BIO *a); int BIO_read(BIO *b, void *data, int len); int BIO_gets(BIO *bp, char *buf, int size); int BIO_write(BIO *b, const void *data, int len); int BIO_puts(BIO *bp, const char *buf); int BIO_indent(BIO *b, int indent, int max); long BIO_ctrl(BIO *bp, int cmd, long larg, void *parg); long BIO_callback_ctrl(BIO *b, int cmd, void (*fp) (struct bio_st *, int, const char *, int, long, long)); char *BIO_ptr_ctrl(BIO *bp, int cmd, long larg); long BIO_int_ctrl(BIO *bp, int cmd, long larg, int iarg); BIO *BIO_push(BIO *b, BIO *append); BIO *BIO_pop(BIO *b); void BIO_free_all(BIO *a); BIO *BIO_find_type(BIO *b, int bio_type); BIO *BIO_next(BIO *b); BIO *BIO_get_retry_BIO(BIO *bio, int *reason); int BIO_get_retry_reason(BIO *bio); BIO *BIO_dup_chain(BIO *in); int BIO_nread0(BIO *bio, char **buf); int BIO_nread(BIO *bio, char **buf, int num); int BIO_nwrite0(BIO *bio, char **buf); int BIO_nwrite(BIO *bio, char **buf, int num); long BIO_debug_callback(BIO *bio, int cmd, const char *argp, int argi, long argl, long ret); BIO_METHOD *BIO_s_mem(void); BIO *BIO_new_mem_buf(const void *buf, int len); BIO_METHOD *BIO_s_socket(void); BIO_METHOD *BIO_s_connect(void); BIO_METHOD *BIO_s_accept(void); BIO_METHOD *BIO_s_fd(void); BIO_METHOD *BIO_s_log(void); BIO_METHOD *BIO_s_bio(void); BIO_METHOD *BIO_s_null(void); BIO_METHOD *BIO_f_null(void); BIO_METHOD *BIO_f_buffer(void); BIO_METHOD *BIO_f_nbio_test(void); BIO_METHOD *BIO_s_datagram(void); int BIO_sock_should_retry(int i); int BIO_sock_non_fatal_error(int error); int BIO_dgram_non_fatal_error(int error); int BIO_fd_should_retry(int i); int BIO_fd_non_fatal_error(int error); int BIO_dump_cb(int (*cb) (const void *data, size_t len, void *u), void *u, const char *s, int len); int BIO_dump_indent_cb(int (*cb) (const void *data, size_t len, void *u), void *u, const char *s, int len, int indent); int BIO_dump(BIO *b, const char *bytes, int len); int BIO_dump_indent(BIO *b, const char *bytes, int len, int indent); int BIO_dump_fp(FILE *fp, const char *s, int len); int BIO_dump_indent_fp(FILE *fp, const char *s, int len, int indent); int BIO_hex_string(BIO *out, int indent, int width, unsigned char *data, int datalen); struct hostent *BIO_gethostbyname(const char *name); # 746 "/usr/include/openssl/bio.h" 3 4 int BIO_sock_error(int sock); int BIO_socket_ioctl(int fd, long type, void *arg); int BIO_socket_nbio(int fd, int mode); int BIO_get_port(const char *str, unsigned short *port_ptr); int BIO_get_host_ip(const char *str, unsigned char *ip); int BIO_get_accept_socket(char *host_port, int mode); int BIO_accept(int sock, char **ip_port); int BIO_sock_init(void); void BIO_sock_cleanup(void); int BIO_set_tcp_ndelay(int sock, int turn_on); BIO *BIO_new_socket(int sock, int close_flag); BIO *BIO_new_dgram(int fd, int close_flag); # 771 "/usr/include/openssl/bio.h" 3 4 BIO *BIO_new_fd(int fd, int close_flag); BIO *BIO_new_connect(const char *host_port); BIO *BIO_new_accept(const char *host_port); int BIO_new_bio_pair(BIO **bio1, size_t writebuf1, BIO **bio2, size_t writebuf2); void BIO_copy_next_retry(BIO *b); # 794 "/usr/include/openssl/bio.h" 3 4 int BIO_printf(BIO *bio, const char *format, ...) __attribute__((__format__(__printf__, 2, 3))); int BIO_vprintf(BIO *bio, const char *format, va_list args) __attribute__((__format__(__printf__, 2, 0))); int BIO_snprintf(char *buf, size_t n, const char *format, ...) __attribute__((__format__(__printf__, 3, 4))); int BIO_vsnprintf(char *buf, size_t n, const char *format, va_list args) __attribute__((__format__(__printf__, 3, 0))); void ERR_load_BIO_strings(void); # 153 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/x509.h" 1 3 4 # 67 "/usr/include/openssl/x509.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 68 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/buffer.h" 1 3 4 # 68 "/usr/include/openssl/buffer.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long int ptrdiff_t; # 426 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef struct { long long __max_align_ll __attribute__((__aligned__(__alignof__(long long)))); long double __max_align_ld __attribute__((__aligned__(__alignof__(long double)))); } max_align_t; # 69 "/usr/include/openssl/buffer.h" 2 3 4 # 77 "/usr/include/openssl/buffer.h" 3 4 struct buf_mem_st { size_t length; char *data; size_t max; }; BUF_MEM *BUF_MEM_new(void); void BUF_MEM_free(BUF_MEM *a); int BUF_MEM_grow(BUF_MEM *str, size_t len); int BUF_MEM_grow_clean(BUF_MEM *str, size_t len); size_t BUF_strnlen(const char *str, size_t maxlen); char *BUF_strdup(const char *str); char *BUF_strndup(const char *str, size_t siz); void *BUF_memdup(const void *data, size_t siz); void BUF_reverse(unsigned char *out, const unsigned char *in, size_t siz); size_t BUF_strlcpy(char *dst, const char *src, size_t siz); size_t BUF_strlcat(char *dst, const char *src, size_t siz); void ERR_load_BUF_strings(void); # 71 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/evp.h" 1 3 4 # 66 "/usr/include/openssl/evp.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 67 "/usr/include/openssl/evp.h" 2 3 4 # 94 "/usr/include/openssl/evp.h" 3 4 # 1 "/usr/include/openssl/objects.h" 1 3 4 # 65 "/usr/include/openssl/objects.h" 3 4 # 1 "/usr/include/openssl/obj_mac.h" 1 3 4 # 66 "/usr/include/openssl/objects.h" 2 3 4 # 965 "/usr/include/openssl/objects.h" 3 4 # 1 "/usr/include/openssl/asn1.h" 1 3 4 # 62 "/usr/include/openssl/asn1.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 29 "/usr/include/time.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 38 "/usr/include/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 42 "/usr/include/time.h" 2 3 4 # 131 "/usr/include/time.h" 3 4 struct tm { int tm_sec; int tm_min; int tm_hour; int tm_mday; int tm_mon; int tm_year; int tm_wday; int tm_yday; int tm_isdst; long int tm_gmtoff; const char *tm_zone; }; struct itimerspec { struct timespec it_interval; struct timespec it_value; }; struct sigevent; # 186 "/usr/include/time.h" 3 4 extern clock_t clock (void) __attribute__ ((__nothrow__ , __leaf__)); extern time_t time (time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern double difftime (time_t __time1, time_t __time0) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern time_t mktime (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern size_t strftime (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); # 221 "/usr/include/time.h" 3 4 # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 222 "/usr/include/time.h" 2 3 4 extern size_t strftime_l (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)); # 236 "/usr/include/time.h" 3 4 extern struct tm *gmtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *gmtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime (const struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime_r (const struct tm *__restrict __tp, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime_r (const time_t *__restrict __timer, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *__tzname[2]; extern int __daylight; extern long int __timezone; extern char *tzname[2]; extern void tzset (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daylight; extern long int timezone; extern int stime (const time_t *__when) __attribute__ ((__nothrow__ , __leaf__)); # 319 "/usr/include/time.h" 3 4 extern time_t timegm (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern time_t timelocal (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int dysize (int __year) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 334 "/usr/include/time.h" 3 4 extern int nanosleep (const struct timespec *__requested_time, struct timespec *__remaining); extern int clock_getres (clockid_t __clock_id, struct timespec *__res) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_gettime (clockid_t __clock_id, struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_settime (clockid_t __clock_id, const struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_nanosleep (clockid_t __clock_id, int __flags, const struct timespec *__req, struct timespec *__rem); extern int clock_getcpuclockid (pid_t __pid, clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_create (clockid_t __clock_id, struct sigevent *__restrict __evp, timer_t *__restrict __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_delete (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_settime (timer_t __timerid, int __flags, const struct itimerspec *__restrict __value, struct itimerspec *__restrict __ovalue) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_gettime (timer_t __timerid, struct itimerspec *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_getoverrun (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timespec_get (struct timespec *__ts, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 430 "/usr/include/time.h" 3 4 # 63 "/usr/include/openssl/asn1.h" 2 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 64 "/usr/include/openssl/asn1.h" 2 3 4 # 74 "/usr/include/openssl/asn1.h" 3 4 # 1 "/usr/include/openssl/bn.h" 1 3 4 # 128 "/usr/include/openssl/bn.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 34 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 1 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 168 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/include/limits.h" 1 3 4 # 143 "/usr/include/limits.h" 3 4 # 1 "/usr/include/bits/posix1_lim.h" 1 3 4 # 160 "/usr/include/bits/posix1_lim.h" 3 4 # 1 "/usr/include/bits/local_lim.h" 1 3 4 # 38 "/usr/include/bits/local_lim.h" 3 4 # 1 "/usr/include/linux/limits.h" 1 3 4 # 39 "/usr/include/bits/local_lim.h" 2 3 4 # 161 "/usr/include/bits/posix1_lim.h" 2 3 4 # 144 "/usr/include/limits.h" 2 3 4 # 1 "/usr/include/bits/posix2_lim.h" 1 3 4 # 148 "/usr/include/limits.h" 2 3 4 # 169 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 8 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 2 3 4 # 35 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 129 "/usr/include/openssl/bn.h" 2 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 130 "/usr/include/openssl/bn.h" 2 3 4 # 313 "/usr/include/openssl/bn.h" 3 4 struct bignum_st { unsigned long *d; int top; int dmax; int neg; int flags; }; struct bn_mont_ctx_st { int ri; BIGNUM RR; BIGNUM N; BIGNUM Ni; unsigned long n0[2]; int flags; }; struct bn_recp_ctx_st { BIGNUM N; BIGNUM Nr; int num_bits; int shift; int flags; }; struct bn_gencb_st { unsigned int ver; void *arg; union { void (*cb_1) (int, int, void *); int (*cb_2) (int, int, BN_GENCB *); } cb; }; int BN_GENCB_call(BN_GENCB *cb, int a, int b); # 421 "/usr/include/openssl/bn.h" 3 4 const BIGNUM *BN_value_one(void); char *BN_options(void); BN_CTX *BN_CTX_new(void); void BN_CTX_init(BN_CTX *c); void BN_CTX_free(BN_CTX *c); void BN_CTX_start(BN_CTX *ctx); BIGNUM *BN_CTX_get(BN_CTX *ctx); void BN_CTX_end(BN_CTX *ctx); int BN_rand(BIGNUM *rnd, int bits, int top, int bottom); int BN_pseudo_rand(BIGNUM *rnd, int bits, int top, int bottom); int BN_rand_range(BIGNUM *rnd, const BIGNUM *range); int BN_pseudo_rand_range(BIGNUM *rnd, const BIGNUM *range); int BN_num_bits(const BIGNUM *a); int BN_num_bits_word(unsigned long); BIGNUM *BN_new(void); void BN_init(BIGNUM *); void BN_clear_free(BIGNUM *a); BIGNUM *BN_copy(BIGNUM *a, const BIGNUM *b); void BN_swap(BIGNUM *a, BIGNUM *b); BIGNUM *BN_bin2bn(const unsigned char *s, int len, BIGNUM *ret); int BN_bn2bin(const BIGNUM *a, unsigned char *to); BIGNUM *BN_mpi2bn(const unsigned char *s, int len, BIGNUM *ret); int BN_bn2mpi(const BIGNUM *a, unsigned char *to); int BN_sub(BIGNUM *r, const BIGNUM *a, const BIGNUM *b); int BN_usub(BIGNUM *r, const BIGNUM *a, const BIGNUM *b); int BN_uadd(BIGNUM *r, const BIGNUM *a, const BIGNUM *b); int BN_add(BIGNUM *r, const BIGNUM *a, const BIGNUM *b); int BN_mul(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx); int BN_sqr(BIGNUM *r, const BIGNUM *a, BN_CTX *ctx); void BN_set_negative(BIGNUM *b, int n); int BN_div(BIGNUM *dv, BIGNUM *rem, const BIGNUM *m, const BIGNUM *d, BN_CTX *ctx); int BN_nnmod(BIGNUM *r, const BIGNUM *m, const BIGNUM *d, BN_CTX *ctx); int BN_mod_add(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *m, BN_CTX *ctx); int BN_mod_add_quick(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *m); int BN_mod_sub(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *m, BN_CTX *ctx); int BN_mod_sub_quick(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *m); int BN_mod_mul(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *m, BN_CTX *ctx); int BN_mod_sqr(BIGNUM *r, const BIGNUM *a, const BIGNUM *m, BN_CTX *ctx); int BN_mod_lshift1(BIGNUM *r, const BIGNUM *a, const BIGNUM *m, BN_CTX *ctx); int BN_mod_lshift1_quick(BIGNUM *r, const BIGNUM *a, const BIGNUM *m); int BN_mod_lshift(BIGNUM *r, const BIGNUM *a, int n, const BIGNUM *m, BN_CTX *ctx); int BN_mod_lshift_quick(BIGNUM *r, const BIGNUM *a, int n, const BIGNUM *m); unsigned long BN_mod_word(const BIGNUM *a, unsigned long w); unsigned long BN_div_word(BIGNUM *a, unsigned long w); int BN_mul_word(BIGNUM *a, unsigned long w); int BN_add_word(BIGNUM *a, unsigned long w); int BN_sub_word(BIGNUM *a, unsigned long w); int BN_set_word(BIGNUM *a, unsigned long w); unsigned long BN_get_word(const BIGNUM *a); int BN_cmp(const BIGNUM *a, const BIGNUM *b); void BN_free(BIGNUM *a); int BN_is_bit_set(const BIGNUM *a, int n); int BN_lshift(BIGNUM *r, const BIGNUM *a, int n); int BN_lshift1(BIGNUM *r, const BIGNUM *a); int BN_exp(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); int BN_mod_exp(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx); int BN_mod_exp_mont(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *m_ctx); int BN_mod_exp_mont_consttime(BIGNUM *rr, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *in_mont); int BN_mod_exp_mont_word(BIGNUM *r, unsigned long a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *m_ctx); int BN_mod_exp2_mont(BIGNUM *r, const BIGNUM *a1, const BIGNUM *p1, const BIGNUM *a2, const BIGNUM *p2, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *m_ctx); int BN_mod_exp_simple(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx); int BN_mask_bits(BIGNUM *a, int n); int BN_print_fp(FILE *fp, const BIGNUM *a); int BN_print(BIO *fp, const BIGNUM *a); int BN_reciprocal(BIGNUM *r, const BIGNUM *m, int len, BN_CTX *ctx); int BN_rshift(BIGNUM *r, const BIGNUM *a, int n); int BN_rshift1(BIGNUM *r, const BIGNUM *a); void BN_clear(BIGNUM *a); BIGNUM *BN_dup(const BIGNUM *a); int BN_ucmp(const BIGNUM *a, const BIGNUM *b); int BN_set_bit(BIGNUM *a, int n); int BN_clear_bit(BIGNUM *a, int n); char *BN_bn2hex(const BIGNUM *a); char *BN_bn2dec(const BIGNUM *a); int BN_hex2bn(BIGNUM **a, const char *str); int BN_dec2bn(BIGNUM **a, const char *str); int BN_asc2bn(BIGNUM **a, const char *str); int BN_gcd(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx); int BN_kronecker(const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx); BIGNUM *BN_mod_inverse(BIGNUM *ret, const BIGNUM *a, const BIGNUM *n, BN_CTX *ctx); BIGNUM *BN_mod_sqrt(BIGNUM *ret, const BIGNUM *a, const BIGNUM *n, BN_CTX *ctx); void BN_consttime_swap(unsigned long swap, BIGNUM *a, BIGNUM *b, int nwords); BIGNUM *BN_generate_prime(BIGNUM *ret, int bits, int safe, const BIGNUM *add, const BIGNUM *rem, void (*callback) (int, int, void *), void *cb_arg); int BN_is_prime(const BIGNUM *p, int nchecks, void (*callback) (int, int, void *), BN_CTX *ctx, void *cb_arg); int BN_is_prime_fasttest(const BIGNUM *p, int nchecks, void (*callback) (int, int, void *), BN_CTX *ctx, void *cb_arg, int do_trial_division); int BN_generate_prime_ex(BIGNUM *ret, int bits, int safe, const BIGNUM *add, const BIGNUM *rem, BN_GENCB *cb); int BN_is_prime_ex(const BIGNUM *p, int nchecks, BN_CTX *ctx, BN_GENCB *cb); int BN_is_prime_fasttest_ex(const BIGNUM *p, int nchecks, BN_CTX *ctx, int do_trial_division, BN_GENCB *cb); int BN_X931_generate_Xpq(BIGNUM *Xp, BIGNUM *Xq, int nbits, BN_CTX *ctx); int BN_X931_derive_prime_ex(BIGNUM *p, BIGNUM *p1, BIGNUM *p2, const BIGNUM *Xp, const BIGNUM *Xp1, const BIGNUM *Xp2, const BIGNUM *e, BN_CTX *ctx, BN_GENCB *cb); int BN_X931_generate_prime_ex(BIGNUM *p, BIGNUM *p1, BIGNUM *p2, BIGNUM *Xp1, BIGNUM *Xp2, const BIGNUM *Xp, const BIGNUM *e, BN_CTX *ctx, BN_GENCB *cb); BN_MONT_CTX *BN_MONT_CTX_new(void); void BN_MONT_CTX_init(BN_MONT_CTX *ctx); int BN_mod_mul_montgomery(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, BN_MONT_CTX *mont, BN_CTX *ctx); int BN_from_montgomery(BIGNUM *r, const BIGNUM *a, BN_MONT_CTX *mont, BN_CTX *ctx); void BN_MONT_CTX_free(BN_MONT_CTX *mont); int BN_MONT_CTX_set(BN_MONT_CTX *mont, const BIGNUM *mod, BN_CTX *ctx); BN_MONT_CTX *BN_MONT_CTX_copy(BN_MONT_CTX *to, BN_MONT_CTX *from); BN_MONT_CTX *BN_MONT_CTX_set_locked(BN_MONT_CTX **pmont, int lock, const BIGNUM *mod, BN_CTX *ctx); BN_BLINDING *BN_BLINDING_new(const BIGNUM *A, const BIGNUM *Ai, BIGNUM *mod); void BN_BLINDING_free(BN_BLINDING *b); int BN_BLINDING_update(BN_BLINDING *b, BN_CTX *ctx); int BN_BLINDING_convert(BIGNUM *n, BN_BLINDING *b, BN_CTX *ctx); int BN_BLINDING_invert(BIGNUM *n, BN_BLINDING *b, BN_CTX *ctx); int BN_BLINDING_convert_ex(BIGNUM *n, BIGNUM *r, BN_BLINDING *b, BN_CTX *); int BN_BLINDING_invert_ex(BIGNUM *n, const BIGNUM *r, BN_BLINDING *b, BN_CTX *); unsigned long BN_BLINDING_get_thread_id(const BN_BLINDING *); void BN_BLINDING_set_thread_id(BN_BLINDING *, unsigned long); CRYPTO_THREADID *BN_BLINDING_thread_id(BN_BLINDING *); unsigned long BN_BLINDING_get_flags(const BN_BLINDING *); void BN_BLINDING_set_flags(BN_BLINDING *, unsigned long); BN_BLINDING *BN_BLINDING_create_param(BN_BLINDING *b, const BIGNUM *e, BIGNUM *m, BN_CTX *ctx, int (*bn_mod_exp) (BIGNUM *r, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *m_ctx), BN_MONT_CTX *m_ctx); void BN_set_params(int mul, int high, int low, int mont); int BN_get_params(int which); void BN_RECP_CTX_init(BN_RECP_CTX *recp); BN_RECP_CTX *BN_RECP_CTX_new(void); void BN_RECP_CTX_free(BN_RECP_CTX *recp); int BN_RECP_CTX_set(BN_RECP_CTX *recp, const BIGNUM *rdiv, BN_CTX *ctx); int BN_mod_mul_reciprocal(BIGNUM *r, const BIGNUM *x, const BIGNUM *y, BN_RECP_CTX *recp, BN_CTX *ctx); int BN_mod_exp_recp(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx); int BN_div_recp(BIGNUM *dv, BIGNUM *rem, const BIGNUM *m, BN_RECP_CTX *recp, BN_CTX *ctx); # 648 "/usr/include/openssl/bn.h" 3 4 int BN_GF2m_add(BIGNUM *r, const BIGNUM *a, const BIGNUM *b); int BN_GF2m_mod(BIGNUM *r, const BIGNUM *a, const BIGNUM *p); int BN_GF2m_mod_mul(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *p, BN_CTX *ctx); int BN_GF2m_mod_sqr(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); int BN_GF2m_mod_inv(BIGNUM *r, const BIGNUM *b, const BIGNUM *p, BN_CTX *ctx); int BN_GF2m_mod_div(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *p, BN_CTX *ctx); int BN_GF2m_mod_exp(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const BIGNUM *p, BN_CTX *ctx); int BN_GF2m_mod_sqrt(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); int BN_GF2m_mod_solve_quad(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); # 681 "/usr/include/openssl/bn.h" 3 4 int BN_GF2m_mod_arr(BIGNUM *r, const BIGNUM *a, const int p[]); int BN_GF2m_mod_mul_arr(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const int p[], BN_CTX *ctx); int BN_GF2m_mod_sqr_arr(BIGNUM *r, const BIGNUM *a, const int p[], BN_CTX *ctx); int BN_GF2m_mod_inv_arr(BIGNUM *r, const BIGNUM *b, const int p[], BN_CTX *ctx); int BN_GF2m_mod_div_arr(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const int p[], BN_CTX *ctx); int BN_GF2m_mod_exp_arr(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, const int p[], BN_CTX *ctx); int BN_GF2m_mod_sqrt_arr(BIGNUM *r, const BIGNUM *a, const int p[], BN_CTX *ctx); int BN_GF2m_mod_solve_quad_arr(BIGNUM *r, const BIGNUM *a, const int p[], BN_CTX *ctx); int BN_GF2m_poly2arr(const BIGNUM *a, int p[], int max); int BN_GF2m_arr2poly(const int p[], BIGNUM *a); int BN_nist_mod_192(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); int BN_nist_mod_224(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); int BN_nist_mod_256(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); int BN_nist_mod_384(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); int BN_nist_mod_521(BIGNUM *r, const BIGNUM *a, const BIGNUM *p, BN_CTX *ctx); const BIGNUM *BN_get0_nist_prime_192(void); const BIGNUM *BN_get0_nist_prime_224(void); const BIGNUM *BN_get0_nist_prime_256(void); const BIGNUM *BN_get0_nist_prime_384(void); const BIGNUM *BN_get0_nist_prime_521(void); # 737 "/usr/include/openssl/bn.h" 3 4 BIGNUM *bn_expand2(BIGNUM *a, int words); BIGNUM *bn_dup_expand(const BIGNUM *a, int words); # 850 "/usr/include/openssl/bn.h" 3 4 unsigned long bn_mul_add_words(unsigned long *rp, const unsigned long *ap, int num, unsigned long w); unsigned long bn_mul_words(unsigned long *rp, const unsigned long *ap, int num, unsigned long w); void bn_sqr_words(unsigned long *rp, const unsigned long *ap, int num); unsigned long bn_div_words(unsigned long h, unsigned long l, unsigned long d); unsigned long bn_add_words(unsigned long *rp, const unsigned long *ap, const unsigned long *bp, int num); unsigned long bn_sub_words(unsigned long *rp, const unsigned long *ap, const unsigned long *bp, int num); BIGNUM *get_rfc2409_prime_768(BIGNUM *bn); BIGNUM *get_rfc2409_prime_1024(BIGNUM *bn); BIGNUM *get_rfc3526_prime_1536(BIGNUM *bn); BIGNUM *get_rfc3526_prime_2048(BIGNUM *bn); BIGNUM *get_rfc3526_prime_3072(BIGNUM *bn); BIGNUM *get_rfc3526_prime_4096(BIGNUM *bn); BIGNUM *get_rfc3526_prime_6144(BIGNUM *bn); BIGNUM *get_rfc3526_prime_8192(BIGNUM *bn); int BN_bntest_rand(BIGNUM *rnd, int bits, int top, int bottom); void ERR_load_BN_strings(void); # 75 "/usr/include/openssl/asn1.h" 2 3 4 # 161 "/usr/include/openssl/asn1.h" 3 4 struct X509_algor_st; struct stack_st_X509_ALGOR { _STACK stack; }; # 172 "/usr/include/openssl/asn1.h" 3 4 typedef struct asn1_ctx_st { unsigned char *p; int eos; int error; int inf; int tag; int xclass; long slen; unsigned char *max; unsigned char *q; unsigned char **pp; int line; } ASN1_CTX; typedef struct asn1_const_ctx_st { const unsigned char *p; int eos; int error; int inf; int tag; int xclass; long slen; const unsigned char *max; const unsigned char *q; const unsigned char **pp; int line; } ASN1_const_CTX; # 210 "/usr/include/openssl/asn1.h" 3 4 struct asn1_object_st { const char *sn, *ln; int nid; int length; const unsigned char *data; int flags; }; # 239 "/usr/include/openssl/asn1.h" 3 4 struct asn1_string_st { int length; int type; unsigned char *data; long flags; }; typedef struct ASN1_ENCODING_st { unsigned char *enc; long len; int modified; } ASN1_ENCODING; # 272 "/usr/include/openssl/asn1.h" 3 4 typedef struct asn1_string_table_st { int nid; long minsize; long maxsize; unsigned long mask; unsigned long flags; } ASN1_STRING_TABLE; struct stack_st_ASN1_STRING_TABLE { _STACK stack; }; # 296 "/usr/include/openssl/asn1.h" 3 4 typedef struct ASN1_TEMPLATE_st ASN1_TEMPLATE; typedef struct ASN1_TLC_st ASN1_TLC; typedef struct ASN1_VALUE_st ASN1_VALUE; # 363 "/usr/include/openssl/asn1.h" 3 4 typedef void *d2i_of_void(void **,const unsigned char **,long); typedef int i2d_of_void(void *,unsigned char **); # 404 "/usr/include/openssl/asn1.h" 3 4 typedef const ASN1_ITEM ASN1_ITEM_EXP; # 519 "/usr/include/openssl/asn1.h" 3 4 struct stack_st_ASN1_INTEGER { _STACK stack; }; struct stack_st_ASN1_GENERALSTRING { _STACK stack; }; typedef struct asn1_type_st { int type; union { char *ptr; ASN1_BOOLEAN boolean; ASN1_STRING *asn1_string; ASN1_OBJECT *object; ASN1_INTEGER *integer; ASN1_ENUMERATED *enumerated; ASN1_BIT_STRING *bit_string; ASN1_OCTET_STRING *octet_string; ASN1_PRINTABLESTRING *printablestring; ASN1_T61STRING *t61string; ASN1_IA5STRING *ia5string; ASN1_GENERALSTRING *generalstring; ASN1_BMPSTRING *bmpstring; ASN1_UNIVERSALSTRING *universalstring; ASN1_UTCTIME *utctime; ASN1_GENERALIZEDTIME *generalizedtime; ASN1_VISIBLESTRING *visiblestring; ASN1_UTF8STRING *utf8string; ASN1_STRING *set; ASN1_STRING *sequence; ASN1_VALUE *asn1_value; } value; } ASN1_TYPE; struct stack_st_ASN1_TYPE { _STACK stack; }; typedef struct stack_st_ASN1_TYPE ASN1_SEQUENCE_ANY; ASN1_SEQUENCE_ANY *d2i_ASN1_SEQUENCE_ANY(ASN1_SEQUENCE_ANY **a, const unsigned char **in, long len); int i2d_ASN1_SEQUENCE_ANY(const ASN1_SEQUENCE_ANY *a, unsigned char **out); extern const ASN1_ITEM ASN1_SEQUENCE_ANY_it; ASN1_SEQUENCE_ANY *d2i_ASN1_SET_ANY(ASN1_SEQUENCE_ANY **a, const unsigned char **in, long len); int i2d_ASN1_SET_ANY(const ASN1_SEQUENCE_ANY *a, unsigned char **out); extern const ASN1_ITEM ASN1_SET_ANY_it; typedef struct NETSCAPE_X509_st { ASN1_OCTET_STRING *header; X509 *cert; } NETSCAPE_X509; typedef struct BIT_STRING_BITNAME_st { int bitnum; const char *lname; const char *sname; } BIT_STRING_BITNAME; # 776 "/usr/include/openssl/asn1.h" 3 4 ASN1_TYPE *ASN1_TYPE_new(void); void ASN1_TYPE_free(ASN1_TYPE *a); ASN1_TYPE *d2i_ASN1_TYPE(ASN1_TYPE **a, const unsigned char **in, long len); int i2d_ASN1_TYPE(ASN1_TYPE *a, unsigned char **out); extern const ASN1_ITEM ASN1_ANY_it; int ASN1_TYPE_get(ASN1_TYPE *a); void ASN1_TYPE_set(ASN1_TYPE *a, int type, void *value); int ASN1_TYPE_set1(ASN1_TYPE *a, int type, const void *value); int ASN1_TYPE_cmp(const ASN1_TYPE *a, const ASN1_TYPE *b); ASN1_OBJECT *ASN1_OBJECT_new(void); void ASN1_OBJECT_free(ASN1_OBJECT *a); int i2d_ASN1_OBJECT(ASN1_OBJECT *a, unsigned char **pp); ASN1_OBJECT *c2i_ASN1_OBJECT(ASN1_OBJECT **a, const unsigned char **pp, long length); ASN1_OBJECT *d2i_ASN1_OBJECT(ASN1_OBJECT **a, const unsigned char **pp, long length); extern const ASN1_ITEM ASN1_OBJECT_it; struct stack_st_ASN1_OBJECT { _STACK stack; }; ASN1_STRING *ASN1_STRING_new(void); void ASN1_STRING_free(ASN1_STRING *a); void ASN1_STRING_clear_free(ASN1_STRING *a); int ASN1_STRING_copy(ASN1_STRING *dst, const ASN1_STRING *str); ASN1_STRING *ASN1_STRING_dup(const ASN1_STRING *a); ASN1_STRING *ASN1_STRING_type_new(int type); int ASN1_STRING_cmp(const ASN1_STRING *a, const ASN1_STRING *b); int ASN1_STRING_set(ASN1_STRING *str, const void *data, int len); void ASN1_STRING_set0(ASN1_STRING *str, void *data, int len); int ASN1_STRING_length(const ASN1_STRING *x); void ASN1_STRING_length_set(ASN1_STRING *x, int n); int ASN1_STRING_type(ASN1_STRING *x); unsigned char *ASN1_STRING_data(ASN1_STRING *x); ASN1_BIT_STRING *ASN1_BIT_STRING_new(void); void ASN1_BIT_STRING_free(ASN1_BIT_STRING *a); ASN1_BIT_STRING *d2i_ASN1_BIT_STRING(ASN1_BIT_STRING **a, const unsigned char **in, long len); int i2d_ASN1_BIT_STRING(ASN1_BIT_STRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_BIT_STRING_it; int i2c_ASN1_BIT_STRING(ASN1_BIT_STRING *a, unsigned char **pp); ASN1_BIT_STRING *c2i_ASN1_BIT_STRING(ASN1_BIT_STRING **a, const unsigned char **pp, long length); int ASN1_BIT_STRING_set(ASN1_BIT_STRING *a, unsigned char *d, int length); int ASN1_BIT_STRING_set_bit(ASN1_BIT_STRING *a, int n, int value); int ASN1_BIT_STRING_get_bit(ASN1_BIT_STRING *a, int n); int ASN1_BIT_STRING_check(ASN1_BIT_STRING *a, unsigned char *flags, int flags_len); int ASN1_BIT_STRING_name_print(BIO *out, ASN1_BIT_STRING *bs, BIT_STRING_BITNAME *tbl, int indent); int ASN1_BIT_STRING_num_asc(char *name, BIT_STRING_BITNAME *tbl); int ASN1_BIT_STRING_set_asc(ASN1_BIT_STRING *bs, char *name, int value, BIT_STRING_BITNAME *tbl); int i2d_ASN1_BOOLEAN(int a, unsigned char **pp); int d2i_ASN1_BOOLEAN(int *a, const unsigned char **pp, long length); ASN1_INTEGER *ASN1_INTEGER_new(void); void ASN1_INTEGER_free(ASN1_INTEGER *a); ASN1_INTEGER *d2i_ASN1_INTEGER(ASN1_INTEGER **a, const unsigned char **in, long len); int i2d_ASN1_INTEGER(ASN1_INTEGER *a, unsigned char **out); extern const ASN1_ITEM ASN1_INTEGER_it; int i2c_ASN1_INTEGER(ASN1_INTEGER *a, unsigned char **pp); ASN1_INTEGER *c2i_ASN1_INTEGER(ASN1_INTEGER **a, const unsigned char **pp, long length); ASN1_INTEGER *d2i_ASN1_UINTEGER(ASN1_INTEGER **a, const unsigned char **pp, long length); ASN1_INTEGER *ASN1_INTEGER_dup(const ASN1_INTEGER *x); int ASN1_INTEGER_cmp(const ASN1_INTEGER *x, const ASN1_INTEGER *y); ASN1_ENUMERATED *ASN1_ENUMERATED_new(void); void ASN1_ENUMERATED_free(ASN1_ENUMERATED *a); ASN1_ENUMERATED *d2i_ASN1_ENUMERATED(ASN1_ENUMERATED **a, const unsigned char **in, long len); int i2d_ASN1_ENUMERATED(ASN1_ENUMERATED *a, unsigned char **out); extern const ASN1_ITEM ASN1_ENUMERATED_it; int ASN1_UTCTIME_check(const ASN1_UTCTIME *a); ASN1_UTCTIME *ASN1_UTCTIME_set(ASN1_UTCTIME *s, time_t t); ASN1_UTCTIME *ASN1_UTCTIME_adj(ASN1_UTCTIME *s, time_t t, int offset_day, long offset_sec); int ASN1_UTCTIME_set_string(ASN1_UTCTIME *s, const char *str); int ASN1_UTCTIME_cmp_time_t(const ASN1_UTCTIME *s, time_t t); int ASN1_GENERALIZEDTIME_check(const ASN1_GENERALIZEDTIME *a); ASN1_GENERALIZEDTIME *ASN1_GENERALIZEDTIME_set(ASN1_GENERALIZEDTIME *s, time_t t); ASN1_GENERALIZEDTIME *ASN1_GENERALIZEDTIME_adj(ASN1_GENERALIZEDTIME *s, time_t t, int offset_day, long offset_sec); int ASN1_GENERALIZEDTIME_set_string(ASN1_GENERALIZEDTIME *s, const char *str); int ASN1_TIME_diff(int *pday, int *psec, const ASN1_TIME *from, const ASN1_TIME *to); ASN1_OCTET_STRING *ASN1_OCTET_STRING_new(void); void ASN1_OCTET_STRING_free(ASN1_OCTET_STRING *a); ASN1_OCTET_STRING *d2i_ASN1_OCTET_STRING(ASN1_OCTET_STRING **a, const unsigned char **in, long len); int i2d_ASN1_OCTET_STRING(ASN1_OCTET_STRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_OCTET_STRING_it; ASN1_OCTET_STRING *ASN1_OCTET_STRING_dup(const ASN1_OCTET_STRING *a); int ASN1_OCTET_STRING_cmp(const ASN1_OCTET_STRING *a, const ASN1_OCTET_STRING *b); int ASN1_OCTET_STRING_set(ASN1_OCTET_STRING *str, const unsigned char *data, int len); ASN1_VISIBLESTRING *ASN1_VISIBLESTRING_new(void); void ASN1_VISIBLESTRING_free(ASN1_VISIBLESTRING *a); ASN1_VISIBLESTRING *d2i_ASN1_VISIBLESTRING(ASN1_VISIBLESTRING **a, const unsigned char **in, long len); int i2d_ASN1_VISIBLESTRING(ASN1_VISIBLESTRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_VISIBLESTRING_it; ASN1_UNIVERSALSTRING *ASN1_UNIVERSALSTRING_new(void); void ASN1_UNIVERSALSTRING_free(ASN1_UNIVERSALSTRING *a); ASN1_UNIVERSALSTRING *d2i_ASN1_UNIVERSALSTRING(ASN1_UNIVERSALSTRING **a, const unsigned char **in, long len); int i2d_ASN1_UNIVERSALSTRING(ASN1_UNIVERSALSTRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_UNIVERSALSTRING_it; ASN1_UTF8STRING *ASN1_UTF8STRING_new(void); void ASN1_UTF8STRING_free(ASN1_UTF8STRING *a); ASN1_UTF8STRING *d2i_ASN1_UTF8STRING(ASN1_UTF8STRING **a, const unsigned char **in, long len); int i2d_ASN1_UTF8STRING(ASN1_UTF8STRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_UTF8STRING_it; ASN1_NULL *ASN1_NULL_new(void); void ASN1_NULL_free(ASN1_NULL *a); ASN1_NULL *d2i_ASN1_NULL(ASN1_NULL **a, const unsigned char **in, long len); int i2d_ASN1_NULL(ASN1_NULL *a, unsigned char **out); extern const ASN1_ITEM ASN1_NULL_it; ASN1_BMPSTRING *ASN1_BMPSTRING_new(void); void ASN1_BMPSTRING_free(ASN1_BMPSTRING *a); ASN1_BMPSTRING *d2i_ASN1_BMPSTRING(ASN1_BMPSTRING **a, const unsigned char **in, long len); int i2d_ASN1_BMPSTRING(ASN1_BMPSTRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_BMPSTRING_it; int UTF8_getc(const unsigned char *str, int len, unsigned long *val); int UTF8_putc(unsigned char *str, int len, unsigned long value); ASN1_STRING *ASN1_PRINTABLE_new(void); void ASN1_PRINTABLE_free(ASN1_STRING *a); ASN1_STRING *d2i_ASN1_PRINTABLE(ASN1_STRING **a, const unsigned char **in, long len); int i2d_ASN1_PRINTABLE(ASN1_STRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_PRINTABLE_it; ASN1_STRING *DIRECTORYSTRING_new(void); void DIRECTORYSTRING_free(ASN1_STRING *a); ASN1_STRING *d2i_DIRECTORYSTRING(ASN1_STRING **a, const unsigned char **in, long len); int i2d_DIRECTORYSTRING(ASN1_STRING *a, unsigned char **out); extern const ASN1_ITEM DIRECTORYSTRING_it; ASN1_STRING *DISPLAYTEXT_new(void); void DISPLAYTEXT_free(ASN1_STRING *a); ASN1_STRING *d2i_DISPLAYTEXT(ASN1_STRING **a, const unsigned char **in, long len); int i2d_DISPLAYTEXT(ASN1_STRING *a, unsigned char **out); extern const ASN1_ITEM DISPLAYTEXT_it; ASN1_PRINTABLESTRING *ASN1_PRINTABLESTRING_new(void); void ASN1_PRINTABLESTRING_free(ASN1_PRINTABLESTRING *a); ASN1_PRINTABLESTRING *d2i_ASN1_PRINTABLESTRING(ASN1_PRINTABLESTRING **a, const unsigned char **in, long len); int i2d_ASN1_PRINTABLESTRING(ASN1_PRINTABLESTRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_PRINTABLESTRING_it; ASN1_T61STRING *ASN1_T61STRING_new(void); void ASN1_T61STRING_free(ASN1_T61STRING *a); ASN1_T61STRING *d2i_ASN1_T61STRING(ASN1_T61STRING **a, const unsigned char **in, long len); int i2d_ASN1_T61STRING(ASN1_T61STRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_T61STRING_it; ASN1_IA5STRING *ASN1_IA5STRING_new(void); void ASN1_IA5STRING_free(ASN1_IA5STRING *a); ASN1_IA5STRING *d2i_ASN1_IA5STRING(ASN1_IA5STRING **a, const unsigned char **in, long len); int i2d_ASN1_IA5STRING(ASN1_IA5STRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_IA5STRING_it; ASN1_GENERALSTRING *ASN1_GENERALSTRING_new(void); void ASN1_GENERALSTRING_free(ASN1_GENERALSTRING *a); ASN1_GENERALSTRING *d2i_ASN1_GENERALSTRING(ASN1_GENERALSTRING **a, const unsigned char **in, long len); int i2d_ASN1_GENERALSTRING(ASN1_GENERALSTRING *a, unsigned char **out); extern const ASN1_ITEM ASN1_GENERALSTRING_it; ASN1_UTCTIME *ASN1_UTCTIME_new(void); void ASN1_UTCTIME_free(ASN1_UTCTIME *a); ASN1_UTCTIME *d2i_ASN1_UTCTIME(ASN1_UTCTIME **a, const unsigned char **in, long len); int i2d_ASN1_UTCTIME(ASN1_UTCTIME *a, unsigned char **out); extern const ASN1_ITEM ASN1_UTCTIME_it; ASN1_GENERALIZEDTIME *ASN1_GENERALIZEDTIME_new(void); void ASN1_GENERALIZEDTIME_free(ASN1_GENERALIZEDTIME *a); ASN1_GENERALIZEDTIME *d2i_ASN1_GENERALIZEDTIME(ASN1_GENERALIZEDTIME **a, const unsigned char **in, long len); int i2d_ASN1_GENERALIZEDTIME(ASN1_GENERALIZEDTIME *a, unsigned char **out); extern const ASN1_ITEM ASN1_GENERALIZEDTIME_it; ASN1_TIME *ASN1_TIME_new(void); void ASN1_TIME_free(ASN1_TIME *a); ASN1_TIME *d2i_ASN1_TIME(ASN1_TIME **a, const unsigned char **in, long len); int i2d_ASN1_TIME(ASN1_TIME *a, unsigned char **out); extern const ASN1_ITEM ASN1_TIME_it; extern const ASN1_ITEM ASN1_OCTET_STRING_NDEF_it; ASN1_TIME *ASN1_TIME_set(ASN1_TIME *s, time_t t); ASN1_TIME *ASN1_TIME_adj(ASN1_TIME *s, time_t t, int offset_day, long offset_sec); int ASN1_TIME_check(ASN1_TIME *t); ASN1_GENERALIZEDTIME *ASN1_TIME_to_generalizedtime(ASN1_TIME *t, ASN1_GENERALIZEDTIME **out); int ASN1_TIME_set_string(ASN1_TIME *s, const char *str); int i2d_ASN1_SET(struct stack_st_OPENSSL_BLOCK *a, unsigned char **pp, i2d_of_void *i2d, int ex_tag, int ex_class, int is_set); struct stack_st_OPENSSL_BLOCK *d2i_ASN1_SET(struct stack_st_OPENSSL_BLOCK **a, const unsigned char **pp, long length, d2i_of_void *d2i, void (*free_func) (OPENSSL_BLOCK), int ex_tag, int ex_class); int i2a_ASN1_INTEGER(BIO *bp, ASN1_INTEGER *a); int a2i_ASN1_INTEGER(BIO *bp, ASN1_INTEGER *bs, char *buf, int size); int i2a_ASN1_ENUMERATED(BIO *bp, ASN1_ENUMERATED *a); int a2i_ASN1_ENUMERATED(BIO *bp, ASN1_ENUMERATED *bs, char *buf, int size); int i2a_ASN1_OBJECT(BIO *bp, ASN1_OBJECT *a); int a2i_ASN1_STRING(BIO *bp, ASN1_STRING *bs, char *buf, int size); int i2a_ASN1_STRING(BIO *bp, ASN1_STRING *a, int type); int i2t_ASN1_OBJECT(char *buf, int buf_len, ASN1_OBJECT *a); int a2d_ASN1_OBJECT(unsigned char *out, int olen, const char *buf, int num); ASN1_OBJECT *ASN1_OBJECT_create(int nid, unsigned char *data, int len, const char *sn, const char *ln); int ASN1_INTEGER_set(ASN1_INTEGER *a, long v); long ASN1_INTEGER_get(const ASN1_INTEGER *a); ASN1_INTEGER *BN_to_ASN1_INTEGER(const BIGNUM *bn, ASN1_INTEGER *ai); BIGNUM *ASN1_INTEGER_to_BN(const ASN1_INTEGER *ai, BIGNUM *bn); int ASN1_ENUMERATED_set(ASN1_ENUMERATED *a, long v); long ASN1_ENUMERATED_get(ASN1_ENUMERATED *a); ASN1_ENUMERATED *BN_to_ASN1_ENUMERATED(BIGNUM *bn, ASN1_ENUMERATED *ai); BIGNUM *ASN1_ENUMERATED_to_BN(ASN1_ENUMERATED *ai, BIGNUM *bn); int ASN1_PRINTABLE_type(const unsigned char *s, int max); int i2d_ASN1_bytes(ASN1_STRING *a, unsigned char **pp, int tag, int xclass); ASN1_STRING *d2i_ASN1_bytes(ASN1_STRING **a, const unsigned char **pp, long length, int Ptag, int Pclass); unsigned long ASN1_tag2bit(int tag); ASN1_STRING *d2i_ASN1_type_bytes(ASN1_STRING **a, const unsigned char **pp, long length, int type); int asn1_Finish(ASN1_CTX *c); int asn1_const_Finish(ASN1_const_CTX *c); int ASN1_get_object(const unsigned char **pp, long *plength, int *ptag, int *pclass, long omax); int ASN1_check_infinite_end(unsigned char **p, long len); int ASN1_const_check_infinite_end(const unsigned char **p, long len); void ASN1_put_object(unsigned char **pp, int constructed, int length, int tag, int xclass); int ASN1_put_eoc(unsigned char **pp); int ASN1_object_size(int constructed, int length, int tag); void *ASN1_dup(i2d_of_void *i2d, d2i_of_void *d2i, void *x); # 976 "/usr/include/openssl/asn1.h" 3 4 void *ASN1_item_dup(const ASN1_ITEM *it, void *x); # 985 "/usr/include/openssl/asn1.h" 3 4 void *ASN1_d2i_fp(void *(*xnew) (void), d2i_of_void *d2i, FILE *in, void **x); void *ASN1_item_d2i_fp(const ASN1_ITEM *it, FILE *in, void *x); int ASN1_i2d_fp(i2d_of_void *i2d, FILE *out, void *x); # 1006 "/usr/include/openssl/asn1.h" 3 4 int ASN1_item_i2d_fp(const ASN1_ITEM *it, FILE *out, void *x); int ASN1_STRING_print_ex_fp(FILE *fp, ASN1_STRING *str, unsigned long flags); int ASN1_STRING_to_UTF8(unsigned char **out, ASN1_STRING *in); void *ASN1_d2i_bio(void *(*xnew) (void), d2i_of_void *d2i, BIO *in, void **x); void *ASN1_item_d2i_bio(const ASN1_ITEM *it, BIO *in, void *x); int ASN1_i2d_bio(i2d_of_void *i2d, BIO *out, unsigned char *x); # 1034 "/usr/include/openssl/asn1.h" 3 4 int ASN1_item_i2d_bio(const ASN1_ITEM *it, BIO *out, void *x); int ASN1_UTCTIME_print(BIO *fp, const ASN1_UTCTIME *a); int ASN1_GENERALIZEDTIME_print(BIO *fp, const ASN1_GENERALIZEDTIME *a); int ASN1_TIME_print(BIO *fp, const ASN1_TIME *a); int ASN1_STRING_print(BIO *bp, const ASN1_STRING *v); int ASN1_STRING_print_ex(BIO *out, ASN1_STRING *str, unsigned long flags); int ASN1_bn_print(BIO *bp, const char *number, const BIGNUM *num, unsigned char *buf, int off); int ASN1_parse(BIO *bp, const unsigned char *pp, long len, int indent); int ASN1_parse_dump(BIO *bp, const unsigned char *pp, long len, int indent, int dump); const char *ASN1_tag2str(int tag); NETSCAPE_X509 *NETSCAPE_X509_new(void); void NETSCAPE_X509_free(NETSCAPE_X509 *a); NETSCAPE_X509 *d2i_NETSCAPE_X509(NETSCAPE_X509 **a, const unsigned char **in, long len); int i2d_NETSCAPE_X509(NETSCAPE_X509 *a, unsigned char **out); extern const ASN1_ITEM NETSCAPE_X509_it; int ASN1_UNIVERSALSTRING_to_string(ASN1_UNIVERSALSTRING *s); int ASN1_TYPE_set_octetstring(ASN1_TYPE *a, unsigned char *data, int len); int ASN1_TYPE_get_octetstring(ASN1_TYPE *a, unsigned char *data, int max_len); int ASN1_TYPE_set_int_octetstring(ASN1_TYPE *a, long num, unsigned char *data, int len); int ASN1_TYPE_get_int_octetstring(ASN1_TYPE *a, long *num, unsigned char *data, int max_len); struct stack_st_OPENSSL_BLOCK *ASN1_seq_unpack(const unsigned char *buf, int len, d2i_of_void *d2i, void (*free_func) (OPENSSL_BLOCK)); unsigned char *ASN1_seq_pack(struct stack_st_OPENSSL_BLOCK *safes, i2d_of_void *i2d, unsigned char **buf, int *len); void *ASN1_unpack_string(ASN1_STRING *oct, d2i_of_void *d2i); void *ASN1_item_unpack(ASN1_STRING *oct, const ASN1_ITEM *it); ASN1_STRING *ASN1_pack_string(void *obj, i2d_of_void *i2d, ASN1_OCTET_STRING **oct); ASN1_STRING *ASN1_item_pack(void *obj, const ASN1_ITEM *it, ASN1_OCTET_STRING **oct); void ASN1_STRING_set_default_mask(unsigned long mask); int ASN1_STRING_set_default_mask_asc(const char *p); unsigned long ASN1_STRING_get_default_mask(void); int ASN1_mbstring_copy(ASN1_STRING **out, const unsigned char *in, int len, int inform, unsigned long mask); int ASN1_mbstring_ncopy(ASN1_STRING **out, const unsigned char *in, int len, int inform, unsigned long mask, long minsize, long maxsize); ASN1_STRING *ASN1_STRING_set_by_NID(ASN1_STRING **out, const unsigned char *in, int inlen, int inform, int nid); ASN1_STRING_TABLE *ASN1_STRING_TABLE_get(int nid); int ASN1_STRING_TABLE_add(int, long, long, unsigned long, unsigned long); void ASN1_STRING_TABLE_cleanup(void); ASN1_VALUE *ASN1_item_new(const ASN1_ITEM *it); void ASN1_item_free(ASN1_VALUE *val, const ASN1_ITEM *it); ASN1_VALUE *ASN1_item_d2i(ASN1_VALUE **val, const unsigned char **in, long len, const ASN1_ITEM *it); int ASN1_item_i2d(ASN1_VALUE *val, unsigned char **out, const ASN1_ITEM *it); int ASN1_item_ndef_i2d(ASN1_VALUE *val, unsigned char **out, const ASN1_ITEM *it); void ASN1_add_oid_module(void); ASN1_TYPE *ASN1_generate_nconf(char *str, CONF *nconf); ASN1_TYPE *ASN1_generate_v3(char *str, X509V3_CTX *cnf); # 1132 "/usr/include/openssl/asn1.h" 3 4 int ASN1_item_print(BIO *out, ASN1_VALUE *ifld, int indent, const ASN1_ITEM *it, const ASN1_PCTX *pctx); ASN1_PCTX *ASN1_PCTX_new(void); void ASN1_PCTX_free(ASN1_PCTX *p); unsigned long ASN1_PCTX_get_flags(ASN1_PCTX *p); void ASN1_PCTX_set_flags(ASN1_PCTX *p, unsigned long flags); unsigned long ASN1_PCTX_get_nm_flags(ASN1_PCTX *p); void ASN1_PCTX_set_nm_flags(ASN1_PCTX *p, unsigned long flags); unsigned long ASN1_PCTX_get_cert_flags(ASN1_PCTX *p); void ASN1_PCTX_set_cert_flags(ASN1_PCTX *p, unsigned long flags); unsigned long ASN1_PCTX_get_oid_flags(ASN1_PCTX *p); void ASN1_PCTX_set_oid_flags(ASN1_PCTX *p, unsigned long flags); unsigned long ASN1_PCTX_get_str_flags(ASN1_PCTX *p); void ASN1_PCTX_set_str_flags(ASN1_PCTX *p, unsigned long flags); BIO_METHOD *BIO_f_asn1(void); BIO *BIO_new_NDEF(BIO *out, ASN1_VALUE *val, const ASN1_ITEM *it); int i2d_ASN1_bio_stream(BIO *out, ASN1_VALUE *val, BIO *in, int flags, const ASN1_ITEM *it); int PEM_write_bio_ASN1_stream(BIO *out, ASN1_VALUE *val, BIO *in, int flags, const char *hdr, const ASN1_ITEM *it); int SMIME_write_ASN1(BIO *bio, ASN1_VALUE *val, BIO *data, int flags, int ctype_nid, int econt_nid, struct stack_st_X509_ALGOR *mdalgs, const ASN1_ITEM *it); ASN1_VALUE *SMIME_read_ASN1(BIO *bio, BIO **bcont, const ASN1_ITEM *it); int SMIME_crlf_copy(BIO *in, BIO *out, int flags); int SMIME_text(BIO *in, BIO *out); void ERR_load_ASN1_strings(void); # 966 "/usr/include/openssl/objects.h" 2 3 4 # 984 "/usr/include/openssl/objects.h" 3 4 typedef struct obj_name_st { int type; int alias; const char *name; const char *data; } OBJ_NAME; int OBJ_NAME_init(void); int OBJ_NAME_new_index(unsigned long (*hash_func) (const char *), int (*cmp_func) (const char *, const char *), void (*free_func) (const char *, int, const char *)); const char *OBJ_NAME_get(const char *name, int type); int OBJ_NAME_add(const char *name, int type, const char *data); int OBJ_NAME_remove(const char *name, int type); void OBJ_NAME_cleanup(int type); void OBJ_NAME_do_all(int type, void (*fn) (const OBJ_NAME *, void *arg), void *arg); void OBJ_NAME_do_all_sorted(int type, void (*fn) (const OBJ_NAME *, void *arg), void *arg); ASN1_OBJECT *OBJ_dup(const ASN1_OBJECT *o); ASN1_OBJECT *OBJ_nid2obj(int n); const char *OBJ_nid2ln(int n); const char *OBJ_nid2sn(int n); int OBJ_obj2nid(const ASN1_OBJECT *o); ASN1_OBJECT *OBJ_txt2obj(const char *s, int no_name); int OBJ_obj2txt(char *buf, int buf_len, const ASN1_OBJECT *a, int no_name); int OBJ_txt2nid(const char *s); int OBJ_ln2nid(const char *s); int OBJ_sn2nid(const char *s); int OBJ_cmp(const ASN1_OBJECT *a, const ASN1_OBJECT *b); const void *OBJ_bsearch_(const void *key, const void *base, int num, int size, int (*cmp) (const void *, const void *)); const void *OBJ_bsearch_ex_(const void *key, const void *base, int num, int size, int (*cmp) (const void *, const void *), int flags); # 1104 "/usr/include/openssl/objects.h" 3 4 int OBJ_new_nid(int num); int OBJ_add_object(const ASN1_OBJECT *obj); int OBJ_create(const char *oid, const char *sn, const char *ln); void OBJ_cleanup(void); int OBJ_create_objects(BIO *in); int OBJ_find_sigid_algs(int signid, int *pdig_nid, int *ppkey_nid); int OBJ_find_sigid_by_algs(int *psignid, int dig_nid, int pkey_nid); int OBJ_add_sigid(int signid, int dig_id, int pkey_id); void OBJ_sigid_free(void); extern int obj_cleanup_defer; void check_defer(int nid); void ERR_load_OBJ_strings(void); # 95 "/usr/include/openssl/evp.h" 2 3 4 # 129 "/usr/include/openssl/evp.h" 3 4 struct evp_pkey_st { int type; int save_type; int references; const EVP_PKEY_ASN1_METHOD *ameth; ENGINE *engine; union { char *ptr; struct rsa_st *rsa; struct dsa_st *dsa; struct dh_st *dh; struct ec_key_st *ec; } pkey; int save_parameters; struct stack_st_X509_ATTRIBUTE *attributes; } ; struct env_md_st { int type; int pkey_type; int md_size; unsigned long flags; int (*init) (EVP_MD_CTX *ctx); int (*update) (EVP_MD_CTX *ctx, const void *data, size_t count); int (*final) (EVP_MD_CTX *ctx, unsigned char *md); int (*copy) (EVP_MD_CTX *to, const EVP_MD_CTX *from); int (*cleanup) (EVP_MD_CTX *ctx); int (*sign) (int type, const unsigned char *m, unsigned int m_length, unsigned char *sigret, unsigned int *siglen, void *key); int (*verify) (int type, const unsigned char *m, unsigned int m_length, const unsigned char *sigbuf, unsigned int siglen, void *key); int required_pkey_type[5]; int block_size; int ctx_size; int (*md_ctrl) (EVP_MD_CTX *ctx, int cmd, int p1, void *p2); } ; typedef int evp_sign_method(int type, const unsigned char *m, unsigned int m_length, unsigned char *sigret, unsigned int *siglen, void *key); typedef int evp_verify_method(int type, const unsigned char *m, unsigned int m_length, const unsigned char *sigbuf, unsigned int siglen, void *key); # 268 "/usr/include/openssl/evp.h" 3 4 struct env_md_ctx_st { const EVP_MD *digest; ENGINE *engine; unsigned long flags; void *md_data; EVP_PKEY_CTX *pctx; int (*update) (EVP_MD_CTX *ctx, const void *data, size_t count); } ; # 308 "/usr/include/openssl/evp.h" 3 4 struct evp_cipher_st { int nid; int block_size; int key_len; int iv_len; unsigned long flags; int (*init) (EVP_CIPHER_CTX *ctx, const unsigned char *key, const unsigned char *iv, int enc); int (*do_cipher) (EVP_CIPHER_CTX *ctx, unsigned char *out, const unsigned char *in, size_t inl); int (*cleanup) (EVP_CIPHER_CTX *); int ctx_size; int (*set_asn1_parameters) (EVP_CIPHER_CTX *, ASN1_TYPE *); int (*get_asn1_parameters) (EVP_CIPHER_CTX *, ASN1_TYPE *); int (*ctrl) (EVP_CIPHER_CTX *, int type, int arg, void *ptr); void *app_data; } ; # 429 "/usr/include/openssl/evp.h" 3 4 typedef struct { unsigned char *out; const unsigned char *inp; size_t len; unsigned int interleave; } EVP_CTRL_TLS1_1_MULTIBLOCK_PARAM; # 444 "/usr/include/openssl/evp.h" 3 4 typedef struct evp_cipher_info_st { const EVP_CIPHER *cipher; unsigned char iv[16]; } EVP_CIPHER_INFO; struct evp_cipher_ctx_st { const EVP_CIPHER *cipher; ENGINE *engine; int encrypt; int buf_len; unsigned char oiv[16]; unsigned char iv[16]; unsigned char buf[32]; int num; void *app_data; int key_len; unsigned long flags; void *cipher_data; int final_used; int block_mask; unsigned char final[32]; } ; typedef struct evp_Encode_Ctx_st { int num; int length; unsigned char enc_data[80]; int line_num; int expect_nl; } EVP_ENCODE_CTX; typedef int (EVP_PBE_KEYGEN) (EVP_CIPHER_CTX *ctx, const char *pass, int passlen, ASN1_TYPE *param, const EVP_CIPHER *cipher, const EVP_MD *md, int en_de); # 516 "/usr/include/openssl/evp.h" 3 4 int EVP_MD_type(const EVP_MD *md); int EVP_MD_pkey_type(const EVP_MD *md); int EVP_MD_size(const EVP_MD *md); int EVP_MD_block_size(const EVP_MD *md); unsigned long EVP_MD_flags(const EVP_MD *md); const EVP_MD *EVP_MD_CTX_md(const EVP_MD_CTX *ctx); int EVP_CIPHER_nid(const EVP_CIPHER *cipher); int EVP_CIPHER_block_size(const EVP_CIPHER *cipher); int EVP_CIPHER_key_length(const EVP_CIPHER *cipher); int EVP_CIPHER_iv_length(const EVP_CIPHER *cipher); unsigned long EVP_CIPHER_flags(const EVP_CIPHER *cipher); const EVP_CIPHER *EVP_CIPHER_CTX_cipher(const EVP_CIPHER_CTX *ctx); int EVP_CIPHER_CTX_nid(const EVP_CIPHER_CTX *ctx); int EVP_CIPHER_CTX_block_size(const EVP_CIPHER_CTX *ctx); int EVP_CIPHER_CTX_key_length(const EVP_CIPHER_CTX *ctx); int EVP_CIPHER_CTX_iv_length(const EVP_CIPHER_CTX *ctx); int EVP_CIPHER_CTX_copy(EVP_CIPHER_CTX *out, const EVP_CIPHER_CTX *in); void *EVP_CIPHER_CTX_get_app_data(const EVP_CIPHER_CTX *ctx); void EVP_CIPHER_CTX_set_app_data(EVP_CIPHER_CTX *ctx, void *data); unsigned long EVP_CIPHER_CTX_flags(const EVP_CIPHER_CTX *ctx); # 574 "/usr/include/openssl/evp.h" 3 4 int EVP_Cipher(EVP_CIPHER_CTX *c, unsigned char *out, const unsigned char *in, unsigned int inl); # 586 "/usr/include/openssl/evp.h" 3 4 void EVP_MD_CTX_init(EVP_MD_CTX *ctx); int EVP_MD_CTX_cleanup(EVP_MD_CTX *ctx); EVP_MD_CTX *EVP_MD_CTX_create(void); void EVP_MD_CTX_destroy(EVP_MD_CTX *ctx); int EVP_MD_CTX_copy_ex(EVP_MD_CTX *out, const EVP_MD_CTX *in); void EVP_MD_CTX_set_flags(EVP_MD_CTX *ctx, int flags); void EVP_MD_CTX_clear_flags(EVP_MD_CTX *ctx, int flags); int EVP_MD_CTX_test_flags(const EVP_MD_CTX *ctx, int flags); int EVP_DigestInit_ex(EVP_MD_CTX *ctx, const EVP_MD *type, ENGINE *impl); int EVP_DigestUpdate(EVP_MD_CTX *ctx, const void *d, size_t cnt); int EVP_DigestFinal_ex(EVP_MD_CTX *ctx, unsigned char *md, unsigned int *s); int EVP_Digest(const void *data, size_t count, unsigned char *md, unsigned int *size, const EVP_MD *type, ENGINE *impl); int EVP_MD_CTX_copy(EVP_MD_CTX *out, const EVP_MD_CTX *in); int EVP_DigestInit(EVP_MD_CTX *ctx, const EVP_MD *type); int EVP_DigestFinal(EVP_MD_CTX *ctx, unsigned char *md, unsigned int *s); int EVP_read_pw_string(char *buf, int length, const char *prompt, int verify); int EVP_read_pw_string_min(char *buf, int minlen, int maxlen, const char *prompt, int verify); void EVP_set_pw_prompt(const char *prompt); char *EVP_get_pw_prompt(void); int EVP_BytesToKey(const EVP_CIPHER *type, const EVP_MD *md, const unsigned char *salt, const unsigned char *data, int datal, int count, unsigned char *key, unsigned char *iv); void EVP_CIPHER_CTX_set_flags(EVP_CIPHER_CTX *ctx, int flags); void EVP_CIPHER_CTX_clear_flags(EVP_CIPHER_CTX *ctx, int flags); int EVP_CIPHER_CTX_test_flags(const EVP_CIPHER_CTX *ctx, int flags); int EVP_EncryptInit(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *cipher, const unsigned char *key, const unsigned char *iv); int EVP_EncryptInit_ex(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *cipher, ENGINE *impl, const unsigned char *key, const unsigned char *iv); int EVP_EncryptUpdate(EVP_CIPHER_CTX *ctx, unsigned char *out, int *outl, const unsigned char *in, int inl); int EVP_EncryptFinal_ex(EVP_CIPHER_CTX *ctx, unsigned char *out, int *outl); int EVP_EncryptFinal(EVP_CIPHER_CTX *ctx, unsigned char *out, int *outl); int EVP_DecryptInit(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *cipher, const unsigned char *key, const unsigned char *iv); int EVP_DecryptInit_ex(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *cipher, ENGINE *impl, const unsigned char *key, const unsigned char *iv); int EVP_DecryptUpdate(EVP_CIPHER_CTX *ctx, unsigned char *out, int *outl, const unsigned char *in, int inl); int EVP_DecryptFinal(EVP_CIPHER_CTX *ctx, unsigned char *outm, int *outl); int EVP_DecryptFinal_ex(EVP_CIPHER_CTX *ctx, unsigned char *outm, int *outl); int EVP_CipherInit(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *cipher, const unsigned char *key, const unsigned char *iv, int enc); int EVP_CipherInit_ex(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *cipher, ENGINE *impl, const unsigned char *key, const unsigned char *iv, int enc); int EVP_CipherUpdate(EVP_CIPHER_CTX *ctx, unsigned char *out, int *outl, const unsigned char *in, int inl); int EVP_CipherFinal(EVP_CIPHER_CTX *ctx, unsigned char *outm, int *outl); int EVP_CipherFinal_ex(EVP_CIPHER_CTX *ctx, unsigned char *outm, int *outl); int EVP_SignFinal(EVP_MD_CTX *ctx, unsigned char *md, unsigned int *s, EVP_PKEY *pkey); int EVP_VerifyFinal(EVP_MD_CTX *ctx, const unsigned char *sigbuf, unsigned int siglen, EVP_PKEY *pkey); int EVP_DigestSignInit(EVP_MD_CTX *ctx, EVP_PKEY_CTX **pctx, const EVP_MD *type, ENGINE *e, EVP_PKEY *pkey); int EVP_DigestSignFinal(EVP_MD_CTX *ctx, unsigned char *sigret, size_t *siglen); int EVP_DigestVerifyInit(EVP_MD_CTX *ctx, EVP_PKEY_CTX **pctx, const EVP_MD *type, ENGINE *e, EVP_PKEY *pkey); int EVP_DigestVerifyFinal(EVP_MD_CTX *ctx, const unsigned char *sig, size_t siglen); int EVP_OpenInit(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *type, const unsigned char *ek, int ekl, const unsigned char *iv, EVP_PKEY *priv); int EVP_OpenFinal(EVP_CIPHER_CTX *ctx, unsigned char *out, int *outl); int EVP_SealInit(EVP_CIPHER_CTX *ctx, const EVP_CIPHER *type, unsigned char **ek, int *ekl, unsigned char *iv, EVP_PKEY **pubk, int npubk); int EVP_SealFinal(EVP_CIPHER_CTX *ctx, unsigned char *out, int *outl); void EVP_EncodeInit(EVP_ENCODE_CTX *ctx); void EVP_EncodeUpdate(EVP_ENCODE_CTX *ctx, unsigned char *out, int *outl, const unsigned char *in, int inl); void EVP_EncodeFinal(EVP_ENCODE_CTX *ctx, unsigned char *out, int *outl); int EVP_EncodeBlock(unsigned char *t, const unsigned char *f, int n); void EVP_DecodeInit(EVP_ENCODE_CTX *ctx); int EVP_DecodeUpdate(EVP_ENCODE_CTX *ctx, unsigned char *out, int *outl, const unsigned char *in, int inl); int EVP_DecodeFinal(EVP_ENCODE_CTX *ctx, unsigned char *out, int *outl); int EVP_DecodeBlock(unsigned char *t, const unsigned char *f, int n); void EVP_CIPHER_CTX_init(EVP_CIPHER_CTX *a); int EVP_CIPHER_CTX_cleanup(EVP_CIPHER_CTX *a); EVP_CIPHER_CTX *EVP_CIPHER_CTX_new(void); void EVP_CIPHER_CTX_free(EVP_CIPHER_CTX *a); int EVP_CIPHER_CTX_set_key_length(EVP_CIPHER_CTX *x, int keylen); int EVP_CIPHER_CTX_set_padding(EVP_CIPHER_CTX *c, int pad); int EVP_CIPHER_CTX_ctrl(EVP_CIPHER_CTX *ctx, int type, int arg, void *ptr); int EVP_CIPHER_CTX_rand_key(EVP_CIPHER_CTX *ctx, unsigned char *key); BIO_METHOD *BIO_f_md(void); BIO_METHOD *BIO_f_base64(void); BIO_METHOD *BIO_f_cipher(void); BIO_METHOD *BIO_f_reliable(void); void BIO_set_cipher(BIO *b, const EVP_CIPHER *c, const unsigned char *k, const unsigned char *i, int enc); const EVP_MD *EVP_md_null(void); const EVP_MD *EVP_md4(void); const EVP_MD *EVP_md5(void); const EVP_MD *EVP_sha(void); const EVP_MD *EVP_sha1(void); const EVP_MD *EVP_dss(void); const EVP_MD *EVP_dss1(void); const EVP_MD *EVP_ecdsa(void); const EVP_MD *EVP_sha224(void); const EVP_MD *EVP_sha256(void); const EVP_MD *EVP_sha384(void); const EVP_MD *EVP_sha512(void); const EVP_MD *EVP_mdc2(void); const EVP_MD *EVP_ripemd160(void); const EVP_MD *EVP_whirlpool(void); const EVP_CIPHER *EVP_enc_null(void); const EVP_CIPHER *EVP_des_ecb(void); const EVP_CIPHER *EVP_des_ede(void); const EVP_CIPHER *EVP_des_ede3(void); const EVP_CIPHER *EVP_des_ede_ecb(void); const EVP_CIPHER *EVP_des_ede3_ecb(void); const EVP_CIPHER *EVP_des_cfb64(void); const EVP_CIPHER *EVP_des_cfb1(void); const EVP_CIPHER *EVP_des_cfb8(void); const EVP_CIPHER *EVP_des_ede_cfb64(void); const EVP_CIPHER *EVP_des_ede3_cfb64(void); const EVP_CIPHER *EVP_des_ede3_cfb1(void); const EVP_CIPHER *EVP_des_ede3_cfb8(void); const EVP_CIPHER *EVP_des_ofb(void); const EVP_CIPHER *EVP_des_ede_ofb(void); const EVP_CIPHER *EVP_des_ede3_ofb(void); const EVP_CIPHER *EVP_des_cbc(void); const EVP_CIPHER *EVP_des_ede_cbc(void); const EVP_CIPHER *EVP_des_ede3_cbc(void); const EVP_CIPHER *EVP_desx_cbc(void); const EVP_CIPHER *EVP_des_ede3_wrap(void); # 785 "/usr/include/openssl/evp.h" 3 4 const EVP_CIPHER *EVP_rc4(void); const EVP_CIPHER *EVP_rc4_40(void); const EVP_CIPHER *EVP_rc4_hmac_md5(void); const EVP_CIPHER *EVP_idea_ecb(void); const EVP_CIPHER *EVP_idea_cfb64(void); const EVP_CIPHER *EVP_idea_ofb(void); const EVP_CIPHER *EVP_idea_cbc(void); const EVP_CIPHER *EVP_rc2_ecb(void); const EVP_CIPHER *EVP_rc2_cbc(void); const EVP_CIPHER *EVP_rc2_40_cbc(void); const EVP_CIPHER *EVP_rc2_64_cbc(void); const EVP_CIPHER *EVP_rc2_cfb64(void); const EVP_CIPHER *EVP_rc2_ofb(void); const EVP_CIPHER *EVP_bf_ecb(void); const EVP_CIPHER *EVP_bf_cbc(void); const EVP_CIPHER *EVP_bf_cfb64(void); const EVP_CIPHER *EVP_bf_ofb(void); const EVP_CIPHER *EVP_cast5_ecb(void); const EVP_CIPHER *EVP_cast5_cbc(void); const EVP_CIPHER *EVP_cast5_cfb64(void); const EVP_CIPHER *EVP_cast5_ofb(void); # 829 "/usr/include/openssl/evp.h" 3 4 const EVP_CIPHER *EVP_aes_128_ecb(void); const EVP_CIPHER *EVP_aes_128_cbc(void); const EVP_CIPHER *EVP_aes_128_cfb1(void); const EVP_CIPHER *EVP_aes_128_cfb8(void); const EVP_CIPHER *EVP_aes_128_cfb128(void); const EVP_CIPHER *EVP_aes_128_ofb(void); const EVP_CIPHER *EVP_aes_128_ctr(void); const EVP_CIPHER *EVP_aes_128_ccm(void); const EVP_CIPHER *EVP_aes_128_gcm(void); const EVP_CIPHER *EVP_aes_128_xts(void); const EVP_CIPHER *EVP_aes_128_wrap(void); const EVP_CIPHER *EVP_aes_192_ecb(void); const EVP_CIPHER *EVP_aes_192_cbc(void); const EVP_CIPHER *EVP_aes_192_cfb1(void); const EVP_CIPHER *EVP_aes_192_cfb8(void); const EVP_CIPHER *EVP_aes_192_cfb128(void); const EVP_CIPHER *EVP_aes_192_ofb(void); const EVP_CIPHER *EVP_aes_192_ctr(void); const EVP_CIPHER *EVP_aes_192_ccm(void); const EVP_CIPHER *EVP_aes_192_gcm(void); const EVP_CIPHER *EVP_aes_192_wrap(void); const EVP_CIPHER *EVP_aes_256_ecb(void); const EVP_CIPHER *EVP_aes_256_cbc(void); const EVP_CIPHER *EVP_aes_256_cfb1(void); const EVP_CIPHER *EVP_aes_256_cfb8(void); const EVP_CIPHER *EVP_aes_256_cfb128(void); const EVP_CIPHER *EVP_aes_256_ofb(void); const EVP_CIPHER *EVP_aes_256_ctr(void); const EVP_CIPHER *EVP_aes_256_ccm(void); const EVP_CIPHER *EVP_aes_256_gcm(void); const EVP_CIPHER *EVP_aes_256_xts(void); const EVP_CIPHER *EVP_aes_256_wrap(void); const EVP_CIPHER *EVP_aes_128_cbc_hmac_sha1(void); const EVP_CIPHER *EVP_aes_256_cbc_hmac_sha1(void); const EVP_CIPHER *EVP_aes_128_cbc_hmac_sha256(void); const EVP_CIPHER *EVP_aes_256_cbc_hmac_sha256(void); const EVP_CIPHER *EVP_camellia_128_ecb(void); const EVP_CIPHER *EVP_camellia_128_cbc(void); const EVP_CIPHER *EVP_camellia_128_cfb1(void); const EVP_CIPHER *EVP_camellia_128_cfb8(void); const EVP_CIPHER *EVP_camellia_128_cfb128(void); const EVP_CIPHER *EVP_camellia_128_ofb(void); const EVP_CIPHER *EVP_camellia_192_ecb(void); const EVP_CIPHER *EVP_camellia_192_cbc(void); const EVP_CIPHER *EVP_camellia_192_cfb1(void); const EVP_CIPHER *EVP_camellia_192_cfb8(void); const EVP_CIPHER *EVP_camellia_192_cfb128(void); const EVP_CIPHER *EVP_camellia_192_ofb(void); const EVP_CIPHER *EVP_camellia_256_ecb(void); const EVP_CIPHER *EVP_camellia_256_cbc(void); const EVP_CIPHER *EVP_camellia_256_cfb1(void); const EVP_CIPHER *EVP_camellia_256_cfb8(void); const EVP_CIPHER *EVP_camellia_256_cfb128(void); const EVP_CIPHER *EVP_camellia_256_ofb(void); const EVP_CIPHER *EVP_seed_ecb(void); const EVP_CIPHER *EVP_seed_cbc(void); const EVP_CIPHER *EVP_seed_cfb128(void); const EVP_CIPHER *EVP_seed_ofb(void); void OPENSSL_add_all_algorithms_noconf(void); void OPENSSL_add_all_algorithms_conf(void); # 916 "/usr/include/openssl/evp.h" 3 4 void OpenSSL_add_all_ciphers(void); void OpenSSL_add_all_digests(void); int EVP_add_cipher(const EVP_CIPHER *cipher); int EVP_add_digest(const EVP_MD *digest); const EVP_CIPHER *EVP_get_cipherbyname(const char *name); const EVP_MD *EVP_get_digestbyname(const char *name); void EVP_cleanup(void); void EVP_CIPHER_do_all(void (*fn) (const EVP_CIPHER *ciph, const char *from, const char *to, void *x), void *arg); void EVP_CIPHER_do_all_sorted(void (*fn) (const EVP_CIPHER *ciph, const char *from, const char *to, void *x), void *arg); void EVP_MD_do_all(void (*fn) (const EVP_MD *ciph, const char *from, const char *to, void *x), void *arg); void EVP_MD_do_all_sorted(void (*fn) (const EVP_MD *ciph, const char *from, const char *to, void *x), void *arg); int EVP_PKEY_decrypt_old(unsigned char *dec_key, const unsigned char *enc_key, int enc_key_len, EVP_PKEY *private_key); int EVP_PKEY_encrypt_old(unsigned char *enc_key, const unsigned char *key, int key_len, EVP_PKEY *pub_key); int EVP_PKEY_type(int type); int EVP_PKEY_id(const EVP_PKEY *pkey); int EVP_PKEY_base_id(const EVP_PKEY *pkey); int EVP_PKEY_bits(EVP_PKEY *pkey); int EVP_PKEY_size(EVP_PKEY *pkey); int EVP_PKEY_set_type(EVP_PKEY *pkey, int type); int EVP_PKEY_set_type_str(EVP_PKEY *pkey, const char *str, int len); int EVP_PKEY_assign(EVP_PKEY *pkey, int type, void *key); void *EVP_PKEY_get0(EVP_PKEY *pkey); struct rsa_st; int EVP_PKEY_set1_RSA(EVP_PKEY *pkey, struct rsa_st *key); struct rsa_st *EVP_PKEY_get1_RSA(EVP_PKEY *pkey); struct dsa_st; int EVP_PKEY_set1_DSA(EVP_PKEY *pkey, struct dsa_st *key); struct dsa_st *EVP_PKEY_get1_DSA(EVP_PKEY *pkey); struct dh_st; int EVP_PKEY_set1_DH(EVP_PKEY *pkey, struct dh_st *key); struct dh_st *EVP_PKEY_get1_DH(EVP_PKEY *pkey); struct ec_key_st; int EVP_PKEY_set1_EC_KEY(EVP_PKEY *pkey, struct ec_key_st *key); struct ec_key_st *EVP_PKEY_get1_EC_KEY(EVP_PKEY *pkey); EVP_PKEY *EVP_PKEY_new(void); void EVP_PKEY_free(EVP_PKEY *pkey); EVP_PKEY *d2i_PublicKey(int type, EVP_PKEY **a, const unsigned char **pp, long length); int i2d_PublicKey(EVP_PKEY *a, unsigned char **pp); EVP_PKEY *d2i_PrivateKey(int type, EVP_PKEY **a, const unsigned char **pp, long length); EVP_PKEY *d2i_AutoPrivateKey(EVP_PKEY **a, const unsigned char **pp, long length); int i2d_PrivateKey(EVP_PKEY *a, unsigned char **pp); int EVP_PKEY_copy_parameters(EVP_PKEY *to, const EVP_PKEY *from); int EVP_PKEY_missing_parameters(const EVP_PKEY *pkey); int EVP_PKEY_save_parameters(EVP_PKEY *pkey, int mode); int EVP_PKEY_cmp_parameters(const EVP_PKEY *a, const EVP_PKEY *b); int EVP_PKEY_cmp(const EVP_PKEY *a, const EVP_PKEY *b); int EVP_PKEY_print_public(BIO *out, const EVP_PKEY *pkey, int indent, ASN1_PCTX *pctx); int EVP_PKEY_print_private(BIO *out, const EVP_PKEY *pkey, int indent, ASN1_PCTX *pctx); int EVP_PKEY_print_params(BIO *out, const EVP_PKEY *pkey, int indent, ASN1_PCTX *pctx); int EVP_PKEY_get_default_digest_nid(EVP_PKEY *pkey, int *pnid); int EVP_CIPHER_type(const EVP_CIPHER *ctx); int EVP_CIPHER_param_to_asn1(EVP_CIPHER_CTX *c, ASN1_TYPE *type); int EVP_CIPHER_asn1_to_param(EVP_CIPHER_CTX *c, ASN1_TYPE *type); int EVP_CIPHER_set_asn1_iv(EVP_CIPHER_CTX *c, ASN1_TYPE *type); int EVP_CIPHER_get_asn1_iv(EVP_CIPHER_CTX *c, ASN1_TYPE *type); int PKCS5_PBE_keyivgen(EVP_CIPHER_CTX *ctx, const char *pass, int passlen, ASN1_TYPE *param, const EVP_CIPHER *cipher, const EVP_MD *md, int en_de); int PKCS5_PBKDF2_HMAC_SHA1(const char *pass, int passlen, const unsigned char *salt, int saltlen, int iter, int keylen, unsigned char *out); int PKCS5_PBKDF2_HMAC(const char *pass, int passlen, const unsigned char *salt, int saltlen, int iter, const EVP_MD *digest, int keylen, unsigned char *out); int PKCS5_v2_PBE_keyivgen(EVP_CIPHER_CTX *ctx, const char *pass, int passlen, ASN1_TYPE *param, const EVP_CIPHER *cipher, const EVP_MD *md, int en_de); void PKCS5_PBE_add(void); int EVP_PBE_CipherInit(ASN1_OBJECT *pbe_obj, const char *pass, int passlen, ASN1_TYPE *param, EVP_CIPHER_CTX *ctx, int en_de); # 1045 "/usr/include/openssl/evp.h" 3 4 int EVP_PBE_alg_add_type(int pbe_type, int pbe_nid, int cipher_nid, int md_nid, EVP_PBE_KEYGEN *keygen); int EVP_PBE_alg_add(int nid, const EVP_CIPHER *cipher, const EVP_MD *md, EVP_PBE_KEYGEN *keygen); int EVP_PBE_find(int type, int pbe_nid, int *pcnid, int *pmnid, EVP_PBE_KEYGEN **pkeygen); void EVP_PBE_cleanup(void); # 1064 "/usr/include/openssl/evp.h" 3 4 int EVP_PKEY_asn1_get_count(void); const EVP_PKEY_ASN1_METHOD *EVP_PKEY_asn1_get0(int idx); const EVP_PKEY_ASN1_METHOD *EVP_PKEY_asn1_find(ENGINE **pe, int type); const EVP_PKEY_ASN1_METHOD *EVP_PKEY_asn1_find_str(ENGINE **pe, const char *str, int len); int EVP_PKEY_asn1_add0(const EVP_PKEY_ASN1_METHOD *ameth); int EVP_PKEY_asn1_add_alias(int to, int from); int EVP_PKEY_asn1_get0_info(int *ppkey_id, int *pkey_base_id, int *ppkey_flags, const char **pinfo, const char **ppem_str, const EVP_PKEY_ASN1_METHOD *ameth); const EVP_PKEY_ASN1_METHOD *EVP_PKEY_get0_asn1(EVP_PKEY *pkey); EVP_PKEY_ASN1_METHOD *EVP_PKEY_asn1_new(int id, int flags, const char *pem_str, const char *info); void EVP_PKEY_asn1_copy(EVP_PKEY_ASN1_METHOD *dst, const EVP_PKEY_ASN1_METHOD *src); void EVP_PKEY_asn1_free(EVP_PKEY_ASN1_METHOD *ameth); void EVP_PKEY_asn1_set_public(EVP_PKEY_ASN1_METHOD *ameth, int (*pub_decode) (EVP_PKEY *pk, X509_PUBKEY *pub), int (*pub_encode) (X509_PUBKEY *pub, const EVP_PKEY *pk), int (*pub_cmp) (const EVP_PKEY *a, const EVP_PKEY *b), int (*pub_print) (BIO *out, const EVP_PKEY *pkey, int indent, ASN1_PCTX *pctx), int (*pkey_size) (const EVP_PKEY *pk), int (*pkey_bits) (const EVP_PKEY *pk)); void EVP_PKEY_asn1_set_private(EVP_PKEY_ASN1_METHOD *ameth, int (*priv_decode) (EVP_PKEY *pk, PKCS8_PRIV_KEY_INFO *p8inf), int (*priv_encode) (PKCS8_PRIV_KEY_INFO *p8, const EVP_PKEY *pk), int (*priv_print) (BIO *out, const EVP_PKEY *pkey, int indent, ASN1_PCTX *pctx)); void EVP_PKEY_asn1_set_param(EVP_PKEY_ASN1_METHOD *ameth, int (*param_decode) (EVP_PKEY *pkey, const unsigned char **pder, int derlen), int (*param_encode) (const EVP_PKEY *pkey, unsigned char **pder), int (*param_missing) (const EVP_PKEY *pk), int (*param_copy) (EVP_PKEY *to, const EVP_PKEY *from), int (*param_cmp) (const EVP_PKEY *a, const EVP_PKEY *b), int (*param_print) (BIO *out, const EVP_PKEY *pkey, int indent, ASN1_PCTX *pctx)); void EVP_PKEY_asn1_set_free(EVP_PKEY_ASN1_METHOD *ameth, void (*pkey_free) (EVP_PKEY *pkey)); void EVP_PKEY_asn1_set_ctrl(EVP_PKEY_ASN1_METHOD *ameth, int (*pkey_ctrl) (EVP_PKEY *pkey, int op, long arg1, void *arg2)); void EVP_PKEY_asn1_set_item(EVP_PKEY_ASN1_METHOD *ameth, int (*item_verify) (EVP_MD_CTX *ctx, const ASN1_ITEM *it, void *asn, X509_ALGOR *a, ASN1_BIT_STRING *sig, EVP_PKEY *pkey), int (*item_sign) (EVP_MD_CTX *ctx, const ASN1_ITEM *it, void *asn, X509_ALGOR *alg1, X509_ALGOR *alg2, ASN1_BIT_STRING *sig)); # 1204 "/usr/include/openssl/evp.h" 3 4 const EVP_PKEY_METHOD *EVP_PKEY_meth_find(int type); EVP_PKEY_METHOD *EVP_PKEY_meth_new(int id, int flags); void EVP_PKEY_meth_get0_info(int *ppkey_id, int *pflags, const EVP_PKEY_METHOD *meth); void EVP_PKEY_meth_copy(EVP_PKEY_METHOD *dst, const EVP_PKEY_METHOD *src); void EVP_PKEY_meth_free(EVP_PKEY_METHOD *pmeth); int EVP_PKEY_meth_add0(const EVP_PKEY_METHOD *pmeth); EVP_PKEY_CTX *EVP_PKEY_CTX_new(EVP_PKEY *pkey, ENGINE *e); EVP_PKEY_CTX *EVP_PKEY_CTX_new_id(int id, ENGINE *e); EVP_PKEY_CTX *EVP_PKEY_CTX_dup(EVP_PKEY_CTX *ctx); void EVP_PKEY_CTX_free(EVP_PKEY_CTX *ctx); int EVP_PKEY_CTX_ctrl(EVP_PKEY_CTX *ctx, int keytype, int optype, int cmd, int p1, void *p2); int EVP_PKEY_CTX_ctrl_str(EVP_PKEY_CTX *ctx, const char *type, const char *value); int EVP_PKEY_CTX_get_operation(EVP_PKEY_CTX *ctx); void EVP_PKEY_CTX_set0_keygen_info(EVP_PKEY_CTX *ctx, int *dat, int datlen); EVP_PKEY *EVP_PKEY_new_mac_key(int type, ENGINE *e, const unsigned char *key, int keylen); void EVP_PKEY_CTX_set_data(EVP_PKEY_CTX *ctx, void *data); void *EVP_PKEY_CTX_get_data(EVP_PKEY_CTX *ctx); EVP_PKEY *EVP_PKEY_CTX_get0_pkey(EVP_PKEY_CTX *ctx); EVP_PKEY *EVP_PKEY_CTX_get0_peerkey(EVP_PKEY_CTX *ctx); void EVP_PKEY_CTX_set_app_data(EVP_PKEY_CTX *ctx, void *data); void *EVP_PKEY_CTX_get_app_data(EVP_PKEY_CTX *ctx); int EVP_PKEY_sign_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_sign(EVP_PKEY_CTX *ctx, unsigned char *sig, size_t *siglen, const unsigned char *tbs, size_t tbslen); int EVP_PKEY_verify_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_verify(EVP_PKEY_CTX *ctx, const unsigned char *sig, size_t siglen, const unsigned char *tbs, size_t tbslen); int EVP_PKEY_verify_recover_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_verify_recover(EVP_PKEY_CTX *ctx, unsigned char *rout, size_t *routlen, const unsigned char *sig, size_t siglen); int EVP_PKEY_encrypt_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_encrypt(EVP_PKEY_CTX *ctx, unsigned char *out, size_t *outlen, const unsigned char *in, size_t inlen); int EVP_PKEY_decrypt_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_decrypt(EVP_PKEY_CTX *ctx, unsigned char *out, size_t *outlen, const unsigned char *in, size_t inlen); int EVP_PKEY_derive_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_derive_set_peer(EVP_PKEY_CTX *ctx, EVP_PKEY *peer); int EVP_PKEY_derive(EVP_PKEY_CTX *ctx, unsigned char *key, size_t *keylen); typedef int EVP_PKEY_gen_cb (EVP_PKEY_CTX *ctx); int EVP_PKEY_paramgen_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_paramgen(EVP_PKEY_CTX *ctx, EVP_PKEY **ppkey); int EVP_PKEY_keygen_init(EVP_PKEY_CTX *ctx); int EVP_PKEY_keygen(EVP_PKEY_CTX *ctx, EVP_PKEY **ppkey); void EVP_PKEY_CTX_set_cb(EVP_PKEY_CTX *ctx, EVP_PKEY_gen_cb *cb); EVP_PKEY_gen_cb *EVP_PKEY_CTX_get_cb(EVP_PKEY_CTX *ctx); int EVP_PKEY_CTX_get_keygen_info(EVP_PKEY_CTX *ctx, int idx); void EVP_PKEY_meth_set_init(EVP_PKEY_METHOD *pmeth, int (*init) (EVP_PKEY_CTX *ctx)); void EVP_PKEY_meth_set_copy(EVP_PKEY_METHOD *pmeth, int (*copy) (EVP_PKEY_CTX *dst, EVP_PKEY_CTX *src)); void EVP_PKEY_meth_set_cleanup(EVP_PKEY_METHOD *pmeth, void (*cleanup) (EVP_PKEY_CTX *ctx)); void EVP_PKEY_meth_set_paramgen(EVP_PKEY_METHOD *pmeth, int (*paramgen_init) (EVP_PKEY_CTX *ctx), int (*paramgen) (EVP_PKEY_CTX *ctx, EVP_PKEY *pkey)); void EVP_PKEY_meth_set_keygen(EVP_PKEY_METHOD *pmeth, int (*keygen_init) (EVP_PKEY_CTX *ctx), int (*keygen) (EVP_PKEY_CTX *ctx, EVP_PKEY *pkey)); void EVP_PKEY_meth_set_sign(EVP_PKEY_METHOD *pmeth, int (*sign_init) (EVP_PKEY_CTX *ctx), int (*sign) (EVP_PKEY_CTX *ctx, unsigned char *sig, size_t *siglen, const unsigned char *tbs, size_t tbslen)); void EVP_PKEY_meth_set_verify(EVP_PKEY_METHOD *pmeth, int (*verify_init) (EVP_PKEY_CTX *ctx), int (*verify) (EVP_PKEY_CTX *ctx, const unsigned char *sig, size_t siglen, const unsigned char *tbs, size_t tbslen)); void EVP_PKEY_meth_set_verify_recover(EVP_PKEY_METHOD *pmeth, int (*verify_recover_init) (EVP_PKEY_CTX *ctx), int (*verify_recover) (EVP_PKEY_CTX *ctx, unsigned char *sig, size_t *siglen, const unsigned char *tbs, size_t tbslen)); void EVP_PKEY_meth_set_signctx(EVP_PKEY_METHOD *pmeth, int (*signctx_init) (EVP_PKEY_CTX *ctx, EVP_MD_CTX *mctx), int (*signctx) (EVP_PKEY_CTX *ctx, unsigned char *sig, size_t *siglen, EVP_MD_CTX *mctx)); void EVP_PKEY_meth_set_verifyctx(EVP_PKEY_METHOD *pmeth, int (*verifyctx_init) (EVP_PKEY_CTX *ctx, EVP_MD_CTX *mctx), int (*verifyctx) (EVP_PKEY_CTX *ctx, const unsigned char *sig, int siglen, EVP_MD_CTX *mctx)); void EVP_PKEY_meth_set_encrypt(EVP_PKEY_METHOD *pmeth, int (*encrypt_init) (EVP_PKEY_CTX *ctx), int (*encryptfn) (EVP_PKEY_CTX *ctx, unsigned char *out, size_t *outlen, const unsigned char *in, size_t inlen)); void EVP_PKEY_meth_set_decrypt(EVP_PKEY_METHOD *pmeth, int (*decrypt_init) (EVP_PKEY_CTX *ctx), int (*decrypt) (EVP_PKEY_CTX *ctx, unsigned char *out, size_t *outlen, const unsigned char *in, size_t inlen)); void EVP_PKEY_meth_set_derive(EVP_PKEY_METHOD *pmeth, int (*derive_init) (EVP_PKEY_CTX *ctx), int (*derive) (EVP_PKEY_CTX *ctx, unsigned char *key, size_t *keylen)); void EVP_PKEY_meth_set_ctrl(EVP_PKEY_METHOD *pmeth, int (*ctrl) (EVP_PKEY_CTX *ctx, int type, int p1, void *p2), int (*ctrl_str) (EVP_PKEY_CTX *ctx, const char *type, const char *value)); void EVP_add_alg_module(void); void ERR_load_EVP_strings(void); # 74 "/usr/include/openssl/x509.h" 2 3 4 # 83 "/usr/include/openssl/x509.h" 3 4 # 1 "/usr/include/openssl/ec.h" 1 3 4 # 79 "/usr/include/openssl/ec.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 80 "/usr/include/openssl/ec.h" 2 3 4 # 105 "/usr/include/openssl/ec.h" 3 4 typedef enum { POINT_CONVERSION_COMPRESSED = 2, POINT_CONVERSION_UNCOMPRESSED = 4, POINT_CONVERSION_HYBRID = 6 } point_conversion_form_t; typedef struct ec_method_st EC_METHOD; typedef struct ec_group_st # 127 "/usr/include/openssl/ec.h" 3 4 EC_GROUP; typedef struct ec_point_st EC_POINT; # 139 "/usr/include/openssl/ec.h" 3 4 const EC_METHOD *EC_GFp_simple_method(void); const EC_METHOD *EC_GFp_mont_method(void); const EC_METHOD *EC_GFp_nist_method(void); const EC_METHOD *EC_GFp_nistp224_method(void); const EC_METHOD *EC_GFp_nistp256_method(void); const EC_METHOD *EC_GFp_nistp521_method(void); # 176 "/usr/include/openssl/ec.h" 3 4 const EC_METHOD *EC_GF2m_simple_method(void); # 188 "/usr/include/openssl/ec.h" 3 4 EC_GROUP *EC_GROUP_new(const EC_METHOD *meth); void EC_GROUP_free(EC_GROUP *group); void EC_GROUP_clear_free(EC_GROUP *group); int EC_GROUP_copy(EC_GROUP *dst, const EC_GROUP *src); EC_GROUP *EC_GROUP_dup(const EC_GROUP *src); const EC_METHOD *EC_GROUP_method_of(const EC_GROUP *group); int EC_METHOD_get_field_type(const EC_METHOD *meth); # 234 "/usr/include/openssl/ec.h" 3 4 int EC_GROUP_set_generator(EC_GROUP *group, const EC_POINT *generator, const BIGNUM *order, const BIGNUM *cofactor); const EC_POINT *EC_GROUP_get0_generator(const EC_GROUP *group); BN_MONT_CTX *EC_GROUP_get_mont_data(const EC_GROUP *group); int EC_GROUP_get_order(const EC_GROUP *group, BIGNUM *order, BN_CTX *ctx); int EC_GROUP_get_cofactor(const EC_GROUP *group, BIGNUM *cofactor, BN_CTX *ctx); void EC_GROUP_set_curve_name(EC_GROUP *group, int nid); int EC_GROUP_get_curve_name(const EC_GROUP *group); void EC_GROUP_set_asn1_flag(EC_GROUP *group, int flag); int EC_GROUP_get_asn1_flag(const EC_GROUP *group); void EC_GROUP_set_point_conversion_form(EC_GROUP *group, point_conversion_form_t form); point_conversion_form_t EC_GROUP_get_point_conversion_form(const EC_GROUP *); unsigned char *EC_GROUP_get0_seed(const EC_GROUP *x); size_t EC_GROUP_get_seed_len(const EC_GROUP *); size_t EC_GROUP_set_seed(EC_GROUP *, const unsigned char *, size_t len); # 297 "/usr/include/openssl/ec.h" 3 4 int EC_GROUP_set_curve_GFp(EC_GROUP *group, const BIGNUM *p, const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx); # 308 "/usr/include/openssl/ec.h" 3 4 int EC_GROUP_get_curve_GFp(const EC_GROUP *group, BIGNUM *p, BIGNUM *a, BIGNUM *b, BN_CTX *ctx); # 320 "/usr/include/openssl/ec.h" 3 4 int EC_GROUP_set_curve_GF2m(EC_GROUP *group, const BIGNUM *p, const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx); # 331 "/usr/include/openssl/ec.h" 3 4 int EC_GROUP_get_curve_GF2m(const EC_GROUP *group, BIGNUM *p, BIGNUM *a, BIGNUM *b, BN_CTX *ctx); int EC_GROUP_get_degree(const EC_GROUP *group); int EC_GROUP_check(const EC_GROUP *group, BN_CTX *ctx); int EC_GROUP_check_discriminant(const EC_GROUP *group, BN_CTX *ctx); int EC_GROUP_cmp(const EC_GROUP *a, const EC_GROUP *b, BN_CTX *ctx); # 375 "/usr/include/openssl/ec.h" 3 4 EC_GROUP *EC_GROUP_new_curve_GFp(const BIGNUM *p, const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx); # 386 "/usr/include/openssl/ec.h" 3 4 EC_GROUP *EC_GROUP_new_curve_GF2m(const BIGNUM *p, const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx); EC_GROUP *EC_GROUP_new_by_curve_name(int nid); typedef struct { int nid; const char *comment; } EC_builtin_curve; size_t EC_get_builtin_curves(EC_builtin_curve *r, size_t nitems); const char *EC_curve_nid2nist(int nid); int EC_curve_nist2nid(const char *name); # 424 "/usr/include/openssl/ec.h" 3 4 EC_POINT *EC_POINT_new(const EC_GROUP *group); void EC_POINT_free(EC_POINT *point); void EC_POINT_clear_free(EC_POINT *point); int EC_POINT_copy(EC_POINT *dst, const EC_POINT *src); EC_POINT *EC_POINT_dup(const EC_POINT *src, const EC_GROUP *group); const EC_METHOD *EC_POINT_method_of(const EC_POINT *point); int EC_POINT_set_to_infinity(const EC_GROUP *group, EC_POINT *point); # 473 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_set_Jprojective_coordinates_GFp(const EC_GROUP *group, EC_POINT *p, const BIGNUM *x, const BIGNUM *y, const BIGNUM *z, BN_CTX *ctx); # 487 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_get_Jprojective_coordinates_GFp(const EC_GROUP *group, const EC_POINT *p, BIGNUM *x, BIGNUM *y, BIGNUM *z, BN_CTX *ctx); # 500 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_set_affine_coordinates_GFp(const EC_GROUP *group, EC_POINT *p, const BIGNUM *x, const BIGNUM *y, BN_CTX *ctx); # 512 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_get_affine_coordinates_GFp(const EC_GROUP *group, const EC_POINT *p, BIGNUM *x, BIGNUM *y, BN_CTX *ctx); # 524 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_set_compressed_coordinates_GFp(const EC_GROUP *group, EC_POINT *p, const BIGNUM *x, int y_bit, BN_CTX *ctx); # 536 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_set_affine_coordinates_GF2m(const EC_GROUP *group, EC_POINT *p, const BIGNUM *x, const BIGNUM *y, BN_CTX *ctx); # 548 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_get_affine_coordinates_GF2m(const EC_GROUP *group, const EC_POINT *p, BIGNUM *x, BIGNUM *y, BN_CTX *ctx); # 560 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_set_compressed_coordinates_GF2m(const EC_GROUP *group, EC_POINT *p, const BIGNUM *x, int y_bit, BN_CTX *ctx); # 574 "/usr/include/openssl/ec.h" 3 4 size_t EC_POINT_point2oct(const EC_GROUP *group, const EC_POINT *p, point_conversion_form_t form, unsigned char *buf, size_t len, BN_CTX *ctx); # 586 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_oct2point(const EC_GROUP *group, EC_POINT *p, const unsigned char *buf, size_t len, BN_CTX *ctx); BIGNUM *EC_POINT_point2bn(const EC_GROUP *, const EC_POINT *, point_conversion_form_t form, BIGNUM *, BN_CTX *); EC_POINT *EC_POINT_bn2point(const EC_GROUP *, const BIGNUM *, EC_POINT *, BN_CTX *); char *EC_POINT_point2hex(const EC_GROUP *, const EC_POINT *, point_conversion_form_t form, BN_CTX *); EC_POINT *EC_POINT_hex2point(const EC_GROUP *, const char *, EC_POINT *, BN_CTX *); # 611 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_add(const EC_GROUP *group, EC_POINT *r, const EC_POINT *a, const EC_POINT *b, BN_CTX *ctx); # 621 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_dbl(const EC_GROUP *group, EC_POINT *r, const EC_POINT *a, BN_CTX *ctx); int EC_POINT_invert(const EC_GROUP *group, EC_POINT *a, BN_CTX *ctx); int EC_POINT_is_at_infinity(const EC_GROUP *group, const EC_POINT *p); int EC_POINT_is_on_curve(const EC_GROUP *group, const EC_POINT *point, BN_CTX *ctx); # 655 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_cmp(const EC_GROUP *group, const EC_POINT *a, const EC_POINT *b, BN_CTX *ctx); int EC_POINT_make_affine(const EC_GROUP *group, EC_POINT *point, BN_CTX *ctx); int EC_POINTs_make_affine(const EC_GROUP *group, size_t num, EC_POINT *points[], BN_CTX *ctx); # 672 "/usr/include/openssl/ec.h" 3 4 int EC_POINTs_mul(const EC_GROUP *group, EC_POINT *r, const BIGNUM *n, size_t num, const EC_POINT *p[], const BIGNUM *m[], BN_CTX *ctx); # 685 "/usr/include/openssl/ec.h" 3 4 int EC_POINT_mul(const EC_GROUP *group, EC_POINT *r, const BIGNUM *n, const EC_POINT *q, const BIGNUM *m, BN_CTX *ctx); int EC_GROUP_precompute_mult(EC_GROUP *group, BN_CTX *ctx); int EC_GROUP_have_precompute_mult(const EC_GROUP *group); # 709 "/usr/include/openssl/ec.h" 3 4 int EC_GROUP_get_basis_type(const EC_GROUP *); int EC_GROUP_get_trinomial_basis(const EC_GROUP *, unsigned int *k); int EC_GROUP_get_pentanomial_basis(const EC_GROUP *, unsigned int *k1, unsigned int *k2, unsigned int *k3); typedef struct ecpk_parameters_st ECPKPARAMETERS; EC_GROUP *d2i_ECPKParameters(EC_GROUP **, const unsigned char **in, long len); int i2d_ECPKParameters(const EC_GROUP *, unsigned char **out); # 731 "/usr/include/openssl/ec.h" 3 4 int ECPKParameters_print(BIO *bp, const EC_GROUP *x, int off); int ECPKParameters_print_fp(FILE *fp, const EC_GROUP *x, int off); typedef struct ec_key_st EC_KEY; # 754 "/usr/include/openssl/ec.h" 3 4 EC_KEY *EC_KEY_new(void); int EC_KEY_get_flags(const EC_KEY *key); void EC_KEY_set_flags(EC_KEY *key, int flags); void EC_KEY_clear_flags(EC_KEY *key, int flags); EC_KEY *EC_KEY_new_by_curve_name(int nid); void EC_KEY_free(EC_KEY *key); EC_KEY *EC_KEY_copy(EC_KEY *dst, const EC_KEY *src); EC_KEY *EC_KEY_dup(const EC_KEY *src); int EC_KEY_up_ref(EC_KEY *key); const EC_GROUP *EC_KEY_get0_group(const EC_KEY *key); int EC_KEY_set_group(EC_KEY *key, const EC_GROUP *group); const BIGNUM *EC_KEY_get0_private_key(const EC_KEY *key); int EC_KEY_set_private_key(EC_KEY *key, const BIGNUM *prv); const EC_POINT *EC_KEY_get0_public_key(const EC_KEY *key); int EC_KEY_set_public_key(EC_KEY *key, const EC_POINT *pub); unsigned EC_KEY_get_enc_flags(const EC_KEY *key); void EC_KEY_set_enc_flags(EC_KEY *eckey, unsigned int flags); point_conversion_form_t EC_KEY_get_conv_form(const EC_KEY *key); void EC_KEY_set_conv_form(EC_KEY *eckey, point_conversion_form_t cform); void *EC_KEY_get_key_method_data(EC_KEY *key, void *(*dup_func) (void *), void (*free_func) (void *), void (*clear_free_func) (void *)); # 852 "/usr/include/openssl/ec.h" 3 4 void *EC_KEY_insert_key_method_data(EC_KEY *key, void *data, void *(*dup_func) (void *), void (*free_func) (void *), void (*clear_free_func) (void *)); void EC_KEY_set_asn1_flag(EC_KEY *eckey, int asn1_flag); int EC_KEY_precompute_mult(EC_KEY *key, BN_CTX *ctx); int EC_KEY_generate_key(EC_KEY *key); int EC_KEY_check_key(const EC_KEY *key); # 886 "/usr/include/openssl/ec.h" 3 4 int EC_KEY_set_public_key_affine_coordinates(EC_KEY *key, BIGNUM *x, BIGNUM *y); # 899 "/usr/include/openssl/ec.h" 3 4 EC_KEY *d2i_ECPrivateKey(EC_KEY **key, const unsigned char **in, long len); int i2d_ECPrivateKey(EC_KEY *key, unsigned char **out); # 920 "/usr/include/openssl/ec.h" 3 4 EC_KEY *d2i_ECParameters(EC_KEY **key, const unsigned char **in, long len); int i2d_ECParameters(EC_KEY *key, unsigned char **out); # 942 "/usr/include/openssl/ec.h" 3 4 EC_KEY *o2i_ECPublicKey(EC_KEY **key, const unsigned char **in, long len); int i2o_ECPublicKey(EC_KEY *key, unsigned char **out); int ECParameters_print(BIO *bp, const EC_KEY *key); int EC_KEY_print(BIO *bp, const EC_KEY *key, int off); # 975 "/usr/include/openssl/ec.h" 3 4 int ECParameters_print_fp(FILE *fp, const EC_KEY *key); int EC_KEY_print_fp(FILE *fp, const EC_KEY *key, int off); # 1076 "/usr/include/openssl/ec.h" 3 4 void ERR_load_EC_strings(void); # 84 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/ecdsa.h" 1 3 4 # 62 "/usr/include/openssl/ecdsa.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 63 "/usr/include/openssl/ecdsa.h" 2 3 4 # 78 "/usr/include/openssl/ecdsa.h" 3 4 typedef struct ECDSA_SIG_st { BIGNUM *r; BIGNUM *s; } ECDSA_SIG; ECDSA_SIG *ECDSA_SIG_new(void); void ECDSA_SIG_free(ECDSA_SIG *sig); int i2d_ECDSA_SIG(const ECDSA_SIG *sig, unsigned char **pp); # 108 "/usr/include/openssl/ecdsa.h" 3 4 ECDSA_SIG *d2i_ECDSA_SIG(ECDSA_SIG **sig, const unsigned char **pp, long len); # 117 "/usr/include/openssl/ecdsa.h" 3 4 ECDSA_SIG *ECDSA_do_sign(const unsigned char *dgst, int dgst_len, EC_KEY *eckey); # 130 "/usr/include/openssl/ecdsa.h" 3 4 ECDSA_SIG *ECDSA_do_sign_ex(const unsigned char *dgst, int dgstlen, const BIGNUM *kinv, const BIGNUM *rp, EC_KEY *eckey); # 143 "/usr/include/openssl/ecdsa.h" 3 4 int ECDSA_do_verify(const unsigned char *dgst, int dgst_len, const ECDSA_SIG *sig, EC_KEY *eckey); const ECDSA_METHOD *ECDSA_OpenSSL(void); void ECDSA_set_default_method(const ECDSA_METHOD *meth); const ECDSA_METHOD *ECDSA_get_default_method(void); int ECDSA_set_method(EC_KEY *eckey, const ECDSA_METHOD *meth); int ECDSA_size(const EC_KEY *eckey); # 178 "/usr/include/openssl/ecdsa.h" 3 4 int ECDSA_sign_setup(EC_KEY *eckey, BN_CTX *ctx, BIGNUM **kinv, BIGNUM **rp); # 190 "/usr/include/openssl/ecdsa.h" 3 4 int ECDSA_sign(int type, const unsigned char *dgst, int dgstlen, unsigned char *sig, unsigned int *siglen, EC_KEY *eckey); # 206 "/usr/include/openssl/ecdsa.h" 3 4 int ECDSA_sign_ex(int type, const unsigned char *dgst, int dgstlen, unsigned char *sig, unsigned int *siglen, const BIGNUM *kinv, const BIGNUM *rp, EC_KEY *eckey); # 221 "/usr/include/openssl/ecdsa.h" 3 4 int ECDSA_verify(int type, const unsigned char *dgst, int dgstlen, const unsigned char *sig, int siglen, EC_KEY *eckey); int ECDSA_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int ECDSA_set_ex_data(EC_KEY *d, int idx, void *arg); void *ECDSA_get_ex_data(EC_KEY *d, int idx); ECDSA_METHOD *ECDSA_METHOD_new(const ECDSA_METHOD *ecdsa_method); void ECDSA_METHOD_free(ECDSA_METHOD *ecdsa_method); void ECDSA_METHOD_set_app_data(ECDSA_METHOD *ecdsa_method, void *app); void *ECDSA_METHOD_get_app_data(ECDSA_METHOD *ecdsa_method); void ECDSA_METHOD_set_sign(ECDSA_METHOD *ecdsa_method, ECDSA_SIG *(*ecdsa_do_sign) (const unsigned char *dgst, int dgst_len, const BIGNUM *inv, const BIGNUM *rp, EC_KEY *eckey)); void ECDSA_METHOD_set_sign_setup(ECDSA_METHOD *ecdsa_method, int (*ecdsa_sign_setup) (EC_KEY *eckey, BN_CTX *ctx, BIGNUM **kinv, BIGNUM **r)); void ECDSA_METHOD_set_verify(ECDSA_METHOD *ecdsa_method, int (*ecdsa_do_verify) (const unsigned char *dgst, int dgst_len, const ECDSA_SIG *sig, EC_KEY *eckey)); void ECDSA_METHOD_set_flags(ECDSA_METHOD *ecdsa_method, int flags); void ECDSA_METHOD_set_name(ECDSA_METHOD *ecdsa_method, char *name); # 310 "/usr/include/openssl/ecdsa.h" 3 4 void ERR_load_ECDSA_strings(void); # 88 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/ecdh.h" 1 3 4 # 72 "/usr/include/openssl/ecdh.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 73 "/usr/include/openssl/ecdh.h" 2 3 4 # 90 "/usr/include/openssl/ecdh.h" 3 4 const ECDH_METHOD *ECDH_OpenSSL(void); void ECDH_set_default_method(const ECDH_METHOD *); const ECDH_METHOD *ECDH_get_default_method(void); int ECDH_set_method(EC_KEY *, const ECDH_METHOD *); int ECDH_compute_key(void *out, size_t outlen, const EC_POINT *pub_key, EC_KEY *ecdh, void *(*KDF) (const void *in, size_t inlen, void *out, size_t *outlen)); int ECDH_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int ECDH_set_ex_data(EC_KEY *d, int idx, void *arg); void *ECDH_get_ex_data(EC_KEY *d, int idx); int ECDH_KDF_X9_62(unsigned char *out, size_t outlen, const unsigned char *Z, size_t Zlen, const unsigned char *sinfo, size_t sinfolen, const EVP_MD *md); void ERR_load_ECDH_strings(void); # 92 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/rsa.h" 1 3 4 # 85 "/usr/include/openssl/rsa.h" 3 4 struct rsa_meth_st { const char *name; int (*rsa_pub_enc) (int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); int (*rsa_pub_dec) (int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); int (*rsa_priv_enc) (int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); int (*rsa_priv_dec) (int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); int (*rsa_mod_exp) (BIGNUM *r0, const BIGNUM *I, RSA *rsa, BN_CTX *ctx); int (*bn_mod_exp) (BIGNUM *r, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *m_ctx); int (*init) (RSA *rsa); int (*finish) (RSA *rsa); int flags; char *app_data; # 116 "/usr/include/openssl/rsa.h" 3 4 int (*rsa_sign) (int type, const unsigned char *m, unsigned int m_length, unsigned char *sigret, unsigned int *siglen, const RSA *rsa); int (*rsa_verify) (int dtype, const unsigned char *m, unsigned int m_length, const unsigned char *sigbuf, unsigned int siglen, const RSA *rsa); int (*rsa_keygen) (RSA *rsa, int bits, BIGNUM *e, BN_GENCB *cb); }; struct rsa_st { int pad; long version; const RSA_METHOD *meth; ENGINE *engine; BIGNUM *n; BIGNUM *e; BIGNUM *d; BIGNUM *p; BIGNUM *q; BIGNUM *dmp1; BIGNUM *dmq1; BIGNUM *iqmp; CRYPTO_EX_DATA ex_data; int references; int flags; BN_MONT_CTX *_method_mod_n; BN_MONT_CTX *_method_mod_p; BN_MONT_CTX *_method_mod_q; char *bignum_data; BN_BLINDING *blinding; BN_BLINDING *mt_blinding; }; # 320 "/usr/include/openssl/rsa.h" 3 4 RSA *RSA_new(void); RSA *RSA_new_method(ENGINE *engine); int RSA_size(const RSA *rsa); RSA *RSA_generate_key(int bits, unsigned long e, void (*callback) (int, int, void *), void *cb_arg); int RSA_generate_key_ex(RSA *rsa, int bits, BIGNUM *e, BN_GENCB *cb); int RSA_check_key(const RSA *); int RSA_public_encrypt(int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); int RSA_private_encrypt(int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); int RSA_public_decrypt(int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); int RSA_private_decrypt(int flen, const unsigned char *from, unsigned char *to, RSA *rsa, int padding); void RSA_free(RSA *r); int RSA_up_ref(RSA *r); int RSA_flags(const RSA *r); void RSA_set_default_method(const RSA_METHOD *meth); const RSA_METHOD *RSA_get_default_method(void); const RSA_METHOD *RSA_get_method(const RSA *rsa); int RSA_set_method(RSA *rsa, const RSA_METHOD *meth); int RSA_memory_lock(RSA *r); const RSA_METHOD *RSA_PKCS1_SSLeay(void); const RSA_METHOD *RSA_null_method(void); RSA *d2i_RSAPublicKey(RSA **a, const unsigned char **in, long len); int i2d_RSAPublicKey(const RSA *a, unsigned char **out); extern const ASN1_ITEM RSAPublicKey_it; RSA *d2i_RSAPrivateKey(RSA **a, const unsigned char **in, long len); int i2d_RSAPrivateKey(const RSA *a, unsigned char **out); extern const ASN1_ITEM RSAPrivateKey_it; typedef struct rsa_pss_params_st { X509_ALGOR *hashAlgorithm; X509_ALGOR *maskGenAlgorithm; ASN1_INTEGER *saltLength; ASN1_INTEGER *trailerField; } RSA_PSS_PARAMS; RSA_PSS_PARAMS *RSA_PSS_PARAMS_new(void); void RSA_PSS_PARAMS_free(RSA_PSS_PARAMS *a); RSA_PSS_PARAMS *d2i_RSA_PSS_PARAMS(RSA_PSS_PARAMS **a, const unsigned char **in, long len); int i2d_RSA_PSS_PARAMS(RSA_PSS_PARAMS *a, unsigned char **out); extern const ASN1_ITEM RSA_PSS_PARAMS_it; typedef struct rsa_oaep_params_st { X509_ALGOR *hashFunc; X509_ALGOR *maskGenFunc; X509_ALGOR *pSourceFunc; } RSA_OAEP_PARAMS; RSA_OAEP_PARAMS *RSA_OAEP_PARAMS_new(void); void RSA_OAEP_PARAMS_free(RSA_OAEP_PARAMS *a); RSA_OAEP_PARAMS *d2i_RSA_OAEP_PARAMS(RSA_OAEP_PARAMS **a, const unsigned char **in, long len); int i2d_RSA_OAEP_PARAMS(RSA_OAEP_PARAMS *a, unsigned char **out); extern const ASN1_ITEM RSA_OAEP_PARAMS_it; int RSA_print_fp(FILE *fp, const RSA *r, int offset); int RSA_print(BIO *bp, const RSA *r, int offset); int i2d_RSA_NET(const RSA *a, unsigned char **pp, int (*cb) (char *buf, int len, const char *prompt, int verify), int sgckey); RSA *d2i_RSA_NET(RSA **a, const unsigned char **pp, long length, int (*cb) (char *buf, int len, const char *prompt, int verify), int sgckey); int i2d_Netscape_RSA(const RSA *a, unsigned char **pp, int (*cb) (char *buf, int len, const char *prompt, int verify)); RSA *d2i_Netscape_RSA(RSA **a, const unsigned char **pp, long length, int (*cb) (char *buf, int len, const char *prompt, int verify)); int RSA_sign(int type, const unsigned char *m, unsigned int m_length, unsigned char *sigret, unsigned int *siglen, RSA *rsa); int RSA_verify(int type, const unsigned char *m, unsigned int m_length, const unsigned char *sigbuf, unsigned int siglen, RSA *rsa); int RSA_sign_ASN1_OCTET_STRING(int type, const unsigned char *m, unsigned int m_length, unsigned char *sigret, unsigned int *siglen, RSA *rsa); int RSA_verify_ASN1_OCTET_STRING(int type, const unsigned char *m, unsigned int m_length, unsigned char *sigbuf, unsigned int siglen, RSA *rsa); int RSA_blinding_on(RSA *rsa, BN_CTX *ctx); void RSA_blinding_off(RSA *rsa); BN_BLINDING *RSA_setup_blinding(RSA *rsa, BN_CTX *ctx); int RSA_padding_add_PKCS1_type_1(unsigned char *to, int tlen, const unsigned char *f, int fl); int RSA_padding_check_PKCS1_type_1(unsigned char *to, int tlen, const unsigned char *f, int fl, int rsa_len); int RSA_padding_add_PKCS1_type_2(unsigned char *to, int tlen, const unsigned char *f, int fl); int RSA_padding_check_PKCS1_type_2(unsigned char *to, int tlen, const unsigned char *f, int fl, int rsa_len); int PKCS1_MGF1(unsigned char *mask, long len, const unsigned char *seed, long seedlen, const EVP_MD *dgst); int RSA_padding_add_PKCS1_OAEP(unsigned char *to, int tlen, const unsigned char *f, int fl, const unsigned char *p, int pl); int RSA_padding_check_PKCS1_OAEP(unsigned char *to, int tlen, const unsigned char *f, int fl, int rsa_len, const unsigned char *p, int pl); int RSA_padding_add_PKCS1_OAEP_mgf1(unsigned char *to, int tlen, const unsigned char *from, int flen, const unsigned char *param, int plen, const EVP_MD *md, const EVP_MD *mgf1md); int RSA_padding_check_PKCS1_OAEP_mgf1(unsigned char *to, int tlen, const unsigned char *from, int flen, int num, const unsigned char *param, int plen, const EVP_MD *md, const EVP_MD *mgf1md); int RSA_padding_add_SSLv23(unsigned char *to, int tlen, const unsigned char *f, int fl); int RSA_padding_check_SSLv23(unsigned char *to, int tlen, const unsigned char *f, int fl, int rsa_len); int RSA_padding_add_none(unsigned char *to, int tlen, const unsigned char *f, int fl); int RSA_padding_check_none(unsigned char *to, int tlen, const unsigned char *f, int fl, int rsa_len); int RSA_padding_add_X931(unsigned char *to, int tlen, const unsigned char *f, int fl); int RSA_padding_check_X931(unsigned char *to, int tlen, const unsigned char *f, int fl, int rsa_len); int RSA_X931_hash_id(int nid); int RSA_verify_PKCS1_PSS(RSA *rsa, const unsigned char *mHash, const EVP_MD *Hash, const unsigned char *EM, int sLen); int RSA_padding_add_PKCS1_PSS(RSA *rsa, unsigned char *EM, const unsigned char *mHash, const EVP_MD *Hash, int sLen); int RSA_verify_PKCS1_PSS_mgf1(RSA *rsa, const unsigned char *mHash, const EVP_MD *Hash, const EVP_MD *mgf1Hash, const unsigned char *EM, int sLen); int RSA_padding_add_PKCS1_PSS_mgf1(RSA *rsa, unsigned char *EM, const unsigned char *mHash, const EVP_MD *Hash, const EVP_MD *mgf1Hash, int sLen); int RSA_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int RSA_set_ex_data(RSA *r, int idx, void *arg); void *RSA_get_ex_data(const RSA *r, int idx); RSA *RSAPublicKey_dup(RSA *rsa); RSA *RSAPrivateKey_dup(RSA *rsa); # 523 "/usr/include/openssl/rsa.h" 3 4 void ERR_load_RSA_strings(void); # 97 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/dsa.h" 1 3 4 # 68 "/usr/include/openssl/dsa.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 69 "/usr/include/openssl/dsa.h" 2 3 4 # 83 "/usr/include/openssl/dsa.h" 3 4 # 1 "/usr/include/openssl/dh.h" 1 3 4 # 62 "/usr/include/openssl/dh.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 63 "/usr/include/openssl/dh.h" 2 3 4 # 117 "/usr/include/openssl/dh.h" 3 4 struct dh_method { const char *name; int (*generate_key) (DH *dh); int (*compute_key) (unsigned char *key, const BIGNUM *pub_key, DH *dh); int (*bn_mod_exp) (const DH *dh, BIGNUM *r, const BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *m_ctx); int (*init) (DH *dh); int (*finish) (DH *dh); int flags; char *app_data; int (*generate_params) (DH *dh, int prime_len, int generator, BN_GENCB *cb); }; struct dh_st { int pad; int version; BIGNUM *p; BIGNUM *g; long length; BIGNUM *pub_key; BIGNUM *priv_key; int flags; BN_MONT_CTX *method_mont_p; BIGNUM *q; BIGNUM *j; unsigned char *seed; int seedlen; BIGNUM *counter; int references; CRYPTO_EX_DATA ex_data; const DH_METHOD *meth; ENGINE *engine; }; # 192 "/usr/include/openssl/dh.h" 3 4 DH *DHparams_dup(DH *); const DH_METHOD *DH_OpenSSL(void); void DH_set_default_method(const DH_METHOD *meth); const DH_METHOD *DH_get_default_method(void); int DH_set_method(DH *dh, const DH_METHOD *meth); DH *DH_new_method(ENGINE *engine); DH *DH_new(void); void DH_free(DH *dh); int DH_up_ref(DH *dh); int DH_size(const DH *dh); int DH_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int DH_set_ex_data(DH *d, int idx, void *arg); void *DH_get_ex_data(DH *d, int idx); DH *DH_generate_parameters(int prime_len, int generator, void (*callback) (int, int, void *), void *cb_arg); int DH_generate_parameters_ex(DH *dh, int prime_len, int generator, BN_GENCB *cb); int DH_check(const DH *dh, int *codes); int DH_check_pub_key(const DH *dh, const BIGNUM *pub_key, int *codes); int DH_generate_key(DH *dh); int DH_compute_key(unsigned char *key, const BIGNUM *pub_key, DH *dh); int DH_compute_key_padded(unsigned char *key, const BIGNUM *pub_key, DH *dh); DH *d2i_DHparams(DH **a, const unsigned char **pp, long length); int i2d_DHparams(const DH *a, unsigned char **pp); DH *d2i_DHxparams(DH **a, const unsigned char **pp, long length); int i2d_DHxparams(const DH *a, unsigned char **pp); int DHparams_print_fp(FILE *fp, const DH *x); int DHparams_print(BIO *bp, const DH *x); DH *DH_get_1024_160(void); DH *DH_get_2048_224(void); DH *DH_get_2048_256(void); int DH_KDF_X9_42(unsigned char *out, size_t outlen, const unsigned char *Z, size_t Zlen, ASN1_OBJECT *key_oid, const unsigned char *ukm, size_t ukmlen, const EVP_MD *md); # 347 "/usr/include/openssl/dh.h" 3 4 void ERR_load_DH_strings(void); # 84 "/usr/include/openssl/dsa.h" 2 3 4 # 124 "/usr/include/openssl/dsa.h" 3 4 typedef struct DSA_SIG_st { BIGNUM *r; BIGNUM *s; } DSA_SIG; struct dsa_method { const char *name; DSA_SIG *(*dsa_do_sign) (const unsigned char *dgst, int dlen, DSA *dsa); int (*dsa_sign_setup) (DSA *dsa, BN_CTX *ctx_in, BIGNUM **kinvp, BIGNUM **rp); int (*dsa_do_verify) (const unsigned char *dgst, int dgst_len, DSA_SIG *sig, DSA *dsa); int (*dsa_mod_exp) (DSA *dsa, BIGNUM *rr, BIGNUM *a1, BIGNUM *p1, BIGNUM *a2, BIGNUM *p2, BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *in_mont); int (*bn_mod_exp) (DSA *dsa, BIGNUM *r, BIGNUM *a, const BIGNUM *p, const BIGNUM *m, BN_CTX *ctx, BN_MONT_CTX *m_ctx); int (*init) (DSA *dsa); int (*finish) (DSA *dsa); int flags; char *app_data; int (*dsa_paramgen) (DSA *dsa, int bits, const unsigned char *seed, int seed_len, int *counter_ret, unsigned long *h_ret, BN_GENCB *cb); int (*dsa_keygen) (DSA *dsa); }; struct dsa_st { int pad; long version; int write_params; BIGNUM *p; BIGNUM *q; BIGNUM *g; BIGNUM *pub_key; BIGNUM *priv_key; BIGNUM *kinv; BIGNUM *r; int flags; BN_MONT_CTX *method_mont_p; int references; CRYPTO_EX_DATA ex_data; const DSA_METHOD *meth; ENGINE *engine; }; # 187 "/usr/include/openssl/dsa.h" 3 4 DSA *DSAparams_dup(DSA *x); DSA_SIG *DSA_SIG_new(void); void DSA_SIG_free(DSA_SIG *a); int i2d_DSA_SIG(const DSA_SIG *a, unsigned char **pp); DSA_SIG *d2i_DSA_SIG(DSA_SIG **v, const unsigned char **pp, long length); DSA_SIG *DSA_do_sign(const unsigned char *dgst, int dlen, DSA *dsa); int DSA_do_verify(const unsigned char *dgst, int dgst_len, DSA_SIG *sig, DSA *dsa); const DSA_METHOD *DSA_OpenSSL(void); void DSA_set_default_method(const DSA_METHOD *); const DSA_METHOD *DSA_get_default_method(void); int DSA_set_method(DSA *dsa, const DSA_METHOD *); DSA *DSA_new(void); DSA *DSA_new_method(ENGINE *engine); void DSA_free(DSA *r); int DSA_up_ref(DSA *r); int DSA_size(const DSA *); int DSA_sign_setup(DSA *dsa, BN_CTX *ctx_in, BIGNUM **kinvp, BIGNUM **rp); int DSA_sign(int type, const unsigned char *dgst, int dlen, unsigned char *sig, unsigned int *siglen, DSA *dsa); int DSA_verify(int type, const unsigned char *dgst, int dgst_len, const unsigned char *sigbuf, int siglen, DSA *dsa); int DSA_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int DSA_set_ex_data(DSA *d, int idx, void *arg); void *DSA_get_ex_data(DSA *d, int idx); DSA *d2i_DSAPublicKey(DSA **a, const unsigned char **pp, long length); DSA *d2i_DSAPrivateKey(DSA **a, const unsigned char **pp, long length); DSA *d2i_DSAparams(DSA **a, const unsigned char **pp, long length); DSA *DSA_generate_parameters(int bits, unsigned char *seed, int seed_len, int *counter_ret, unsigned long *h_ret, void (*callback) (int, int, void *), void *cb_arg); int DSA_generate_parameters_ex(DSA *dsa, int bits, const unsigned char *seed, int seed_len, int *counter_ret, unsigned long *h_ret, BN_GENCB *cb); int DSA_generate_key(DSA *a); int i2d_DSAPublicKey(const DSA *a, unsigned char **pp); int i2d_DSAPrivateKey(const DSA *a, unsigned char **pp); int i2d_DSAparams(const DSA *a, unsigned char **pp); int DSAparams_print(BIO *bp, const DSA *x); int DSA_print(BIO *bp, const DSA *x, int off); int DSAparams_print_fp(FILE *fp, const DSA *x); int DSA_print_fp(FILE *bp, const DSA *x, int off); # 265 "/usr/include/openssl/dsa.h" 3 4 DH *DSA_dup_DH(const DSA *r); # 281 "/usr/include/openssl/dsa.h" 3 4 void ERR_load_DSA_strings(void); # 100 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/sha.h" 1 3 4 # 62 "/usr/include/openssl/sha.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 63 "/usr/include/openssl/sha.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 64 "/usr/include/openssl/sha.h" 2 3 4 # 100 "/usr/include/openssl/sha.h" 3 4 typedef struct SHAstate_st { unsigned int h0, h1, h2, h3, h4; unsigned int Nl, Nh; unsigned int data[16]; unsigned int num; } SHA_CTX; int SHA_Init(SHA_CTX *c); int SHA_Update(SHA_CTX *c, const void *data, size_t len); int SHA_Final(unsigned char *md, SHA_CTX *c); unsigned char *SHA(const unsigned char *d, size_t n, unsigned char *md); void SHA_Transform(SHA_CTX *c, const unsigned char *data); int SHA1_Init(SHA_CTX *c); int SHA1_Update(SHA_CTX *c, const void *data, size_t len); int SHA1_Final(unsigned char *md, SHA_CTX *c); unsigned char *SHA1(const unsigned char *d, size_t n, unsigned char *md); void SHA1_Transform(SHA_CTX *c, const unsigned char *data); # 134 "/usr/include/openssl/sha.h" 3 4 typedef struct SHA256state_st { unsigned int h[8]; unsigned int Nl, Nh; unsigned int data[16]; unsigned int num, md_len; } SHA256_CTX; int SHA224_Init(SHA256_CTX *c); int SHA224_Update(SHA256_CTX *c, const void *data, size_t len); int SHA224_Final(unsigned char *md, SHA256_CTX *c); unsigned char *SHA224(const unsigned char *d, size_t n, unsigned char *md); int SHA256_Init(SHA256_CTX *c); int SHA256_Update(SHA256_CTX *c, const void *data, size_t len); int SHA256_Final(unsigned char *md, SHA256_CTX *c); unsigned char *SHA256(const unsigned char *d, size_t n, unsigned char *md); void SHA256_Transform(SHA256_CTX *c, const unsigned char *data); # 183 "/usr/include/openssl/sha.h" 3 4 typedef struct SHA512state_st { unsigned long long h[8]; unsigned long long Nl, Nh; union { unsigned long long d[16]; unsigned char p[(16*8)]; } u; unsigned int num, md_len; } SHA512_CTX; int SHA384_Init(SHA512_CTX *c); int SHA384_Update(SHA512_CTX *c, const void *data, size_t len); int SHA384_Final(unsigned char *md, SHA512_CTX *c); unsigned char *SHA384(const unsigned char *d, size_t n, unsigned char *md); int SHA512_Init(SHA512_CTX *c); int SHA512_Update(SHA512_CTX *c, const void *data, size_t len); int SHA512_Final(unsigned char *md, SHA512_CTX *c); unsigned char *SHA512(const unsigned char *d, size_t n, unsigned char *md); void SHA512_Transform(SHA512_CTX *c, const unsigned char *data); # 108 "/usr/include/openssl/x509.h" 2 3 4 # 137 "/usr/include/openssl/x509.h" 3 4 typedef struct X509_objects_st { int nid; int (*a2i) (void); int (*i2a) (void); } X509_OBJECTS; struct X509_algor_st { ASN1_OBJECT *algorithm; ASN1_TYPE *parameter; } ; typedef struct stack_st_X509_ALGOR X509_ALGORS; typedef struct X509_val_st { ASN1_TIME *notBefore; ASN1_TIME *notAfter; } X509_VAL; struct X509_pubkey_st { X509_ALGOR *algor; ASN1_BIT_STRING *public_key; EVP_PKEY *pkey; }; typedef struct X509_sig_st { X509_ALGOR *algor; ASN1_OCTET_STRING *digest; } X509_SIG; typedef struct X509_name_entry_st { ASN1_OBJECT *object; ASN1_STRING *value; int set; int size; } X509_NAME_ENTRY; struct stack_st_X509_NAME_ENTRY { _STACK stack; }; struct X509_name_st { struct stack_st_X509_NAME_ENTRY *entries; int modified; BUF_MEM *bytes; unsigned char *canon_enc; int canon_enclen; } ; struct stack_st_X509_NAME { _STACK stack; }; typedef struct X509_extension_st { ASN1_OBJECT *object; ASN1_BOOLEAN critical; ASN1_OCTET_STRING *value; } X509_EXTENSION; typedef struct stack_st_X509_EXTENSION X509_EXTENSIONS; struct stack_st_X509_EXTENSION { _STACK stack; }; typedef struct x509_attributes_st { ASN1_OBJECT *object; int single; union { char *ptr; struct stack_st_ASN1_TYPE *set; ASN1_TYPE *single; } value; } X509_ATTRIBUTE; struct stack_st_X509_ATTRIBUTE { _STACK stack; }; typedef struct X509_req_info_st { ASN1_ENCODING enc; ASN1_INTEGER *version; X509_NAME *subject; X509_PUBKEY *pubkey; struct stack_st_X509_ATTRIBUTE *attributes; } X509_REQ_INFO; typedef struct X509_req_st { X509_REQ_INFO *req_info; X509_ALGOR *sig_alg; ASN1_BIT_STRING *signature; int references; } X509_REQ; typedef struct x509_cinf_st { ASN1_INTEGER *version; ASN1_INTEGER *serialNumber; X509_ALGOR *signature; X509_NAME *issuer; X509_VAL *validity; X509_NAME *subject; X509_PUBKEY *key; ASN1_BIT_STRING *issuerUID; ASN1_BIT_STRING *subjectUID; struct stack_st_X509_EXTENSION *extensions; ASN1_ENCODING enc; } X509_CINF; typedef struct x509_cert_aux_st { struct stack_st_ASN1_OBJECT *trust; struct stack_st_ASN1_OBJECT *reject; ASN1_UTF8STRING *alias; ASN1_OCTET_STRING *keyid; struct stack_st_X509_ALGOR *other; } X509_CERT_AUX; struct x509_st { X509_CINF *cert_info; X509_ALGOR *sig_alg; ASN1_BIT_STRING *signature; int valid; int references; char *name; CRYPTO_EX_DATA ex_data; long ex_pathlen; long ex_pcpathlen; unsigned long ex_flags; unsigned long ex_kusage; unsigned long ex_xkusage; unsigned long ex_nscert; ASN1_OCTET_STRING *skid; AUTHORITY_KEYID *akid; X509_POLICY_CACHE *policy_cache; struct stack_st_DIST_POINT *crldp; struct stack_st_GENERAL_NAME *altname; NAME_CONSTRAINTS *nc; unsigned char sha1_hash[20]; X509_CERT_AUX *aux; } ; struct stack_st_X509 { _STACK stack; }; typedef struct x509_trust_st { int trust; int flags; int (*check_trust) (struct x509_trust_st *, X509 *, int); char *name; int arg1; void *arg2; } X509_TRUST; struct stack_st_X509_TRUST { _STACK stack; }; typedef struct x509_cert_pair_st { X509 *forward; X509 *reverse; } X509_CERT_PAIR; # 427 "/usr/include/openssl/x509.h" 3 4 struct x509_revoked_st { ASN1_INTEGER *serialNumber; ASN1_TIME *revocationDate; struct stack_st_X509_EXTENSION *extensions; struct stack_st_GENERAL_NAME *issuer; int reason; int sequence; }; struct stack_st_X509_REVOKED { _STACK stack; }; typedef struct X509_crl_info_st { ASN1_INTEGER *version; X509_ALGOR *sig_alg; X509_NAME *issuer; ASN1_TIME *lastUpdate; ASN1_TIME *nextUpdate; struct stack_st_X509_REVOKED *revoked; struct stack_st_X509_EXTENSION *extensions; ASN1_ENCODING enc; } X509_CRL_INFO; struct X509_crl_st { X509_CRL_INFO *crl; X509_ALGOR *sig_alg; ASN1_BIT_STRING *signature; int references; int flags; AUTHORITY_KEYID *akid; ISSUING_DIST_POINT *idp; int idp_flags; int idp_reasons; ASN1_INTEGER *crl_number; ASN1_INTEGER *base_crl_number; unsigned char sha1_hash[20]; struct stack_st_GENERAL_NAMES *issuers; const X509_CRL_METHOD *meth; void *meth_data; } ; struct stack_st_X509_CRL { _STACK stack; }; typedef struct private_key_st { int version; X509_ALGOR *enc_algor; ASN1_OCTET_STRING *enc_pkey; EVP_PKEY *dec_pkey; int key_length; char *key_data; int key_free; EVP_CIPHER_INFO cipher; int references; } X509_PKEY; typedef struct X509_info_st { X509 *x509; X509_CRL *crl; X509_PKEY *x_pkey; EVP_CIPHER_INFO enc_cipher; int enc_len; char *enc_data; int references; } X509_INFO; struct stack_st_X509_INFO { _STACK stack; }; typedef struct Netscape_spkac_st { X509_PUBKEY *pubkey; ASN1_IA5STRING *challenge; } NETSCAPE_SPKAC; typedef struct Netscape_spki_st { NETSCAPE_SPKAC *spkac; X509_ALGOR *sig_algor; ASN1_BIT_STRING *signature; } NETSCAPE_SPKI; typedef struct Netscape_certificate_sequence { ASN1_OBJECT *type; struct stack_st_X509 *certs; } NETSCAPE_CERT_SEQUENCE; # 540 "/usr/include/openssl/x509.h" 3 4 typedef struct PBEPARAM_st { ASN1_OCTET_STRING *salt; ASN1_INTEGER *iter; } PBEPARAM; typedef struct PBE2PARAM_st { X509_ALGOR *keyfunc; X509_ALGOR *encryption; } PBE2PARAM; typedef struct PBKDF2PARAM_st { ASN1_TYPE *salt; ASN1_INTEGER *iter; ASN1_INTEGER *keylength; X509_ALGOR *prf; } PBKDF2PARAM; struct pkcs8_priv_key_info_st { int broken; ASN1_INTEGER *version; X509_ALGOR *pkeyalg; ASN1_TYPE *pkey; struct stack_st_X509_ATTRIBUTE *attributes; }; # 1 "/usr/include/openssl/x509_vfy.h" 1 3 4 # 70 "/usr/include/openssl/x509_vfy.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 71 "/usr/include/openssl/x509_vfy.h" 2 3 4 # 1 "/usr/include/openssl/lhash.h" 1 3 4 # 66 "/usr/include/openssl/lhash.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 67 "/usr/include/openssl/lhash.h" 2 3 4 # 79 "/usr/include/openssl/lhash.h" 3 4 typedef struct lhash_node_st { void *data; struct lhash_node_st *next; unsigned long hash; } LHASH_NODE; typedef int (*LHASH_COMP_FN_TYPE) (const void *, const void *); typedef unsigned long (*LHASH_HASH_FN_TYPE) (const void *); typedef void (*LHASH_DOALL_FN_TYPE) (void *); typedef void (*LHASH_DOALL_ARG_FN_TYPE) (void *, void *); # 139 "/usr/include/openssl/lhash.h" 3 4 typedef struct lhash_st { LHASH_NODE **b; LHASH_COMP_FN_TYPE comp; LHASH_HASH_FN_TYPE hash; unsigned int num_nodes; unsigned int num_alloc_nodes; unsigned int p; unsigned int pmax; unsigned long up_load; unsigned long down_load; unsigned long num_items; unsigned long num_expands; unsigned long num_expand_reallocs; unsigned long num_contracts; unsigned long num_contract_reallocs; unsigned long num_hash_calls; unsigned long num_comp_calls; unsigned long num_insert; unsigned long num_replace; unsigned long num_delete; unsigned long num_no_delete; unsigned long num_retrieve; unsigned long num_retrieve_miss; unsigned long num_hash_comps; int error; } _LHASH; # 175 "/usr/include/openssl/lhash.h" 3 4 _LHASH *lh_new(LHASH_HASH_FN_TYPE h, LHASH_COMP_FN_TYPE c); void lh_free(_LHASH *lh); void *lh_insert(_LHASH *lh, void *data); void *lh_delete(_LHASH *lh, const void *data); void *lh_retrieve(_LHASH *lh, const void *data); void lh_doall(_LHASH *lh, LHASH_DOALL_FN_TYPE func); void lh_doall_arg(_LHASH *lh, LHASH_DOALL_ARG_FN_TYPE func, void *arg); unsigned long lh_strhash(const char *c); unsigned long lh_num_items(const _LHASH *lh); void lh_stats(const _LHASH *lh, FILE *out); void lh_node_stats(const _LHASH *lh, FILE *out); void lh_node_usage_stats(const _LHASH *lh, FILE *out); void lh_stats_bio(const _LHASH *lh, BIO *out); void lh_node_stats_bio(const _LHASH *lh, BIO *out); void lh_node_usage_stats_bio(const _LHASH *lh, BIO *out); # 233 "/usr/include/openssl/lhash.h" 3 4 struct lhash_st_OPENSSL_STRING { int dummy; }; struct lhash_st_OPENSSL_CSTRING { int dummy; }; # 73 "/usr/include/openssl/x509_vfy.h" 2 3 4 # 92 "/usr/include/openssl/x509_vfy.h" 3 4 typedef struct x509_file_st { int num_paths; int num_alloced; char **paths; int *path_type; } X509_CERT_FILE_CTX; # 123 "/usr/include/openssl/x509_vfy.h" 3 4 typedef struct x509_object_st { int type; union { char *ptr; X509 *x509; X509_CRL *crl; EVP_PKEY *pkey; } data; } X509_OBJECT; typedef struct x509_lookup_st X509_LOOKUP; struct stack_st_X509_LOOKUP { _STACK stack; }; struct stack_st_X509_OBJECT { _STACK stack; }; typedef struct x509_lookup_method_st { const char *name; int (*new_item) (X509_LOOKUP *ctx); void (*free) (X509_LOOKUP *ctx); int (*init) (X509_LOOKUP *ctx); int (*shutdown) (X509_LOOKUP *ctx); int (*ctrl) (X509_LOOKUP *ctx, int cmd, const char *argc, long argl, char **ret); int (*get_by_subject) (X509_LOOKUP *ctx, int type, X509_NAME *name, X509_OBJECT *ret); int (*get_by_issuer_serial) (X509_LOOKUP *ctx, int type, X509_NAME *name, ASN1_INTEGER *serial, X509_OBJECT *ret); int (*get_by_fingerprint) (X509_LOOKUP *ctx, int type, unsigned char *bytes, int len, X509_OBJECT *ret); int (*get_by_alias) (X509_LOOKUP *ctx, int type, char *str, int len, X509_OBJECT *ret); } X509_LOOKUP_METHOD; typedef struct X509_VERIFY_PARAM_ID_st X509_VERIFY_PARAM_ID; typedef struct X509_VERIFY_PARAM_st { char *name; time_t check_time; unsigned long inh_flags; unsigned long flags; int purpose; int trust; int depth; struct stack_st_ASN1_OBJECT *policies; X509_VERIFY_PARAM_ID *id; } X509_VERIFY_PARAM; struct stack_st_X509_VERIFY_PARAM { _STACK stack; }; struct x509_store_st { int cache; struct stack_st_X509_OBJECT *objs; struct stack_st_X509_LOOKUP *get_cert_methods; X509_VERIFY_PARAM *param; int (*verify) (X509_STORE_CTX *ctx); int (*verify_cb) (int ok, X509_STORE_CTX *ctx); int (*get_issuer) (X509 **issuer, X509_STORE_CTX *ctx, X509 *x); int (*check_issued) (X509_STORE_CTX *ctx, X509 *x, X509 *issuer); int (*check_revocation) (X509_STORE_CTX *ctx); int (*get_crl) (X509_STORE_CTX *ctx, X509_CRL **crl, X509 *x); int (*check_crl) (X509_STORE_CTX *ctx, X509_CRL *crl); int (*cert_crl) (X509_STORE_CTX *ctx, X509_CRL *crl, X509 *x); struct stack_st_X509 *(*lookup_certs) (X509_STORE_CTX *ctx, X509_NAME *nm); struct stack_st_X509_CRL *(*lookup_crls) (X509_STORE_CTX *ctx, X509_NAME *nm); int (*cleanup) (X509_STORE_CTX *ctx); CRYPTO_EX_DATA ex_data; int references; } ; int X509_STORE_set_depth(X509_STORE *store, int depth); struct x509_lookup_st { int init; int skip; X509_LOOKUP_METHOD *method; char *method_data; X509_STORE *store_ctx; } ; struct x509_store_ctx_st { X509_STORE *ctx; int current_method; X509 *cert; struct stack_st_X509 *untrusted; struct stack_st_X509_CRL *crls; X509_VERIFY_PARAM *param; void *other_ctx; int (*verify) (X509_STORE_CTX *ctx); int (*verify_cb) (int ok, X509_STORE_CTX *ctx); int (*get_issuer) (X509 **issuer, X509_STORE_CTX *ctx, X509 *x); int (*check_issued) (X509_STORE_CTX *ctx, X509 *x, X509 *issuer); int (*check_revocation) (X509_STORE_CTX *ctx); int (*get_crl) (X509_STORE_CTX *ctx, X509_CRL **crl, X509 *x); int (*check_crl) (X509_STORE_CTX *ctx, X509_CRL *crl); int (*cert_crl) (X509_STORE_CTX *ctx, X509_CRL *crl, X509 *x); int (*check_policy) (X509_STORE_CTX *ctx); struct stack_st_X509 *(*lookup_certs) (X509_STORE_CTX *ctx, X509_NAME *nm); struct stack_st_X509_CRL *(*lookup_crls) (X509_STORE_CTX *ctx, X509_NAME *nm); int (*cleanup) (X509_STORE_CTX *ctx); int valid; int last_untrusted; struct stack_st_X509 *chain; X509_POLICY_TREE *tree; int explicit_policy; int error_depth; int error; X509 *current_cert; X509 *current_issuer; X509_CRL *current_crl; int current_crl_score; unsigned int current_reasons; X509_STORE_CTX *parent; CRYPTO_EX_DATA ex_data; } ; void X509_STORE_CTX_set_depth(X509_STORE_CTX *ctx, int depth); # 459 "/usr/include/openssl/x509_vfy.h" 3 4 int X509_OBJECT_idx_by_subject(struct stack_st_X509_OBJECT *h, int type, X509_NAME *name); X509_OBJECT *X509_OBJECT_retrieve_by_subject(struct stack_st_X509_OBJECT *h, int type, X509_NAME *name); X509_OBJECT *X509_OBJECT_retrieve_match(struct stack_st_X509_OBJECT *h, X509_OBJECT *x); void X509_OBJECT_up_ref_count(X509_OBJECT *a); void X509_OBJECT_free_contents(X509_OBJECT *a); X509_STORE *X509_STORE_new(void); void X509_STORE_free(X509_STORE *v); struct stack_st_X509 *X509_STORE_get1_certs(X509_STORE_CTX *st, X509_NAME *nm); struct stack_st_X509_CRL *X509_STORE_get1_crls(X509_STORE_CTX *st, X509_NAME *nm); int X509_STORE_set_flags(X509_STORE *ctx, unsigned long flags); int X509_STORE_set_purpose(X509_STORE *ctx, int purpose); int X509_STORE_set_trust(X509_STORE *ctx, int trust); int X509_STORE_set1_param(X509_STORE *ctx, X509_VERIFY_PARAM *pm); void X509_STORE_set_verify_cb(X509_STORE *ctx, int (*verify_cb) (int, X509_STORE_CTX *)); void X509_STORE_set_lookup_crls_cb(X509_STORE *ctx, struct stack_st_X509_CRL *(*cb) (X509_STORE_CTX *ctx, X509_NAME *nm)); X509_STORE_CTX *X509_STORE_CTX_new(void); int X509_STORE_CTX_get1_issuer(X509 **issuer, X509_STORE_CTX *ctx, X509 *x); void X509_STORE_CTX_free(X509_STORE_CTX *ctx); int X509_STORE_CTX_init(X509_STORE_CTX *ctx, X509_STORE *store, X509 *x509, struct stack_st_X509 *chain); void X509_STORE_CTX_trusted_stack(X509_STORE_CTX *ctx, struct stack_st_X509 *sk); void X509_STORE_CTX_cleanup(X509_STORE_CTX *ctx); X509_STORE *X509_STORE_CTX_get0_store(X509_STORE_CTX *ctx); X509_LOOKUP *X509_STORE_add_lookup(X509_STORE *v, X509_LOOKUP_METHOD *m); X509_LOOKUP_METHOD *X509_LOOKUP_hash_dir(void); X509_LOOKUP_METHOD *X509_LOOKUP_file(void); int X509_STORE_add_cert(X509_STORE *ctx, X509 *x); int X509_STORE_add_crl(X509_STORE *ctx, X509_CRL *x); int X509_STORE_get_by_subject(X509_STORE_CTX *vs, int type, X509_NAME *name, X509_OBJECT *ret); int X509_LOOKUP_ctrl(X509_LOOKUP *ctx, int cmd, const char *argc, long argl, char **ret); int X509_load_cert_file(X509_LOOKUP *ctx, const char *file, int type); int X509_load_crl_file(X509_LOOKUP *ctx, const char *file, int type); int X509_load_cert_crl_file(X509_LOOKUP *ctx, const char *file, int type); X509_LOOKUP *X509_LOOKUP_new(X509_LOOKUP_METHOD *method); void X509_LOOKUP_free(X509_LOOKUP *ctx); int X509_LOOKUP_init(X509_LOOKUP *ctx); int X509_LOOKUP_by_subject(X509_LOOKUP *ctx, int type, X509_NAME *name, X509_OBJECT *ret); int X509_LOOKUP_by_issuer_serial(X509_LOOKUP *ctx, int type, X509_NAME *name, ASN1_INTEGER *serial, X509_OBJECT *ret); int X509_LOOKUP_by_fingerprint(X509_LOOKUP *ctx, int type, unsigned char *bytes, int len, X509_OBJECT *ret); int X509_LOOKUP_by_alias(X509_LOOKUP *ctx, int type, char *str, int len, X509_OBJECT *ret); int X509_LOOKUP_shutdown(X509_LOOKUP *ctx); int X509_STORE_load_locations(X509_STORE *ctx, const char *file, const char *dir); int X509_STORE_set_default_paths(X509_STORE *ctx); int X509_STORE_CTX_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int X509_STORE_CTX_set_ex_data(X509_STORE_CTX *ctx, int idx, void *data); void *X509_STORE_CTX_get_ex_data(X509_STORE_CTX *ctx, int idx); int X509_STORE_CTX_get_error(X509_STORE_CTX *ctx); void X509_STORE_CTX_set_error(X509_STORE_CTX *ctx, int s); int X509_STORE_CTX_get_error_depth(X509_STORE_CTX *ctx); X509 *X509_STORE_CTX_get_current_cert(X509_STORE_CTX *ctx); X509 *X509_STORE_CTX_get0_current_issuer(X509_STORE_CTX *ctx); X509_CRL *X509_STORE_CTX_get0_current_crl(X509_STORE_CTX *ctx); X509_STORE_CTX *X509_STORE_CTX_get0_parent_ctx(X509_STORE_CTX *ctx); struct stack_st_X509 *X509_STORE_CTX_get_chain(X509_STORE_CTX *ctx); struct stack_st_X509 *X509_STORE_CTX_get1_chain(X509_STORE_CTX *ctx); void X509_STORE_CTX_set_cert(X509_STORE_CTX *c, X509 *x); void X509_STORE_CTX_set_chain(X509_STORE_CTX *c, struct stack_st_X509 *sk); void X509_STORE_CTX_set0_crls(X509_STORE_CTX *c, struct stack_st_X509_CRL *sk); int X509_STORE_CTX_set_purpose(X509_STORE_CTX *ctx, int purpose); int X509_STORE_CTX_set_trust(X509_STORE_CTX *ctx, int trust); int X509_STORE_CTX_purpose_inherit(X509_STORE_CTX *ctx, int def_purpose, int purpose, int trust); void X509_STORE_CTX_set_flags(X509_STORE_CTX *ctx, unsigned long flags); void X509_STORE_CTX_set_time(X509_STORE_CTX *ctx, unsigned long flags, time_t t); void X509_STORE_CTX_set_verify_cb(X509_STORE_CTX *ctx, int (*verify_cb) (int, X509_STORE_CTX *)); X509_POLICY_TREE *X509_STORE_CTX_get0_policy_tree(X509_STORE_CTX *ctx); int X509_STORE_CTX_get_explicit_policy(X509_STORE_CTX *ctx); X509_VERIFY_PARAM *X509_STORE_CTX_get0_param(X509_STORE_CTX *ctx); void X509_STORE_CTX_set0_param(X509_STORE_CTX *ctx, X509_VERIFY_PARAM *param); int X509_STORE_CTX_set_default(X509_STORE_CTX *ctx, const char *name); X509_VERIFY_PARAM *X509_VERIFY_PARAM_new(void); void X509_VERIFY_PARAM_free(X509_VERIFY_PARAM *param); int X509_VERIFY_PARAM_inherit(X509_VERIFY_PARAM *to, const X509_VERIFY_PARAM *from); int X509_VERIFY_PARAM_set1(X509_VERIFY_PARAM *to, const X509_VERIFY_PARAM *from); int X509_VERIFY_PARAM_set1_name(X509_VERIFY_PARAM *param, const char *name); int X509_VERIFY_PARAM_set_flags(X509_VERIFY_PARAM *param, unsigned long flags); int X509_VERIFY_PARAM_clear_flags(X509_VERIFY_PARAM *param, unsigned long flags); unsigned long X509_VERIFY_PARAM_get_flags(X509_VERIFY_PARAM *param); int X509_VERIFY_PARAM_set_purpose(X509_VERIFY_PARAM *param, int purpose); int X509_VERIFY_PARAM_set_trust(X509_VERIFY_PARAM *param, int trust); void X509_VERIFY_PARAM_set_depth(X509_VERIFY_PARAM *param, int depth); void X509_VERIFY_PARAM_set_time(X509_VERIFY_PARAM *param, time_t t); int X509_VERIFY_PARAM_add0_policy(X509_VERIFY_PARAM *param, ASN1_OBJECT *policy); int X509_VERIFY_PARAM_set1_policies(X509_VERIFY_PARAM *param, struct stack_st_ASN1_OBJECT *policies); int X509_VERIFY_PARAM_set1_host(X509_VERIFY_PARAM *param, const char *name, size_t namelen); int X509_VERIFY_PARAM_add1_host(X509_VERIFY_PARAM *param, const char *name, size_t namelen); void X509_VERIFY_PARAM_set_hostflags(X509_VERIFY_PARAM *param, unsigned int flags); char *X509_VERIFY_PARAM_get0_peername(X509_VERIFY_PARAM *); int X509_VERIFY_PARAM_set1_email(X509_VERIFY_PARAM *param, const char *email, size_t emaillen); int X509_VERIFY_PARAM_set1_ip(X509_VERIFY_PARAM *param, const unsigned char *ip, size_t iplen); int X509_VERIFY_PARAM_set1_ip_asc(X509_VERIFY_PARAM *param, const char *ipasc); int X509_VERIFY_PARAM_get_depth(const X509_VERIFY_PARAM *param); const char *X509_VERIFY_PARAM_get0_name(const X509_VERIFY_PARAM *param); int X509_VERIFY_PARAM_add0_table(X509_VERIFY_PARAM *param); int X509_VERIFY_PARAM_get_count(void); const X509_VERIFY_PARAM *X509_VERIFY_PARAM_get0(int id); const X509_VERIFY_PARAM *X509_VERIFY_PARAM_lookup(const char *name); void X509_VERIFY_PARAM_table_cleanup(void); int X509_policy_check(X509_POLICY_TREE **ptree, int *pexplicit_policy, struct stack_st_X509 *certs, struct stack_st_ASN1_OBJECT *policy_oids, unsigned int flags); void X509_policy_tree_free(X509_POLICY_TREE *tree); int X509_policy_tree_level_count(const X509_POLICY_TREE *tree); X509_POLICY_LEVEL *X509_policy_tree_get0_level(const X509_POLICY_TREE *tree, int i); struct stack_st_X509_POLICY_NODE *X509_policy_tree_get0_policies(const X509_POLICY_TREE *tree); struct stack_st_X509_POLICY_NODE *X509_policy_tree_get0_user_policies(const X509_POLICY_TREE *tree); int X509_policy_level_node_count(X509_POLICY_LEVEL *level); X509_POLICY_NODE *X509_policy_level_get0_node(X509_POLICY_LEVEL *level, int i); const ASN1_OBJECT *X509_policy_node_get0_policy(const X509_POLICY_NODE *node); struct stack_st_POLICYQUALINFO *X509_policy_node_get0_qualifiers(const X509_POLICY_NODE *node); const X509_POLICY_NODE *X509_policy_node_get0_parent(const X509_POLICY_NODE *node); # 582 "/usr/include/openssl/x509.h" 2 3 4 # 1 "/usr/include/openssl/pkcs7.h" 1 3 4 # 64 "/usr/include/openssl/pkcs7.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 65 "/usr/include/openssl/pkcs7.h" 2 3 4 # 86 "/usr/include/openssl/pkcs7.h" 3 4 typedef struct pkcs7_issuer_and_serial_st { X509_NAME *issuer; ASN1_INTEGER *serial; } PKCS7_ISSUER_AND_SERIAL; typedef struct pkcs7_signer_info_st { ASN1_INTEGER *version; PKCS7_ISSUER_AND_SERIAL *issuer_and_serial; X509_ALGOR *digest_alg; struct stack_st_X509_ATTRIBUTE *auth_attr; X509_ALGOR *digest_enc_alg; ASN1_OCTET_STRING *enc_digest; struct stack_st_X509_ATTRIBUTE *unauth_attr; EVP_PKEY *pkey; } PKCS7_SIGNER_INFO; struct stack_st_PKCS7_SIGNER_INFO { _STACK stack; }; typedef struct pkcs7_recip_info_st { ASN1_INTEGER *version; PKCS7_ISSUER_AND_SERIAL *issuer_and_serial; X509_ALGOR *key_enc_algor; ASN1_OCTET_STRING *enc_key; X509 *cert; } PKCS7_RECIP_INFO; struct stack_st_PKCS7_RECIP_INFO { _STACK stack; }; typedef struct pkcs7_signed_st { ASN1_INTEGER *version; struct stack_st_X509_ALGOR *md_algs; struct stack_st_X509 *cert; struct stack_st_X509_CRL *crl; struct stack_st_PKCS7_SIGNER_INFO *signer_info; struct pkcs7_st *contents; } PKCS7_SIGNED; typedef struct pkcs7_enc_content_st { ASN1_OBJECT *content_type; X509_ALGOR *algorithm; ASN1_OCTET_STRING *enc_data; const EVP_CIPHER *cipher; } PKCS7_ENC_CONTENT; typedef struct pkcs7_enveloped_st { ASN1_INTEGER *version; struct stack_st_PKCS7_RECIP_INFO *recipientinfo; PKCS7_ENC_CONTENT *enc_data; } PKCS7_ENVELOPE; typedef struct pkcs7_signedandenveloped_st { ASN1_INTEGER *version; struct stack_st_X509_ALGOR *md_algs; struct stack_st_X509 *cert; struct stack_st_X509_CRL *crl; struct stack_st_PKCS7_SIGNER_INFO *signer_info; PKCS7_ENC_CONTENT *enc_data; struct stack_st_PKCS7_RECIP_INFO *recipientinfo; } PKCS7_SIGN_ENVELOPE; typedef struct pkcs7_digest_st { ASN1_INTEGER *version; X509_ALGOR *md; struct pkcs7_st *contents; ASN1_OCTET_STRING *digest; } PKCS7_DIGEST; typedef struct pkcs7_encrypted_st { ASN1_INTEGER *version; PKCS7_ENC_CONTENT *enc_data; } PKCS7_ENCRYPT; typedef struct pkcs7_st { unsigned char *asn1; long length; int state; int detached; ASN1_OBJECT *type; union { char *ptr; ASN1_OCTET_STRING *data; PKCS7_SIGNED *sign; PKCS7_ENVELOPE *enveloped; PKCS7_SIGN_ENVELOPE *signed_and_enveloped; PKCS7_DIGEST *digest; PKCS7_ENCRYPT *encrypted; ASN1_TYPE *other; } d; } PKCS7; struct stack_st_PKCS7 { _STACK stack; }; # 258 "/usr/include/openssl/pkcs7.h" 3 4 PKCS7_ISSUER_AND_SERIAL *PKCS7_ISSUER_AND_SERIAL_new(void); void PKCS7_ISSUER_AND_SERIAL_free(PKCS7_ISSUER_AND_SERIAL *a); PKCS7_ISSUER_AND_SERIAL *d2i_PKCS7_ISSUER_AND_SERIAL(PKCS7_ISSUER_AND_SERIAL **a, const unsigned char **in, long len); int i2d_PKCS7_ISSUER_AND_SERIAL(PKCS7_ISSUER_AND_SERIAL *a, unsigned char **out); extern const ASN1_ITEM PKCS7_ISSUER_AND_SERIAL_it; int PKCS7_ISSUER_AND_SERIAL_digest(PKCS7_ISSUER_AND_SERIAL *data, const EVP_MD *type, unsigned char *md, unsigned int *len); PKCS7 *d2i_PKCS7_fp(FILE *fp, PKCS7 **p7); int i2d_PKCS7_fp(FILE *fp, PKCS7 *p7); PKCS7 *PKCS7_dup(PKCS7 *p7); PKCS7 *d2i_PKCS7_bio(BIO *bp, PKCS7 **p7); int i2d_PKCS7_bio(BIO *bp, PKCS7 *p7); int i2d_PKCS7_bio_stream(BIO *out, PKCS7 *p7, BIO *in, int flags); int PEM_write_bio_PKCS7_stream(BIO *out, PKCS7 *p7, BIO *in, int flags); PKCS7_SIGNER_INFO *PKCS7_SIGNER_INFO_new(void); void PKCS7_SIGNER_INFO_free(PKCS7_SIGNER_INFO *a); PKCS7_SIGNER_INFO *d2i_PKCS7_SIGNER_INFO(PKCS7_SIGNER_INFO **a, const unsigned char **in, long len); int i2d_PKCS7_SIGNER_INFO(PKCS7_SIGNER_INFO *a, unsigned char **out); extern const ASN1_ITEM PKCS7_SIGNER_INFO_it; PKCS7_RECIP_INFO *PKCS7_RECIP_INFO_new(void); void PKCS7_RECIP_INFO_free(PKCS7_RECIP_INFO *a); PKCS7_RECIP_INFO *d2i_PKCS7_RECIP_INFO(PKCS7_RECIP_INFO **a, const unsigned char **in, long len); int i2d_PKCS7_RECIP_INFO(PKCS7_RECIP_INFO *a, unsigned char **out); extern const ASN1_ITEM PKCS7_RECIP_INFO_it; PKCS7_SIGNED *PKCS7_SIGNED_new(void); void PKCS7_SIGNED_free(PKCS7_SIGNED *a); PKCS7_SIGNED *d2i_PKCS7_SIGNED(PKCS7_SIGNED **a, const unsigned char **in, long len); int i2d_PKCS7_SIGNED(PKCS7_SIGNED *a, unsigned char **out); extern const ASN1_ITEM PKCS7_SIGNED_it; PKCS7_ENC_CONTENT *PKCS7_ENC_CONTENT_new(void); void PKCS7_ENC_CONTENT_free(PKCS7_ENC_CONTENT *a); PKCS7_ENC_CONTENT *d2i_PKCS7_ENC_CONTENT(PKCS7_ENC_CONTENT **a, const unsigned char **in, long len); int i2d_PKCS7_ENC_CONTENT(PKCS7_ENC_CONTENT *a, unsigned char **out); extern const ASN1_ITEM PKCS7_ENC_CONTENT_it; PKCS7_ENVELOPE *PKCS7_ENVELOPE_new(void); void PKCS7_ENVELOPE_free(PKCS7_ENVELOPE *a); PKCS7_ENVELOPE *d2i_PKCS7_ENVELOPE(PKCS7_ENVELOPE **a, const unsigned char **in, long len); int i2d_PKCS7_ENVELOPE(PKCS7_ENVELOPE *a, unsigned char **out); extern const ASN1_ITEM PKCS7_ENVELOPE_it; PKCS7_SIGN_ENVELOPE *PKCS7_SIGN_ENVELOPE_new(void); void PKCS7_SIGN_ENVELOPE_free(PKCS7_SIGN_ENVELOPE *a); PKCS7_SIGN_ENVELOPE *d2i_PKCS7_SIGN_ENVELOPE(PKCS7_SIGN_ENVELOPE **a, const unsigned char **in, long len); int i2d_PKCS7_SIGN_ENVELOPE(PKCS7_SIGN_ENVELOPE *a, unsigned char **out); extern const ASN1_ITEM PKCS7_SIGN_ENVELOPE_it; PKCS7_DIGEST *PKCS7_DIGEST_new(void); void PKCS7_DIGEST_free(PKCS7_DIGEST *a); PKCS7_DIGEST *d2i_PKCS7_DIGEST(PKCS7_DIGEST **a, const unsigned char **in, long len); int i2d_PKCS7_DIGEST(PKCS7_DIGEST *a, unsigned char **out); extern const ASN1_ITEM PKCS7_DIGEST_it; PKCS7_ENCRYPT *PKCS7_ENCRYPT_new(void); void PKCS7_ENCRYPT_free(PKCS7_ENCRYPT *a); PKCS7_ENCRYPT *d2i_PKCS7_ENCRYPT(PKCS7_ENCRYPT **a, const unsigned char **in, long len); int i2d_PKCS7_ENCRYPT(PKCS7_ENCRYPT *a, unsigned char **out); extern const ASN1_ITEM PKCS7_ENCRYPT_it; PKCS7 *PKCS7_new(void); void PKCS7_free(PKCS7 *a); PKCS7 *d2i_PKCS7(PKCS7 **a, const unsigned char **in, long len); int i2d_PKCS7(PKCS7 *a, unsigned char **out); extern const ASN1_ITEM PKCS7_it; extern const ASN1_ITEM PKCS7_ATTR_SIGN_it; extern const ASN1_ITEM PKCS7_ATTR_VERIFY_it; int i2d_PKCS7_NDEF(PKCS7 *a, unsigned char **out); int PKCS7_print_ctx(BIO *out, PKCS7 *x, int indent, const ASN1_PCTX *pctx); long PKCS7_ctrl(PKCS7 *p7, int cmd, long larg, char *parg); int PKCS7_set_type(PKCS7 *p7, int type); int PKCS7_set0_type_other(PKCS7 *p7, int type, ASN1_TYPE *other); int PKCS7_set_content(PKCS7 *p7, PKCS7 *p7_data); int PKCS7_SIGNER_INFO_set(PKCS7_SIGNER_INFO *p7i, X509 *x509, EVP_PKEY *pkey, const EVP_MD *dgst); int PKCS7_SIGNER_INFO_sign(PKCS7_SIGNER_INFO *si); int PKCS7_add_signer(PKCS7 *p7, PKCS7_SIGNER_INFO *p7i); int PKCS7_add_certificate(PKCS7 *p7, X509 *x509); int PKCS7_add_crl(PKCS7 *p7, X509_CRL *x509); int PKCS7_content_new(PKCS7 *p7, int nid); int PKCS7_dataVerify(X509_STORE *cert_store, X509_STORE_CTX *ctx, BIO *bio, PKCS7 *p7, PKCS7_SIGNER_INFO *si); int PKCS7_signatureVerify(BIO *bio, PKCS7 *p7, PKCS7_SIGNER_INFO *si, X509 *x509); BIO *PKCS7_dataInit(PKCS7 *p7, BIO *bio); int PKCS7_dataFinal(PKCS7 *p7, BIO *bio); BIO *PKCS7_dataDecode(PKCS7 *p7, EVP_PKEY *pkey, BIO *in_bio, X509 *pcert); PKCS7_SIGNER_INFO *PKCS7_add_signature(PKCS7 *p7, X509 *x509, EVP_PKEY *pkey, const EVP_MD *dgst); X509 *PKCS7_cert_from_signer_info(PKCS7 *p7, PKCS7_SIGNER_INFO *si); int PKCS7_set_digest(PKCS7 *p7, const EVP_MD *md); struct stack_st_PKCS7_SIGNER_INFO *PKCS7_get_signer_info(PKCS7 *p7); PKCS7_RECIP_INFO *PKCS7_add_recipient(PKCS7 *p7, X509 *x509); void PKCS7_SIGNER_INFO_get0_algs(PKCS7_SIGNER_INFO *si, EVP_PKEY **pk, X509_ALGOR **pdig, X509_ALGOR **psig); void PKCS7_RECIP_INFO_get0_alg(PKCS7_RECIP_INFO *ri, X509_ALGOR **penc); int PKCS7_add_recipient_info(PKCS7 *p7, PKCS7_RECIP_INFO *ri); int PKCS7_RECIP_INFO_set(PKCS7_RECIP_INFO *p7i, X509 *x509); int PKCS7_set_cipher(PKCS7 *p7, const EVP_CIPHER *cipher); int PKCS7_stream(unsigned char ***boundary, PKCS7 *p7); PKCS7_ISSUER_AND_SERIAL *PKCS7_get_issuer_and_serial(PKCS7 *p7, int idx); ASN1_OCTET_STRING *PKCS7_digest_from_attributes(struct stack_st_X509_ATTRIBUTE *sk); int PKCS7_add_signed_attribute(PKCS7_SIGNER_INFO *p7si, int nid, int type, void *data); int PKCS7_add_attribute(PKCS7_SIGNER_INFO *p7si, int nid, int atrtype, void *value); ASN1_TYPE *PKCS7_get_attribute(PKCS7_SIGNER_INFO *si, int nid); ASN1_TYPE *PKCS7_get_signed_attribute(PKCS7_SIGNER_INFO *si, int nid); int PKCS7_set_signed_attributes(PKCS7_SIGNER_INFO *p7si, struct stack_st_X509_ATTRIBUTE *sk); int PKCS7_set_attributes(PKCS7_SIGNER_INFO *p7si, struct stack_st_X509_ATTRIBUTE *sk); PKCS7 *PKCS7_sign(X509 *signcert, EVP_PKEY *pkey, struct stack_st_X509 *certs, BIO *data, int flags); PKCS7_SIGNER_INFO *PKCS7_sign_add_signer(PKCS7 *p7, X509 *signcert, EVP_PKEY *pkey, const EVP_MD *md, int flags); int PKCS7_final(PKCS7 *p7, BIO *data, int flags); int PKCS7_verify(PKCS7 *p7, struct stack_st_X509 *certs, X509_STORE *store, BIO *indata, BIO *out, int flags); struct stack_st_X509 *PKCS7_get0_signers(PKCS7 *p7, struct stack_st_X509 *certs, int flags); PKCS7 *PKCS7_encrypt(struct stack_st_X509 *certs, BIO *in, const EVP_CIPHER *cipher, int flags); int PKCS7_decrypt(PKCS7 *p7, EVP_PKEY *pkey, X509 *cert, BIO *data, int flags); int PKCS7_add_attrib_smimecap(PKCS7_SIGNER_INFO *si, struct stack_st_X509_ALGOR *cap); struct stack_st_X509_ALGOR *PKCS7_get_smimecap(PKCS7_SIGNER_INFO *si); int PKCS7_simple_smimecap(struct stack_st_X509_ALGOR *sk, int nid, int arg); int PKCS7_add_attrib_content_type(PKCS7_SIGNER_INFO *si, ASN1_OBJECT *coid); int PKCS7_add0_attrib_signing_time(PKCS7_SIGNER_INFO *si, ASN1_TIME *t); int PKCS7_add1_attrib_digest(PKCS7_SIGNER_INFO *si, const unsigned char *md, int mdlen); int SMIME_write_PKCS7(BIO *bio, PKCS7 *p7, BIO *data, int flags); PKCS7 *SMIME_read_PKCS7(BIO *bio, BIO **bcont); BIO *BIO_new_PKCS7(BIO *out, PKCS7 *p7); void ERR_load_PKCS7_strings(void); # 583 "/usr/include/openssl/x509.h" 2 3 4 # 608 "/usr/include/openssl/x509.h" 3 4 void X509_CRL_set_default_method(const X509_CRL_METHOD *meth); X509_CRL_METHOD *X509_CRL_METHOD_new(int (*crl_init) (X509_CRL *crl), int (*crl_free) (X509_CRL *crl), int (*crl_lookup) (X509_CRL *crl, X509_REVOKED **ret, ASN1_INTEGER *ser, X509_NAME *issuer), int (*crl_verify) (X509_CRL *crl, EVP_PKEY *pk)); void X509_CRL_METHOD_free(X509_CRL_METHOD *m); void X509_CRL_set_meth_data(X509_CRL *crl, void *dat); void *X509_CRL_get_meth_data(X509_CRL *crl); const char *X509_verify_cert_error_string(long n); int X509_verify(X509 *a, EVP_PKEY *r); int X509_REQ_verify(X509_REQ *a, EVP_PKEY *r); int X509_CRL_verify(X509_CRL *a, EVP_PKEY *r); int NETSCAPE_SPKI_verify(NETSCAPE_SPKI *a, EVP_PKEY *r); NETSCAPE_SPKI *NETSCAPE_SPKI_b64_decode(const char *str, int len); char *NETSCAPE_SPKI_b64_encode(NETSCAPE_SPKI *x); EVP_PKEY *NETSCAPE_SPKI_get_pubkey(NETSCAPE_SPKI *x); int NETSCAPE_SPKI_set_pubkey(NETSCAPE_SPKI *x, EVP_PKEY *pkey); int NETSCAPE_SPKI_print(BIO *out, NETSCAPE_SPKI *spki); int X509_signature_dump(BIO *bp, const ASN1_STRING *sig, int indent); int X509_signature_print(BIO *bp, X509_ALGOR *alg, ASN1_STRING *sig); int X509_sign(X509 *x, EVP_PKEY *pkey, const EVP_MD *md); int X509_sign_ctx(X509 *x, EVP_MD_CTX *ctx); int X509_http_nbio(OCSP_REQ_CTX *rctx, X509 **pcert); int X509_REQ_sign(X509_REQ *x, EVP_PKEY *pkey, const EVP_MD *md); int X509_REQ_sign_ctx(X509_REQ *x, EVP_MD_CTX *ctx); int X509_CRL_sign(X509_CRL *x, EVP_PKEY *pkey, const EVP_MD *md); int X509_CRL_sign_ctx(X509_CRL *x, EVP_MD_CTX *ctx); int X509_CRL_http_nbio(OCSP_REQ_CTX *rctx, X509_CRL **pcrl); int NETSCAPE_SPKI_sign(NETSCAPE_SPKI *x, EVP_PKEY *pkey, const EVP_MD *md); int X509_pubkey_digest(const X509 *data, const EVP_MD *type, unsigned char *md, unsigned int *len); int X509_digest(const X509 *data, const EVP_MD *type, unsigned char *md, unsigned int *len); int X509_CRL_digest(const X509_CRL *data, const EVP_MD *type, unsigned char *md, unsigned int *len); int X509_REQ_digest(const X509_REQ *data, const EVP_MD *type, unsigned char *md, unsigned int *len); int X509_NAME_digest(const X509_NAME *data, const EVP_MD *type, unsigned char *md, unsigned int *len); X509 *d2i_X509_fp(FILE *fp, X509 **x509); int i2d_X509_fp(FILE *fp, X509 *x509); X509_CRL *d2i_X509_CRL_fp(FILE *fp, X509_CRL **crl); int i2d_X509_CRL_fp(FILE *fp, X509_CRL *crl); X509_REQ *d2i_X509_REQ_fp(FILE *fp, X509_REQ **req); int i2d_X509_REQ_fp(FILE *fp, X509_REQ *req); RSA *d2i_RSAPrivateKey_fp(FILE *fp, RSA **rsa); int i2d_RSAPrivateKey_fp(FILE *fp, RSA *rsa); RSA *d2i_RSAPublicKey_fp(FILE *fp, RSA **rsa); int i2d_RSAPublicKey_fp(FILE *fp, RSA *rsa); RSA *d2i_RSA_PUBKEY_fp(FILE *fp, RSA **rsa); int i2d_RSA_PUBKEY_fp(FILE *fp, RSA *rsa); DSA *d2i_DSA_PUBKEY_fp(FILE *fp, DSA **dsa); int i2d_DSA_PUBKEY_fp(FILE *fp, DSA *dsa); DSA *d2i_DSAPrivateKey_fp(FILE *fp, DSA **dsa); int i2d_DSAPrivateKey_fp(FILE *fp, DSA *dsa); EC_KEY *d2i_EC_PUBKEY_fp(FILE *fp, EC_KEY **eckey); int i2d_EC_PUBKEY_fp(FILE *fp, EC_KEY *eckey); EC_KEY *d2i_ECPrivateKey_fp(FILE *fp, EC_KEY **eckey); int i2d_ECPrivateKey_fp(FILE *fp, EC_KEY *eckey); X509_SIG *d2i_PKCS8_fp(FILE *fp, X509_SIG **p8); int i2d_PKCS8_fp(FILE *fp, X509_SIG *p8); PKCS8_PRIV_KEY_INFO *d2i_PKCS8_PRIV_KEY_INFO_fp(FILE *fp, PKCS8_PRIV_KEY_INFO **p8inf); int i2d_PKCS8_PRIV_KEY_INFO_fp(FILE *fp, PKCS8_PRIV_KEY_INFO *p8inf); int i2d_PKCS8PrivateKeyInfo_fp(FILE *fp, EVP_PKEY *key); int i2d_PrivateKey_fp(FILE *fp, EVP_PKEY *pkey); EVP_PKEY *d2i_PrivateKey_fp(FILE *fp, EVP_PKEY **a); int i2d_PUBKEY_fp(FILE *fp, EVP_PKEY *pkey); EVP_PKEY *d2i_PUBKEY_fp(FILE *fp, EVP_PKEY **a); X509 *d2i_X509_bio(BIO *bp, X509 **x509); int i2d_X509_bio(BIO *bp, X509 *x509); X509_CRL *d2i_X509_CRL_bio(BIO *bp, X509_CRL **crl); int i2d_X509_CRL_bio(BIO *bp, X509_CRL *crl); X509_REQ *d2i_X509_REQ_bio(BIO *bp, X509_REQ **req); int i2d_X509_REQ_bio(BIO *bp, X509_REQ *req); RSA *d2i_RSAPrivateKey_bio(BIO *bp, RSA **rsa); int i2d_RSAPrivateKey_bio(BIO *bp, RSA *rsa); RSA *d2i_RSAPublicKey_bio(BIO *bp, RSA **rsa); int i2d_RSAPublicKey_bio(BIO *bp, RSA *rsa); RSA *d2i_RSA_PUBKEY_bio(BIO *bp, RSA **rsa); int i2d_RSA_PUBKEY_bio(BIO *bp, RSA *rsa); DSA *d2i_DSA_PUBKEY_bio(BIO *bp, DSA **dsa); int i2d_DSA_PUBKEY_bio(BIO *bp, DSA *dsa); DSA *d2i_DSAPrivateKey_bio(BIO *bp, DSA **dsa); int i2d_DSAPrivateKey_bio(BIO *bp, DSA *dsa); EC_KEY *d2i_EC_PUBKEY_bio(BIO *bp, EC_KEY **eckey); int i2d_EC_PUBKEY_bio(BIO *bp, EC_KEY *eckey); EC_KEY *d2i_ECPrivateKey_bio(BIO *bp, EC_KEY **eckey); int i2d_ECPrivateKey_bio(BIO *bp, EC_KEY *eckey); X509_SIG *d2i_PKCS8_bio(BIO *bp, X509_SIG **p8); int i2d_PKCS8_bio(BIO *bp, X509_SIG *p8); PKCS8_PRIV_KEY_INFO *d2i_PKCS8_PRIV_KEY_INFO_bio(BIO *bp, PKCS8_PRIV_KEY_INFO **p8inf); int i2d_PKCS8_PRIV_KEY_INFO_bio(BIO *bp, PKCS8_PRIV_KEY_INFO *p8inf); int i2d_PKCS8PrivateKeyInfo_bio(BIO *bp, EVP_PKEY *key); int i2d_PrivateKey_bio(BIO *bp, EVP_PKEY *pkey); EVP_PKEY *d2i_PrivateKey_bio(BIO *bp, EVP_PKEY **a); int i2d_PUBKEY_bio(BIO *bp, EVP_PKEY *pkey); EVP_PKEY *d2i_PUBKEY_bio(BIO *bp, EVP_PKEY **a); X509 *X509_dup(X509 *x509); X509_ATTRIBUTE *X509_ATTRIBUTE_dup(X509_ATTRIBUTE *xa); X509_EXTENSION *X509_EXTENSION_dup(X509_EXTENSION *ex); X509_CRL *X509_CRL_dup(X509_CRL *crl); X509_REVOKED *X509_REVOKED_dup(X509_REVOKED *rev); X509_REQ *X509_REQ_dup(X509_REQ *req); X509_ALGOR *X509_ALGOR_dup(X509_ALGOR *xn); int X509_ALGOR_set0(X509_ALGOR *alg, ASN1_OBJECT *aobj, int ptype, void *pval); void X509_ALGOR_get0(ASN1_OBJECT **paobj, int *pptype, void **ppval, X509_ALGOR *algor); void X509_ALGOR_set_md(X509_ALGOR *alg, const EVP_MD *md); int X509_ALGOR_cmp(const X509_ALGOR *a, const X509_ALGOR *b); X509_NAME *X509_NAME_dup(X509_NAME *xn); X509_NAME_ENTRY *X509_NAME_ENTRY_dup(X509_NAME_ENTRY *ne); int X509_cmp_time(const ASN1_TIME *s, time_t *t); int X509_cmp_current_time(const ASN1_TIME *s); ASN1_TIME *X509_time_adj(ASN1_TIME *s, long adj, time_t *t); ASN1_TIME *X509_time_adj_ex(ASN1_TIME *s, int offset_day, long offset_sec, time_t *t); ASN1_TIME *X509_gmtime_adj(ASN1_TIME *s, long adj); const char *X509_get_default_cert_area(void); const char *X509_get_default_cert_dir(void); const char *X509_get_default_cert_file(void); const char *X509_get_default_cert_dir_env(void); const char *X509_get_default_cert_file_env(void); const char *X509_get_default_private_dir(void); X509_REQ *X509_to_X509_REQ(X509 *x, EVP_PKEY *pkey, const EVP_MD *md); X509 *X509_REQ_to_X509(X509_REQ *r, int days, EVP_PKEY *pkey); X509_ALGOR *X509_ALGOR_new(void); void X509_ALGOR_free(X509_ALGOR *a); X509_ALGOR *d2i_X509_ALGOR(X509_ALGOR **a, const unsigned char **in, long len); int i2d_X509_ALGOR(X509_ALGOR *a, unsigned char **out); extern const ASN1_ITEM X509_ALGOR_it; X509_ALGORS *d2i_X509_ALGORS(X509_ALGORS **a, const unsigned char **in, long len); int i2d_X509_ALGORS(X509_ALGORS *a, unsigned char **out); extern const ASN1_ITEM X509_ALGORS_it; X509_VAL *X509_VAL_new(void); void X509_VAL_free(X509_VAL *a); X509_VAL *d2i_X509_VAL(X509_VAL **a, const unsigned char **in, long len); int i2d_X509_VAL(X509_VAL *a, unsigned char **out); extern const ASN1_ITEM X509_VAL_it; X509_PUBKEY *X509_PUBKEY_new(void); void X509_PUBKEY_free(X509_PUBKEY *a); X509_PUBKEY *d2i_X509_PUBKEY(X509_PUBKEY **a, const unsigned char **in, long len); int i2d_X509_PUBKEY(X509_PUBKEY *a, unsigned char **out); extern const ASN1_ITEM X509_PUBKEY_it; int X509_PUBKEY_set(X509_PUBKEY **x, EVP_PKEY *pkey); EVP_PKEY *X509_PUBKEY_get(X509_PUBKEY *key); int X509_get_pubkey_parameters(EVP_PKEY *pkey, struct stack_st_X509 *chain); int i2d_PUBKEY(EVP_PKEY *a, unsigned char **pp); EVP_PKEY *d2i_PUBKEY(EVP_PKEY **a, const unsigned char **pp, long length); int i2d_RSA_PUBKEY(RSA *a, unsigned char **pp); RSA *d2i_RSA_PUBKEY(RSA **a, const unsigned char **pp, long length); int i2d_DSA_PUBKEY(DSA *a, unsigned char **pp); DSA *d2i_DSA_PUBKEY(DSA **a, const unsigned char **pp, long length); int i2d_EC_PUBKEY(EC_KEY *a, unsigned char **pp); EC_KEY *d2i_EC_PUBKEY(EC_KEY **a, const unsigned char **pp, long length); X509_SIG *X509_SIG_new(void); void X509_SIG_free(X509_SIG *a); X509_SIG *d2i_X509_SIG(X509_SIG **a, const unsigned char **in, long len); int i2d_X509_SIG(X509_SIG *a, unsigned char **out); extern const ASN1_ITEM X509_SIG_it; X509_REQ_INFO *X509_REQ_INFO_new(void); void X509_REQ_INFO_free(X509_REQ_INFO *a); X509_REQ_INFO *d2i_X509_REQ_INFO(X509_REQ_INFO **a, const unsigned char **in, long len); int i2d_X509_REQ_INFO(X509_REQ_INFO *a, unsigned char **out); extern const ASN1_ITEM X509_REQ_INFO_it; X509_REQ *X509_REQ_new(void); void X509_REQ_free(X509_REQ *a); X509_REQ *d2i_X509_REQ(X509_REQ **a, const unsigned char **in, long len); int i2d_X509_REQ(X509_REQ *a, unsigned char **out); extern const ASN1_ITEM X509_REQ_it; X509_ATTRIBUTE *X509_ATTRIBUTE_new(void); void X509_ATTRIBUTE_free(X509_ATTRIBUTE *a); X509_ATTRIBUTE *d2i_X509_ATTRIBUTE(X509_ATTRIBUTE **a, const unsigned char **in, long len); int i2d_X509_ATTRIBUTE(X509_ATTRIBUTE *a, unsigned char **out); extern const ASN1_ITEM X509_ATTRIBUTE_it; X509_ATTRIBUTE *X509_ATTRIBUTE_create(int nid, int atrtype, void *value); X509_EXTENSION *X509_EXTENSION_new(void); void X509_EXTENSION_free(X509_EXTENSION *a); X509_EXTENSION *d2i_X509_EXTENSION(X509_EXTENSION **a, const unsigned char **in, long len); int i2d_X509_EXTENSION(X509_EXTENSION *a, unsigned char **out); extern const ASN1_ITEM X509_EXTENSION_it; X509_EXTENSIONS *d2i_X509_EXTENSIONS(X509_EXTENSIONS **a, const unsigned char **in, long len); int i2d_X509_EXTENSIONS(X509_EXTENSIONS *a, unsigned char **out); extern const ASN1_ITEM X509_EXTENSIONS_it; X509_NAME_ENTRY *X509_NAME_ENTRY_new(void); void X509_NAME_ENTRY_free(X509_NAME_ENTRY *a); X509_NAME_ENTRY *d2i_X509_NAME_ENTRY(X509_NAME_ENTRY **a, const unsigned char **in, long len); int i2d_X509_NAME_ENTRY(X509_NAME_ENTRY *a, unsigned char **out); extern const ASN1_ITEM X509_NAME_ENTRY_it; X509_NAME *X509_NAME_new(void); void X509_NAME_free(X509_NAME *a); X509_NAME *d2i_X509_NAME(X509_NAME **a, const unsigned char **in, long len); int i2d_X509_NAME(X509_NAME *a, unsigned char **out); extern const ASN1_ITEM X509_NAME_it; int X509_NAME_set(X509_NAME **xn, X509_NAME *name); X509_CINF *X509_CINF_new(void); void X509_CINF_free(X509_CINF *a); X509_CINF *d2i_X509_CINF(X509_CINF **a, const unsigned char **in, long len); int i2d_X509_CINF(X509_CINF *a, unsigned char **out); extern const ASN1_ITEM X509_CINF_it; X509 *X509_new(void); void X509_free(X509 *a); X509 *d2i_X509(X509 **a, const unsigned char **in, long len); int i2d_X509(X509 *a, unsigned char **out); extern const ASN1_ITEM X509_it; X509_CERT_AUX *X509_CERT_AUX_new(void); void X509_CERT_AUX_free(X509_CERT_AUX *a); X509_CERT_AUX *d2i_X509_CERT_AUX(X509_CERT_AUX **a, const unsigned char **in, long len); int i2d_X509_CERT_AUX(X509_CERT_AUX *a, unsigned char **out); extern const ASN1_ITEM X509_CERT_AUX_it; X509_CERT_PAIR *X509_CERT_PAIR_new(void); void X509_CERT_PAIR_free(X509_CERT_PAIR *a); X509_CERT_PAIR *d2i_X509_CERT_PAIR(X509_CERT_PAIR **a, const unsigned char **in, long len); int i2d_X509_CERT_PAIR(X509_CERT_PAIR *a, unsigned char **out); extern const ASN1_ITEM X509_CERT_PAIR_it; int X509_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int X509_set_ex_data(X509 *r, int idx, void *arg); void *X509_get_ex_data(X509 *r, int idx); int i2d_X509_AUX(X509 *a, unsigned char **pp); X509 *d2i_X509_AUX(X509 **a, const unsigned char **pp, long length); int i2d_re_X509_tbs(X509 *x, unsigned char **pp); void X509_get0_signature(ASN1_BIT_STRING **psig, X509_ALGOR **palg, const X509 *x); int X509_get_signature_nid(const X509 *x); int X509_alias_set1(X509 *x, unsigned char *name, int len); int X509_keyid_set1(X509 *x, unsigned char *id, int len); unsigned char *X509_alias_get0(X509 *x, int *len); unsigned char *X509_keyid_get0(X509 *x, int *len); int (*X509_TRUST_set_default(int (*trust) (int, X509 *, int))) (int, X509 *, int); int X509_TRUST_set(int *t, int trust); int X509_add1_trust_object(X509 *x, ASN1_OBJECT *obj); int X509_add1_reject_object(X509 *x, ASN1_OBJECT *obj); void X509_trust_clear(X509 *x); void X509_reject_clear(X509 *x); X509_REVOKED *X509_REVOKED_new(void); void X509_REVOKED_free(X509_REVOKED *a); X509_REVOKED *d2i_X509_REVOKED(X509_REVOKED **a, const unsigned char **in, long len); int i2d_X509_REVOKED(X509_REVOKED *a, unsigned char **out); extern const ASN1_ITEM X509_REVOKED_it; X509_CRL_INFO *X509_CRL_INFO_new(void); void X509_CRL_INFO_free(X509_CRL_INFO *a); X509_CRL_INFO *d2i_X509_CRL_INFO(X509_CRL_INFO **a, const unsigned char **in, long len); int i2d_X509_CRL_INFO(X509_CRL_INFO *a, unsigned char **out); extern const ASN1_ITEM X509_CRL_INFO_it; X509_CRL *X509_CRL_new(void); void X509_CRL_free(X509_CRL *a); X509_CRL *d2i_X509_CRL(X509_CRL **a, const unsigned char **in, long len); int i2d_X509_CRL(X509_CRL *a, unsigned char **out); extern const ASN1_ITEM X509_CRL_it; int X509_CRL_add0_revoked(X509_CRL *crl, X509_REVOKED *rev); int X509_CRL_get0_by_serial(X509_CRL *crl, X509_REVOKED **ret, ASN1_INTEGER *serial); int X509_CRL_get0_by_cert(X509_CRL *crl, X509_REVOKED **ret, X509 *x); X509_PKEY *X509_PKEY_new(void); void X509_PKEY_free(X509_PKEY *a); int i2d_X509_PKEY(X509_PKEY *a, unsigned char **pp); X509_PKEY *d2i_X509_PKEY(X509_PKEY **a, const unsigned char **pp, long length); NETSCAPE_SPKI *NETSCAPE_SPKI_new(void); void NETSCAPE_SPKI_free(NETSCAPE_SPKI *a); NETSCAPE_SPKI *d2i_NETSCAPE_SPKI(NETSCAPE_SPKI **a, const unsigned char **in, long len); int i2d_NETSCAPE_SPKI(NETSCAPE_SPKI *a, unsigned char **out); extern const ASN1_ITEM NETSCAPE_SPKI_it; NETSCAPE_SPKAC *NETSCAPE_SPKAC_new(void); void NETSCAPE_SPKAC_free(NETSCAPE_SPKAC *a); NETSCAPE_SPKAC *d2i_NETSCAPE_SPKAC(NETSCAPE_SPKAC **a, const unsigned char **in, long len); int i2d_NETSCAPE_SPKAC(NETSCAPE_SPKAC *a, unsigned char **out); extern const ASN1_ITEM NETSCAPE_SPKAC_it; NETSCAPE_CERT_SEQUENCE *NETSCAPE_CERT_SEQUENCE_new(void); void NETSCAPE_CERT_SEQUENCE_free(NETSCAPE_CERT_SEQUENCE *a); NETSCAPE_CERT_SEQUENCE *d2i_NETSCAPE_CERT_SEQUENCE(NETSCAPE_CERT_SEQUENCE **a, const unsigned char **in, long len); int i2d_NETSCAPE_CERT_SEQUENCE(NETSCAPE_CERT_SEQUENCE *a, unsigned char **out); extern const ASN1_ITEM NETSCAPE_CERT_SEQUENCE_it; X509_INFO *X509_INFO_new(void); void X509_INFO_free(X509_INFO *a); char *X509_NAME_oneline(X509_NAME *a, char *buf, int size); int ASN1_verify(i2d_of_void *i2d, X509_ALGOR *algor1, ASN1_BIT_STRING *signature, char *data, EVP_PKEY *pkey); int ASN1_digest(i2d_of_void *i2d, const EVP_MD *type, char *data, unsigned char *md, unsigned int *len); int ASN1_sign(i2d_of_void *i2d, X509_ALGOR *algor1, X509_ALGOR *algor2, ASN1_BIT_STRING *signature, char *data, EVP_PKEY *pkey, const EVP_MD *type); int ASN1_item_digest(const ASN1_ITEM *it, const EVP_MD *type, void *data, unsigned char *md, unsigned int *len); int ASN1_item_verify(const ASN1_ITEM *it, X509_ALGOR *algor1, ASN1_BIT_STRING *signature, void *data, EVP_PKEY *pkey); int ASN1_item_sign(const ASN1_ITEM *it, X509_ALGOR *algor1, X509_ALGOR *algor2, ASN1_BIT_STRING *signature, void *data, EVP_PKEY *pkey, const EVP_MD *type); int ASN1_item_sign_ctx(const ASN1_ITEM *it, X509_ALGOR *algor1, X509_ALGOR *algor2, ASN1_BIT_STRING *signature, void *asn, EVP_MD_CTX *ctx); int X509_set_version(X509 *x, long version); int X509_set_serialNumber(X509 *x, ASN1_INTEGER *serial); ASN1_INTEGER *X509_get_serialNumber(X509 *x); int X509_set_issuer_name(X509 *x, X509_NAME *name); X509_NAME *X509_get_issuer_name(X509 *a); int X509_set_subject_name(X509 *x, X509_NAME *name); X509_NAME *X509_get_subject_name(X509 *a); int X509_set_notBefore(X509 *x, const ASN1_TIME *tm); int X509_set_notAfter(X509 *x, const ASN1_TIME *tm); int X509_set_pubkey(X509 *x, EVP_PKEY *pkey); EVP_PKEY *X509_get_pubkey(X509 *x); ASN1_BIT_STRING *X509_get0_pubkey_bitstr(const X509 *x); int X509_certificate_type(X509 *x, EVP_PKEY *pubkey ); int X509_REQ_set_version(X509_REQ *x, long version); int X509_REQ_set_subject_name(X509_REQ *req, X509_NAME *name); int X509_REQ_set_pubkey(X509_REQ *x, EVP_PKEY *pkey); EVP_PKEY *X509_REQ_get_pubkey(X509_REQ *req); int X509_REQ_extension_nid(int nid); int *X509_REQ_get_extension_nids(void); void X509_REQ_set_extension_nids(int *nids); struct stack_st_X509_EXTENSION *X509_REQ_get_extensions(X509_REQ *req); int X509_REQ_add_extensions_nid(X509_REQ *req, struct stack_st_X509_EXTENSION *exts, int nid); int X509_REQ_add_extensions(X509_REQ *req, struct stack_st_X509_EXTENSION *exts); int X509_REQ_get_attr_count(const X509_REQ *req); int X509_REQ_get_attr_by_NID(const X509_REQ *req, int nid, int lastpos); int X509_REQ_get_attr_by_OBJ(const X509_REQ *req, ASN1_OBJECT *obj, int lastpos); X509_ATTRIBUTE *X509_REQ_get_attr(const X509_REQ *req, int loc); X509_ATTRIBUTE *X509_REQ_delete_attr(X509_REQ *req, int loc); int X509_REQ_add1_attr(X509_REQ *req, X509_ATTRIBUTE *attr); int X509_REQ_add1_attr_by_OBJ(X509_REQ *req, const ASN1_OBJECT *obj, int type, const unsigned char *bytes, int len); int X509_REQ_add1_attr_by_NID(X509_REQ *req, int nid, int type, const unsigned char *bytes, int len); int X509_REQ_add1_attr_by_txt(X509_REQ *req, const char *attrname, int type, const unsigned char *bytes, int len); int X509_CRL_set_version(X509_CRL *x, long version); int X509_CRL_set_issuer_name(X509_CRL *x, X509_NAME *name); int X509_CRL_set_lastUpdate(X509_CRL *x, const ASN1_TIME *tm); int X509_CRL_set_nextUpdate(X509_CRL *x, const ASN1_TIME *tm); int X509_CRL_sort(X509_CRL *crl); int X509_REVOKED_set_serialNumber(X509_REVOKED *x, ASN1_INTEGER *serial); int X509_REVOKED_set_revocationDate(X509_REVOKED *r, ASN1_TIME *tm); X509_CRL *X509_CRL_diff(X509_CRL *base, X509_CRL *newer, EVP_PKEY *skey, const EVP_MD *md, unsigned int flags); int X509_REQ_check_private_key(X509_REQ *x509, EVP_PKEY *pkey); int X509_check_private_key(X509 *x509, EVP_PKEY *pkey); int X509_chain_check_suiteb(int *perror_depth, X509 *x, struct stack_st_X509 *chain, unsigned long flags); int X509_CRL_check_suiteb(X509_CRL *crl, EVP_PKEY *pk, unsigned long flags); struct stack_st_X509 *X509_chain_up_ref(struct stack_st_X509 *chain); int X509_issuer_and_serial_cmp(const X509 *a, const X509 *b); unsigned long X509_issuer_and_serial_hash(X509 *a); int X509_issuer_name_cmp(const X509 *a, const X509 *b); unsigned long X509_issuer_name_hash(X509 *a); int X509_subject_name_cmp(const X509 *a, const X509 *b); unsigned long X509_subject_name_hash(X509 *x); unsigned long X509_issuer_name_hash_old(X509 *a); unsigned long X509_subject_name_hash_old(X509 *x); int X509_cmp(const X509 *a, const X509 *b); int X509_NAME_cmp(const X509_NAME *a, const X509_NAME *b); unsigned long X509_NAME_hash(X509_NAME *x); unsigned long X509_NAME_hash_old(X509_NAME *x); int X509_CRL_cmp(const X509_CRL *a, const X509_CRL *b); int X509_CRL_match(const X509_CRL *a, const X509_CRL *b); int X509_print_ex_fp(FILE *bp, X509 *x, unsigned long nmflag, unsigned long cflag); int X509_print_fp(FILE *bp, X509 *x); int X509_CRL_print_fp(FILE *bp, X509_CRL *x); int X509_REQ_print_fp(FILE *bp, X509_REQ *req); int X509_NAME_print_ex_fp(FILE *fp, X509_NAME *nm, int indent, unsigned long flags); int X509_NAME_print(BIO *bp, X509_NAME *name, int obase); int X509_NAME_print_ex(BIO *out, X509_NAME *nm, int indent, unsigned long flags); int X509_print_ex(BIO *bp, X509 *x, unsigned long nmflag, unsigned long cflag); int X509_print(BIO *bp, X509 *x); int X509_ocspid_print(BIO *bp, X509 *x); int X509_CERT_AUX_print(BIO *bp, X509_CERT_AUX *x, int indent); int X509_CRL_print(BIO *bp, X509_CRL *x); int X509_REQ_print_ex(BIO *bp, X509_REQ *x, unsigned long nmflag, unsigned long cflag); int X509_REQ_print(BIO *bp, X509_REQ *req); int X509_NAME_entry_count(X509_NAME *name); int X509_NAME_get_text_by_NID(X509_NAME *name, int nid, char *buf, int len); int X509_NAME_get_text_by_OBJ(X509_NAME *name, ASN1_OBJECT *obj, char *buf, int len); int X509_NAME_get_index_by_NID(X509_NAME *name, int nid, int lastpos); int X509_NAME_get_index_by_OBJ(X509_NAME *name, ASN1_OBJECT *obj, int lastpos); X509_NAME_ENTRY *X509_NAME_get_entry(X509_NAME *name, int loc); X509_NAME_ENTRY *X509_NAME_delete_entry(X509_NAME *name, int loc); int X509_NAME_add_entry(X509_NAME *name, X509_NAME_ENTRY *ne, int loc, int set); int X509_NAME_add_entry_by_OBJ(X509_NAME *name, ASN1_OBJECT *obj, int type, unsigned char *bytes, int len, int loc, int set); int X509_NAME_add_entry_by_NID(X509_NAME *name, int nid, int type, unsigned char *bytes, int len, int loc, int set); X509_NAME_ENTRY *X509_NAME_ENTRY_create_by_txt(X509_NAME_ENTRY **ne, const char *field, int type, const unsigned char *bytes, int len); X509_NAME_ENTRY *X509_NAME_ENTRY_create_by_NID(X509_NAME_ENTRY **ne, int nid, int type, unsigned char *bytes, int len); int X509_NAME_add_entry_by_txt(X509_NAME *name, const char *field, int type, const unsigned char *bytes, int len, int loc, int set); X509_NAME_ENTRY *X509_NAME_ENTRY_create_by_OBJ(X509_NAME_ENTRY **ne, ASN1_OBJECT *obj, int type, const unsigned char *bytes, int len); int X509_NAME_ENTRY_set_object(X509_NAME_ENTRY *ne, ASN1_OBJECT *obj); int X509_NAME_ENTRY_set_data(X509_NAME_ENTRY *ne, int type, const unsigned char *bytes, int len); ASN1_OBJECT *X509_NAME_ENTRY_get_object(X509_NAME_ENTRY *ne); ASN1_STRING *X509_NAME_ENTRY_get_data(X509_NAME_ENTRY *ne); int X509v3_get_ext_count(const struct stack_st_X509_EXTENSION *x); int X509v3_get_ext_by_NID(const struct stack_st_X509_EXTENSION *x, int nid, int lastpos); int X509v3_get_ext_by_OBJ(const struct stack_st_X509_EXTENSION *x, ASN1_OBJECT *obj, int lastpos); int X509v3_get_ext_by_critical(const struct stack_st_X509_EXTENSION *x, int crit, int lastpos); X509_EXTENSION *X509v3_get_ext(const struct stack_st_X509_EXTENSION *x, int loc); X509_EXTENSION *X509v3_delete_ext(struct stack_st_X509_EXTENSION *x, int loc); struct stack_st_X509_EXTENSION *X509v3_add_ext(struct stack_st_X509_EXTENSION **x, X509_EXTENSION *ex, int loc); int X509_get_ext_count(X509 *x); int X509_get_ext_by_NID(X509 *x, int nid, int lastpos); int X509_get_ext_by_OBJ(X509 *x, ASN1_OBJECT *obj, int lastpos); int X509_get_ext_by_critical(X509 *x, int crit, int lastpos); X509_EXTENSION *X509_get_ext(X509 *x, int loc); X509_EXTENSION *X509_delete_ext(X509 *x, int loc); int X509_add_ext(X509 *x, X509_EXTENSION *ex, int loc); void *X509_get_ext_d2i(X509 *x, int nid, int *crit, int *idx); int X509_add1_ext_i2d(X509 *x, int nid, void *value, int crit, unsigned long flags); int X509_CRL_get_ext_count(X509_CRL *x); int X509_CRL_get_ext_by_NID(X509_CRL *x, int nid, int lastpos); int X509_CRL_get_ext_by_OBJ(X509_CRL *x, ASN1_OBJECT *obj, int lastpos); int X509_CRL_get_ext_by_critical(X509_CRL *x, int crit, int lastpos); X509_EXTENSION *X509_CRL_get_ext(X509_CRL *x, int loc); X509_EXTENSION *X509_CRL_delete_ext(X509_CRL *x, int loc); int X509_CRL_add_ext(X509_CRL *x, X509_EXTENSION *ex, int loc); void *X509_CRL_get_ext_d2i(X509_CRL *x, int nid, int *crit, int *idx); int X509_CRL_add1_ext_i2d(X509_CRL *x, int nid, void *value, int crit, unsigned long flags); int X509_REVOKED_get_ext_count(X509_REVOKED *x); int X509_REVOKED_get_ext_by_NID(X509_REVOKED *x, int nid, int lastpos); int X509_REVOKED_get_ext_by_OBJ(X509_REVOKED *x, ASN1_OBJECT *obj, int lastpos); int X509_REVOKED_get_ext_by_critical(X509_REVOKED *x, int crit, int lastpos); X509_EXTENSION *X509_REVOKED_get_ext(X509_REVOKED *x, int loc); X509_EXTENSION *X509_REVOKED_delete_ext(X509_REVOKED *x, int loc); int X509_REVOKED_add_ext(X509_REVOKED *x, X509_EXTENSION *ex, int loc); void *X509_REVOKED_get_ext_d2i(X509_REVOKED *x, int nid, int *crit, int *idx); int X509_REVOKED_add1_ext_i2d(X509_REVOKED *x, int nid, void *value, int crit, unsigned long flags); X509_EXTENSION *X509_EXTENSION_create_by_NID(X509_EXTENSION **ex, int nid, int crit, ASN1_OCTET_STRING *data); X509_EXTENSION *X509_EXTENSION_create_by_OBJ(X509_EXTENSION **ex, ASN1_OBJECT *obj, int crit, ASN1_OCTET_STRING *data); int X509_EXTENSION_set_object(X509_EXTENSION *ex, ASN1_OBJECT *obj); int X509_EXTENSION_set_critical(X509_EXTENSION *ex, int crit); int X509_EXTENSION_set_data(X509_EXTENSION *ex, ASN1_OCTET_STRING *data); ASN1_OBJECT *X509_EXTENSION_get_object(X509_EXTENSION *ex); ASN1_OCTET_STRING *X509_EXTENSION_get_data(X509_EXTENSION *ne); int X509_EXTENSION_get_critical(X509_EXTENSION *ex); int X509at_get_attr_count(const struct stack_st_X509_ATTRIBUTE *x); int X509at_get_attr_by_NID(const struct stack_st_X509_ATTRIBUTE *x, int nid, int lastpos); int X509at_get_attr_by_OBJ(const struct stack_st_X509_ATTRIBUTE *sk, ASN1_OBJECT *obj, int lastpos); X509_ATTRIBUTE *X509at_get_attr(const struct stack_st_X509_ATTRIBUTE *x, int loc); X509_ATTRIBUTE *X509at_delete_attr(struct stack_st_X509_ATTRIBUTE *x, int loc); struct stack_st_X509_ATTRIBUTE *X509at_add1_attr(struct stack_st_X509_ATTRIBUTE **x, X509_ATTRIBUTE *attr); struct stack_st_X509_ATTRIBUTE *X509at_add1_attr_by_OBJ(struct stack_st_X509_ATTRIBUTE **x, const ASN1_OBJECT *obj, int type, const unsigned char *bytes, int len); struct stack_st_X509_ATTRIBUTE *X509at_add1_attr_by_NID(struct stack_st_X509_ATTRIBUTE **x, int nid, int type, const unsigned char *bytes, int len); struct stack_st_X509_ATTRIBUTE *X509at_add1_attr_by_txt(struct stack_st_X509_ATTRIBUTE **x, const char *attrname, int type, const unsigned char *bytes, int len); void *X509at_get0_data_by_OBJ(struct stack_st_X509_ATTRIBUTE *x, ASN1_OBJECT *obj, int lastpos, int type); X509_ATTRIBUTE *X509_ATTRIBUTE_create_by_NID(X509_ATTRIBUTE **attr, int nid, int atrtype, const void *data, int len); X509_ATTRIBUTE *X509_ATTRIBUTE_create_by_OBJ(X509_ATTRIBUTE **attr, const ASN1_OBJECT *obj, int atrtype, const void *data, int len); X509_ATTRIBUTE *X509_ATTRIBUTE_create_by_txt(X509_ATTRIBUTE **attr, const char *atrname, int type, const unsigned char *bytes, int len); int X509_ATTRIBUTE_set1_object(X509_ATTRIBUTE *attr, const ASN1_OBJECT *obj); int X509_ATTRIBUTE_set1_data(X509_ATTRIBUTE *attr, int attrtype, const void *data, int len); void *X509_ATTRIBUTE_get0_data(X509_ATTRIBUTE *attr, int idx, int atrtype, void *data); int X509_ATTRIBUTE_count(X509_ATTRIBUTE *attr); ASN1_OBJECT *X509_ATTRIBUTE_get0_object(X509_ATTRIBUTE *attr); ASN1_TYPE *X509_ATTRIBUTE_get0_type(X509_ATTRIBUTE *attr, int idx); int EVP_PKEY_get_attr_count(const EVP_PKEY *key); int EVP_PKEY_get_attr_by_NID(const EVP_PKEY *key, int nid, int lastpos); int EVP_PKEY_get_attr_by_OBJ(const EVP_PKEY *key, ASN1_OBJECT *obj, int lastpos); X509_ATTRIBUTE *EVP_PKEY_get_attr(const EVP_PKEY *key, int loc); X509_ATTRIBUTE *EVP_PKEY_delete_attr(EVP_PKEY *key, int loc); int EVP_PKEY_add1_attr(EVP_PKEY *key, X509_ATTRIBUTE *attr); int EVP_PKEY_add1_attr_by_OBJ(EVP_PKEY *key, const ASN1_OBJECT *obj, int type, const unsigned char *bytes, int len); int EVP_PKEY_add1_attr_by_NID(EVP_PKEY *key, int nid, int type, const unsigned char *bytes, int len); int EVP_PKEY_add1_attr_by_txt(EVP_PKEY *key, const char *attrname, int type, const unsigned char *bytes, int len); int X509_verify_cert(X509_STORE_CTX *ctx); X509 *X509_find_by_issuer_and_serial(struct stack_st_X509 *sk, X509_NAME *name, ASN1_INTEGER *serial); X509 *X509_find_by_subject(struct stack_st_X509 *sk, X509_NAME *name); PBEPARAM *PBEPARAM_new(void); void PBEPARAM_free(PBEPARAM *a); PBEPARAM *d2i_PBEPARAM(PBEPARAM **a, const unsigned char **in, long len); int i2d_PBEPARAM(PBEPARAM *a, unsigned char **out); extern const ASN1_ITEM PBEPARAM_it; PBE2PARAM *PBE2PARAM_new(void); void PBE2PARAM_free(PBE2PARAM *a); PBE2PARAM *d2i_PBE2PARAM(PBE2PARAM **a, const unsigned char **in, long len); int i2d_PBE2PARAM(PBE2PARAM *a, unsigned char **out); extern const ASN1_ITEM PBE2PARAM_it; PBKDF2PARAM *PBKDF2PARAM_new(void); void PBKDF2PARAM_free(PBKDF2PARAM *a); PBKDF2PARAM *d2i_PBKDF2PARAM(PBKDF2PARAM **a, const unsigned char **in, long len); int i2d_PBKDF2PARAM(PBKDF2PARAM *a, unsigned char **out); extern const ASN1_ITEM PBKDF2PARAM_it; int PKCS5_pbe_set0_algor(X509_ALGOR *algor, int alg, int iter, const unsigned char *salt, int saltlen); X509_ALGOR *PKCS5_pbe_set(int alg, int iter, const unsigned char *salt, int saltlen); X509_ALGOR *PKCS5_pbe2_set(const EVP_CIPHER *cipher, int iter, unsigned char *salt, int saltlen); X509_ALGOR *PKCS5_pbe2_set_iv(const EVP_CIPHER *cipher, int iter, unsigned char *salt, int saltlen, unsigned char *aiv, int prf_nid); X509_ALGOR *PKCS5_pbkdf2_set(int iter, unsigned char *salt, int saltlen, int prf_nid, int keylen); PKCS8_PRIV_KEY_INFO *PKCS8_PRIV_KEY_INFO_new(void); void PKCS8_PRIV_KEY_INFO_free(PKCS8_PRIV_KEY_INFO *a); PKCS8_PRIV_KEY_INFO *d2i_PKCS8_PRIV_KEY_INFO(PKCS8_PRIV_KEY_INFO **a, const unsigned char **in, long len); int i2d_PKCS8_PRIV_KEY_INFO(PKCS8_PRIV_KEY_INFO *a, unsigned char **out); extern const ASN1_ITEM PKCS8_PRIV_KEY_INFO_it; EVP_PKEY *EVP_PKCS82PKEY(PKCS8_PRIV_KEY_INFO *p8); PKCS8_PRIV_KEY_INFO *EVP_PKEY2PKCS8(EVP_PKEY *pkey); PKCS8_PRIV_KEY_INFO *EVP_PKEY2PKCS8_broken(EVP_PKEY *pkey, int broken); PKCS8_PRIV_KEY_INFO *PKCS8_set_broken(PKCS8_PRIV_KEY_INFO *p8, int broken); int PKCS8_pkey_set0(PKCS8_PRIV_KEY_INFO *priv, ASN1_OBJECT *aobj, int version, int ptype, void *pval, unsigned char *penc, int penclen); int PKCS8_pkey_get0(ASN1_OBJECT **ppkalg, const unsigned char **pk, int *ppklen, X509_ALGOR **pa, PKCS8_PRIV_KEY_INFO *p8); int X509_PUBKEY_set0_param(X509_PUBKEY *pub, ASN1_OBJECT *aobj, int ptype, void *pval, unsigned char *penc, int penclen); int X509_PUBKEY_get0_param(ASN1_OBJECT **ppkalg, const unsigned char **pk, int *ppklen, X509_ALGOR **pa, X509_PUBKEY *pub); int X509_check_trust(X509 *x, int id, int flags); int X509_TRUST_get_count(void); X509_TRUST *X509_TRUST_get0(int idx); int X509_TRUST_get_by_id(int id); int X509_TRUST_add(int id, int flags, int (*ck) (X509_TRUST *, X509 *, int), char *name, int arg1, void *arg2); void X509_TRUST_cleanup(void); int X509_TRUST_get_flags(X509_TRUST *xp); char *X509_TRUST_get0_name(X509_TRUST *xp); int X509_TRUST_get_trust(X509_TRUST *xp); void ERR_load_X509_strings(void); # 157 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/pem.h" 1 3 4 # 62 "/usr/include/openssl/pem.h" 3 4 # 1 "/usr/include/openssl/e_os2.h" 1 3 4 # 56 "/usr/include/openssl/e_os2.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 57 "/usr/include/openssl/e_os2.h" 2 3 4 # 63 "/usr/include/openssl/pem.h" 2 3 4 # 71 "/usr/include/openssl/pem.h" 3 4 # 1 "/usr/include/openssl/pem2.h" 1 3 4 # 72 "/usr/include/openssl/pem.h" 2 3 4 # 145 "/usr/include/openssl/pem.h" 3 4 typedef struct PEM_Encode_Seal_st { EVP_ENCODE_CTX encode; EVP_MD_CTX md; EVP_CIPHER_CTX cipher; } PEM_ENCODE_SEAL_CTX; typedef struct pem_recip_st { char *name; X509_NAME *dn; int cipher; int key_enc; } PEM_USER; typedef struct pem_ctx_st { int type; struct { int version; int mode; } proc_type; char *domain; struct { int cipher; } DEK_info; PEM_USER *originator; int num_recipient; PEM_USER **recipient; EVP_MD *md; int md_enc; int md_len; char *md_data; EVP_CIPHER *dec; int key_len; unsigned char *key; int data_enc; int data_len; unsigned char *data; } PEM_CTX; # 389 "/usr/include/openssl/pem.h" 3 4 typedef int pem_password_cb (char *buf, int size, int rwflag, void *userdata); int PEM_get_EVP_CIPHER_INFO(char *header, EVP_CIPHER_INFO *cipher); int PEM_do_header(EVP_CIPHER_INFO *cipher, unsigned char *data, long *len, pem_password_cb *callback, void *u); int PEM_read_bio(BIO *bp, char **name, char **header, unsigned char **data, long *len); int PEM_write_bio(BIO *bp, const char *name, const char *hdr, const unsigned char *data, long len); int PEM_bytes_read_bio(unsigned char **pdata, long *plen, char **pnm, const char *name, BIO *bp, pem_password_cb *cb, void *u); void *PEM_ASN1_read_bio(d2i_of_void *d2i, const char *name, BIO *bp, void **x, pem_password_cb *cb, void *u); int PEM_ASN1_write_bio(i2d_of_void *i2d, const char *name, BIO *bp, void *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); struct stack_st_X509_INFO *PEM_X509_INFO_read_bio(BIO *bp, struct stack_st_X509_INFO *sk, pem_password_cb *cb, void *u); int PEM_X509_INFO_write_bio(BIO *bp, X509_INFO *xi, EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cd, void *u); int PEM_read(FILE *fp, char **name, char **header, unsigned char **data, long *len); int PEM_write(FILE *fp, const char *name, const char *hdr, const unsigned char *data, long len); void *PEM_ASN1_read(d2i_of_void *d2i, const char *name, FILE *fp, void **x, pem_password_cb *cb, void *u); int PEM_ASN1_write(i2d_of_void *i2d, const char *name, FILE *fp, void *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *callback, void *u); struct stack_st_X509_INFO *PEM_X509_INFO_read(FILE *fp, struct stack_st_X509_INFO *sk, pem_password_cb *cb, void *u); int PEM_SealInit(PEM_ENCODE_SEAL_CTX *ctx, EVP_CIPHER *type, EVP_MD *md_type, unsigned char **ek, int *ekl, unsigned char *iv, EVP_PKEY **pubk, int npubk); void PEM_SealUpdate(PEM_ENCODE_SEAL_CTX *ctx, unsigned char *out, int *outl, unsigned char *in, int inl); int PEM_SealFinal(PEM_ENCODE_SEAL_CTX *ctx, unsigned char *sig, int *sigl, unsigned char *out, int *outl, EVP_PKEY *priv); void PEM_SignInit(EVP_MD_CTX *ctx, EVP_MD *type); void PEM_SignUpdate(EVP_MD_CTX *ctx, unsigned char *d, unsigned int cnt); int PEM_SignFinal(EVP_MD_CTX *ctx, unsigned char *sigret, unsigned int *siglen, EVP_PKEY *pkey); int PEM_def_callback(char *buf, int num, int w, void *key); void PEM_proc_type(char *buf, int type); void PEM_dek_info(char *buf, const char *type, int len, char *str); X509 *PEM_read_bio_X509(BIO *bp, X509 **x, pem_password_cb *cb, void *u); X509 *PEM_read_X509(FILE *fp, X509 **x, pem_password_cb *cb, void *u); int PEM_write_bio_X509(BIO *bp, X509 *x); int PEM_write_X509(FILE *fp, X509 *x); X509 *PEM_read_bio_X509_AUX(BIO *bp, X509 **x, pem_password_cb *cb, void *u); X509 *PEM_read_X509_AUX(FILE *fp, X509 **x, pem_password_cb *cb, void *u); int PEM_write_bio_X509_AUX(BIO *bp, X509 *x); int PEM_write_X509_AUX(FILE *fp, X509 *x); X509_CERT_PAIR *PEM_read_bio_X509_CERT_PAIR(BIO *bp, X509_CERT_PAIR **x, pem_password_cb *cb, void *u); X509_CERT_PAIR *PEM_read_X509_CERT_PAIR(FILE *fp, X509_CERT_PAIR **x, pem_password_cb *cb, void *u); int PEM_write_bio_X509_CERT_PAIR(BIO *bp, X509_CERT_PAIR *x); int PEM_write_X509_CERT_PAIR(FILE *fp, X509_CERT_PAIR *x); X509_REQ *PEM_read_bio_X509_REQ(BIO *bp, X509_REQ **x, pem_password_cb *cb, void *u); X509_REQ *PEM_read_X509_REQ(FILE *fp, X509_REQ **x, pem_password_cb *cb, void *u); int PEM_write_bio_X509_REQ(BIO *bp, X509_REQ *x); int PEM_write_X509_REQ(FILE *fp, X509_REQ *x); int PEM_write_bio_X509_REQ_NEW(BIO *bp, X509_REQ *x); int PEM_write_X509_REQ_NEW(FILE *fp, X509_REQ *x); X509_CRL *PEM_read_bio_X509_CRL(BIO *bp, X509_CRL **x, pem_password_cb *cb, void *u); X509_CRL *PEM_read_X509_CRL(FILE *fp, X509_CRL **x, pem_password_cb *cb, void *u); int PEM_write_bio_X509_CRL(BIO *bp, X509_CRL *x); int PEM_write_X509_CRL(FILE *fp, X509_CRL *x); PKCS7 *PEM_read_bio_PKCS7(BIO *bp, PKCS7 **x, pem_password_cb *cb, void *u); PKCS7 *PEM_read_PKCS7(FILE *fp, PKCS7 **x, pem_password_cb *cb, void *u); int PEM_write_bio_PKCS7(BIO *bp, PKCS7 *x); int PEM_write_PKCS7(FILE *fp, PKCS7 *x); NETSCAPE_CERT_SEQUENCE *PEM_read_bio_NETSCAPE_CERT_SEQUENCE(BIO *bp, NETSCAPE_CERT_SEQUENCE **x, pem_password_cb *cb, void *u); NETSCAPE_CERT_SEQUENCE *PEM_read_NETSCAPE_CERT_SEQUENCE(FILE *fp, NETSCAPE_CERT_SEQUENCE **x, pem_password_cb *cb, void *u); int PEM_write_bio_NETSCAPE_CERT_SEQUENCE(BIO *bp, NETSCAPE_CERT_SEQUENCE *x); int PEM_write_NETSCAPE_CERT_SEQUENCE(FILE *fp, NETSCAPE_CERT_SEQUENCE *x); X509_SIG *PEM_read_bio_PKCS8(BIO *bp, X509_SIG **x, pem_password_cb *cb, void *u); X509_SIG *PEM_read_PKCS8(FILE *fp, X509_SIG **x, pem_password_cb *cb, void *u); int PEM_write_bio_PKCS8(BIO *bp, X509_SIG *x); int PEM_write_PKCS8(FILE *fp, X509_SIG *x); PKCS8_PRIV_KEY_INFO *PEM_read_bio_PKCS8_PRIV_KEY_INFO(BIO *bp, PKCS8_PRIV_KEY_INFO **x, pem_password_cb *cb, void *u); PKCS8_PRIV_KEY_INFO *PEM_read_PKCS8_PRIV_KEY_INFO(FILE *fp, PKCS8_PRIV_KEY_INFO **x, pem_password_cb *cb, void *u); int PEM_write_bio_PKCS8_PRIV_KEY_INFO(BIO *bp, PKCS8_PRIV_KEY_INFO *x); int PEM_write_PKCS8_PRIV_KEY_INFO(FILE *fp, PKCS8_PRIV_KEY_INFO *x); RSA *PEM_read_bio_RSAPrivateKey(BIO *bp, RSA **x, pem_password_cb *cb, void *u); RSA *PEM_read_RSAPrivateKey(FILE *fp, RSA **x, pem_password_cb *cb, void *u); int PEM_write_bio_RSAPrivateKey(BIO *bp, RSA *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); int PEM_write_RSAPrivateKey(FILE *fp, RSA *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); RSA *PEM_read_bio_RSAPublicKey(BIO *bp, RSA **x, pem_password_cb *cb, void *u); RSA *PEM_read_RSAPublicKey(FILE *fp, RSA **x, pem_password_cb *cb, void *u); int PEM_write_bio_RSAPublicKey(BIO *bp, const RSA *x); int PEM_write_RSAPublicKey(FILE *fp, const RSA *x); RSA *PEM_read_bio_RSA_PUBKEY(BIO *bp, RSA **x, pem_password_cb *cb, void *u); RSA *PEM_read_RSA_PUBKEY(FILE *fp, RSA **x, pem_password_cb *cb, void *u); int PEM_write_bio_RSA_PUBKEY(BIO *bp, RSA *x); int PEM_write_RSA_PUBKEY(FILE *fp, RSA *x); DSA *PEM_read_bio_DSAPrivateKey(BIO *bp, DSA **x, pem_password_cb *cb, void *u); DSA *PEM_read_DSAPrivateKey(FILE *fp, DSA **x, pem_password_cb *cb, void *u); int PEM_write_bio_DSAPrivateKey(BIO *bp, DSA *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); int PEM_write_DSAPrivateKey(FILE *fp, DSA *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); DSA *PEM_read_bio_DSA_PUBKEY(BIO *bp, DSA **x, pem_password_cb *cb, void *u); DSA *PEM_read_DSA_PUBKEY(FILE *fp, DSA **x, pem_password_cb *cb, void *u); int PEM_write_bio_DSA_PUBKEY(BIO *bp, DSA *x); int PEM_write_DSA_PUBKEY(FILE *fp, DSA *x); DSA *PEM_read_bio_DSAparams(BIO *bp, DSA **x, pem_password_cb *cb, void *u); DSA *PEM_read_DSAparams(FILE *fp, DSA **x, pem_password_cb *cb, void *u); int PEM_write_bio_DSAparams(BIO *bp, const DSA *x); int PEM_write_DSAparams(FILE *fp, const DSA *x); EC_GROUP *PEM_read_bio_ECPKParameters(BIO *bp, EC_GROUP **x, pem_password_cb *cb, void *u); EC_GROUP *PEM_read_ECPKParameters(FILE *fp, EC_GROUP **x, pem_password_cb *cb, void *u); int PEM_write_bio_ECPKParameters(BIO *bp, const EC_GROUP *x); int PEM_write_ECPKParameters(FILE *fp, const EC_GROUP *x); EC_KEY *PEM_read_bio_ECPrivateKey(BIO *bp, EC_KEY **x, pem_password_cb *cb, void *u); EC_KEY *PEM_read_ECPrivateKey(FILE *fp, EC_KEY **x, pem_password_cb *cb, void *u); int PEM_write_bio_ECPrivateKey(BIO *bp, EC_KEY *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); int PEM_write_ECPrivateKey(FILE *fp, EC_KEY *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); EC_KEY *PEM_read_bio_EC_PUBKEY(BIO *bp, EC_KEY **x, pem_password_cb *cb, void *u); EC_KEY *PEM_read_EC_PUBKEY(FILE *fp, EC_KEY **x, pem_password_cb *cb, void *u); int PEM_write_bio_EC_PUBKEY(BIO *bp, EC_KEY *x); int PEM_write_EC_PUBKEY(FILE *fp, EC_KEY *x); DH *PEM_read_bio_DHparams(BIO *bp, DH **x, pem_password_cb *cb, void *u); DH *PEM_read_DHparams(FILE *fp, DH **x, pem_password_cb *cb, void *u); int PEM_write_bio_DHparams(BIO *bp, const DH *x); int PEM_write_DHparams(FILE *fp, const DH *x); int PEM_write_bio_DHxparams(BIO *bp, const DH *x); int PEM_write_DHxparams(FILE *fp, const DH *x); EVP_PKEY *PEM_read_bio_PrivateKey(BIO *bp, EVP_PKEY **x, pem_password_cb *cb, void *u); EVP_PKEY *PEM_read_PrivateKey(FILE *fp, EVP_PKEY **x, pem_password_cb *cb, void *u); int PEM_write_bio_PrivateKey(BIO *bp, EVP_PKEY *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); int PEM_write_PrivateKey(FILE *fp, EVP_PKEY *x, const EVP_CIPHER *enc, unsigned char *kstr, int klen, pem_password_cb *cb, void *u); EVP_PKEY *PEM_read_bio_PUBKEY(BIO *bp, EVP_PKEY **x, pem_password_cb *cb, void *u); EVP_PKEY *PEM_read_PUBKEY(FILE *fp, EVP_PKEY **x, pem_password_cb *cb, void *u); int PEM_write_bio_PUBKEY(BIO *bp, EVP_PKEY *x); int PEM_write_PUBKEY(FILE *fp, EVP_PKEY *x); int PEM_write_bio_PKCS8PrivateKey_nid(BIO *bp, EVP_PKEY *x, int nid, char *kstr, int klen, pem_password_cb *cb, void *u); int PEM_write_bio_PKCS8PrivateKey(BIO *, EVP_PKEY *, const EVP_CIPHER *, char *, int, pem_password_cb *, void *); int i2d_PKCS8PrivateKey_bio(BIO *bp, EVP_PKEY *x, const EVP_CIPHER *enc, char *kstr, int klen, pem_password_cb *cb, void *u); int i2d_PKCS8PrivateKey_nid_bio(BIO *bp, EVP_PKEY *x, int nid, char *kstr, int klen, pem_password_cb *cb, void *u); EVP_PKEY *d2i_PKCS8PrivateKey_bio(BIO *bp, EVP_PKEY **x, pem_password_cb *cb, void *u); int i2d_PKCS8PrivateKey_fp(FILE *fp, EVP_PKEY *x, const EVP_CIPHER *enc, char *kstr, int klen, pem_password_cb *cb, void *u); int i2d_PKCS8PrivateKey_nid_fp(FILE *fp, EVP_PKEY *x, int nid, char *kstr, int klen, pem_password_cb *cb, void *u); int PEM_write_PKCS8PrivateKey_nid(FILE *fp, EVP_PKEY *x, int nid, char *kstr, int klen, pem_password_cb *cb, void *u); EVP_PKEY *d2i_PKCS8PrivateKey_fp(FILE *fp, EVP_PKEY **x, pem_password_cb *cb, void *u); int PEM_write_PKCS8PrivateKey(FILE *fp, EVP_PKEY *x, const EVP_CIPHER *enc, char *kstr, int klen, pem_password_cb *cd, void *u); EVP_PKEY *PEM_read_bio_Parameters(BIO *bp, EVP_PKEY **x); int PEM_write_bio_Parameters(BIO *bp, EVP_PKEY *x); EVP_PKEY *b2i_PrivateKey(const unsigned char **in, long length); EVP_PKEY *b2i_PublicKey(const unsigned char **in, long length); EVP_PKEY *b2i_PrivateKey_bio(BIO *in); EVP_PKEY *b2i_PublicKey_bio(BIO *in); int i2b_PrivateKey_bio(BIO *out, EVP_PKEY *pk); int i2b_PublicKey_bio(BIO *out, EVP_PKEY *pk); EVP_PKEY *b2i_PVK_bio(BIO *in, pem_password_cb *cb, void *u); int i2b_PVK_bio(BIO *out, EVP_PKEY *pk, int enclevel, pem_password_cb *cb, void *u); # 535 "/usr/include/openssl/pem.h" 3 4 void ERR_load_PEM_strings(void); # 163 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/hmac.h" 1 3 4 # 61 "/usr/include/openssl/hmac.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 62 "/usr/include/openssl/hmac.h" 2 3 4 # 75 "/usr/include/openssl/hmac.h" 3 4 typedef struct hmac_ctx_st { const EVP_MD *md; EVP_MD_CTX md_ctx; EVP_MD_CTX i_ctx; EVP_MD_CTX o_ctx; unsigned int key_length; unsigned char key[128]; } HMAC_CTX; void HMAC_CTX_init(HMAC_CTX *ctx); void HMAC_CTX_cleanup(HMAC_CTX *ctx); int HMAC_Init(HMAC_CTX *ctx, const void *key, int len, const EVP_MD *md); int HMAC_Init_ex(HMAC_CTX *ctx, const void *key, int len, const EVP_MD *md, ENGINE *impl); int HMAC_Update(HMAC_CTX *ctx, const unsigned char *data, size_t len); int HMAC_Final(HMAC_CTX *ctx, unsigned char *md, unsigned int *len); unsigned char *HMAC(const EVP_MD *evp_md, const void *key, int key_len, const unsigned char *d, size_t n, unsigned char *md, unsigned int *md_len); int HMAC_CTX_copy(HMAC_CTX *dctx, HMAC_CTX *sctx); void HMAC_CTX_set_flags(HMAC_CTX *ctx, unsigned long flags); # 164 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/kssl.h" 1 3 4 # 67 "/usr/include/openssl/kssl.h" 3 4 # 1 "/usr/include/openssl/opensslconf.h" 1 3 4 # 68 "/usr/include/openssl/kssl.h" 2 3 4 # 166 "/usr/include/openssl/ssl.h" 2 3 4 # 372 "/usr/include/openssl/ssl.h" 3 4 typedef struct ssl_st *ssl_crock_st; typedef struct tls_session_ticket_ext_st TLS_SESSION_TICKET_EXT; typedef struct ssl_method_st SSL_METHOD; typedef struct ssl_cipher_st SSL_CIPHER; typedef struct ssl_session_st SSL_SESSION; typedef struct tls_sigalgs_st TLS_SIGALGS; typedef struct ssl_conf_ctx_st SSL_CONF_CTX; struct stack_st_SSL_CIPHER { _STACK stack; }; typedef struct srtp_protection_profile_st { const char *name; unsigned long id; } SRTP_PROTECTION_PROFILE; struct stack_st_SRTP_PROTECTION_PROFILE { _STACK stack; }; typedef int (*tls_session_ticket_ext_cb_fn) (SSL *s, const unsigned char *data, int len, void *arg); typedef int (*tls_session_secret_cb_fn) (SSL *s, void *secret, int *secret_len, struct stack_st_SSL_CIPHER *peer_ciphers, SSL_CIPHER **cipher, void *arg); typedef int (*custom_ext_add_cb) (SSL *s, unsigned int ext_type, const unsigned char **out, size_t *outlen, int *al, void *add_arg); typedef void (*custom_ext_free_cb) (SSL *s, unsigned int ext_type, const unsigned char *out, void *add_arg); typedef int (*custom_ext_parse_cb) (SSL *s, unsigned int ext_type, const unsigned char *in, size_t inlen, int *al, void *parse_arg); struct ssl_cipher_st { int valid; const char *name; unsigned long id; unsigned long algorithm_mkey; unsigned long algorithm_auth; unsigned long algorithm_enc; unsigned long algorithm_mac; unsigned long algorithm_ssl; unsigned long algo_strength; unsigned long algorithm2; int strength_bits; int alg_bits; }; struct ssl_method_st { int version; int (*ssl_new) (SSL *s); void (*ssl_clear) (SSL *s); void (*ssl_free) (SSL *s); int (*ssl_accept) (SSL *s); int (*ssl_connect) (SSL *s); int (*ssl_read) (SSL *s, void *buf, int len); int (*ssl_peek) (SSL *s, void *buf, int len); int (*ssl_write) (SSL *s, const void *buf, int len); int (*ssl_shutdown) (SSL *s); int (*ssl_renegotiate) (SSL *s); int (*ssl_renegotiate_check) (SSL *s); long (*ssl_get_message) (SSL *s, int st1, int stn, int mt, long max, int *ok); int (*ssl_read_bytes) (SSL *s, int type, unsigned char *buf, int len, int peek); int (*ssl_write_bytes) (SSL *s, int type, const void *buf_, int len); int (*ssl_dispatch_alert) (SSL *s); long (*ssl_ctrl) (SSL *s, int cmd, long larg, void *parg); long (*ssl_ctx_ctrl) (SSL_CTX *ctx, int cmd, long larg, void *parg); const SSL_CIPHER *(*get_cipher_by_char) (const unsigned char *ptr); int (*put_cipher_by_char) (const SSL_CIPHER *cipher, unsigned char *ptr); int (*ssl_pending) (const SSL *s); int (*num_ciphers) (void); const SSL_CIPHER *(*get_cipher) (unsigned ncipher); const struct ssl_method_st *(*get_ssl_method) (int version); long (*get_timeout) (void); struct ssl3_enc_method *ssl3_enc; int (*ssl_version) (void); long (*ssl_callback_ctrl) (SSL *s, int cb_id, void (*fp) (void)); long (*ssl_ctx_callback_ctrl) (SSL_CTX *s, int cb_id, void (*fp) (void)); }; # 498 "/usr/include/openssl/ssl.h" 3 4 struct ssl_session_st { int ssl_version; unsigned int key_arg_length; unsigned char key_arg[8]; int master_key_length; unsigned char master_key[48]; unsigned int session_id_length; unsigned char session_id[32]; unsigned int sid_ctx_length; unsigned char sid_ctx[32]; char *psk_identity_hint; char *psk_identity; int not_resumable; struct sess_cert_st *sess_cert; X509 *peer; long verify_result; int references; long timeout; long time; unsigned int compress_meth; const SSL_CIPHER *cipher; unsigned long cipher_id; struct stack_st_SSL_CIPHER *ciphers; CRYPTO_EX_DATA ex_data; struct ssl_session_st *prev, *next; char *tlsext_hostname; size_t tlsext_ecpointformatlist_length; unsigned char *tlsext_ecpointformatlist; size_t tlsext_ellipticcurvelist_length; unsigned char *tlsext_ellipticcurvelist; unsigned char *tlsext_tick; size_t tlsext_ticklen; long tlsext_tick_lifetime_hint; char *srp_username; }; # 834 "/usr/include/openssl/ssl.h" 3 4 void SSL_CTX_set_msg_callback(SSL_CTX *ctx, void (*cb) (int write_p, int version, int content_type, const void *buf, size_t len, SSL *ssl, void *arg)); void SSL_set_msg_callback(SSL *ssl, void (*cb) (int write_p, int version, int content_type, const void *buf, size_t len, SSL *ssl, void *arg)); typedef struct srp_ctx_st { void *SRP_cb_arg; int (*TLS_ext_srp_username_callback) (SSL *, int *, void *); int (*SRP_verify_param_callback) (SSL *, void *); char *(*SRP_give_srp_client_pwd_callback) (SSL *, void *); char *login; BIGNUM *N, *g, *s, *B, *A; BIGNUM *a, *b, *v; char *info; int strength; unsigned long srp_Mask; } SRP_CTX; int SSL_SRP_CTX_init(SSL *s); int SSL_CTX_SRP_CTX_init(SSL_CTX *ctx); int SSL_SRP_CTX_free(SSL *ctx); int SSL_CTX_SRP_CTX_free(SSL_CTX *ctx); int SSL_srp_server_param_with_username(SSL *s, int *ad); int SRP_generate_server_master_secret(SSL *s, unsigned char *master_key); int SRP_Calc_A_param(SSL *s); int SRP_generate_client_master_secret(SSL *s, unsigned char *master_key); # 905 "/usr/include/openssl/ssl.h" 3 4 typedef int (*GEN_SESSION_CB) (const SSL *ssl, unsigned char *id, unsigned int *id_len); typedef struct ssl_comp_st SSL_COMP; struct ssl_comp_st { int id; const char *name; COMP_METHOD *method; }; struct stack_st_SSL_COMP { _STACK stack; }; struct lhash_st_SSL_SESSION { int dummy; }; struct ssl_ctx_st { const SSL_METHOD *method; struct stack_st_SSL_CIPHER *cipher_list; struct stack_st_SSL_CIPHER *cipher_list_by_id; struct x509_store_st *cert_store; struct lhash_st_SSL_SESSION *sessions; unsigned long session_cache_size; struct ssl_session_st *session_cache_head; struct ssl_session_st *session_cache_tail; int session_cache_mode; long session_timeout; # 960 "/usr/include/openssl/ssl.h" 3 4 int (*new_session_cb) (struct ssl_st *ssl, SSL_SESSION *sess); void (*remove_session_cb) (struct ssl_ctx_st *ctx, SSL_SESSION *sess); SSL_SESSION *(*get_session_cb) (struct ssl_st *ssl, unsigned char *data, int len, int *copy); struct { int sess_connect; int sess_connect_renegotiate; int sess_connect_good; int sess_accept; int sess_accept_renegotiate; int sess_accept_good; int sess_miss; int sess_timeout; int sess_cache_full; int sess_hit; int sess_cb_hit; } stats; int references; int (*app_verify_callback) (X509_STORE_CTX *, void *); void *app_verify_arg; pem_password_cb *default_passwd_callback; void *default_passwd_callback_userdata; int (*client_cert_cb) (SSL *ssl, X509 **x509, EVP_PKEY **pkey); int (*app_gen_cookie_cb) (SSL *ssl, unsigned char *cookie, unsigned int *cookie_len); int (*app_verify_cookie_cb) (SSL *ssl, unsigned char *cookie, unsigned int cookie_len); CRYPTO_EX_DATA ex_data; const EVP_MD *rsa_md5; const EVP_MD *md5; const EVP_MD *sha1; struct stack_st_X509 *extra_certs; struct stack_st_SSL_COMP *comp_methods; void (*info_callback) (const SSL *ssl, int type, int val); struct stack_st_X509_NAME *client_CA; unsigned long options; unsigned long mode; long max_cert_list; struct cert_st *cert; int read_ahead; void (*msg_callback) (int write_p, int version, int content_type, const void *buf, size_t len, SSL *ssl, void *arg); void *msg_callback_arg; int verify_mode; unsigned int sid_ctx_length; unsigned char sid_ctx[32]; int (*default_verify_callback) (int ok, X509_STORE_CTX *ctx); GEN_SESSION_CB generate_session_id; X509_VERIFY_PARAM *param; int quiet_shutdown; unsigned int max_send_fragment; ENGINE *client_cert_engine; int (*tlsext_servername_callback) (SSL *, int *, void *); void *tlsext_servername_arg; unsigned char tlsext_tick_key_name[16]; unsigned char tlsext_tick_hmac_key[16]; unsigned char tlsext_tick_aes_key[16]; int (*tlsext_ticket_key_cb) (SSL *ssl, unsigned char *name, unsigned char *iv, EVP_CIPHER_CTX *ectx, HMAC_CTX *hctx, int enc); int (*tlsext_status_cb) (SSL *ssl, void *arg); void *tlsext_status_arg; int (*tlsext_opaque_prf_input_callback) (SSL *, void *peerinput, size_t len, void *arg); void *tlsext_opaque_prf_input_callback_arg; char *psk_identity_hint; unsigned int (*psk_client_callback) (SSL *ssl, const char *hint, char *identity, unsigned int max_identity_len, unsigned char *psk, unsigned int max_psk_len); unsigned int (*psk_server_callback) (SSL *ssl, const char *identity, unsigned char *psk, unsigned int max_psk_len); unsigned int freelist_max_len; struct ssl3_buf_freelist_st *wbuf_freelist; struct ssl3_buf_freelist_st *rbuf_freelist; SRP_CTX srp_ctx; # 1131 "/usr/include/openssl/ssl.h" 3 4 int (*next_protos_advertised_cb) (SSL *s, const unsigned char **buf, unsigned int *len, void *arg); void *next_protos_advertised_cb_arg; int (*next_proto_select_cb) (SSL *s, unsigned char **out, unsigned char *outlen, const unsigned char *in, unsigned int inlen, void *arg); void *next_proto_select_cb_arg; struct stack_st_SRTP_PROTECTION_PROFILE *srtp_profiles; # 1162 "/usr/include/openssl/ssl.h" 3 4 int (*alpn_select_cb) (SSL *s, const unsigned char **out, unsigned char *outlen, const unsigned char *in, unsigned int inlen, void *arg); void *alpn_select_cb_arg; unsigned char *alpn_client_proto_list; unsigned alpn_client_proto_list_len; size_t tlsext_ecpointformatlist_length; unsigned char *tlsext_ecpointformatlist; size_t tlsext_ellipticcurvelist_length; unsigned char *tlsext_ellipticcurvelist; }; # 1199 "/usr/include/openssl/ssl.h" 3 4 struct lhash_st_SSL_SESSION *SSL_CTX_sessions(SSL_CTX *ctx); # 1225 "/usr/include/openssl/ssl.h" 3 4 void SSL_CTX_sess_set_new_cb(SSL_CTX *ctx, int (*new_session_cb) (struct ssl_st *ssl, SSL_SESSION *sess)); int (*SSL_CTX_sess_get_new_cb(SSL_CTX *ctx)) (struct ssl_st *ssl, SSL_SESSION *sess); void SSL_CTX_sess_set_remove_cb(SSL_CTX *ctx, void (*remove_session_cb) (struct ssl_ctx_st *ctx, SSL_SESSION *sess)); void (*SSL_CTX_sess_get_remove_cb(SSL_CTX *ctx)) (struct ssl_ctx_st *ctx, SSL_SESSION *sess); void SSL_CTX_sess_set_get_cb(SSL_CTX *ctx, SSL_SESSION *(*get_session_cb) (struct ssl_st *ssl, unsigned char *data, int len, int *copy)); SSL_SESSION *(*SSL_CTX_sess_get_get_cb(SSL_CTX *ctx)) (struct ssl_st *ssl, unsigned char *Data, int len, int *copy); void SSL_CTX_set_info_callback(SSL_CTX *ctx, void (*cb) (const SSL *ssl, int type, int val)); void (*SSL_CTX_get_info_callback(SSL_CTX *ctx)) (const SSL *ssl, int type, int val); void SSL_CTX_set_client_cert_cb(SSL_CTX *ctx, int (*client_cert_cb) (SSL *ssl, X509 **x509, EVP_PKEY **pkey)); int (*SSL_CTX_get_client_cert_cb(SSL_CTX *ctx)) (SSL *ssl, X509 **x509, EVP_PKEY **pkey); int SSL_CTX_set_client_cert_engine(SSL_CTX *ctx, ENGINE *e); void SSL_CTX_set_cookie_generate_cb(SSL_CTX *ctx, int (*app_gen_cookie_cb) (SSL *ssl, unsigned char *cookie, unsigned int *cookie_len)); void SSL_CTX_set_cookie_verify_cb(SSL_CTX *ctx, int (*app_verify_cookie_cb) (SSL *ssl, unsigned char *cookie, unsigned int cookie_len)); void SSL_CTX_set_next_protos_advertised_cb(SSL_CTX *s, int (*cb) (SSL *ssl, const unsigned char **out, unsigned int *outlen, void *arg), void *arg); void SSL_CTX_set_next_proto_select_cb(SSL_CTX *s, int (*cb) (SSL *ssl, unsigned char **out, unsigned char *outlen, const unsigned char *in, unsigned int inlen, void *arg), void *arg); void SSL_get0_next_proto_negotiated(const SSL *s, const unsigned char **data, unsigned *len); int SSL_select_next_proto(unsigned char **out, unsigned char *outlen, const unsigned char *in, unsigned int inlen, const unsigned char *client, unsigned int client_len); int SSL_CTX_set_alpn_protos(SSL_CTX *ctx, const unsigned char *protos, unsigned protos_len); int SSL_set_alpn_protos(SSL *ssl, const unsigned char *protos, unsigned protos_len); void SSL_CTX_set_alpn_select_cb(SSL_CTX *ctx, int (*cb) (SSL *ssl, const unsigned char **out, unsigned char *outlen, const unsigned char *in, unsigned int inlen, void *arg), void *arg); void SSL_get0_alpn_selected(const SSL *ssl, const unsigned char **data, unsigned *len); # 1321 "/usr/include/openssl/ssl.h" 3 4 void SSL_CTX_set_psk_client_callback(SSL_CTX *ctx, unsigned int (*psk_client_callback) (SSL *ssl, const char *hint, char *identity, unsigned int max_identity_len, unsigned char *psk, unsigned int max_psk_len)); void SSL_set_psk_client_callback(SSL *ssl, unsigned int (*psk_client_callback) (SSL *ssl, const char *hint, char *identity, unsigned int max_identity_len, unsigned char *psk, unsigned int max_psk_len)); void SSL_CTX_set_psk_server_callback(SSL_CTX *ctx, unsigned int (*psk_server_callback) (SSL *ssl, const char *identity, unsigned char *psk, unsigned int max_psk_len)); void SSL_set_psk_server_callback(SSL *ssl, unsigned int (*psk_server_callback) (SSL *ssl, const char *identity, unsigned char *psk, unsigned int max_psk_len)); int SSL_CTX_use_psk_identity_hint(SSL_CTX *ctx, const char *identity_hint); int SSL_use_psk_identity_hint(SSL *s, const char *identity_hint); const char *SSL_get_psk_identity_hint(const SSL *s); const char *SSL_get_psk_identity(const SSL *s); int SSL_CTX_add_client_custom_ext(SSL_CTX *ctx, unsigned int ext_type, custom_ext_add_cb add_cb, custom_ext_free_cb free_cb, void *add_arg, custom_ext_parse_cb parse_cb, void *parse_arg); int SSL_CTX_add_server_custom_ext(SSL_CTX *ctx, unsigned int ext_type, custom_ext_add_cb add_cb, custom_ext_free_cb free_cb, void *add_arg, custom_ext_parse_cb parse_cb, void *parse_arg); int SSL_extension_supported(unsigned int ext_type); # 1422 "/usr/include/openssl/ssl.h" 3 4 struct ssl_st { int version; int type; const SSL_METHOD *method; BIO *rbio; BIO *wbio; BIO *bbio; # 1455 "/usr/include/openssl/ssl.h" 3 4 int rwstate; int in_handshake; int (*handshake_func) (SSL *); # 1467 "/usr/include/openssl/ssl.h" 3 4 int server; int new_session; int quiet_shutdown; int shutdown; int state; int rstate; BUF_MEM *init_buf; void *init_msg; int init_num; int init_off; unsigned char *packet; unsigned int packet_length; struct ssl2_state_st *s2; struct ssl3_state_st *s3; struct dtls1_state_st *d1; int read_ahead; void (*msg_callback) (int write_p, int version, int content_type, const void *buf, size_t len, SSL *ssl, void *arg); void *msg_callback_arg; int hit; X509_VERIFY_PARAM *param; struct stack_st_SSL_CIPHER *cipher_list; struct stack_st_SSL_CIPHER *cipher_list_by_id; int mac_flags; EVP_CIPHER_CTX *enc_read_ctx; EVP_MD_CTX *read_hash; COMP_CTX *expand; EVP_CIPHER_CTX *enc_write_ctx; EVP_MD_CTX *write_hash; COMP_CTX *compress; struct cert_st *cert; unsigned int sid_ctx_length; unsigned char sid_ctx[32]; SSL_SESSION *session; GEN_SESSION_CB generate_session_id; int verify_mode; int (*verify_callback) (int ok, X509_STORE_CTX *ctx); void (*info_callback) (const SSL *ssl, int type, int val); int error; int error_code; unsigned int (*psk_client_callback) (SSL *ssl, const char *hint, char *identity, unsigned int max_identity_len, unsigned char *psk, unsigned int max_psk_len); unsigned int (*psk_server_callback) (SSL *ssl, const char *identity, unsigned char *psk, unsigned int max_psk_len); SSL_CTX *ctx; int debug; long verify_result; CRYPTO_EX_DATA ex_data; struct stack_st_X509_NAME *client_CA; int references; unsigned long options; unsigned long mode; long max_cert_list; int first_packet; int client_version; unsigned int max_send_fragment; void (*tlsext_debug_cb) (SSL *s, int client_server, int type, unsigned char *data, int len, void *arg); void *tlsext_debug_arg; char *tlsext_hostname; int servername_done; int tlsext_status_type; int tlsext_status_expected; struct stack_st_OCSP_RESPID *tlsext_ocsp_ids; X509_EXTENSIONS *tlsext_ocsp_exts; unsigned char *tlsext_ocsp_resp; int tlsext_ocsp_resplen; int tlsext_ticket_expected; size_t tlsext_ecpointformatlist_length; unsigned char *tlsext_ecpointformatlist; size_t tlsext_ellipticcurvelist_length; unsigned char *tlsext_ellipticcurvelist; void *tlsext_opaque_prf_input; size_t tlsext_opaque_prf_input_len; TLS_SESSION_TICKET_EXT *tlsext_session_ticket; tls_session_ticket_ext_cb_fn tls_session_ticket_ext_cb; void *tls_session_ticket_ext_cb_arg; tls_session_secret_cb_fn tls_session_secret_cb; void *tls_session_secret_cb_arg; SSL_CTX *initial_ctx; # 1648 "/usr/include/openssl/ssl.h" 3 4 unsigned char *next_proto_negotiated; unsigned char next_proto_negotiated_len; struct stack_st_SRTP_PROTECTION_PROFILE *srtp_profiles; SRTP_PROTECTION_PROFILE *srtp_profile; unsigned int tlsext_heartbeat; unsigned int tlsext_hb_pending; unsigned int tlsext_hb_seq; # 1675 "/usr/include/openssl/ssl.h" 3 4 int renegotiate; SRP_CTX srp_ctx; unsigned char *alpn_client_proto_list; unsigned alpn_client_proto_list_len; }; # 1 "/usr/include/openssl/ssl2.h" 1 3 4 # 163 "/usr/include/openssl/ssl2.h" 3 4 typedef struct ssl2_state_st { int three_byte_header; int clear_text; int escape; int ssl2_rollback; unsigned int wnum; int wpend_tot; const unsigned char *wpend_buf; int wpend_off; int wpend_len; int wpend_ret; int rbuf_left; int rbuf_offs; unsigned char *rbuf; unsigned char *wbuf; unsigned char *write_ptr; unsigned int padding; unsigned int rlength; int ract_data_length; unsigned int wlength; int wact_data_length; unsigned char *ract_data; unsigned char *wact_data; unsigned char *mac_data; unsigned char *read_key; unsigned char *write_key; unsigned int challenge_length; unsigned char challenge[32]; unsigned int conn_id_length; unsigned char conn_id[16]; unsigned int key_material_length; unsigned char key_material[24 * 2]; unsigned long read_sequence; unsigned long write_sequence; struct { unsigned int conn_id_length; unsigned int cert_type; unsigned int cert_length; unsigned int csl; unsigned int clear; unsigned int enc; unsigned char ccl[32]; unsigned int cipher_spec_length; unsigned int session_id_length; unsigned int clen; unsigned int rlen; } tmp; } SSL2_STATE; # 1697 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/ssl3.h" 1 3 4 # 125 "/usr/include/openssl/ssl3.h" 3 4 # 1 "/usr/include/openssl/ssl.h" 1 3 4 # 126 "/usr/include/openssl/ssl3.h" 2 3 4 # 403 "/usr/include/openssl/ssl3.h" 3 4 typedef struct ssl3_record_st { int type; unsigned int length; unsigned int off; unsigned char *data; unsigned char *input; unsigned char *comp; unsigned long epoch; unsigned char seq_num[8]; } SSL3_RECORD; typedef struct ssl3_buffer_st { unsigned char *buf; size_t len; int offset; int left; } SSL3_BUFFER; # 481 "/usr/include/openssl/ssl3.h" 3 4 typedef struct ssl3_state_st { long flags; int delay_buf_pop_ret; unsigned char read_sequence[8]; int read_mac_secret_size; unsigned char read_mac_secret[64]; unsigned char write_sequence[8]; int write_mac_secret_size; unsigned char write_mac_secret[64]; unsigned char server_random[32]; unsigned char client_random[32]; int need_empty_fragments; int empty_fragment_done; int init_extra; SSL3_BUFFER rbuf; SSL3_BUFFER wbuf; SSL3_RECORD rrec; SSL3_RECORD wrec; unsigned char alert_fragment[2]; unsigned int alert_fragment_len; unsigned char handshake_fragment[4]; unsigned int handshake_fragment_len; unsigned int wnum; int wpend_tot; int wpend_type; int wpend_ret; const unsigned char *wpend_buf; BIO *handshake_buffer; EVP_MD_CTX **handshake_dgst; int change_cipher_spec; int warn_alert; int fatal_alert; int alert_dispatch; unsigned char send_alert[2]; int renegotiate; int total_renegotiations; int num_renegotiations; int in_read_app_data; void *client_opaque_prf_input; size_t client_opaque_prf_input_len; void *server_opaque_prf_input; size_t server_opaque_prf_input_len; struct { unsigned char cert_verify_md[64 * 2]; unsigned char finish_md[64 * 2]; int finish_md_len; unsigned char peer_finish_md[64 * 2]; int peer_finish_md_len; unsigned long message_size; int message_type; const SSL_CIPHER *new_cipher; DH *dh; EC_KEY *ecdh; int next_state; int reuse_message; int cert_req; int ctype_num; char ctype[9]; struct stack_st_X509_NAME *ca_names; int use_rsa_tmp; int key_block_length; unsigned char *key_block; const EVP_CIPHER *new_sym_enc; const EVP_MD *new_hash; int new_mac_pkey_type; int new_mac_secret_size; const SSL_COMP *new_compression; int cert_request; } tmp; unsigned char previous_client_finished[64]; unsigned char previous_client_finished_len; unsigned char previous_server_finished[64]; unsigned char previous_server_finished_len; int send_connection_binding; int next_proto_neg_seen; # 615 "/usr/include/openssl/ssl3.h" 3 4 char is_probably_safari; # 628 "/usr/include/openssl/ssl3.h" 3 4 unsigned char *alpn_selected; unsigned alpn_selected_len; } SSL3_STATE; # 1698 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/tls1.h" 1 3 4 # 309 "/usr/include/openssl/tls1.h" 3 4 const char *SSL_get_servername(const SSL *s, const int type); int SSL_get_servername_type(const SSL *s); int SSL_export_keying_material(SSL *s, unsigned char *out, size_t olen, const char *label, size_t llen, const unsigned char *p, size_t plen, int use_context); int SSL_get_sigalgs(SSL *s, int idx, int *psign, int *phash, int *psignandhash, unsigned char *rsig, unsigned char *rhash); int SSL_get_shared_sigalgs(SSL *s, int idx, int *psign, int *phash, int *psignandhash, unsigned char *rsig, unsigned char *rhash); int SSL_check_chain(SSL *s, X509 *x, EVP_PKEY *pk, struct stack_st_X509 *chain); # 802 "/usr/include/openssl/tls1.h" 3 4 struct tls_session_ticket_ext_st { unsigned short length; void *data; }; # 1699 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/dtls1.h" 1 3 4 # 64 "/usr/include/openssl/dtls1.h" 3 4 # 1 "/usr/include/openssl/pqueue.h" 1 3 4 # 65 "/usr/include/openssl/pqueue.h" 3 4 # 1 "/usr/include/string.h" 1 3 4 # 27 "/usr/include/string.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 33 "/usr/include/string.h" 2 3 4 extern void *memcpy (void *__restrict __dest, const void *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memmove (void *__dest, const void *__src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memccpy (void *__restrict __dest, const void *__restrict __src, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memset (void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int memcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 92 "/usr/include/string.h" 3 4 extern void *memchr (const void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 123 "/usr/include/string.h" 3 4 extern char *strcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strcat (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncat (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcoll (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strxfrm (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 162 "/usr/include/string.h" 3 4 extern int strcoll_l (const char *__s1, const char *__s2, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern size_t strxfrm_l (char *__dest, const char *__src, size_t __n, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern char *strdup (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); extern char *strndup (const char *__string, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); # 206 "/usr/include/string.h" 3 4 # 231 "/usr/include/string.h" 3 4 extern char *strchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 258 "/usr/include/string.h" 3 4 extern char *strrchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 277 "/usr/include/string.h" 3 4 extern size_t strcspn (const char *__s, const char *__reject) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strspn (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 310 "/usr/include/string.h" 3 4 extern char *strpbrk (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 337 "/usr/include/string.h" 3 4 extern char *strstr (const char *__haystack, const char *__needle) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strtok (char *__restrict __s, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *__strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern char *strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); # 392 "/usr/include/string.h" 3 4 extern size_t strlen (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern size_t strnlen (const char *__string, size_t __maxlen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern char *strerror (int __errnum) __attribute__ ((__nothrow__ , __leaf__)); # 422 "/usr/include/string.h" 3 4 extern int strerror_r (int __errnum, char *__buf, size_t __buflen) __asm__ ("" "__xpg_strerror_r") __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 440 "/usr/include/string.h" 3 4 extern char *strerror_l (int __errnum, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)); extern void __bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void bcopy (const void *__src, void *__dest, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int bcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 484 "/usr/include/string.h" 3 4 extern char *index (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 512 "/usr/include/string.h" 3 4 extern char *rindex (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern int ffs (int __i) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 529 "/usr/include/string.h" 3 4 extern int strcasecmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncasecmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 552 "/usr/include/string.h" 3 4 extern char *strsep (char **__restrict __stringp, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strsignal (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern char *__stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *__stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); # 656 "/usr/include/string.h" 3 4 # 66 "/usr/include/openssl/pqueue.h" 2 3 4 typedef struct _pqueue *pqueue; typedef struct _pitem { unsigned char priority[8]; void *data; struct _pitem *next; } pitem; typedef struct _pitem *piterator; pitem *pitem_new(unsigned char *prio64be, void *data); void pitem_free(pitem *item); pqueue pqueue_new(void); void pqueue_free(pqueue pq); pitem *pqueue_insert(pqueue pq, pitem *item); pitem *pqueue_peek(pqueue pq); pitem *pqueue_pop(pqueue pq); pitem *pqueue_find(pqueue pq, unsigned char *prio64be); pitem *pqueue_iterator(pqueue pq); pitem *pqueue_next(piterator *iter); void pqueue_print(pqueue pq); int pqueue_size(pqueue pq); # 65 "/usr/include/openssl/dtls1.h" 2 3 4 # 78 "/usr/include/openssl/dtls1.h" 3 4 # 1 "/usr/include/sys/time.h" 1 3 4 # 27 "/usr/include/sys/time.h" 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 28 "/usr/include/sys/time.h" 2 3 4 # 37 "/usr/include/sys/time.h" 3 4 # 55 "/usr/include/sys/time.h" 3 4 struct timezone { int tz_minuteswest; int tz_dsttime; }; typedef struct timezone *__restrict __timezone_ptr_t; # 71 "/usr/include/sys/time.h" 3 4 extern int gettimeofday (struct timeval *__restrict __tv, __timezone_ptr_t __tz) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int settimeofday (const struct timeval *__tv, const struct timezone *__tz) __attribute__ ((__nothrow__ , __leaf__)); extern int adjtime (const struct timeval *__delta, struct timeval *__olddelta) __attribute__ ((__nothrow__ , __leaf__)); enum __itimer_which { ITIMER_REAL = 0, ITIMER_VIRTUAL = 1, ITIMER_PROF = 2 }; struct itimerval { struct timeval it_interval; struct timeval it_value; }; typedef int __itimer_which_t; extern int getitimer (__itimer_which_t __which, struct itimerval *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int setitimer (__itimer_which_t __which, const struct itimerval *__restrict __new, struct itimerval *__restrict __old) __attribute__ ((__nothrow__ , __leaf__)); extern int utimes (const char *__file, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int lutimes (const char *__file, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int futimes (int __fd, const struct timeval __tvp[2]) __attribute__ ((__nothrow__ , __leaf__)); # 189 "/usr/include/sys/time.h" 3 4 # 79 "/usr/include/openssl/dtls1.h" 2 3 4 # 128 "/usr/include/openssl/dtls1.h" 3 4 typedef struct dtls1_bitmap_st { unsigned long map; unsigned char max_seq_num[8]; } DTLS1_BITMAP; struct dtls1_retransmit_state { EVP_CIPHER_CTX *enc_write_ctx; EVP_MD_CTX *write_hash; COMP_CTX *compress; SSL_SESSION *session; unsigned short epoch; }; struct hm_header_st { unsigned char type; unsigned long msg_len; unsigned short seq; unsigned long frag_off; unsigned long frag_len; unsigned int is_ccs; struct dtls1_retransmit_state saved_retransmit_state; }; struct ccs_header_st { unsigned char type; unsigned short seq; }; struct dtls1_timeout_st { unsigned int read_timeouts; unsigned int write_timeouts; unsigned int num_alerts; }; typedef struct record_pqueue_st { unsigned short epoch; pqueue q; } record_pqueue; typedef struct hm_fragment_st { struct hm_header_st msg_header; unsigned char *fragment; unsigned char *reassembly; } hm_fragment; typedef struct dtls1_state_st { unsigned int send_cookie; unsigned char cookie[256]; unsigned char rcvd_cookie[256]; unsigned int cookie_len; unsigned short r_epoch; unsigned short w_epoch; DTLS1_BITMAP bitmap; DTLS1_BITMAP next_bitmap; unsigned short handshake_write_seq; unsigned short next_handshake_write_seq; unsigned short handshake_read_seq; unsigned char last_write_sequence[8]; record_pqueue unprocessed_rcds; record_pqueue processed_rcds; pqueue buffered_messages; pqueue sent_messages; record_pqueue buffered_app_data; unsigned int listen; unsigned int link_mtu; unsigned int mtu; struct hm_header_st w_msg_hdr; struct hm_header_st r_msg_hdr; struct dtls1_timeout_st timeout; struct timeval next_timeout; unsigned short timeout_duration; unsigned char alert_fragment[2]; unsigned int alert_fragment_len; unsigned char handshake_fragment[12]; unsigned int handshake_fragment_len; unsigned int retransmitting; unsigned int change_cipher_spec_ok; } DTLS1_STATE; typedef struct dtls1_record_data_st { unsigned char *packet; unsigned int packet_length; SSL3_BUFFER rbuf; SSL3_RECORD rrec; } DTLS1_RECORD_DATA; # 1700 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/ssl23.h" 1 3 4 # 1701 "/usr/include/openssl/ssl.h" 2 3 4 # 1 "/usr/include/openssl/srtp.h" 1 3 4 # 135 "/usr/include/openssl/srtp.h" 3 4 int SSL_CTX_set_tlsext_use_srtp(SSL_CTX *ctx, const char *profiles); int SSL_set_tlsext_use_srtp(SSL *ctx, const char *profiles); struct stack_st_SRTP_PROTECTION_PROFILE *SSL_get_srtp_profiles(SSL *ssl); SRTP_PROTECTION_PROFILE *SSL_get_selected_srtp_profile(SSL *s); # 1702 "/usr/include/openssl/ssl.h" 2 3 4 # 1768 "/usr/include/openssl/ssl.h" 3 4 size_t SSL_get_finished(const SSL *s, void *buf, size_t count); size_t SSL_get_peer_finished(const SSL *s, void *buf, size_t count); # 1808 "/usr/include/openssl/ssl.h" 3 4 SSL_SESSION *PEM_read_bio_SSL_SESSION(BIO *bp, SSL_SESSION **x, pem_password_cb *cb, void *u); SSL_SESSION *PEM_read_SSL_SESSION(FILE *fp, SSL_SESSION **x, pem_password_cb *cb, void *u); int PEM_write_bio_SSL_SESSION(BIO *bp, SSL_SESSION *x); int PEM_write_SSL_SESSION(FILE *fp, SSL_SESSION *x); # 2121 "/usr/include/openssl/ssl.h" 3 4 BIO_METHOD *BIO_f_ssl(void); BIO *BIO_new_ssl(SSL_CTX *ctx, int client); BIO *BIO_new_ssl_connect(SSL_CTX *ctx); BIO *BIO_new_buffer_ssl_connect(SSL_CTX *ctx); int BIO_ssl_copy_session_id(BIO *to, BIO *from); void BIO_ssl_shutdown(BIO *ssl_bio); int SSL_CTX_set_cipher_list(SSL_CTX *, const char *str); SSL_CTX *SSL_CTX_new(const SSL_METHOD *meth); void SSL_CTX_free(SSL_CTX *); long SSL_CTX_set_timeout(SSL_CTX *ctx, long t); long SSL_CTX_get_timeout(const SSL_CTX *ctx); X509_STORE *SSL_CTX_get_cert_store(const SSL_CTX *); void SSL_CTX_set_cert_store(SSL_CTX *, X509_STORE *); int SSL_want(const SSL *s); int SSL_clear(SSL *s); void SSL_CTX_flush_sessions(SSL_CTX *ctx, long tm); const SSL_CIPHER *SSL_get_current_cipher(const SSL *s); int SSL_CIPHER_get_bits(const SSL_CIPHER *c, int *alg_bits); char *SSL_CIPHER_get_version(const SSL_CIPHER *c); const char *SSL_CIPHER_get_name(const SSL_CIPHER *c); unsigned long SSL_CIPHER_get_id(const SSL_CIPHER *c); int SSL_get_fd(const SSL *s); int SSL_get_rfd(const SSL *s); int SSL_get_wfd(const SSL *s); const char *SSL_get_cipher_list(const SSL *s, int n); char *SSL_get_shared_ciphers(const SSL *s, char *buf, int len); int SSL_get_read_ahead(const SSL *s); int SSL_pending(const SSL *s); int SSL_set_fd(SSL *s, int fd); int SSL_set_rfd(SSL *s, int fd); int SSL_set_wfd(SSL *s, int fd); void SSL_set_bio(SSL *s, BIO *rbio, BIO *wbio); BIO *SSL_get_rbio(const SSL *s); BIO *SSL_get_wbio(const SSL *s); int SSL_set_cipher_list(SSL *s, const char *str); void SSL_set_read_ahead(SSL *s, int yes); int SSL_get_verify_mode(const SSL *s); int SSL_get_verify_depth(const SSL *s); int (*SSL_get_verify_callback(const SSL *s)) (int, X509_STORE_CTX *); void SSL_set_verify(SSL *s, int mode, int (*callback) (int ok, X509_STORE_CTX *ctx)); void SSL_set_verify_depth(SSL *s, int depth); void SSL_set_cert_cb(SSL *s, int (*cb) (SSL *ssl, void *arg), void *arg); int SSL_use_RSAPrivateKey(SSL *ssl, RSA *rsa); int SSL_use_RSAPrivateKey_ASN1(SSL *ssl, unsigned char *d, long len); int SSL_use_PrivateKey(SSL *ssl, EVP_PKEY *pkey); int SSL_use_PrivateKey_ASN1(int pk, SSL *ssl, const unsigned char *d, long len); int SSL_use_certificate(SSL *ssl, X509 *x); int SSL_use_certificate_ASN1(SSL *ssl, const unsigned char *d, int len); int SSL_CTX_use_serverinfo(SSL_CTX *ctx, const unsigned char *serverinfo, size_t serverinfo_length); int SSL_CTX_use_serverinfo_file(SSL_CTX *ctx, const char *file); int SSL_use_RSAPrivateKey_file(SSL *ssl, const char *file, int type); int SSL_use_PrivateKey_file(SSL *ssl, const char *file, int type); int SSL_use_certificate_file(SSL *ssl, const char *file, int type); int SSL_CTX_use_RSAPrivateKey_file(SSL_CTX *ctx, const char *file, int type); int SSL_CTX_use_PrivateKey_file(SSL_CTX *ctx, const char *file, int type); int SSL_CTX_use_certificate_file(SSL_CTX *ctx, const char *file, int type); int SSL_CTX_use_certificate_chain_file(SSL_CTX *ctx, const char *file); struct stack_st_X509_NAME *SSL_load_client_CA_file(const char *file); int SSL_add_file_cert_subjects_to_stack(struct stack_st_X509_NAME *stackCAs, const char *file); int SSL_add_dir_cert_subjects_to_stack(struct stack_st_X509_NAME *stackCAs, const char *dir); void SSL_load_error_strings(void); const char *SSL_state_string(const SSL *s); const char *SSL_rstate_string(const SSL *s); const char *SSL_state_string_long(const SSL *s); const char *SSL_rstate_string_long(const SSL *s); long SSL_SESSION_get_time(const SSL_SESSION *s); long SSL_SESSION_set_time(SSL_SESSION *s, long t); long SSL_SESSION_get_timeout(const SSL_SESSION *s); long SSL_SESSION_set_timeout(SSL_SESSION *s, long t); void SSL_copy_session_id(SSL *to, const SSL *from); X509 *SSL_SESSION_get0_peer(SSL_SESSION *s); int SSL_SESSION_set1_id_context(SSL_SESSION *s, const unsigned char *sid_ctx, unsigned int sid_ctx_len); SSL_SESSION *SSL_SESSION_new(void); const unsigned char *SSL_SESSION_get_id(const SSL_SESSION *s, unsigned int *len); unsigned int SSL_SESSION_get_compress_id(const SSL_SESSION *s); int SSL_SESSION_print_fp(FILE *fp, const SSL_SESSION *ses); int SSL_SESSION_print(BIO *fp, const SSL_SESSION *ses); void SSL_SESSION_free(SSL_SESSION *ses); int i2d_SSL_SESSION(SSL_SESSION *in, unsigned char **pp); int SSL_set_session(SSL *to, SSL_SESSION *session); int SSL_CTX_add_session(SSL_CTX *s, SSL_SESSION *c); int SSL_CTX_remove_session(SSL_CTX *, SSL_SESSION *c); int SSL_CTX_set_generate_session_id(SSL_CTX *, GEN_SESSION_CB); int SSL_set_generate_session_id(SSL *, GEN_SESSION_CB); int SSL_has_matching_session_id(const SSL *ssl, const unsigned char *id, unsigned int id_len); SSL_SESSION *d2i_SSL_SESSION(SSL_SESSION **a, const unsigned char **pp, long length); X509 *SSL_get_peer_certificate(const SSL *s); struct stack_st_X509 *SSL_get_peer_cert_chain(const SSL *s); int SSL_CTX_get_verify_mode(const SSL_CTX *ctx); int SSL_CTX_get_verify_depth(const SSL_CTX *ctx); int (*SSL_CTX_get_verify_callback(const SSL_CTX *ctx)) (int, X509_STORE_CTX *); void SSL_CTX_set_verify(SSL_CTX *ctx, int mode, int (*callback) (int, X509_STORE_CTX *)); void SSL_CTX_set_verify_depth(SSL_CTX *ctx, int depth); void SSL_CTX_set_cert_verify_callback(SSL_CTX *ctx, int (*cb) (X509_STORE_CTX *, void *), void *arg); void SSL_CTX_set_cert_cb(SSL_CTX *c, int (*cb) (SSL *ssl, void *arg), void *arg); int SSL_CTX_use_RSAPrivateKey(SSL_CTX *ctx, RSA *rsa); int SSL_CTX_use_RSAPrivateKey_ASN1(SSL_CTX *ctx, const unsigned char *d, long len); int SSL_CTX_use_PrivateKey(SSL_CTX *ctx, EVP_PKEY *pkey); int SSL_CTX_use_PrivateKey_ASN1(int pk, SSL_CTX *ctx, const unsigned char *d, long len); int SSL_CTX_use_certificate(SSL_CTX *ctx, X509 *x); int SSL_CTX_use_certificate_ASN1(SSL_CTX *ctx, int len, const unsigned char *d); void SSL_CTX_set_default_passwd_cb(SSL_CTX *ctx, pem_password_cb *cb); void SSL_CTX_set_default_passwd_cb_userdata(SSL_CTX *ctx, void *u); int SSL_CTX_check_private_key(const SSL_CTX *ctx); int SSL_check_private_key(const SSL *ctx); int SSL_CTX_set_session_id_context(SSL_CTX *ctx, const unsigned char *sid_ctx, unsigned int sid_ctx_len); SSL *SSL_new(SSL_CTX *ctx); int SSL_set_session_id_context(SSL *ssl, const unsigned char *sid_ctx, unsigned int sid_ctx_len); int SSL_CTX_set_purpose(SSL_CTX *s, int purpose); int SSL_set_purpose(SSL *s, int purpose); int SSL_CTX_set_trust(SSL_CTX *s, int trust); int SSL_set_trust(SSL *s, int trust); int SSL_CTX_set1_param(SSL_CTX *ctx, X509_VERIFY_PARAM *vpm); int SSL_set1_param(SSL *ssl, X509_VERIFY_PARAM *vpm); X509_VERIFY_PARAM *SSL_CTX_get0_param(SSL_CTX *ctx); X509_VERIFY_PARAM *SSL_get0_param(SSL *ssl); int SSL_CTX_set_srp_username(SSL_CTX *ctx, char *name); int SSL_CTX_set_srp_password(SSL_CTX *ctx, char *password); int SSL_CTX_set_srp_strength(SSL_CTX *ctx, int strength); int SSL_CTX_set_srp_client_pwd_callback(SSL_CTX *ctx, char *(*cb) (SSL *, void *)); int SSL_CTX_set_srp_verify_param_callback(SSL_CTX *ctx, int (*cb) (SSL *, void *)); int SSL_CTX_set_srp_username_callback(SSL_CTX *ctx, int (*cb) (SSL *, int *, void *)); int SSL_CTX_set_srp_cb_arg(SSL_CTX *ctx, void *arg); int SSL_set_srp_server_param(SSL *s, const BIGNUM *N, const BIGNUM *g, BIGNUM *sa, BIGNUM *v, char *info); int SSL_set_srp_server_param_pw(SSL *s, const char *user, const char *pass, const char *grp); BIGNUM *SSL_get_srp_g(SSL *s); BIGNUM *SSL_get_srp_N(SSL *s); char *SSL_get_srp_username(SSL *s); char *SSL_get_srp_userinfo(SSL *s); void SSL_certs_clear(SSL *s); void SSL_free(SSL *ssl); int SSL_accept(SSL *ssl); int SSL_connect(SSL *ssl); int SSL_read(SSL *ssl, void *buf, int num); int SSL_peek(SSL *ssl, void *buf, int num); int SSL_write(SSL *ssl, const void *buf, int num); long SSL_ctrl(SSL *ssl, int cmd, long larg, void *parg); long SSL_callback_ctrl(SSL *, int, void (*)(void)); long SSL_CTX_ctrl(SSL_CTX *ctx, int cmd, long larg, void *parg); long SSL_CTX_callback_ctrl(SSL_CTX *, int, void (*)(void)); int SSL_get_error(const SSL *s, int ret_code); const char *SSL_get_version(const SSL *s); int SSL_CTX_set_ssl_version(SSL_CTX *ctx, const SSL_METHOD *meth); const SSL_METHOD *SSLv2_method(void); const SSL_METHOD *SSLv2_server_method(void); const SSL_METHOD *SSLv2_client_method(void); # 2360 "/usr/include/openssl/ssl.h" 3 4 const SSL_METHOD *SSLv23_method(void); const SSL_METHOD *SSLv23_server_method(void); const SSL_METHOD *SSLv23_client_method(void); const SSL_METHOD *TLSv1_method(void); const SSL_METHOD *TLSv1_server_method(void); const SSL_METHOD *TLSv1_client_method(void); const SSL_METHOD *TLSv1_1_method(void); const SSL_METHOD *TLSv1_1_server_method(void); const SSL_METHOD *TLSv1_1_client_method(void); const SSL_METHOD *TLSv1_2_method(void); const SSL_METHOD *TLSv1_2_server_method(void); const SSL_METHOD *TLSv1_2_client_method(void); const SSL_METHOD *DTLSv1_method(void); const SSL_METHOD *DTLSv1_server_method(void); const SSL_METHOD *DTLSv1_client_method(void); const SSL_METHOD *DTLSv1_2_method(void); const SSL_METHOD *DTLSv1_2_server_method(void); const SSL_METHOD *DTLSv1_2_client_method(void); const SSL_METHOD *DTLS_method(void); const SSL_METHOD *DTLS_server_method(void); const SSL_METHOD *DTLS_client_method(void); struct stack_st_SSL_CIPHER *SSL_get_ciphers(const SSL *s); int SSL_do_handshake(SSL *s); int SSL_renegotiate(SSL *s); int SSL_renegotiate_abbreviated(SSL *s); int SSL_renegotiate_pending(SSL *s); int SSL_shutdown(SSL *s); const SSL_METHOD *SSL_CTX_get_ssl_method(SSL_CTX *ctx); const SSL_METHOD *SSL_get_ssl_method(SSL *s); int SSL_set_ssl_method(SSL *s, const SSL_METHOD *method); const char *SSL_alert_type_string_long(int value); const char *SSL_alert_type_string(int value); const char *SSL_alert_desc_string_long(int value); const char *SSL_alert_desc_string(int value); void SSL_set_client_CA_list(SSL *s, struct stack_st_X509_NAME *name_list); void SSL_CTX_set_client_CA_list(SSL_CTX *ctx, struct stack_st_X509_NAME *name_list); struct stack_st_X509_NAME *SSL_get_client_CA_list(const SSL *s); struct stack_st_X509_NAME *SSL_CTX_get_client_CA_list(const SSL_CTX *s); int SSL_add_client_CA(SSL *ssl, X509 *x); int SSL_CTX_add_client_CA(SSL_CTX *ctx, X509 *x); void SSL_set_connect_state(SSL *s); void SSL_set_accept_state(SSL *s); long SSL_get_default_timeout(const SSL *s); int SSL_library_init(void); char *SSL_CIPHER_description(const SSL_CIPHER *, char *buf, int size); struct stack_st_X509_NAME *SSL_dup_CA_list(struct stack_st_X509_NAME *sk); SSL *SSL_dup(SSL *ssl); X509 *SSL_get_certificate(const SSL *ssl); struct evp_pkey_st *SSL_get_privatekey(const SSL *ssl); X509 *SSL_CTX_get0_certificate(const SSL_CTX *ctx); EVP_PKEY *SSL_CTX_get0_privatekey(const SSL_CTX *ctx); void SSL_CTX_set_quiet_shutdown(SSL_CTX *ctx, int mode); int SSL_CTX_get_quiet_shutdown(const SSL_CTX *ctx); void SSL_set_quiet_shutdown(SSL *ssl, int mode); int SSL_get_quiet_shutdown(const SSL *ssl); void SSL_set_shutdown(SSL *ssl, int mode); int SSL_get_shutdown(const SSL *ssl); int SSL_version(const SSL *ssl); int SSL_CTX_set_default_verify_paths(SSL_CTX *ctx); int SSL_CTX_load_verify_locations(SSL_CTX *ctx, const char *CAfile, const char *CApath); SSL_SESSION *SSL_get_session(const SSL *ssl); SSL_SESSION *SSL_get1_session(SSL *ssl); SSL_CTX *SSL_get_SSL_CTX(const SSL *ssl); SSL_CTX *SSL_set_SSL_CTX(SSL *ssl, SSL_CTX *ctx); void SSL_set_info_callback(SSL *ssl, void (*cb) (const SSL *ssl, int type, int val)); void (*SSL_get_info_callback(const SSL *ssl)) (const SSL *ssl, int type, int val); int SSL_state(const SSL *ssl); void SSL_set_state(SSL *ssl, int state); void SSL_set_verify_result(SSL *ssl, long v); long SSL_get_verify_result(const SSL *ssl); int SSL_set_ex_data(SSL *ssl, int idx, void *data); void *SSL_get_ex_data(const SSL *ssl, int idx); int SSL_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int SSL_SESSION_set_ex_data(SSL_SESSION *ss, int idx, void *data); void *SSL_SESSION_get_ex_data(const SSL_SESSION *ss, int idx); int SSL_SESSION_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int SSL_CTX_set_ex_data(SSL_CTX *ssl, int idx, void *data); void *SSL_CTX_get_ex_data(const SSL_CTX *ssl, int idx); int SSL_CTX_get_ex_new_index(long argl, void *argp, CRYPTO_EX_new *new_func, CRYPTO_EX_dup *dup_func, CRYPTO_EX_free *free_func); int SSL_get_ex_data_X509_STORE_CTX_idx(void); # 2510 "/usr/include/openssl/ssl.h" 3 4 void SSL_CTX_set_tmp_rsa_callback(SSL_CTX *ctx, RSA *(*cb) (SSL *ssl, int is_export, int keylength)); void SSL_set_tmp_rsa_callback(SSL *ssl, RSA *(*cb) (SSL *ssl, int is_export, int keylength)); void SSL_CTX_set_tmp_dh_callback(SSL_CTX *ctx, DH *(*dh) (SSL *ssl, int is_export, int keylength)); void SSL_set_tmp_dh_callback(SSL *ssl, DH *(*dh) (SSL *ssl, int is_export, int keylength)); void SSL_CTX_set_tmp_ecdh_callback(SSL_CTX *ctx, EC_KEY *(*ecdh) (SSL *ssl, int is_export, int keylength)); void SSL_set_tmp_ecdh_callback(SSL *ssl, EC_KEY *(*ecdh) (SSL *ssl, int is_export, int keylength)); const COMP_METHOD *SSL_get_current_compression(SSL *s); const COMP_METHOD *SSL_get_current_expansion(SSL *s); const char *SSL_COMP_get_name(const COMP_METHOD *comp); struct stack_st_SSL_COMP *SSL_COMP_get_compression_methods(void); struct stack_st_SSL_COMP *SSL_COMP_set0_compression_methods(struct stack_st_SSL_COMP *meths); void SSL_COMP_free_compression_methods(void); int SSL_COMP_add_compression_method(int id, COMP_METHOD *cm); const SSL_CIPHER *SSL_CIPHER_find(SSL *ssl, const unsigned char *ptr); int SSL_set_session_ticket_ext(SSL *s, void *ext_data, int ext_len); int SSL_set_session_ticket_ext_cb(SSL *s, tls_session_ticket_ext_cb_fn cb, void *arg); int SSL_set_session_secret_cb(SSL *s, tls_session_secret_cb_fn tls_session_secret_cb, void *arg); void SSL_set_debug(SSL *s, int debug); int SSL_cache_hit(SSL *s); int SSL_is_server(SSL *s); SSL_CONF_CTX *SSL_CONF_CTX_new(void); int SSL_CONF_CTX_finish(SSL_CONF_CTX *cctx); void SSL_CONF_CTX_free(SSL_CONF_CTX *cctx); unsigned int SSL_CONF_CTX_set_flags(SSL_CONF_CTX *cctx, unsigned int flags); unsigned int SSL_CONF_CTX_clear_flags(SSL_CONF_CTX *cctx, unsigned int flags); int SSL_CONF_CTX_set1_prefix(SSL_CONF_CTX *cctx, const char *pre); void SSL_CONF_CTX_set_ssl(SSL_CONF_CTX *cctx, SSL *ssl); void SSL_CONF_CTX_set_ssl_ctx(SSL_CONF_CTX *cctx, SSL_CTX *ctx); int SSL_CONF_cmd(SSL_CONF_CTX *cctx, const char *cmd, const char *value); int SSL_CONF_cmd_argv(SSL_CONF_CTX *cctx, int *pargc, char ***pargv); int SSL_CONF_cmd_value_type(SSL_CONF_CTX *cctx, const char *cmd); # 2590 "/usr/include/openssl/ssl.h" 3 4 void ERR_load_SSL_strings(void); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Found header files ['openssl/ssl.h'] in ['/usr/include', '/usr/lib/openmpi'] Popping language C ================================================================================ TEST checkSharedLibrary from config.packages.ssl(/home/florian/software/petsc/config/BuildSystem/config/package.py:738) TESTING: checkSharedLibrary from config.packages.ssl(config/BuildSystem/config/package.py:738) By default we don't care about checking if the library is shared Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.sprng(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.sprng(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default PETSc clone, checking for Sowing Checking for program /home/florian/software/bin/pdflatex...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/pdflatex...not found Checking for program /usr/local/sbin/pdflatex...not found Checking for program /usr/local/bin/pdflatex...not found Checking for program /usr/bin/pdflatex...found Defined make macro "PDFLATEX" to "/usr/bin/pdflatex" Checking for program /home/florian/software/bin/bfort...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/bfort...not found Checking for program /usr/local/sbin/bfort...not found Checking for program /usr/local/bin/bfort...not found Checking for program /usr/bin/bfort...not found Checking for program /usr/lib/jvm/default/bin/bfort...not found Checking for program /opt/paraview/bin/bfort...not found Checking for program /usr/bin/site_perl/bfort...not found Checking for program /usr/bin/vendor_perl/bfort...not found Checking for program /usr/bin/core_perl/bfort...not found Checking for program /home/florian/bfort...not found Checking for program /home/florian/software/petsc/bin/win32fe/bfort...not found Checking for program /home/florian/software/bin/doctext...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/doctext...not found Checking for program /usr/local/sbin/doctext...not found Checking for program /usr/local/bin/doctext...not found Checking for program /usr/bin/doctext...not found Checking for program /usr/lib/jvm/default/bin/doctext...not found Checking for program /opt/paraview/bin/doctext...not found Checking for program /usr/bin/site_perl/doctext...not found Checking for program /usr/bin/vendor_perl/doctext...not found Checking for program /usr/bin/core_perl/doctext...not found Checking for program /home/florian/doctext...not found Checking for program /home/florian/software/petsc/bin/win32fe/doctext...not found Checking for program /home/florian/software/bin/mapnames...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/mapnames...not found Checking for program /usr/local/sbin/mapnames...not found Checking for program /usr/local/bin/mapnames...not found Checking for program /usr/bin/mapnames...not found Checking for program /usr/lib/jvm/default/bin/mapnames...not found Checking for program /opt/paraview/bin/mapnames...not found Checking for program /usr/bin/site_perl/mapnames...not found Checking for program /usr/bin/vendor_perl/mapnames...not found Checking for program /usr/bin/core_perl/mapnames...not found Checking for program /home/florian/mapnames...not found Checking for program /home/florian/software/petsc/bin/win32fe/mapnames...not found Checking for program /home/florian/software/bin/bib2html...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/bib2html...not found Checking for program /usr/local/sbin/bib2html...not found Checking for program /usr/local/bin/bib2html...not found Checking for program /usr/bin/bib2html...not found Checking for program /usr/lib/jvm/default/bin/bib2html...not found Checking for program /opt/paraview/bin/bib2html...not found Checking for program /usr/bin/site_perl/bib2html...not found Checking for program /usr/bin/vendor_perl/bib2html...not found Checking for program /usr/bin/core_perl/bib2html...not found Checking for program /home/florian/bib2html...not found Checking for program /home/florian/software/petsc/bin/win32fe/bib2html...not found Bfort not found. Installing sowing for FortranStubs Pushing language C ================================================================================ TEST configureLibrary from config.packages.sowing(/home/florian/software/petsc/config/BuildSystem/config/package.py:679) TESTING: configureLibrary from config.packages.sowing(config/BuildSystem/config/package.py:679) Find an installation and check if it can work with PETSc ================================================================================== Checking for a functional sowing Looking for SOWING at git.sowing, hg.sowing or a directory starting with ['sowing'] Could not locate an existing copy of SOWING: [] Downloading sowing =============================================================================== Trying to download git://https://bitbucket.org/petsc/pkg-sowing.git for SOWING =============================================================================== Executing: git clone https://bitbucket.org/petsc/pkg-sowing.git /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing Looking for SOWING at git.sowing, hg.sowing or a directory starting with ['sowing'] Found a copy of SOWING in git.sowing Executing: ['git', 'rev-parse', '--git-dir'] stdout: .git Executing: ['git', 'cat-file', '-e', 'v1.1.20-pre2^{commit}'] Executing: ['git', 'rev-parse', 'v1.1.20-pre2'] stdout: ccefa3bdb30a16ad32f3a5117ce4149d4ed0e738 Executing: ['git', 'stash'] stdout: No local changes to save Executing: ['git', 'clean', '-f', '-d', '-x'] Executing: ['git', 'checkout', '-f', 'ccefa3bdb30a16ad32f3a5117ce4149d4ed0e738'] Have to rebuild SOWING, /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/sowing.petscconf != /home/florian/software/petsc/arch-linux2-c-debug/lib/petsc/conf/pkg.conf.sowing =============================================================================== Running configure on SOWING; this may take several minutes =============================================================================== Executing: cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing && ./configure --prefix=/home/florian/software/petsc/arch-linux2-c-debug stdout: checking for ranlib... ranlib checking for a BSD-compatible install... /usr/bin/install -c checking whether install works... yes checking for ar... ar checking for gcc... gcc checking whether the C compiler works... yes checking for C compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... none needed checking for c++... c++ checking whether we are using the GNU C++ compiler... yes checking whether c++ accepts -g... yes checking for virtual path format... VPATH checking for latex... /usr/bin/latex checking for gs... /usr/bin/gs checking for pnmcrop... no checking for pbmtoxbm... no checking for ppmtogif... no checking for pnmquant... no checking for perl... /usr/bin/perl checking how to run the C preprocessor... gcc -E checking for grep that handles long lines and -e... /usr/bin/grep checking for egrep... /usr/bin/grep -E checking for ANSI C header files... yes checking for sys/types.h... yes checking for sys/stat.h... yes checking for stdlib.h... yes checking for string.h... yes checking for memory.h... yes checking for strings.h... yes checking for inttypes.h... yes checking for stdint.h... yes checking for unistd.h... yes checking fcntl.h usability... yes checking fcntl.h presence... yes checking for fcntl.h... yes checking sys/time.h usability... yes checking sys/time.h presence... yes checking for sys/time.h... yes checking for unistd.h... (cached) yes checking pwd.h usability... yes checking pwd.h presence... yes checking for pwd.h... yes checking for stdlib.h... (cached) yes checking netdb.h usability... yes checking netdb.h presence... yes checking for netdb.h... yes checking for string.h... (cached) yes checking for an ANSI C-conforming const... yes checking for C/C++ restrict keyword... __restrict checking for uid_t in sys/types.h... yes checking for size_t... yes checking whether time.h and sys/time.h may both be included... yes checking whether struct tm is in sys/time.h or time.h... time.h checking size of void *... 8 checking size of int... 4 checking size of long... 8 checking size of long long... 8 checking for vprintf... yes checking for _doprnt... no checking for getcwd... yes checking for gethostname... yes checking for getwd... yes checking for mkdir... yes checking that mkdir accepts -p... yes checking for uname... yes checking for gethostbyname... yes checking how to run the C++ preprocessor... c++ -E checking time.h usability... yes checking time.h presence... yes checking for time.h... yes checking sys/param.h usability... yes checking sys/param.h presence... yes checking for sys/param.h... yes checking for realpath... yes checking for readlink... yes configure: creating ./config.status config.status: creating Makefile config.status: creating Makerules config.status: creating src/Makefile config.status: creating src/sys/Makefile config.status: creating src/sys/testing/Makefile config.status: creating src/tohtml/Makefile config.status: creating src/tohtml/tohtmlpath.h config.status: creating src/tohtml/testing/Makefile config.status: creating bin/pstoxbm config.status: creating bin/pstogif config.status: creating bin/bib2html config.status: creating src/bfort/Makefile config.status: creating src/bfort/testing/Makefile config.status: creating src/textfilt/Makefile config.status: creating src/doctext/Makefile config.status: creating src/doctext/docpath.h config.status: creating src/doctext/test/Makefile config.status: creating src/mapnames/Makefile config.status: creating src/bib2html/Makefile config.status: creating docs/Makefile config.status: creating docs/doctext/Makefile config.status: creating include/patchlevel.h config.status: creating include/textfilt/textpath.h config.status: creating include/sowingconfig.h config.status: executing bib2html commands =============================================================================== Running make on SOWING; this may take several minutes =============================================================================== Executing: cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing && /usr/bin/make clean stdout: for dir in src docs ; do ( cd $dir && /usr/bin/make clean ) ; done make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src' for dir in sys bfort tohtml doctext textfilt mapnames bib2html ; do ( cd $dir ; /usr/bin/make clean ) ; done make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' rm -f *.o *~ make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' /bin/rm -f *.o *~ bfort /bin/rm -f bfort\ win32/debug/* make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' /bin/rm -f *.o *~ tohtml tortf /bin/rm -f tohtml\ win32/debug/* (cd testing && /usr/bin/make clean ) make[3]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml/testing' rm -rf test[1-9] test1[0-9] test2[0-9] rm -f test[1-9].html rm -f latex.err *.hux img*.xbm img*.gif rm -f up.gif previous.gif next.gif rm -f test1[0-9].html test2[0-9].html test7a.html rm -f test[0-9].htm test[1-2][0-9].htm rm -f testf1.ps testf1.gif rm -f inplace subfiles rm -f *.ler *.aux *.out make[3]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml/testing' make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' /bin/rm -f *.o *~ doctext doc2lt (cd test ; if [ -s Makefile ] ; then /usr/bin/make clean ; fi ) make[3]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext/test' rm -f *.o *~ *.3 *.2 *.html *.tex f1.cit make[3]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext/test' /bin/rm -f doctext\ win32/debug/* make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' /bin/rm -f *.o *~ make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames' /bin/rm -f *.o *~ mapnames ccc make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bib2html' rm -f tout.htm tout-bib.htm make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bib2html' make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src' make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs' rm -f bfort.ps tohtml.ps install.ps doctext.ps \ bfort.pdf tohtml.pdf install.pdf doctext.pdf \ *.aux *.dvi *.toc *.log *.fn *.hux *.err *.blg *.bbl (cd doctext&& /usr/bin/make clean) make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs/doctext' rm -f *.fn *.aux *.blg *.toc *.lof *.lot *.dvi *.fns *.bbl *.log \ *.err *.hux doctext.ps doctext.pdf make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs/doctext' make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs' /bin/rm -f lib/libsowing.a lib/libtfilter.a Executing: cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing && /usr/bin/make stdout: (cd src/sys && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' gcc -I../../include -I../../include -c arch.c gcc -I../../include -I../../include -c txt.c gcc -I../../include -I../../include -c daytime.c gcc -I../../include -I../../include -c file.c gcc -I../../include -I../../include -c tr.c gcc -I../../include -I../../include -c getopts.c gcc -I../../include -I../../include -c rdconfig.c ar cr ../../lib/libsowing.a arch.o txt.o daytime.o file.o tr.o getopts.o rdconfig.o ranlib ../../lib/libsowing.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' (cd src/tohtml && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' gcc -I../../include -I. -I../../include -c tohtml.c gcc -I../../include -I. -I../../include -c tex2html.c gcc -I../../include -I. -I../../include -c search.c gcc -I../../include -I. -I../../include -c texactio.c gcc -I../../include -I. -I../../include -c rdaux.c gcc -I../../include -I. -I../../include -c rdindx.c gcc -I../../include -I. -I../../include -c label.c gcc -I../../include -I. -I../../include -c scan.c gcc -I../../include -I. -I../../include -c refmap.c gcc -I../../include -I. -I../../include -c style.c gcc -I../../include -I. -I../../include -c dimen.c gcc -I../../include -I. -I../../include -c userdef.c gcc -I../../include -I. -I../../include -c tabular.c gcc -I../../include -I. -I../../include -c biblio.c gcc -I../../include -I. -I../../include -c environ.c gcc -I../../include -I. -I../../include -c math.c gcc -I../../include -I. -I../../include -c rddefs.c gcc -I../../include -I. -I../../include -c latexinfo.c gcc -I../../include -I. -I../../include -c accent.c gcc -I../../include -I. -I../../include -c simpleif.c gcc -o tohtml tohtml.o tex2html.o search.o texactio.o rdaux.o rdindx.o label.o scan.o refmap.o style.o dimen.o userdef.o tabular.o biblio.o environ.o math.o rddefs.o latexinfo.o accent.o simpleif.o ../../lib/libsowing.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' (cd src/bfort && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' gcc -DBASEDEF='"/home/florian/software/petsc/arch-linux2-c-debug/share/bfort-base.txt"' -DBASEPATH='"/home/florian/software/petsc/arch-linux2-c-debug/share"' -I../../include -I../../include -c bfort.c gcc -DBASEDEF='"/home/florian/software/petsc/arch-linux2-c-debug/share/bfort-base.txt"' -DBASEPATH='"/home/florian/software/petsc/arch-linux2-c-debug/share"' -I../../include -I../../include -c doc.c gcc -o bfort bfort.o doc.o ../../lib/libsowing.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' (cd src/textfilt && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c cmdline.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c file.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c instream.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c outstream.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c search.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c maptok.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c textout.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c texthtml.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c textnroff.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c texttex.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c inutil.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c errhand.cc ar cr ../../lib/libtfilter.a cmdline.o file.o instream.o outstream.o search.o maptok.o textout.o texthtml.o textnroff.o texttex.o inutil.o errhand.o ranlib ../../lib/libtfilter.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' (cd src/doctext && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' c++ -I../../include/textfilt -I../../include -I. -I../../include -c doctext.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c docutil.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c keyword.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c dotfmat.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c incfiles.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c quotefmt.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c textb.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c docfields.cc c++ -o doctext doctext.o docutil.o keyword.o dotfmat.o \ incfiles.o quotefmt.o textb.o docfields.o ../../lib/libtfilter.a c++ -I../../include/textfilt -I../../include -I. -I../../include -c doc2lt.cc c++ -o doc2lt doc2lt.o docutil.o docfields.o ../../lib/libtfilter.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' (cd src/textfilt && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' make[1]: Nothing to be done for 'ALL'. make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' (cd src/mapnames && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames' c++ -I../../include/textfilt -I../../include -c mapnames.cc c++ -o mapnames mapnames.o ../../lib/libtfilter.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames' =============================================================================== Running make install on SOWING; this may take several minutes =============================================================================== Executing: cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing && /usr/bin/make install stdout: /usr/bin/install -c bin/bib2html /home/florian/software/petsc/arch-linux2-c-debug/bin/bib2html /usr/bin/install -c src/doctext/doctext /home/florian/software/petsc/arch-linux2-c-debug/bin/doctext /usr/bin/install -c src/doctext/doc2lt /home/florian/software/petsc/arch-linux2-c-debug/bin/doc2lt /usr/bin/install -c src/tohtml/tohtml /home/florian/software/petsc/arch-linux2-c-debug/bin/tohtml if [ "`cd bin && pwd`" != "`cd /home/florian/software/petsc/arch-linux2-c-debug/bin && pwd`" ] ; then \ /usr/bin/install -c bin/pstoxbm /home/florian/software/petsc/arch-linux2-c-debug/bin/pstoxbm ; \ /usr/bin/install -c bin/pstogif /home/florian/software/petsc/arch-linux2-c-debug/bin/pstogif ; \ fi /usr/bin/install -c src/bfort/bfort /home/florian/software/petsc/arch-linux2-c-debug/bin/bfort /usr/bin/install -c src/mapnames/mapnames /home/florian/software/petsc/arch-linux2-c-debug/bin/mapnames if [ "`cd ./share && pwd`" != "`cd /home/florian/software/petsc/arch-linux2-c-debug/share && pwd`" ] ; then \ /usr/bin/install -c -m 644 ./share/pstoppm.ps /home/florian/software/petsc/arch-linux2-c-debug/share/pstoppm.ps ;\ /usr/bin/install -c -m 644 ./share/basedefs.txt /home/florian/software/petsc/arch-linux2-c-debug/share/basedefs.txt ;\ /usr/bin/install -c -m 644 ./share/blueball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/blueball.gif ;\ /usr/bin/install -c -m 644 ./share/greenball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/greenball.gif ;\ /usr/bin/install -c -m 644 ./share/purpleball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/purpleball.gif ;\ /usr/bin/install -c -m 644 ./share/redball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/redball.gif ;\ /usr/bin/install -c -m 644 ./share/yellowball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/yellowball.gif ;\ /usr/bin/install -c -m 644 ./share/next.xbm /home/florian/software/petsc/arch-linux2-c-debug/share/next.xbm ;\ /usr/bin/install -c -m 644 ./share/up.xbm /home/florian/software/petsc/arch-linux2-c-debug/share/up.xbm ;\ /usr/bin/install -c -m 644 ./share/previous.xbm /home/florian/software/petsc/arch-linux2-c-debug/share/previous.xbm ;\ /usr/bin/install -c -m 644 ./share/next.gif /home/florian/software/petsc/arch-linux2-c-debug/share/next.gif ;\ /usr/bin/install -c -m 644 ./share/up.gif /home/florian/software/petsc/arch-linux2-c-debug/share/up.gif ;\ /usr/bin/install -c -m 644 ./share/previous.gif /home/florian/software/petsc/arch-linux2-c-debug/share/previous.gif ;\ /usr/bin/install -c -m 644 ./share/html.def /home/florian/software/petsc/arch-linux2-c-debug/share/html.def ;\ /usr/bin/install -c -m 644 ./share/latex.def /home/florian/software/petsc/arch-linux2-c-debug/share/latex.def ;\ /usr/bin/install -c -m 644 ./share/nroff.def /home/florian/software/petsc/arch-linux2-c-debug/share/nroff.def ;\ /usr/bin/install -c -m 644 ./share/refman.def /home/florian/software/petsc/arch-linux2-c-debug/share/refman.def ;\ /usr/bin/install -c -m 644 ./share/refman.sty /home/florian/software/petsc/arch-linux2-c-debug/share/refman.sty ;\ /usr/bin/install -c -m 644 ./share/doctext/html.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/html.def ;\ /usr/bin/install -c -m 644 ./share/doctext/htmlcolor.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/htmlcolor.def ;\ /usr/bin/install -c -m 644 ./share/doctext/htmltabl.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/htmltabl.def ;\ /usr/bin/install -c -m 644 ./share/doctext/htmlargtbl.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/htmlargtbl.def ;\ /usr/bin/install -c -m 644 ./share/doctext/latex.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/latex.def ;\ /usr/bin/install -c -m 644 ./share/doctext/latexargtbl.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/latexargtbl.def ;\ /usr/bin/install -c -m 644 ./share/doctext/nroff.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/nroff.def ;\ fi if [ "`cd ./man/man1 && pwd`" != "`cd /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1 && pwd`" ] ; then \ /usr/bin/install -c -m 644 ./man/man1/tohtml.1 /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1/tohtml.1 ;\ /usr/bin/install -c -m 644 ./man/man1/doctext.1 /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1/doctext.1 ;\ /usr/bin/install -c -m 644 ./man/man1/bfort.1 /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1/bfort.1 ;\ fi ********Output of running make on SOWING follows ******* checking for ranlib... ranlib checking for a BSD-compatible install... /usr/bin/install -c checking whether install works... yes checking for ar... ar checking for gcc... gcc checking whether the C compiler works... yes checking for C compiler default output file name... a.out checking for suffix of executables... checking whether we are cross compiling... no checking for suffix of object files... o checking whether we are using the GNU C compiler... yes checking whether gcc accepts -g... yes checking for gcc option to accept ISO C89... none needed checking for c++... c++ checking whether we are using the GNU C++ compiler... yes checking whether c++ accepts -g... yes checking for virtual path format... VPATH checking for latex... /usr/bin/latex checking for gs... /usr/bin/gs checking for pnmcrop... no checking for pbmtoxbm... no checking for ppmtogif... no checking for pnmquant... no checking for perl... /usr/bin/perl checking how to run the C preprocessor... gcc -E checking for grep that handles long lines and -e... /usr/bin/grep checking for egrep... /usr/bin/grep -E checking for ANSI C header files... yes checking for sys/types.h... yes checking for sys/stat.h... yes checking for stdlib.h... yes checking for string.h... yes checking for memory.h... yes checking for strings.h... yes checking for inttypes.h... yes checking for stdint.h... yes checking for unistd.h... yes checking fcntl.h usability... yes checking fcntl.h presence... yes checking for fcntl.h... yes checking sys/time.h usability... yes checking sys/time.h presence... yes checking for sys/time.h... yes checking for unistd.h... (cached) yes checking pwd.h usability... yes checking pwd.h presence... yes checking for pwd.h... yes checking for stdlib.h... (cached) yes checking netdb.h usability... yes checking netdb.h presence... yes checking for netdb.h... yes checking for string.h... (cached) yes checking for an ANSI C-conforming const... yes checking for C/C++ restrict keyword... __restrict checking for uid_t in sys/types.h... yes checking for size_t... yes checking whether time.h and sys/time.h may both be included... yes checking whether struct tm is in sys/time.h or time.h... time.h checking size of void *... 8 checking size of int... 4 checking size of long... 8 checking size of long long... 8 checking for vprintf... yes checking for _doprnt... no checking for getcwd... yes checking for gethostname... yes checking for getwd... yes checking for mkdir... yes checking that mkdir accepts -p... yes checking for uname... yes checking for gethostbyname... yes checking how to run the C++ preprocessor... c++ -E checking time.h usability... yes checking time.h presence... yes checking for time.h... yes checking sys/param.h usability... yes checking sys/param.h presence... yes checking for sys/param.h... yes checking for realpath... yes checking for readlink... yes configure: creating ./config.status config.status: creating Makefile config.status: creating Makerules config.status: creating src/Makefile config.status: creating src/sys/Makefile config.status: creating src/sys/testing/Makefile config.status: creating src/tohtml/Makefile config.status: creating src/tohtml/tohtmlpath.h config.status: creating src/tohtml/testing/Makefile config.status: creating bin/pstoxbm config.status: creating bin/pstogif config.status: creating bin/bib2html config.status: creating src/bfort/Makefile config.status: creating src/bfort/testing/Makefile config.status: creating src/textfilt/Makefile config.status: creating src/doctext/Makefile config.status: creating src/doctext/docpath.h config.status: creating src/doctext/test/Makefile config.status: creating src/mapnames/Makefile config.status: creating src/bib2html/Makefile config.status: creating docs/Makefile config.status: creating docs/doctext/Makefile config.status: creating include/patchlevel.h config.status: creating include/textfilt/textpath.h config.status: creating include/sowingconfig.h config.status: executing bib2html commandsfor dir in src docs ; do ( cd $dir && /usr/bin/make clean ) ; done make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src' for dir in sys bfort tohtml doctext textfilt mapnames bib2html ; do ( cd $dir ; /usr/bin/make clean ) ; done make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' rm -f *.o *~ make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' /bin/rm -f *.o *~ bfort /bin/rm -f bfort\ win32/debug/* make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' /bin/rm -f *.o *~ tohtml tortf /bin/rm -f tohtml\ win32/debug/* (cd testing && /usr/bin/make clean ) make[3]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml/testing' rm -rf test[1-9] test1[0-9] test2[0-9] rm -f test[1-9].html rm -f latex.err *.hux img*.xbm img*.gif rm -f up.gif previous.gif next.gif rm -f test1[0-9].html test2[0-9].html test7a.html rm -f test[0-9].htm test[1-2][0-9].htm rm -f testf1.ps testf1.gif rm -f inplace subfiles rm -f *.ler *.aux *.out make[3]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml/testing' make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' /bin/rm -f *.o *~ doctext doc2lt (cd test ; if [ -s Makefile ] ; then /usr/bin/make clean ; fi ) make[3]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext/test' rm -f *.o *~ *.3 *.2 *.html *.tex f1.cit make[3]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext/test' /bin/rm -f doctext\ win32/debug/* make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' /bin/rm -f *.o *~ make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames' /bin/rm -f *.o *~ mapnames ccc make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames' make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bib2html' rm -f tout.htm tout-bib.htm make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bib2html' make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src' make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs' rm -f bfort.ps tohtml.ps install.ps doctext.ps \ bfort.pdf tohtml.pdf install.pdf doctext.pdf \ *.aux *.dvi *.toc *.log *.fn *.hux *.err *.blg *.bbl (cd doctext&& /usr/bin/make clean) make[2]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs/doctext' rm -f *.fn *.aux *.blg *.toc *.lof *.lot *.dvi *.fns *.bbl *.log \ *.err *.hux doctext.ps doctext.pdf make[2]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs/doctext' make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/docs' /bin/rm -f lib/libsowing.a lib/libtfilter.a(cd src/sys && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' gcc -I../../include -I../../include -c arch.c gcc -I../../include -I../../include -c txt.c gcc -I../../include -I../../include -c daytime.c gcc -I../../include -I../../include -c file.c gcc -I../../include -I../../include -c tr.c gcc -I../../include -I../../include -c getopts.c gcc -I../../include -I../../include -c rdconfig.c ar cr ../../lib/libsowing.a arch.o txt.o daytime.o file.o tr.o getopts.o rdconfig.o ranlib ../../lib/libsowing.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/sys' (cd src/tohtml && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' gcc -I../../include -I. -I../../include -c tohtml.c gcc -I../../include -I. -I../../include -c tex2html.c gcc -I../../include -I. -I../../include -c search.c gcc -I../../include -I. -I../../include -c texactio.c gcc -I../../include -I. -I../../include -c rdaux.c gcc -I../../include -I. -I../../include -c rdindx.c gcc -I../../include -I. -I../../include -c label.c gcc -I../../include -I. -I../../include -c scan.c gcc -I../../include -I. -I../../include -c refmap.c gcc -I../../include -I. -I../../include -c style.c gcc -I../../include -I. -I../../include -c dimen.c gcc -I../../include -I. -I../../include -c userdef.c gcc -I../../include -I. -I../../include -c tabular.c gcc -I../../include -I. -I../../include -c biblio.c gcc -I../../include -I. -I../../include -c environ.c gcc -I../../include -I. -I../../include -c math.c gcc -I../../include -I. -I../../include -c rddefs.c gcc -I../../include -I. -I../../include -c latexinfo.c gcc -I../../include -I. -I../../include -c accent.c gcc -I../../include -I. -I../../include -c simpleif.c gcc -o tohtml tohtml.o tex2html.o search.o texactio.o rdaux.o rdindx.o label.o scan.o refmap.o style.o dimen.o userdef.o tabular.o biblio.o environ.o math.o rddefs.o latexinfo.o accent.o simpleif.o ../../lib/libsowing.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/tohtml' (cd src/bfort && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' gcc -DBASEDEF='"/home/florian/software/petsc/arch-linux2-c-debug/share/bfort-base.txt"' -DBASEPATH='"/home/florian/software/petsc/arch-linux2-c-debug/share"' -I../../include -I../../include -c bfort.c gcc -DBASEDEF='"/home/florian/software/petsc/arch-linux2-c-debug/share/bfort-base.txt"' -DBASEPATH='"/home/florian/software/petsc/arch-linux2-c-debug/share"' -I../../include -I../../include -c doc.c gcc -o bfort bfort.o doc.o ../../lib/libsowing.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/bfort' (cd src/textfilt && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c cmdline.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c file.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c instream.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c outstream.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c search.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c maptok.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c textout.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c texthtml.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c textnroff.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c texttex.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c inutil.cc c++ -I../../include/textfilt -I../../include -I../../include -I../../include/textfilt -c errhand.cc ar cr ../../lib/libtfilter.a cmdline.o file.o instream.o outstream.o search.o maptok.o textout.o texthtml.o textnroff.o texttex.o inutil.o errhand.o ranlib ../../lib/libtfilter.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' (cd src/doctext && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' c++ -I../../include/textfilt -I../../include -I. -I../../include -c doctext.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c docutil.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c keyword.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c dotfmat.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c incfiles.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c quotefmt.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c textb.cc c++ -I../../include/textfilt -I../../include -I. -I../../include -c docfields.cc c++ -o doctext doctext.o docutil.o keyword.o dotfmat.o \ incfiles.o quotefmt.o textb.o docfields.o ../../lib/libtfilter.a c++ -I../../include/textfilt -I../../include -I. -I../../include -c doc2lt.cc c++ -o doc2lt doc2lt.o docutil.o docfields.o ../../lib/libtfilter.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/doctext' (cd src/textfilt && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' make[1]: Nothing to be done for 'ALL'. make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/textfilt' (cd src/mapnames && /usr/bin/make ) make[1]: Entering directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames' c++ -I../../include/textfilt -I../../include -c mapnames.cc c++ -o mapnames mapnames.o ../../lib/libtfilter.a make[1]: Leaving directory '/home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing/src/mapnames'../../lib/libsowing.a(file.o): In function `SYOpenWritableFile': file.c:(.text+0xc1f): warning: the use of `mktemp' is dangerous, better use `mkstemp' or `mkdtemp' ../../lib/libsowing.a(file.o): In function `SYOpenWritableFile': file.c:(.text+0xc1f): warning: the use of `mktemp' is dangerous, better use `mkstemp' or `mkdtemp' /usr/bin/install -c bin/bib2html /home/florian/software/petsc/arch-linux2-c-debug/bin/bib2html /usr/bin/install -c src/doctext/doctext /home/florian/software/petsc/arch-linux2-c-debug/bin/doctext /usr/bin/install -c src/doctext/doc2lt /home/florian/software/petsc/arch-linux2-c-debug/bin/doc2lt /usr/bin/install -c src/tohtml/tohtml /home/florian/software/petsc/arch-linux2-c-debug/bin/tohtml if [ "`cd bin && pwd`" != "`cd /home/florian/software/petsc/arch-linux2-c-debug/bin && pwd`" ] ; then \ /usr/bin/install -c bin/pstoxbm /home/florian/software/petsc/arch-linux2-c-debug/bin/pstoxbm ; \ /usr/bin/install -c bin/pstogif /home/florian/software/petsc/arch-linux2-c-debug/bin/pstogif ; \ fi /usr/bin/install -c src/bfort/bfort /home/florian/software/petsc/arch-linux2-c-debug/bin/bfort /usr/bin/install -c src/mapnames/mapnames /home/florian/software/petsc/arch-linux2-c-debug/bin/mapnames if [ "`cd ./share && pwd`" != "`cd /home/florian/software/petsc/arch-linux2-c-debug/share && pwd`" ] ; then \ /usr/bin/install -c -m 644 ./share/pstoppm.ps /home/florian/software/petsc/arch-linux2-c-debug/share/pstoppm.ps ;\ /usr/bin/install -c -m 644 ./share/basedefs.txt /home/florian/software/petsc/arch-linux2-c-debug/share/basedefs.txt ;\ /usr/bin/install -c -m 644 ./share/blueball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/blueball.gif ;\ /usr/bin/install -c -m 644 ./share/greenball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/greenball.gif ;\ /usr/bin/install -c -m 644 ./share/purpleball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/purpleball.gif ;\ /usr/bin/install -c -m 644 ./share/redball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/redball.gif ;\ /usr/bin/install -c -m 644 ./share/yellowball.gif /home/florian/software/petsc/arch-linux2-c-debug/share/yellowball.gif ;\ /usr/bin/install -c -m 644 ./share/next.xbm /home/florian/software/petsc/arch-linux2-c-debug/share/next.xbm ;\ /usr/bin/install -c -m 644 ./share/up.xbm /home/florian/software/petsc/arch-linux2-c-debug/share/up.xbm ;\ /usr/bin/install -c -m 644 ./share/previous.xbm /home/florian/software/petsc/arch-linux2-c-debug/share/previous.xbm ;\ /usr/bin/install -c -m 644 ./share/next.gif /home/florian/software/petsc/arch-linux2-c-debug/share/next.gif ;\ /usr/bin/install -c -m 644 ./share/up.gif /home/florian/software/petsc/arch-linux2-c-debug/share/up.gif ;\ /usr/bin/install -c -m 644 ./share/previous.gif /home/florian/software/petsc/arch-linux2-c-debug/share/previous.gif ;\ /usr/bin/install -c -m 644 ./share/html.def /home/florian/software/petsc/arch-linux2-c-debug/share/html.def ;\ /usr/bin/install -c -m 644 ./share/latex.def /home/florian/software/petsc/arch-linux2-c-debug/share/latex.def ;\ /usr/bin/install -c -m 644 ./share/nroff.def /home/florian/software/petsc/arch-linux2-c-debug/share/nroff.def ;\ /usr/bin/install -c -m 644 ./share/refman.def /home/florian/software/petsc/arch-linux2-c-debug/share/refman.def ;\ /usr/bin/install -c -m 644 ./share/refman.sty /home/florian/software/petsc/arch-linux2-c-debug/share/refman.sty ;\ /usr/bin/install -c -m 644 ./share/doctext/html.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/html.def ;\ /usr/bin/install -c -m 644 ./share/doctext/htmlcolor.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/htmlcolor.def ;\ /usr/bin/install -c -m 644 ./share/doctext/htmltabl.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/htmltabl.def ;\ /usr/bin/install -c -m 644 ./share/doctext/htmlargtbl.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/htmlargtbl.def ;\ /usr/bin/install -c -m 644 ./share/doctext/latex.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/latex.def ;\ /usr/bin/install -c -m 644 ./share/doctext/latexargtbl.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/latexargtbl.def ;\ /usr/bin/install -c -m 644 ./share/doctext/nroff.def /home/florian/software/petsc/arch-linux2-c-debug/share/doctext/nroff.def ;\ fi if [ "`cd ./man/man1 && pwd`" != "`cd /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1 && pwd`" ] ; then \ /usr/bin/install -c -m 644 ./man/man1/tohtml.1 /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1/tohtml.1 ;\ /usr/bin/install -c -m 644 ./man/man1/doctext.1 /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1/doctext.1 ;\ /usr/bin/install -c -m 644 ./man/man1/bfort.1 /home/florian/software/petsc/arch-linux2-c-debug/share/man/man1/bfort.1 ;\ fi********End of Output of running make on SOWING ******* Not checking for library in Download SOWING: [] because no functions given to check for ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names No functions to check for in library [] [] Checking for headers Download SOWING: ['/home/florian/software/petsc/arch-linux2-c-debug/include', '/usr/include', '/usr/lib/openmpi'] ================================================================================ TEST checkSharedLibrary from config.packages.sowing(/home/florian/software/petsc/config/BuildSystem/config/package.py:738) TESTING: checkSharedLibrary from config.packages.sowing(config/BuildSystem/config/package.py:738) By default we don't care about checking if the library is shared Popping language C Checking for program /home/florian/software/petsc/arch-linux2-c-debug/bin/bfort...found Defined make macro "BFORT" to "/home/florian/software/petsc/arch-linux2-c-debug/bin/bfort" Checking for program /home/florian/software/petsc/arch-linux2-c-debug/bin/doctext...found Defined make macro "DOCTEXT" to "/home/florian/software/petsc/arch-linux2-c-debug/bin/doctext" Checking for program /home/florian/software/petsc/arch-linux2-c-debug/bin/mapnames...found Defined make macro "MAPNAMES" to "/home/florian/software/petsc/arch-linux2-c-debug/bin/mapnames" Checking for program /home/florian/software/petsc/arch-linux2-c-debug/bin/bib2html...found Defined make macro "BIB2HTML" to "/home/florian/software/petsc/arch-linux2-c-debug/bin/bib2html" Running /home/florian/software/petsc/arch-linux2-c-debug/bin/bfort to generate fortran stubs ================================================================================ TEST alternateConfigureLibrary from config.packages.saws(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.saws(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.revolve(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.revolve(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from config.packages.pthread(/home/florian/software/petsc/config/BuildSystem/config/packages/pthread.py:19) TESTING: configureLibrary from config.packages.pthread(config/BuildSystem/config/packages/pthread.py:19) Checks for pthread_barrier_t, cpu_set_t, and sys/sysctl.h ================================================================================== Checking for a functional pthread Checking for library in Compiler specific search PTHREAD: [] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [pthread_create] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char pthread_create(); static void _check_pthread_create() { pthread_create(); } int main() { _check_pthread_create();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C Checking for headers Compiler specific search PTHREAD: ['/usr/include', '/usr/lib/openmpi'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['pthread.h'] in ['/usr/include', '/usr/lib/openmpi'] Checking include with compiler flags var CPPFLAGS ['/usr/include', '/usr/lib/openmpi'] Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.headers -I/usr/include -I/usr/lib/openmpi /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/pthread.h" 1 3 4 # 21 "/usr/include/pthread.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 27 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 28 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 23 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/sched.h" 1 3 4 # 28 "/usr/include/sched.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 29 "/usr/include/sched.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 35 "/usr/include/sched.h" 2 3 4 typedef __pid_t pid_t; # 1 "/usr/include/bits/sched.h" 1 3 4 # 73 "/usr/include/bits/sched.h" 3 4 struct sched_param { int __sched_priority; }; # 96 "/usr/include/bits/sched.h" 3 4 struct __sched_param { int __sched_priority; }; # 119 "/usr/include/bits/sched.h" 3 4 typedef unsigned long int __cpu_mask; typedef struct { __cpu_mask __bits[1024 / (8 * sizeof (__cpu_mask))]; } cpu_set_t; # 202 "/usr/include/bits/sched.h" 3 4 extern int __sched_cpucount (size_t __setsize, const cpu_set_t *__setp) __attribute__ ((__nothrow__ , __leaf__)); extern cpu_set_t *__sched_cpualloc (size_t __count) __attribute__ ((__nothrow__ , __leaf__)) ; extern void __sched_cpufree (cpu_set_t *__set) __attribute__ ((__nothrow__ , __leaf__)); # 44 "/usr/include/sched.h" 2 3 4 extern int sched_setparam (__pid_t __pid, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getparam (__pid_t __pid, struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_setscheduler (__pid_t __pid, int __policy, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getscheduler (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_yield (void) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_max (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_min (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_rr_get_interval (__pid_t __pid, struct timespec *__t) __attribute__ ((__nothrow__ , __leaf__)); # 126 "/usr/include/sched.h" 3 4 # 24 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 29 "/usr/include/time.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 38 "/usr/include/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 42 "/usr/include/time.h" 2 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 131 "/usr/include/time.h" 3 4 struct tm { int tm_sec; int tm_min; int tm_hour; int tm_mday; int tm_mon; int tm_year; int tm_wday; int tm_yday; int tm_isdst; long int tm_gmtoff; const char *tm_zone; }; struct itimerspec { struct timespec it_interval; struct timespec it_value; }; struct sigevent; # 186 "/usr/include/time.h" 3 4 extern clock_t clock (void) __attribute__ ((__nothrow__ , __leaf__)); extern time_t time (time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern double difftime (time_t __time1, time_t __time0) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern time_t mktime (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern size_t strftime (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); # 221 "/usr/include/time.h" 3 4 # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 222 "/usr/include/time.h" 2 3 4 extern size_t strftime_l (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)); # 236 "/usr/include/time.h" 3 4 extern struct tm *gmtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *gmtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime (const struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime_r (const struct tm *__restrict __tp, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime_r (const time_t *__restrict __timer, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *__tzname[2]; extern int __daylight; extern long int __timezone; extern char *tzname[2]; extern void tzset (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daylight; extern long int timezone; extern int stime (const time_t *__when) __attribute__ ((__nothrow__ , __leaf__)); # 319 "/usr/include/time.h" 3 4 extern time_t timegm (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern time_t timelocal (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int dysize (int __year) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 334 "/usr/include/time.h" 3 4 extern int nanosleep (const struct timespec *__requested_time, struct timespec *__remaining); extern int clock_getres (clockid_t __clock_id, struct timespec *__res) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_gettime (clockid_t __clock_id, struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_settime (clockid_t __clock_id, const struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_nanosleep (clockid_t __clock_id, int __flags, const struct timespec *__req, struct timespec *__rem); extern int clock_getcpuclockid (pid_t __pid, clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_create (clockid_t __clock_id, struct sigevent *__restrict __evp, timer_t *__restrict __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_delete (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_settime (timer_t __timerid, int __flags, const struct itimerspec *__restrict __value, struct itimerspec *__restrict __ovalue) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_gettime (timer_t __timerid, struct itimerspec *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_getoverrun (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timespec_get (struct timespec *__ts, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 430 "/usr/include/time.h" 3 4 # 25 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 27 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/setjmp.h" 1 3 4 # 26 "/usr/include/bits/setjmp.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 27 "/usr/include/bits/setjmp.h" 2 3 4 typedef long int __jmp_buf[8]; # 28 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/pthread.h" 2 3 4 enum { PTHREAD_CREATE_JOINABLE, PTHREAD_CREATE_DETACHED }; enum { PTHREAD_MUTEX_TIMED_NP, PTHREAD_MUTEX_RECURSIVE_NP, PTHREAD_MUTEX_ERRORCHECK_NP, PTHREAD_MUTEX_ADAPTIVE_NP , PTHREAD_MUTEX_NORMAL = PTHREAD_MUTEX_TIMED_NP, PTHREAD_MUTEX_RECURSIVE = PTHREAD_MUTEX_RECURSIVE_NP, PTHREAD_MUTEX_ERRORCHECK = PTHREAD_MUTEX_ERRORCHECK_NP, PTHREAD_MUTEX_DEFAULT = PTHREAD_MUTEX_NORMAL }; enum { PTHREAD_MUTEX_STALLED, PTHREAD_MUTEX_STALLED_NP = PTHREAD_MUTEX_STALLED, PTHREAD_MUTEX_ROBUST, PTHREAD_MUTEX_ROBUST_NP = PTHREAD_MUTEX_ROBUST }; enum { PTHREAD_PRIO_NONE, PTHREAD_PRIO_INHERIT, PTHREAD_PRIO_PROTECT }; # 114 "/usr/include/pthread.h" 3 4 enum { PTHREAD_RWLOCK_PREFER_READER_NP, PTHREAD_RWLOCK_PREFER_WRITER_NP, PTHREAD_RWLOCK_PREFER_WRITER_NONRECURSIVE_NP, PTHREAD_RWLOCK_DEFAULT_NP = PTHREAD_RWLOCK_PREFER_READER_NP }; # 155 "/usr/include/pthread.h" 3 4 enum { PTHREAD_INHERIT_SCHED, PTHREAD_EXPLICIT_SCHED }; enum { PTHREAD_SCOPE_SYSTEM, PTHREAD_SCOPE_PROCESS }; enum { PTHREAD_PROCESS_PRIVATE, PTHREAD_PROCESS_SHARED }; # 190 "/usr/include/pthread.h" 3 4 struct _pthread_cleanup_buffer { void (*__routine) (void *); void *__arg; int __canceltype; struct _pthread_cleanup_buffer *__prev; }; enum { PTHREAD_CANCEL_ENABLE, PTHREAD_CANCEL_DISABLE }; enum { PTHREAD_CANCEL_DEFERRED, PTHREAD_CANCEL_ASYNCHRONOUS }; # 228 "/usr/include/pthread.h" 3 4 extern int pthread_create (pthread_t *__restrict __newthread, const pthread_attr_t *__restrict __attr, void *(*__start_routine) (void *), void *__restrict __arg) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 3))); extern void pthread_exit (void *__retval) __attribute__ ((__noreturn__)); extern int pthread_join (pthread_t __th, void **__thread_return); # 271 "/usr/include/pthread.h" 3 4 extern int pthread_detach (pthread_t __th) __attribute__ ((__nothrow__ , __leaf__)); extern pthread_t pthread_self (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int pthread_equal (pthread_t __thread1, pthread_t __thread2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int pthread_attr_init (pthread_attr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_destroy (pthread_attr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getdetachstate (const pthread_attr_t *__attr, int *__detachstate) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setdetachstate (pthread_attr_t *__attr, int __detachstate) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getguardsize (const pthread_attr_t *__attr, size_t *__guardsize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setguardsize (pthread_attr_t *__attr, size_t __guardsize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getschedparam (const pthread_attr_t *__restrict __attr, struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setschedparam (pthread_attr_t *__restrict __attr, const struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_getschedpolicy (const pthread_attr_t *__restrict __attr, int *__restrict __policy) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setschedpolicy (pthread_attr_t *__attr, int __policy) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getinheritsched (const pthread_attr_t *__restrict __attr, int *__restrict __inherit) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setinheritsched (pthread_attr_t *__attr, int __inherit) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getscope (const pthread_attr_t *__restrict __attr, int *__restrict __scope) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setscope (pthread_attr_t *__attr, int __scope) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getstackaddr (const pthread_attr_t *__restrict __attr, void **__restrict __stackaddr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) __attribute__ ((__deprecated__)); extern int pthread_attr_setstackaddr (pthread_attr_t *__attr, void *__stackaddr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) __attribute__ ((__deprecated__)); extern int pthread_attr_getstacksize (const pthread_attr_t *__restrict __attr, size_t *__restrict __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setstacksize (pthread_attr_t *__attr, size_t __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getstack (const pthread_attr_t *__restrict __attr, void **__restrict __stackaddr, size_t *__restrict __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern int pthread_attr_setstack (pthread_attr_t *__attr, void *__stackaddr, size_t __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 429 "/usr/include/pthread.h" 3 4 extern int pthread_setschedparam (pthread_t __target_thread, int __policy, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))); extern int pthread_getschedparam (pthread_t __target_thread, int *__restrict __policy, struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern int pthread_setschedprio (pthread_t __target_thread, int __prio) __attribute__ ((__nothrow__ , __leaf__)); # 494 "/usr/include/pthread.h" 3 4 extern int pthread_once (pthread_once_t *__once_control, void (*__init_routine) (void)) __attribute__ ((__nonnull__ (1, 2))); # 506 "/usr/include/pthread.h" 3 4 extern int pthread_setcancelstate (int __state, int *__oldstate); extern int pthread_setcanceltype (int __type, int *__oldtype); extern int pthread_cancel (pthread_t __th); extern void pthread_testcancel (void); typedef struct { struct { __jmp_buf __cancel_jmp_buf; int __mask_was_saved; } __cancel_jmp_buf[1]; void *__pad[4]; } __pthread_unwind_buf_t __attribute__ ((__aligned__)); # 540 "/usr/include/pthread.h" 3 4 struct __pthread_cleanup_frame { void (*__cancel_routine) (void *); void *__cancel_arg; int __do_it; int __cancel_type; }; # 680 "/usr/include/pthread.h" 3 4 extern void __pthread_register_cancel (__pthread_unwind_buf_t *__buf) ; # 692 "/usr/include/pthread.h" 3 4 extern void __pthread_unregister_cancel (__pthread_unwind_buf_t *__buf) ; # 733 "/usr/include/pthread.h" 3 4 extern void __pthread_unwind_next (__pthread_unwind_buf_t *__buf) __attribute__ ((__noreturn__)) __attribute__ ((__weak__)) ; struct __jmp_buf_tag; extern int __sigsetjmp (struct __jmp_buf_tag *__env, int __savemask) __attribute__ ((__nothrow__)); extern int pthread_mutex_init (pthread_mutex_t *__mutex, const pthread_mutexattr_t *__mutexattr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_destroy (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_trylock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_lock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_timedlock (pthread_mutex_t *__restrict __mutex, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutex_unlock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_getprioceiling (const pthread_mutex_t * __restrict __mutex, int *__restrict __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutex_setprioceiling (pthread_mutex_t *__restrict __mutex, int __prioceiling, int *__restrict __old_ceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 3))); extern int pthread_mutex_consistent (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 806 "/usr/include/pthread.h" 3 4 extern int pthread_mutexattr_init (pthread_mutexattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_destroy (pthread_mutexattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getpshared (const pthread_mutexattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setpshared (pthread_mutexattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_gettype (const pthread_mutexattr_t *__restrict __attr, int *__restrict __kind) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_settype (pthread_mutexattr_t *__attr, int __kind) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getprotocol (const pthread_mutexattr_t * __restrict __attr, int *__restrict __protocol) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setprotocol (pthread_mutexattr_t *__attr, int __protocol) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getprioceiling (const pthread_mutexattr_t * __restrict __attr, int *__restrict __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setprioceiling (pthread_mutexattr_t *__attr, int __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getrobust (const pthread_mutexattr_t *__attr, int *__robustness) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setrobust (pthread_mutexattr_t *__attr, int __robustness) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 888 "/usr/include/pthread.h" 3 4 extern int pthread_rwlock_init (pthread_rwlock_t *__restrict __rwlock, const pthread_rwlockattr_t *__restrict __attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_destroy (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_rdlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_tryrdlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_timedrdlock (pthread_rwlock_t *__restrict __rwlock, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlock_wrlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_trywrlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_timedwrlock (pthread_rwlock_t *__restrict __rwlock, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlock_unlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_init (pthread_rwlockattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_destroy (pthread_rwlockattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_getpshared (const pthread_rwlockattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlockattr_setpshared (pthread_rwlockattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_getkind_np (const pthread_rwlockattr_t * __restrict __attr, int *__restrict __pref) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlockattr_setkind_np (pthread_rwlockattr_t *__attr, int __pref) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_init (pthread_cond_t *__restrict __cond, const pthread_condattr_t *__restrict __cond_attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_destroy (pthread_cond_t *__cond) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_signal (pthread_cond_t *__cond) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_broadcast (pthread_cond_t *__cond) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_wait (pthread_cond_t *__restrict __cond, pthread_mutex_t *__restrict __mutex) __attribute__ ((__nonnull__ (1, 2))); # 1000 "/usr/include/pthread.h" 3 4 extern int pthread_cond_timedwait (pthread_cond_t *__restrict __cond, pthread_mutex_t *__restrict __mutex, const struct timespec *__restrict __abstime) __attribute__ ((__nonnull__ (1, 2, 3))); extern int pthread_condattr_init (pthread_condattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_destroy (pthread_condattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_getpshared (const pthread_condattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_condattr_setpshared (pthread_condattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_getclock (const pthread_condattr_t * __restrict __attr, __clockid_t *__restrict __clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_condattr_setclock (pthread_condattr_t *__attr, __clockid_t __clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 1044 "/usr/include/pthread.h" 3 4 extern int pthread_spin_init (pthread_spinlock_t *__lock, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_destroy (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_lock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_trylock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_unlock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_init (pthread_barrier_t *__restrict __barrier, const pthread_barrierattr_t *__restrict __attr, unsigned int __count) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_destroy (pthread_barrier_t *__barrier) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_wait (pthread_barrier_t *__barrier) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_init (pthread_barrierattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_destroy (pthread_barrierattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_getpshared (const pthread_barrierattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_barrierattr_setpshared (pthread_barrierattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 1111 "/usr/include/pthread.h" 3 4 extern int pthread_key_create (pthread_key_t *__key, void (*__destr_function) (void *)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_key_delete (pthread_key_t __key) __attribute__ ((__nothrow__ , __leaf__)); extern void *pthread_getspecific (pthread_key_t __key) __attribute__ ((__nothrow__ , __leaf__)); extern int pthread_setspecific (pthread_key_t __key, const void *__pointer) __attribute__ ((__nothrow__ , __leaf__)) ; extern int pthread_getcpuclockid (pthread_t __thread_id, __clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 1145 "/usr/include/pthread.h" 3 4 extern int pthread_atfork (void (*__prepare) (void), void (*__parent) (void), void (*__child) (void)) __attribute__ ((__nothrow__ , __leaf__)); # 1159 "/usr/include/pthread.h" 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Found header files ['pthread.h'] in ['/usr/include', '/usr/lib/openmpi'] Popping language C All intermediate test results are stored in /tmp/petsc-KvGRNM/config.packages.pthread Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.pthread/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.pthread/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.pthread/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.pthread/conftest.c:6:20: warning: unused variable 'a' [-Wunused-variable] pthread_barrier_t *a; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { pthread_barrier_t *a; ; return 0; } Defined "HAVE_PTHREAD_BARRIER_T" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.pthread/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.pthread -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.pthread/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.packages.pthread/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.packages.pthread/conftest.c:6:12: warning: unused variable 'a' [-Wunused-variable] cpu_set_t *a; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { cpu_set_t *a; ; return 0; } Defined "HAVE_SCHED_CPU_SET_T" to "1" Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.headers /tmp/petsc-KvGRNM/config.packages.pthread/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.packages.pthread/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.packages.pthread/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.packages.pthread/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.packages.pthread/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.packages.pthread/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.packages.pthread/conftest.c" 2 # 1 "/usr/include/sys/sysctl.h" 1 3 4 # 21 "/usr/include/sys/sysctl.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/sys/sysctl.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 24 "/usr/include/sys/sysctl.h" 2 3 4 # 43 "/usr/include/sys/sysctl.h" 3 4 # 1 "/usr/include/linux/sysctl.h" 1 3 4 # 25 "/usr/include/linux/sysctl.h" 3 4 # 1 "/usr/include/linux/kernel.h" 1 3 4 # 26 "/usr/include/linux/sysctl.h" 2 3 4 # 1 "/usr/include/linux/types.h" 1 3 4 # 27 "/usr/include/linux/sysctl.h" 2 3 4 struct completion; struct __sysctl_args { int *name; int nlen; void *oldval; size_t *oldlenp; void *newval; size_t newlen; unsigned long __unused[4]; }; enum { CTL_KERN=1, CTL_VM=2, CTL_NET=3, CTL_PROC=4, CTL_FS=5, CTL_DEBUG=6, CTL_DEV=7, CTL_BUS=8, CTL_ABI=9, CTL_CPU=10, CTL_ARLAN=254, CTL_S390DBF=5677, CTL_SUNRPC=7249, CTL_PM=9899, CTL_FRV=9898, }; enum { CTL_BUS_ISA=1 }; enum { INOTIFY_MAX_USER_INSTANCES=1, INOTIFY_MAX_USER_WATCHES=2, INOTIFY_MAX_QUEUED_EVENTS=3 }; enum { KERN_OSTYPE=1, KERN_OSRELEASE=2, KERN_OSREV=3, KERN_VERSION=4, KERN_SECUREMASK=5, KERN_PROF=6, KERN_NODENAME=7, KERN_DOMAINNAME=8, KERN_PANIC=15, KERN_REALROOTDEV=16, KERN_SPARC_REBOOT=21, KERN_CTLALTDEL=22, KERN_PRINTK=23, KERN_NAMETRANS=24, KERN_PPC_HTABRECLAIM=25, KERN_PPC_ZEROPAGED=26, KERN_PPC_POWERSAVE_NAP=27, KERN_MODPROBE=28, KERN_SG_BIG_BUFF=29, KERN_ACCT=30, KERN_PPC_L2CR=31, KERN_RTSIGNR=32, KERN_RTSIGMAX=33, KERN_SHMMAX=34, KERN_MSGMAX=35, KERN_MSGMNB=36, KERN_MSGPOOL=37, KERN_SYSRQ=38, KERN_MAX_THREADS=39, KERN_RANDOM=40, KERN_SHMALL=41, KERN_MSGMNI=42, KERN_SEM=43, KERN_SPARC_STOP_A=44, KERN_SHMMNI=45, KERN_OVERFLOWUID=46, KERN_OVERFLOWGID=47, KERN_SHMPATH=48, KERN_HOTPLUG=49, KERN_IEEE_EMULATION_WARNINGS=50, KERN_S390_USER_DEBUG_LOGGING=51, KERN_CORE_USES_PID=52, KERN_TAINTED=53, KERN_CADPID=54, KERN_PIDMAX=55, KERN_CORE_PATTERN=56, KERN_PANIC_ON_OOPS=57, KERN_HPPA_PWRSW=58, KERN_HPPA_UNALIGNED=59, KERN_PRINTK_RATELIMIT=60, KERN_PRINTK_RATELIMIT_BURST=61, KERN_PTY=62, KERN_NGROUPS_MAX=63, KERN_SPARC_SCONS_PWROFF=64, KERN_HZ_TIMER=65, KERN_UNKNOWN_NMI_PANIC=66, KERN_BOOTLOADER_TYPE=67, KERN_RANDOMIZE=68, KERN_SETUID_DUMPABLE=69, KERN_SPIN_RETRY=70, KERN_ACPI_VIDEO_FLAGS=71, KERN_IA64_UNALIGNED=72, KERN_COMPAT_LOG=73, KERN_MAX_LOCK_DEPTH=74, KERN_NMI_WATCHDOG=75, KERN_PANIC_ON_NMI=76, KERN_PANIC_ON_WARN=77, }; enum { VM_UNUSED1=1, VM_UNUSED2=2, VM_UNUSED3=3, VM_UNUSED4=4, VM_OVERCOMMIT_MEMORY=5, VM_UNUSED5=6, VM_UNUSED7=7, VM_UNUSED8=8, VM_UNUSED9=9, VM_PAGE_CLUSTER=10, VM_DIRTY_BACKGROUND=11, VM_DIRTY_RATIO=12, VM_DIRTY_WB_CS=13, VM_DIRTY_EXPIRE_CS=14, VM_NR_PDFLUSH_THREADS=15, VM_OVERCOMMIT_RATIO=16, VM_PAGEBUF=17, VM_HUGETLB_PAGES=18, VM_SWAPPINESS=19, VM_LOWMEM_RESERVE_RATIO=20, VM_MIN_FREE_KBYTES=21, VM_MAX_MAP_COUNT=22, VM_LAPTOP_MODE=23, VM_BLOCK_DUMP=24, VM_HUGETLB_GROUP=25, VM_VFS_CACHE_PRESSURE=26, VM_LEGACY_VA_LAYOUT=27, VM_SWAP_TOKEN_TIMEOUT=28, VM_DROP_PAGECACHE=29, VM_PERCPU_PAGELIST_FRACTION=30, VM_ZONE_RECLAIM_MODE=31, VM_MIN_UNMAPPED=32, VM_PANIC_ON_OOM=33, VM_VDSO_ENABLED=34, VM_MIN_SLAB=35, }; enum { NET_CORE=1, NET_ETHER=2, NET_802=3, NET_UNIX=4, NET_IPV4=5, NET_IPX=6, NET_ATALK=7, NET_NETROM=8, NET_AX25=9, NET_BRIDGE=10, NET_ROSE=11, NET_IPV6=12, NET_X25=13, NET_TR=14, NET_DECNET=15, NET_ECONET=16, NET_SCTP=17, NET_LLC=18, NET_NETFILTER=19, NET_DCCP=20, NET_IRDA=412, }; enum { RANDOM_POOLSIZE=1, RANDOM_ENTROPY_COUNT=2, RANDOM_READ_THRESH=3, RANDOM_WRITE_THRESH=4, RANDOM_BOOT_ID=5, RANDOM_UUID=6 }; enum { PTY_MAX=1, PTY_NR=2 }; enum { BUS_ISA_MEM_BASE=1, BUS_ISA_PORT_BASE=2, BUS_ISA_PORT_SHIFT=3 }; enum { NET_CORE_WMEM_MAX=1, NET_CORE_RMEM_MAX=2, NET_CORE_WMEM_DEFAULT=3, NET_CORE_RMEM_DEFAULT=4, NET_CORE_MAX_BACKLOG=6, NET_CORE_FASTROUTE=7, NET_CORE_MSG_COST=8, NET_CORE_MSG_BURST=9, NET_CORE_OPTMEM_MAX=10, NET_CORE_HOT_LIST_LENGTH=11, NET_CORE_DIVERT_VERSION=12, NET_CORE_NO_CONG_THRESH=13, NET_CORE_NO_CONG=14, NET_CORE_LO_CONG=15, NET_CORE_MOD_CONG=16, NET_CORE_DEV_WEIGHT=17, NET_CORE_SOMAXCONN=18, NET_CORE_BUDGET=19, NET_CORE_AEVENT_ETIME=20, NET_CORE_AEVENT_RSEQTH=21, NET_CORE_WARNINGS=22, }; enum { NET_UNIX_DESTROY_DELAY=1, NET_UNIX_DELETE_DELAY=2, NET_UNIX_MAX_DGRAM_QLEN=3, }; enum { NET_NF_CONNTRACK_MAX=1, NET_NF_CONNTRACK_TCP_TIMEOUT_SYN_SENT=2, NET_NF_CONNTRACK_TCP_TIMEOUT_SYN_RECV=3, NET_NF_CONNTRACK_TCP_TIMEOUT_ESTABLISHED=4, NET_NF_CONNTRACK_TCP_TIMEOUT_FIN_WAIT=5, NET_NF_CONNTRACK_TCP_TIMEOUT_CLOSE_WAIT=6, NET_NF_CONNTRACK_TCP_TIMEOUT_LAST_ACK=7, NET_NF_CONNTRACK_TCP_TIMEOUT_TIME_WAIT=8, NET_NF_CONNTRACK_TCP_TIMEOUT_CLOSE=9, NET_NF_CONNTRACK_UDP_TIMEOUT=10, NET_NF_CONNTRACK_UDP_TIMEOUT_STREAM=11, NET_NF_CONNTRACK_ICMP_TIMEOUT=12, NET_NF_CONNTRACK_GENERIC_TIMEOUT=13, NET_NF_CONNTRACK_BUCKETS=14, NET_NF_CONNTRACK_LOG_INVALID=15, NET_NF_CONNTRACK_TCP_TIMEOUT_MAX_RETRANS=16, NET_NF_CONNTRACK_TCP_LOOSE=17, NET_NF_CONNTRACK_TCP_BE_LIBERAL=18, NET_NF_CONNTRACK_TCP_MAX_RETRANS=19, NET_NF_CONNTRACK_SCTP_TIMEOUT_CLOSED=20, NET_NF_CONNTRACK_SCTP_TIMEOUT_COOKIE_WAIT=21, NET_NF_CONNTRACK_SCTP_TIMEOUT_COOKIE_ECHOED=22, NET_NF_CONNTRACK_SCTP_TIMEOUT_ESTABLISHED=23, NET_NF_CONNTRACK_SCTP_TIMEOUT_SHUTDOWN_SENT=24, NET_NF_CONNTRACK_SCTP_TIMEOUT_SHUTDOWN_RECD=25, NET_NF_CONNTRACK_SCTP_TIMEOUT_SHUTDOWN_ACK_SENT=26, NET_NF_CONNTRACK_COUNT=27, NET_NF_CONNTRACK_ICMPV6_TIMEOUT=28, NET_NF_CONNTRACK_FRAG6_TIMEOUT=29, NET_NF_CONNTRACK_FRAG6_LOW_THRESH=30, NET_NF_CONNTRACK_FRAG6_HIGH_THRESH=31, NET_NF_CONNTRACK_CHECKSUM=32, }; enum { NET_IPV4_FORWARD=8, NET_IPV4_DYNADDR=9, NET_IPV4_CONF=16, NET_IPV4_NEIGH=17, NET_IPV4_ROUTE=18, NET_IPV4_FIB_HASH=19, NET_IPV4_NETFILTER=20, NET_IPV4_TCP_TIMESTAMPS=33, NET_IPV4_TCP_WINDOW_SCALING=34, NET_IPV4_TCP_SACK=35, NET_IPV4_TCP_RETRANS_COLLAPSE=36, NET_IPV4_DEFAULT_TTL=37, NET_IPV4_AUTOCONFIG=38, NET_IPV4_NO_PMTU_DISC=39, NET_IPV4_TCP_SYN_RETRIES=40, NET_IPV4_IPFRAG_HIGH_THRESH=41, NET_IPV4_IPFRAG_LOW_THRESH=42, NET_IPV4_IPFRAG_TIME=43, NET_IPV4_TCP_MAX_KA_PROBES=44, NET_IPV4_TCP_KEEPALIVE_TIME=45, NET_IPV4_TCP_KEEPALIVE_PROBES=46, NET_IPV4_TCP_RETRIES1=47, NET_IPV4_TCP_RETRIES2=48, NET_IPV4_TCP_FIN_TIMEOUT=49, NET_IPV4_IP_MASQ_DEBUG=50, NET_TCP_SYNCOOKIES=51, NET_TCP_STDURG=52, NET_TCP_RFC1337=53, NET_TCP_SYN_TAILDROP=54, NET_TCP_MAX_SYN_BACKLOG=55, NET_IPV4_LOCAL_PORT_RANGE=56, NET_IPV4_ICMP_ECHO_IGNORE_ALL=57, NET_IPV4_ICMP_ECHO_IGNORE_BROADCASTS=58, NET_IPV4_ICMP_SOURCEQUENCH_RATE=59, NET_IPV4_ICMP_DESTUNREACH_RATE=60, NET_IPV4_ICMP_TIMEEXCEED_RATE=61, NET_IPV4_ICMP_PARAMPROB_RATE=62, NET_IPV4_ICMP_ECHOREPLY_RATE=63, NET_IPV4_ICMP_IGNORE_BOGUS_ERROR_RESPONSES=64, NET_IPV4_IGMP_MAX_MEMBERSHIPS=65, NET_TCP_TW_RECYCLE=66, NET_IPV4_ALWAYS_DEFRAG=67, NET_IPV4_TCP_KEEPALIVE_INTVL=68, NET_IPV4_INET_PEER_THRESHOLD=69, NET_IPV4_INET_PEER_MINTTL=70, NET_IPV4_INET_PEER_MAXTTL=71, NET_IPV4_INET_PEER_GC_MINTIME=72, NET_IPV4_INET_PEER_GC_MAXTIME=73, NET_TCP_ORPHAN_RETRIES=74, NET_TCP_ABORT_ON_OVERFLOW=75, NET_TCP_SYNACK_RETRIES=76, NET_TCP_MAX_ORPHANS=77, NET_TCP_MAX_TW_BUCKETS=78, NET_TCP_FACK=79, NET_TCP_REORDERING=80, NET_TCP_ECN=81, NET_TCP_DSACK=82, NET_TCP_MEM=83, NET_TCP_WMEM=84, NET_TCP_RMEM=85, NET_TCP_APP_WIN=86, NET_TCP_ADV_WIN_SCALE=87, NET_IPV4_NONLOCAL_BIND=88, NET_IPV4_ICMP_RATELIMIT=89, NET_IPV4_ICMP_RATEMASK=90, NET_TCP_TW_REUSE=91, NET_TCP_FRTO=92, NET_TCP_LOW_LATENCY=93, NET_IPV4_IPFRAG_SECRET_INTERVAL=94, NET_IPV4_IGMP_MAX_MSF=96, NET_TCP_NO_METRICS_SAVE=97, NET_TCP_DEFAULT_WIN_SCALE=105, NET_TCP_MODERATE_RCVBUF=106, NET_TCP_TSO_WIN_DIVISOR=107, NET_TCP_BIC_BETA=108, NET_IPV4_ICMP_ERRORS_USE_INBOUND_IFADDR=109, NET_TCP_CONG_CONTROL=110, NET_TCP_ABC=111, NET_IPV4_IPFRAG_MAX_DIST=112, NET_TCP_MTU_PROBING=113, NET_TCP_BASE_MSS=114, NET_IPV4_TCP_WORKAROUND_SIGNED_WINDOWS=115, NET_TCP_DMA_COPYBREAK=116, NET_TCP_SLOW_START_AFTER_IDLE=117, NET_CIPSOV4_CACHE_ENABLE=118, NET_CIPSOV4_CACHE_BUCKET_SIZE=119, NET_CIPSOV4_RBM_OPTFMT=120, NET_CIPSOV4_RBM_STRICTVALID=121, NET_TCP_AVAIL_CONG_CONTROL=122, NET_TCP_ALLOWED_CONG_CONTROL=123, NET_TCP_MAX_SSTHRESH=124, NET_TCP_FRTO_RESPONSE=125, }; enum { NET_IPV4_ROUTE_FLUSH=1, NET_IPV4_ROUTE_MIN_DELAY=2, NET_IPV4_ROUTE_MAX_DELAY=3, NET_IPV4_ROUTE_GC_THRESH=4, NET_IPV4_ROUTE_MAX_SIZE=5, NET_IPV4_ROUTE_GC_MIN_INTERVAL=6, NET_IPV4_ROUTE_GC_TIMEOUT=7, NET_IPV4_ROUTE_GC_INTERVAL=8, NET_IPV4_ROUTE_REDIRECT_LOAD=9, NET_IPV4_ROUTE_REDIRECT_NUMBER=10, NET_IPV4_ROUTE_REDIRECT_SILENCE=11, NET_IPV4_ROUTE_ERROR_COST=12, NET_IPV4_ROUTE_ERROR_BURST=13, NET_IPV4_ROUTE_GC_ELASTICITY=14, NET_IPV4_ROUTE_MTU_EXPIRES=15, NET_IPV4_ROUTE_MIN_PMTU=16, NET_IPV4_ROUTE_MIN_ADVMSS=17, NET_IPV4_ROUTE_SECRET_INTERVAL=18, NET_IPV4_ROUTE_GC_MIN_INTERVAL_MS=19, }; enum { NET_PROTO_CONF_ALL=-2, NET_PROTO_CONF_DEFAULT=-3 }; enum { NET_IPV4_CONF_FORWARDING=1, NET_IPV4_CONF_MC_FORWARDING=2, NET_IPV4_CONF_PROXY_ARP=3, NET_IPV4_CONF_ACCEPT_REDIRECTS=4, NET_IPV4_CONF_SECURE_REDIRECTS=5, NET_IPV4_CONF_SEND_REDIRECTS=6, NET_IPV4_CONF_SHARED_MEDIA=7, NET_IPV4_CONF_RP_FILTER=8, NET_IPV4_CONF_ACCEPT_SOURCE_ROUTE=9, NET_IPV4_CONF_BOOTP_RELAY=10, NET_IPV4_CONF_LOG_MARTIANS=11, NET_IPV4_CONF_TAG=12, NET_IPV4_CONF_ARPFILTER=13, NET_IPV4_CONF_MEDIUM_ID=14, NET_IPV4_CONF_NOXFRM=15, NET_IPV4_CONF_NOPOLICY=16, NET_IPV4_CONF_FORCE_IGMP_VERSION=17, NET_IPV4_CONF_ARP_ANNOUNCE=18, NET_IPV4_CONF_ARP_IGNORE=19, NET_IPV4_CONF_PROMOTE_SECONDARIES=20, NET_IPV4_CONF_ARP_ACCEPT=21, NET_IPV4_CONF_ARP_NOTIFY=22, }; enum { NET_IPV4_NF_CONNTRACK_MAX=1, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_SYN_SENT=2, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_SYN_RECV=3, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_ESTABLISHED=4, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_FIN_WAIT=5, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_CLOSE_WAIT=6, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_LAST_ACK=7, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_TIME_WAIT=8, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_CLOSE=9, NET_IPV4_NF_CONNTRACK_UDP_TIMEOUT=10, NET_IPV4_NF_CONNTRACK_UDP_TIMEOUT_STREAM=11, NET_IPV4_NF_CONNTRACK_ICMP_TIMEOUT=12, NET_IPV4_NF_CONNTRACK_GENERIC_TIMEOUT=13, NET_IPV4_NF_CONNTRACK_BUCKETS=14, NET_IPV4_NF_CONNTRACK_LOG_INVALID=15, NET_IPV4_NF_CONNTRACK_TCP_TIMEOUT_MAX_RETRANS=16, NET_IPV4_NF_CONNTRACK_TCP_LOOSE=17, NET_IPV4_NF_CONNTRACK_TCP_BE_LIBERAL=18, NET_IPV4_NF_CONNTRACK_TCP_MAX_RETRANS=19, NET_IPV4_NF_CONNTRACK_SCTP_TIMEOUT_CLOSED=20, NET_IPV4_NF_CONNTRACK_SCTP_TIMEOUT_COOKIE_WAIT=21, NET_IPV4_NF_CONNTRACK_SCTP_TIMEOUT_COOKIE_ECHOED=22, NET_IPV4_NF_CONNTRACK_SCTP_TIMEOUT_ESTABLISHED=23, NET_IPV4_NF_CONNTRACK_SCTP_TIMEOUT_SHUTDOWN_SENT=24, NET_IPV4_NF_CONNTRACK_SCTP_TIMEOUT_SHUTDOWN_RECD=25, NET_IPV4_NF_CONNTRACK_SCTP_TIMEOUT_SHUTDOWN_ACK_SENT=26, NET_IPV4_NF_CONNTRACK_COUNT=27, NET_IPV4_NF_CONNTRACK_CHECKSUM=28, }; enum { NET_IPV6_CONF=16, NET_IPV6_NEIGH=17, NET_IPV6_ROUTE=18, NET_IPV6_ICMP=19, NET_IPV6_BINDV6ONLY=20, NET_IPV6_IP6FRAG_HIGH_THRESH=21, NET_IPV6_IP6FRAG_LOW_THRESH=22, NET_IPV6_IP6FRAG_TIME=23, NET_IPV6_IP6FRAG_SECRET_INTERVAL=24, NET_IPV6_MLD_MAX_MSF=25, }; enum { NET_IPV6_ROUTE_FLUSH=1, NET_IPV6_ROUTE_GC_THRESH=2, NET_IPV6_ROUTE_MAX_SIZE=3, NET_IPV6_ROUTE_GC_MIN_INTERVAL=4, NET_IPV6_ROUTE_GC_TIMEOUT=5, NET_IPV6_ROUTE_GC_INTERVAL=6, NET_IPV6_ROUTE_GC_ELASTICITY=7, NET_IPV6_ROUTE_MTU_EXPIRES=8, NET_IPV6_ROUTE_MIN_ADVMSS=9, NET_IPV6_ROUTE_GC_MIN_INTERVAL_MS=10 }; enum { NET_IPV6_FORWARDING=1, NET_IPV6_HOP_LIMIT=2, NET_IPV6_MTU=3, NET_IPV6_ACCEPT_RA=4, NET_IPV6_ACCEPT_REDIRECTS=5, NET_IPV6_AUTOCONF=6, NET_IPV6_DAD_TRANSMITS=7, NET_IPV6_RTR_SOLICITS=8, NET_IPV6_RTR_SOLICIT_INTERVAL=9, NET_IPV6_RTR_SOLICIT_DELAY=10, NET_IPV6_USE_TEMPADDR=11, NET_IPV6_TEMP_VALID_LFT=12, NET_IPV6_TEMP_PREFERED_LFT=13, NET_IPV6_REGEN_MAX_RETRY=14, NET_IPV6_MAX_DESYNC_FACTOR=15, NET_IPV6_MAX_ADDRESSES=16, NET_IPV6_FORCE_MLD_VERSION=17, NET_IPV6_ACCEPT_RA_DEFRTR=18, NET_IPV6_ACCEPT_RA_PINFO=19, NET_IPV6_ACCEPT_RA_RTR_PREF=20, NET_IPV6_RTR_PROBE_INTERVAL=21, NET_IPV6_ACCEPT_RA_RT_INFO_MAX_PLEN=22, NET_IPV6_PROXY_NDP=23, NET_IPV6_ACCEPT_SOURCE_ROUTE=25, NET_IPV6_ACCEPT_RA_FROM_LOCAL=26, __NET_IPV6_MAX }; enum { NET_IPV6_ICMP_RATELIMIT=1 }; enum { NET_NEIGH_MCAST_SOLICIT=1, NET_NEIGH_UCAST_SOLICIT=2, NET_NEIGH_APP_SOLICIT=3, NET_NEIGH_RETRANS_TIME=4, NET_NEIGH_REACHABLE_TIME=5, NET_NEIGH_DELAY_PROBE_TIME=6, NET_NEIGH_GC_STALE_TIME=7, NET_NEIGH_UNRES_QLEN=8, NET_NEIGH_PROXY_QLEN=9, NET_NEIGH_ANYCAST_DELAY=10, NET_NEIGH_PROXY_DELAY=11, NET_NEIGH_LOCKTIME=12, NET_NEIGH_GC_INTERVAL=13, NET_NEIGH_GC_THRESH1=14, NET_NEIGH_GC_THRESH2=15, NET_NEIGH_GC_THRESH3=16, NET_NEIGH_RETRANS_TIME_MS=17, NET_NEIGH_REACHABLE_TIME_MS=18, }; enum { NET_DCCP_DEFAULT=1, }; enum { NET_IPX_PPROP_BROADCASTING=1, NET_IPX_FORWARDING=2 }; enum { NET_LLC2=1, NET_LLC_STATION=2, }; enum { NET_LLC2_TIMEOUT=1, }; enum { NET_LLC_STATION_ACK_TIMEOUT=1, }; enum { NET_LLC2_ACK_TIMEOUT=1, NET_LLC2_P_TIMEOUT=2, NET_LLC2_REJ_TIMEOUT=3, NET_LLC2_BUSY_TIMEOUT=4, }; enum { NET_ATALK_AARP_EXPIRY_TIME=1, NET_ATALK_AARP_TICK_TIME=2, NET_ATALK_AARP_RETRANSMIT_LIMIT=3, NET_ATALK_AARP_RESOLVE_TIME=4 }; enum { NET_NETROM_DEFAULT_PATH_QUALITY=1, NET_NETROM_OBSOLESCENCE_COUNT_INITIALISER=2, NET_NETROM_NETWORK_TTL_INITIALISER=3, NET_NETROM_TRANSPORT_TIMEOUT=4, NET_NETROM_TRANSPORT_MAXIMUM_TRIES=5, NET_NETROM_TRANSPORT_ACKNOWLEDGE_DELAY=6, NET_NETROM_TRANSPORT_BUSY_DELAY=7, NET_NETROM_TRANSPORT_REQUESTED_WINDOW_SIZE=8, NET_NETROM_TRANSPORT_NO_ACTIVITY_TIMEOUT=9, NET_NETROM_ROUTING_CONTROL=10, NET_NETROM_LINK_FAILS_COUNT=11, NET_NETROM_RESET=12 }; enum { NET_AX25_IP_DEFAULT_MODE=1, NET_AX25_DEFAULT_MODE=2, NET_AX25_BACKOFF_TYPE=3, NET_AX25_CONNECT_MODE=4, NET_AX25_STANDARD_WINDOW=5, NET_AX25_EXTENDED_WINDOW=6, NET_AX25_T1_TIMEOUT=7, NET_AX25_T2_TIMEOUT=8, NET_AX25_T3_TIMEOUT=9, NET_AX25_IDLE_TIMEOUT=10, NET_AX25_N2=11, NET_AX25_PACLEN=12, NET_AX25_PROTOCOL=13, NET_AX25_DAMA_SLAVE_TIMEOUT=14 }; enum { NET_ROSE_RESTART_REQUEST_TIMEOUT=1, NET_ROSE_CALL_REQUEST_TIMEOUT=2, NET_ROSE_RESET_REQUEST_TIMEOUT=3, NET_ROSE_CLEAR_REQUEST_TIMEOUT=4, NET_ROSE_ACK_HOLD_BACK_TIMEOUT=5, NET_ROSE_ROUTING_CONTROL=6, NET_ROSE_LINK_FAIL_TIMEOUT=7, NET_ROSE_MAX_VCS=8, NET_ROSE_WINDOW_SIZE=9, NET_ROSE_NO_ACTIVITY_TIMEOUT=10 }; enum { NET_X25_RESTART_REQUEST_TIMEOUT=1, NET_X25_CALL_REQUEST_TIMEOUT=2, NET_X25_RESET_REQUEST_TIMEOUT=3, NET_X25_CLEAR_REQUEST_TIMEOUT=4, NET_X25_ACK_HOLD_BACK_TIMEOUT=5, NET_X25_FORWARD=6 }; enum { NET_TR_RIF_TIMEOUT=1 }; enum { NET_DECNET_NODE_TYPE = 1, NET_DECNET_NODE_ADDRESS = 2, NET_DECNET_NODE_NAME = 3, NET_DECNET_DEFAULT_DEVICE = 4, NET_DECNET_TIME_WAIT = 5, NET_DECNET_DN_COUNT = 6, NET_DECNET_DI_COUNT = 7, NET_DECNET_DR_COUNT = 8, NET_DECNET_DST_GC_INTERVAL = 9, NET_DECNET_CONF = 10, NET_DECNET_NO_FC_MAX_CWND = 11, NET_DECNET_MEM = 12, NET_DECNET_RMEM = 13, NET_DECNET_WMEM = 14, NET_DECNET_DEBUG_LEVEL = 255 }; enum { NET_DECNET_CONF_LOOPBACK = -2, NET_DECNET_CONF_DDCMP = -3, NET_DECNET_CONF_PPP = -4, NET_DECNET_CONF_X25 = -5, NET_DECNET_CONF_GRE = -6, NET_DECNET_CONF_ETHER = -7 }; enum { NET_DECNET_CONF_DEV_PRIORITY = 1, NET_DECNET_CONF_DEV_T1 = 2, NET_DECNET_CONF_DEV_T2 = 3, NET_DECNET_CONF_DEV_T3 = 4, NET_DECNET_CONF_DEV_FORWARDING = 5, NET_DECNET_CONF_DEV_BLKSIZE = 6, NET_DECNET_CONF_DEV_STATE = 7 }; enum { NET_SCTP_RTO_INITIAL = 1, NET_SCTP_RTO_MIN = 2, NET_SCTP_RTO_MAX = 3, NET_SCTP_RTO_ALPHA = 4, NET_SCTP_RTO_BETA = 5, NET_SCTP_VALID_COOKIE_LIFE = 6, NET_SCTP_ASSOCIATION_MAX_RETRANS = 7, NET_SCTP_PATH_MAX_RETRANS = 8, NET_SCTP_MAX_INIT_RETRANSMITS = 9, NET_SCTP_HB_INTERVAL = 10, NET_SCTP_PRESERVE_ENABLE = 11, NET_SCTP_MAX_BURST = 12, NET_SCTP_ADDIP_ENABLE = 13, NET_SCTP_PRSCTP_ENABLE = 14, NET_SCTP_SNDBUF_POLICY = 15, NET_SCTP_SACK_TIMEOUT = 16, NET_SCTP_RCVBUF_POLICY = 17, }; enum { NET_BRIDGE_NF_CALL_ARPTABLES = 1, NET_BRIDGE_NF_CALL_IPTABLES = 2, NET_BRIDGE_NF_CALL_IP6TABLES = 3, NET_BRIDGE_NF_FILTER_VLAN_TAGGED = 4, NET_BRIDGE_NF_FILTER_PPPOE_TAGGED = 5, }; enum { NET_IRDA_DISCOVERY=1, NET_IRDA_DEVNAME=2, NET_IRDA_DEBUG=3, NET_IRDA_FAST_POLL=4, NET_IRDA_DISCOVERY_SLOTS=5, NET_IRDA_DISCOVERY_TIMEOUT=6, NET_IRDA_SLOT_TIMEOUT=7, NET_IRDA_MAX_BAUD_RATE=8, NET_IRDA_MIN_TX_TURN_TIME=9, NET_IRDA_MAX_TX_DATA_SIZE=10, NET_IRDA_MAX_TX_WINDOW=11, NET_IRDA_MAX_NOREPLY_TIME=12, NET_IRDA_WARN_NOREPLY_TIME=13, NET_IRDA_LAP_KEEPALIVE_TIME=14, }; enum { FS_NRINODE=1, FS_STATINODE=2, FS_MAXINODE=3, FS_NRDQUOT=4, FS_MAXDQUOT=5, FS_NRFILE=6, FS_MAXFILE=7, FS_DENTRY=8, FS_NRSUPER=9, FS_MAXSUPER=10, FS_OVERFLOWUID=11, FS_OVERFLOWGID=12, FS_LEASES=13, FS_DIR_NOTIFY=14, FS_LEASE_TIME=15, FS_DQSTATS=16, FS_XFS=17, FS_AIO_NR=18, FS_AIO_MAX_NR=19, FS_INOTIFY=20, FS_OCFS2=988, }; enum { FS_DQ_LOOKUPS = 1, FS_DQ_DROPS = 2, FS_DQ_READS = 3, FS_DQ_WRITES = 4, FS_DQ_CACHE_HITS = 5, FS_DQ_ALLOCATED = 6, FS_DQ_FREE = 7, FS_DQ_SYNCS = 8, FS_DQ_WARNINGS = 9, }; enum { DEV_CDROM=1, DEV_HWMON=2, DEV_PARPORT=3, DEV_RAID=4, DEV_MAC_HID=5, DEV_SCSI=6, DEV_IPMI=7, }; enum { DEV_CDROM_INFO=1, DEV_CDROM_AUTOCLOSE=2, DEV_CDROM_AUTOEJECT=3, DEV_CDROM_DEBUG=4, DEV_CDROM_LOCK=5, DEV_CDROM_CHECK_MEDIA=6 }; enum { DEV_PARPORT_DEFAULT=-3 }; enum { DEV_RAID_SPEED_LIMIT_MIN=1, DEV_RAID_SPEED_LIMIT_MAX=2 }; enum { DEV_PARPORT_DEFAULT_TIMESLICE=1, DEV_PARPORT_DEFAULT_SPINTIME=2 }; enum { DEV_PARPORT_SPINTIME=1, DEV_PARPORT_BASE_ADDR=2, DEV_PARPORT_IRQ=3, DEV_PARPORT_DMA=4, DEV_PARPORT_MODES=5, DEV_PARPORT_DEVICES=6, DEV_PARPORT_AUTOPROBE=16 }; enum { DEV_PARPORT_DEVICES_ACTIVE=-3, }; enum { DEV_PARPORT_DEVICE_TIMESLICE=1, }; enum { DEV_MAC_HID_KEYBOARD_SENDS_LINUX_KEYCODES=1, DEV_MAC_HID_KEYBOARD_LOCK_KEYCODES=2, DEV_MAC_HID_MOUSE_BUTTON_EMULATION=3, DEV_MAC_HID_MOUSE_BUTTON2_KEYCODE=4, DEV_MAC_HID_MOUSE_BUTTON3_KEYCODE=5, DEV_MAC_HID_ADB_MOUSE_SENDS_KEYCODES=6 }; enum { DEV_SCSI_LOGGING_LEVEL=1, }; enum { DEV_IPMI_POWEROFF_POWERCYCLE=1, }; enum { ABI_DEFHANDLER_COFF=1, ABI_DEFHANDLER_ELF=2, ABI_DEFHANDLER_LCALL7=3, ABI_DEFHANDLER_LIBCSO=4, ABI_TRACE=5, ABI_FAKE_UTSNAME=6, }; # 44 "/usr/include/sys/sysctl.h" 2 3 4 # 63 "/usr/include/sys/sysctl.h" 3 4 # 1 "/usr/include/bits/sysctl.h" 1 3 4 # 64 "/usr/include/sys/sysctl.h" 2 3 4 extern int sysctl (int *__name, int __nlen, void *__oldval, size_t *__oldlenp, void *__newval, size_t __newlen) __attribute__ ((__nothrow__ , __leaf__)); # 3 "/tmp/petsc-KvGRNM/config.packages.pthread/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Defined "HAVE_SYS_SYSCTL_H" to "1" ================================================================================ TEST checkSharedLibrary from config.packages.pthread(/home/florian/software/petsc/config/BuildSystem/config/package.py:738) TESTING: checkSharedLibrary from config.packages.pthread(config/BuildSystem/config/package.py:738) By default we don't care about checking if the library is shared Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.papi(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.papi(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.pami(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.pami(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.p4est(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.p4est(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.opengles(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.opengles(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.mpe(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.mpe(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.libpng(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.libpng(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.libjpeg(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.libjpeg(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Checking for program /home/florian/software/bin/lgrind...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/lgrind...not found Checking for program /usr/local/sbin/lgrind...not found Checking for program /usr/local/bin/lgrind...not found Checking for program /usr/bin/lgrind...not found Checking for program /usr/lib/jvm/default/bin/lgrind...not found Checking for program /opt/paraview/bin/lgrind...not found Checking for program /usr/bin/site_perl/lgrind...not found Checking for program /usr/bin/vendor_perl/lgrind...not found Checking for program /usr/bin/core_perl/lgrind...not found Checking for program /home/florian/lgrind...not found Checking for program /home/florian/software/petsc/bin/win32fe/lgrind...not found ================================================================================ TEST alternateConfigureLibrary from config.packages.gmp(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.gmp(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.mpfr(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.mpfr(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.opengl(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.opengl(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.glut(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.glut(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.giflib(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.giflib(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.scientificpython(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.scientificpython(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.fiat(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.fiat(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.ctetgen(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.ctetgen(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.concurrencykit(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.concurrencykit(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.cgns(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.cgns(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST locateC2html from config.packages.c2html(/home/florian/software/petsc/config/BuildSystem/config/packages/c2html.py:33) TESTING: locateC2html from config.packages.c2html(config/BuildSystem/config/packages/c2html.py:33) Looking for default C2html executable Checking for program /home/florian/software/bin/c2html...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/c2html...not found Checking for program /usr/local/sbin/c2html...not found Checking for program /usr/local/bin/c2html...not found Checking for program /usr/bin/c2html...not found Checking for program /usr/lib/jvm/default/bin/c2html...not found Checking for program /opt/paraview/bin/c2html...not found Checking for program /usr/bin/site_perl/c2html...not found Checking for program /usr/bin/vendor_perl/c2html...not found Checking for program /usr/bin/core_perl/c2html...not found Checking for program /home/florian/c2html...not found Checking for program /home/florian/software/petsc/bin/win32fe/c2html...not found ================================================================================ TEST alternateConfigureLibrary from config.packages.boost(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.boost(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.openmp(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.openmp(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from config.packages.hwloc(/home/florian/software/petsc/config/BuildSystem/config/package.py:679) TESTING: configureLibrary from config.packages.hwloc(config/BuildSystem/config/package.py:679) Find an installation and check if it can work with PETSc ================================================================================== Checking for a functional hwloc Checking for library in Compiler specific search HWLOC: [] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [hwloc_topology_init] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char hwloc_topology_init(); static void _check_hwloc_topology_init() { hwloc_topology_init(); } int main() { _check_hwloc_topology_init();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: /tmp/petsc-KvGRNM/config.libraries/conftest.o: undefined reference to symbol 'hwloc_topology_init' /usr/lib/libhwloc.so.5: error adding symbols: DSO missing from command line collect2: error: ld returned 1 exit status Popping language C Checking for library in Compiler specific search HWLOC: ['libhwloc.a'] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [hwloc_topology_init] in library ['libhwloc.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char hwloc_topology_init(); static void _check_hwloc_topology_init() { hwloc_topology_init(); } int main() { _check_hwloc_topology_init();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lhwloc -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBHWLOC" to "1" Popping language C Checking for headers Compiler specific search HWLOC: ['/usr/include', '/usr/lib/openmpi'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['hwloc.h'] in ['/usr/include', '/usr/lib/openmpi'] Checking include with compiler flags var CPPFLAGS ['/usr/include', '/usr/lib/openmpi'] Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.headers -I/usr/include -I/usr/lib/openmpi /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/hwloc.h" 1 3 4 # 53 "/usr/include/hwloc.h" 3 4 # 1 "/usr/include/hwloc/autogen/config.h" 1 3 4 # 179 "/usr/include/hwloc/autogen/config.h" 3 4 # 1 "/usr/include/pthread.h" 1 3 4 # 21 "/usr/include/pthread.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 22 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 27 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 28 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 23 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/sched.h" 1 3 4 # 28 "/usr/include/sched.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 29 "/usr/include/sched.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 35 "/usr/include/sched.h" 2 3 4 typedef __pid_t pid_t; # 1 "/usr/include/bits/sched.h" 1 3 4 # 73 "/usr/include/bits/sched.h" 3 4 struct sched_param { int __sched_priority; }; # 96 "/usr/include/bits/sched.h" 3 4 struct __sched_param { int __sched_priority; }; # 119 "/usr/include/bits/sched.h" 3 4 typedef unsigned long int __cpu_mask; typedef struct { __cpu_mask __bits[1024 / (8 * sizeof (__cpu_mask))]; } cpu_set_t; # 202 "/usr/include/bits/sched.h" 3 4 extern int __sched_cpucount (size_t __setsize, const cpu_set_t *__setp) __attribute__ ((__nothrow__ , __leaf__)); extern cpu_set_t *__sched_cpualloc (size_t __count) __attribute__ ((__nothrow__ , __leaf__)) ; extern void __sched_cpufree (cpu_set_t *__set) __attribute__ ((__nothrow__ , __leaf__)); # 44 "/usr/include/sched.h" 2 3 4 extern int sched_setparam (__pid_t __pid, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getparam (__pid_t __pid, struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_setscheduler (__pid_t __pid, int __policy, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_getscheduler (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_yield (void) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_max (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_get_priority_min (int __algorithm) __attribute__ ((__nothrow__ , __leaf__)); extern int sched_rr_get_interval (__pid_t __pid, struct timespec *__t) __attribute__ ((__nothrow__ , __leaf__)); # 126 "/usr/include/sched.h" 3 4 # 24 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/time.h" 1 3 4 # 29 "/usr/include/time.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 38 "/usr/include/time.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 42 "/usr/include/time.h" 2 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 131 "/usr/include/time.h" 3 4 struct tm { int tm_sec; int tm_min; int tm_hour; int tm_mday; int tm_mon; int tm_year; int tm_wday; int tm_yday; int tm_isdst; long int tm_gmtoff; const char *tm_zone; }; struct itimerspec { struct timespec it_interval; struct timespec it_value; }; struct sigevent; # 186 "/usr/include/time.h" 3 4 extern clock_t clock (void) __attribute__ ((__nothrow__ , __leaf__)); extern time_t time (time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern double difftime (time_t __time1, time_t __time0) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern time_t mktime (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern size_t strftime (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); # 221 "/usr/include/time.h" 3 4 # 1 "/usr/include/xlocale.h" 1 3 4 # 27 "/usr/include/xlocale.h" 3 4 typedef struct __locale_struct { struct __locale_data *__locales[13]; const unsigned short int *__ctype_b; const int *__ctype_tolower; const int *__ctype_toupper; const char *__names[13]; } *__locale_t; typedef __locale_t locale_t; # 222 "/usr/include/time.h" 2 3 4 extern size_t strftime_l (char *__restrict __s, size_t __maxsize, const char *__restrict __format, const struct tm *__restrict __tp, __locale_t __loc) __attribute__ ((__nothrow__ , __leaf__)); # 236 "/usr/include/time.h" 3 4 extern struct tm *gmtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *gmtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern struct tm *localtime_r (const time_t *__restrict __timer, struct tm *__restrict __tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime (const struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime (const time_t *__timer) __attribute__ ((__nothrow__ , __leaf__)); extern char *asctime_r (const struct tm *__restrict __tp, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *ctime_r (const time_t *__restrict __timer, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern char *__tzname[2]; extern int __daylight; extern long int __timezone; extern char *tzname[2]; extern void tzset (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daylight; extern long int timezone; extern int stime (const time_t *__when) __attribute__ ((__nothrow__ , __leaf__)); # 319 "/usr/include/time.h" 3 4 extern time_t timegm (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern time_t timelocal (struct tm *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int dysize (int __year) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 334 "/usr/include/time.h" 3 4 extern int nanosleep (const struct timespec *__requested_time, struct timespec *__remaining); extern int clock_getres (clockid_t __clock_id, struct timespec *__res) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_gettime (clockid_t __clock_id, struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_settime (clockid_t __clock_id, const struct timespec *__tp) __attribute__ ((__nothrow__ , __leaf__)); extern int clock_nanosleep (clockid_t __clock_id, int __flags, const struct timespec *__req, struct timespec *__rem); extern int clock_getcpuclockid (pid_t __pid, clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_create (clockid_t __clock_id, struct sigevent *__restrict __evp, timer_t *__restrict __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_delete (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_settime (timer_t __timerid, int __flags, const struct itimerspec *__restrict __value, struct itimerspec *__restrict __ovalue) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_gettime (timer_t __timerid, struct itimerspec *__value) __attribute__ ((__nothrow__ , __leaf__)); extern int timer_getoverrun (timer_t __timerid) __attribute__ ((__nothrow__ , __leaf__)); extern int timespec_get (struct timespec *__ts, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 430 "/usr/include/time.h" 3 4 # 25 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 27 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/setjmp.h" 1 3 4 # 26 "/usr/include/bits/setjmp.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 27 "/usr/include/bits/setjmp.h" 2 3 4 typedef long int __jmp_buf[8]; # 28 "/usr/include/pthread.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/pthread.h" 2 3 4 enum { PTHREAD_CREATE_JOINABLE, PTHREAD_CREATE_DETACHED }; enum { PTHREAD_MUTEX_TIMED_NP, PTHREAD_MUTEX_RECURSIVE_NP, PTHREAD_MUTEX_ERRORCHECK_NP, PTHREAD_MUTEX_ADAPTIVE_NP , PTHREAD_MUTEX_NORMAL = PTHREAD_MUTEX_TIMED_NP, PTHREAD_MUTEX_RECURSIVE = PTHREAD_MUTEX_RECURSIVE_NP, PTHREAD_MUTEX_ERRORCHECK = PTHREAD_MUTEX_ERRORCHECK_NP, PTHREAD_MUTEX_DEFAULT = PTHREAD_MUTEX_NORMAL }; enum { PTHREAD_MUTEX_STALLED, PTHREAD_MUTEX_STALLED_NP = PTHREAD_MUTEX_STALLED, PTHREAD_MUTEX_ROBUST, PTHREAD_MUTEX_ROBUST_NP = PTHREAD_MUTEX_ROBUST }; enum { PTHREAD_PRIO_NONE, PTHREAD_PRIO_INHERIT, PTHREAD_PRIO_PROTECT }; # 114 "/usr/include/pthread.h" 3 4 enum { PTHREAD_RWLOCK_PREFER_READER_NP, PTHREAD_RWLOCK_PREFER_WRITER_NP, PTHREAD_RWLOCK_PREFER_WRITER_NONRECURSIVE_NP, PTHREAD_RWLOCK_DEFAULT_NP = PTHREAD_RWLOCK_PREFER_READER_NP }; # 155 "/usr/include/pthread.h" 3 4 enum { PTHREAD_INHERIT_SCHED, PTHREAD_EXPLICIT_SCHED }; enum { PTHREAD_SCOPE_SYSTEM, PTHREAD_SCOPE_PROCESS }; enum { PTHREAD_PROCESS_PRIVATE, PTHREAD_PROCESS_SHARED }; # 190 "/usr/include/pthread.h" 3 4 struct _pthread_cleanup_buffer { void (*__routine) (void *); void *__arg; int __canceltype; struct _pthread_cleanup_buffer *__prev; }; enum { PTHREAD_CANCEL_ENABLE, PTHREAD_CANCEL_DISABLE }; enum { PTHREAD_CANCEL_DEFERRED, PTHREAD_CANCEL_ASYNCHRONOUS }; # 228 "/usr/include/pthread.h" 3 4 extern int pthread_create (pthread_t *__restrict __newthread, const pthread_attr_t *__restrict __attr, void *(*__start_routine) (void *), void *__restrict __arg) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 3))); extern void pthread_exit (void *__retval) __attribute__ ((__noreturn__)); extern int pthread_join (pthread_t __th, void **__thread_return); # 271 "/usr/include/pthread.h" 3 4 extern int pthread_detach (pthread_t __th) __attribute__ ((__nothrow__ , __leaf__)); extern pthread_t pthread_self (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int pthread_equal (pthread_t __thread1, pthread_t __thread2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int pthread_attr_init (pthread_attr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_destroy (pthread_attr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getdetachstate (const pthread_attr_t *__attr, int *__detachstate) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setdetachstate (pthread_attr_t *__attr, int __detachstate) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getguardsize (const pthread_attr_t *__attr, size_t *__guardsize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setguardsize (pthread_attr_t *__attr, size_t __guardsize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getschedparam (const pthread_attr_t *__restrict __attr, struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setschedparam (pthread_attr_t *__restrict __attr, const struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_getschedpolicy (const pthread_attr_t *__restrict __attr, int *__restrict __policy) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setschedpolicy (pthread_attr_t *__attr, int __policy) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getinheritsched (const pthread_attr_t *__restrict __attr, int *__restrict __inherit) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setinheritsched (pthread_attr_t *__attr, int __inherit) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getscope (const pthread_attr_t *__restrict __attr, int *__restrict __scope) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setscope (pthread_attr_t *__attr, int __scope) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getstackaddr (const pthread_attr_t *__restrict __attr, void **__restrict __stackaddr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) __attribute__ ((__deprecated__)); extern int pthread_attr_setstackaddr (pthread_attr_t *__attr, void *__stackaddr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) __attribute__ ((__deprecated__)); extern int pthread_attr_getstacksize (const pthread_attr_t *__restrict __attr, size_t *__restrict __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_attr_setstacksize (pthread_attr_t *__attr, size_t __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_attr_getstack (const pthread_attr_t *__restrict __attr, void **__restrict __stackaddr, size_t *__restrict __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern int pthread_attr_setstack (pthread_attr_t *__attr, void *__stackaddr, size_t __stacksize) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 429 "/usr/include/pthread.h" 3 4 extern int pthread_setschedparam (pthread_t __target_thread, int __policy, const struct sched_param *__param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))); extern int pthread_getschedparam (pthread_t __target_thread, int *__restrict __policy, struct sched_param *__restrict __param) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern int pthread_setschedprio (pthread_t __target_thread, int __prio) __attribute__ ((__nothrow__ , __leaf__)); # 494 "/usr/include/pthread.h" 3 4 extern int pthread_once (pthread_once_t *__once_control, void (*__init_routine) (void)) __attribute__ ((__nonnull__ (1, 2))); # 506 "/usr/include/pthread.h" 3 4 extern int pthread_setcancelstate (int __state, int *__oldstate); extern int pthread_setcanceltype (int __type, int *__oldtype); extern int pthread_cancel (pthread_t __th); extern void pthread_testcancel (void); typedef struct { struct { __jmp_buf __cancel_jmp_buf; int __mask_was_saved; } __cancel_jmp_buf[1]; void *__pad[4]; } __pthread_unwind_buf_t __attribute__ ((__aligned__)); # 540 "/usr/include/pthread.h" 3 4 struct __pthread_cleanup_frame { void (*__cancel_routine) (void *); void *__cancel_arg; int __do_it; int __cancel_type; }; # 680 "/usr/include/pthread.h" 3 4 extern void __pthread_register_cancel (__pthread_unwind_buf_t *__buf) ; # 692 "/usr/include/pthread.h" 3 4 extern void __pthread_unregister_cancel (__pthread_unwind_buf_t *__buf) ; # 733 "/usr/include/pthread.h" 3 4 extern void __pthread_unwind_next (__pthread_unwind_buf_t *__buf) __attribute__ ((__noreturn__)) __attribute__ ((__weak__)) ; struct __jmp_buf_tag; extern int __sigsetjmp (struct __jmp_buf_tag *__env, int __savemask) __attribute__ ((__nothrow__)); extern int pthread_mutex_init (pthread_mutex_t *__mutex, const pthread_mutexattr_t *__mutexattr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_destroy (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_trylock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_lock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_timedlock (pthread_mutex_t *__restrict __mutex, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutex_unlock (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutex_getprioceiling (const pthread_mutex_t * __restrict __mutex, int *__restrict __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutex_setprioceiling (pthread_mutex_t *__restrict __mutex, int __prioceiling, int *__restrict __old_ceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 3))); extern int pthread_mutex_consistent (pthread_mutex_t *__mutex) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 806 "/usr/include/pthread.h" 3 4 extern int pthread_mutexattr_init (pthread_mutexattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_destroy (pthread_mutexattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getpshared (const pthread_mutexattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setpshared (pthread_mutexattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_gettype (const pthread_mutexattr_t *__restrict __attr, int *__restrict __kind) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_settype (pthread_mutexattr_t *__attr, int __kind) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getprotocol (const pthread_mutexattr_t * __restrict __attr, int *__restrict __protocol) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setprotocol (pthread_mutexattr_t *__attr, int __protocol) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getprioceiling (const pthread_mutexattr_t * __restrict __attr, int *__restrict __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setprioceiling (pthread_mutexattr_t *__attr, int __prioceiling) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_mutexattr_getrobust (const pthread_mutexattr_t *__attr, int *__robustness) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_mutexattr_setrobust (pthread_mutexattr_t *__attr, int __robustness) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 888 "/usr/include/pthread.h" 3 4 extern int pthread_rwlock_init (pthread_rwlock_t *__restrict __rwlock, const pthread_rwlockattr_t *__restrict __attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_destroy (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_rdlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_tryrdlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_timedrdlock (pthread_rwlock_t *__restrict __rwlock, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlock_wrlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_trywrlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlock_timedwrlock (pthread_rwlock_t *__restrict __rwlock, const struct timespec *__restrict __abstime) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlock_unlock (pthread_rwlock_t *__rwlock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_init (pthread_rwlockattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_destroy (pthread_rwlockattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_getpshared (const pthread_rwlockattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlockattr_setpshared (pthread_rwlockattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_rwlockattr_getkind_np (const pthread_rwlockattr_t * __restrict __attr, int *__restrict __pref) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_rwlockattr_setkind_np (pthread_rwlockattr_t *__attr, int __pref) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_init (pthread_cond_t *__restrict __cond, const pthread_condattr_t *__restrict __cond_attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_destroy (pthread_cond_t *__cond) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_signal (pthread_cond_t *__cond) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_broadcast (pthread_cond_t *__cond) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_cond_wait (pthread_cond_t *__restrict __cond, pthread_mutex_t *__restrict __mutex) __attribute__ ((__nonnull__ (1, 2))); # 1000 "/usr/include/pthread.h" 3 4 extern int pthread_cond_timedwait (pthread_cond_t *__restrict __cond, pthread_mutex_t *__restrict __mutex, const struct timespec *__restrict __abstime) __attribute__ ((__nonnull__ (1, 2, 3))); extern int pthread_condattr_init (pthread_condattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_destroy (pthread_condattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_getpshared (const pthread_condattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_condattr_setpshared (pthread_condattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_condattr_getclock (const pthread_condattr_t * __restrict __attr, __clockid_t *__restrict __clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_condattr_setclock (pthread_condattr_t *__attr, __clockid_t __clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 1044 "/usr/include/pthread.h" 3 4 extern int pthread_spin_init (pthread_spinlock_t *__lock, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_destroy (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_lock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_trylock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_spin_unlock (pthread_spinlock_t *__lock) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_init (pthread_barrier_t *__restrict __barrier, const pthread_barrierattr_t *__restrict __attr, unsigned int __count) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_destroy (pthread_barrier_t *__barrier) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrier_wait (pthread_barrier_t *__barrier) __attribute__ ((__nothrow__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_init (pthread_barrierattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_destroy (pthread_barrierattr_t *__attr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_barrierattr_getpshared (const pthread_barrierattr_t * __restrict __attr, int *__restrict __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int pthread_barrierattr_setpshared (pthread_barrierattr_t *__attr, int __pshared) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 1111 "/usr/include/pthread.h" 3 4 extern int pthread_key_create (pthread_key_t *__key, void (*__destr_function) (void *)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int pthread_key_delete (pthread_key_t __key) __attribute__ ((__nothrow__ , __leaf__)); extern void *pthread_getspecific (pthread_key_t __key) __attribute__ ((__nothrow__ , __leaf__)); extern int pthread_setspecific (pthread_key_t __key, const void *__pointer) __attribute__ ((__nothrow__ , __leaf__)) ; extern int pthread_getcpuclockid (pthread_t __thread_id, __clockid_t *__clock_id) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 1145 "/usr/include/pthread.h" 3 4 extern int pthread_atfork (void (*__prepare) (void), void (*__parent) (void), void (*__child) (void)) __attribute__ ((__nothrow__ , __leaf__)); # 1159 "/usr/include/pthread.h" 3 4 # 180 "/usr/include/hwloc/autogen/config.h" 2 3 4 # 1 "/usr/include/unistd.h" 1 3 4 # 27 "/usr/include/unistd.h" 3 4 # 205 "/usr/include/unistd.h" 3 4 # 1 "/usr/include/bits/posix_opt.h" 1 3 4 # 206 "/usr/include/unistd.h" 2 3 4 # 1 "/usr/include/bits/environments.h" 1 3 4 # 22 "/usr/include/bits/environments.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/environments.h" 2 3 4 # 210 "/usr/include/unistd.h" 2 3 4 # 223 "/usr/include/unistd.h" 3 4 typedef __ssize_t ssize_t; # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 230 "/usr/include/unistd.h" 2 3 4 typedef __gid_t gid_t; typedef __uid_t uid_t; typedef __off_t off_t; # 258 "/usr/include/unistd.h" 3 4 typedef __useconds_t useconds_t; # 270 "/usr/include/unistd.h" 3 4 typedef __intptr_t intptr_t; typedef __socklen_t socklen_t; # 290 "/usr/include/unistd.h" 3 4 extern int access (const char *__name, int __type) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 307 "/usr/include/unistd.h" 3 4 extern int faccessat (int __fd, const char *__file, int __type, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))) ; # 337 "/usr/include/unistd.h" 3 4 extern __off_t lseek (int __fd, __off_t __offset, int __whence) __attribute__ ((__nothrow__ , __leaf__)); # 356 "/usr/include/unistd.h" 3 4 extern int close (int __fd); extern ssize_t read (int __fd, void *__buf, size_t __nbytes) ; extern ssize_t write (int __fd, const void *__buf, size_t __n) ; # 379 "/usr/include/unistd.h" 3 4 extern ssize_t pread (int __fd, void *__buf, size_t __nbytes, __off_t __offset) ; extern ssize_t pwrite (int __fd, const void *__buf, size_t __n, __off_t __offset) ; # 420 "/usr/include/unistd.h" 3 4 extern int pipe (int __pipedes[2]) __attribute__ ((__nothrow__ , __leaf__)) ; # 435 "/usr/include/unistd.h" 3 4 extern unsigned int alarm (unsigned int __seconds) __attribute__ ((__nothrow__ , __leaf__)); # 447 "/usr/include/unistd.h" 3 4 extern unsigned int sleep (unsigned int __seconds); extern __useconds_t ualarm (__useconds_t __value, __useconds_t __interval) __attribute__ ((__nothrow__ , __leaf__)); extern int usleep (__useconds_t __useconds); # 472 "/usr/include/unistd.h" 3 4 extern int pause (void); extern int chown (const char *__file, __uid_t __owner, __gid_t __group) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int fchown (int __fd, __uid_t __owner, __gid_t __group) __attribute__ ((__nothrow__ , __leaf__)) ; extern int lchown (const char *__file, __uid_t __owner, __gid_t __group) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int fchownat (int __fd, const char *__file, __uid_t __owner, __gid_t __group, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))) ; extern int chdir (const char *__path) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int fchdir (int __fd) __attribute__ ((__nothrow__ , __leaf__)) ; # 514 "/usr/include/unistd.h" 3 4 extern char *getcwd (char *__buf, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) ; # 528 "/usr/include/unistd.h" 3 4 extern char *getwd (char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) __attribute__ ((__deprecated__)) ; extern int dup (int __fd) __attribute__ ((__nothrow__ , __leaf__)) ; extern int dup2 (int __fd, int __fd2) __attribute__ ((__nothrow__ , __leaf__)); # 546 "/usr/include/unistd.h" 3 4 extern char **__environ; extern int execve (const char *__path, char *const __argv[], char *const __envp[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int fexecve (int __fd, char *const __argv[], char *const __envp[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int execv (const char *__path, char *const __argv[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execle (const char *__path, const char *__arg, ...) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execl (const char *__path, const char *__arg, ...) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execvp (const char *__file, char *const __argv[]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int execlp (const char *__file, const char *__arg, ...) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); # 601 "/usr/include/unistd.h" 3 4 extern int nice (int __inc) __attribute__ ((__nothrow__ , __leaf__)) ; extern void _exit (int __status) __attribute__ ((__noreturn__)); # 1 "/usr/include/bits/confname.h" 1 3 4 # 24 "/usr/include/bits/confname.h" 3 4 enum { _PC_LINK_MAX, _PC_MAX_CANON, _PC_MAX_INPUT, _PC_NAME_MAX, _PC_PATH_MAX, _PC_PIPE_BUF, _PC_CHOWN_RESTRICTED, _PC_NO_TRUNC, _PC_VDISABLE, _PC_SYNC_IO, _PC_ASYNC_IO, _PC_PRIO_IO, _PC_SOCK_MAXBUF, _PC_FILESIZEBITS, _PC_REC_INCR_XFER_SIZE, _PC_REC_MAX_XFER_SIZE, _PC_REC_MIN_XFER_SIZE, _PC_REC_XFER_ALIGN, _PC_ALLOC_SIZE_MIN, _PC_SYMLINK_MAX, _PC_2_SYMLINKS }; enum { _SC_ARG_MAX, _SC_CHILD_MAX, _SC_CLK_TCK, _SC_NGROUPS_MAX, _SC_OPEN_MAX, _SC_STREAM_MAX, _SC_TZNAME_MAX, _SC_JOB_CONTROL, _SC_SAVED_IDS, _SC_REALTIME_SIGNALS, _SC_PRIORITY_SCHEDULING, _SC_TIMERS, _SC_ASYNCHRONOUS_IO, _SC_PRIORITIZED_IO, _SC_SYNCHRONIZED_IO, _SC_FSYNC, _SC_MAPPED_FILES, _SC_MEMLOCK, _SC_MEMLOCK_RANGE, _SC_MEMORY_PROTECTION, _SC_MESSAGE_PASSING, _SC_SEMAPHORES, _SC_SHARED_MEMORY_OBJECTS, _SC_AIO_LISTIO_MAX, _SC_AIO_MAX, _SC_AIO_PRIO_DELTA_MAX, _SC_DELAYTIMER_MAX, _SC_MQ_OPEN_MAX, _SC_MQ_PRIO_MAX, _SC_VERSION, _SC_PAGESIZE, _SC_RTSIG_MAX, _SC_SEM_NSEMS_MAX, _SC_SEM_VALUE_MAX, _SC_SIGQUEUE_MAX, _SC_TIMER_MAX, _SC_BC_BASE_MAX, _SC_BC_DIM_MAX, _SC_BC_SCALE_MAX, _SC_BC_STRING_MAX, _SC_COLL_WEIGHTS_MAX, _SC_EQUIV_CLASS_MAX, _SC_EXPR_NEST_MAX, _SC_LINE_MAX, _SC_RE_DUP_MAX, _SC_CHARCLASS_NAME_MAX, _SC_2_VERSION, _SC_2_C_BIND, _SC_2_C_DEV, _SC_2_FORT_DEV, _SC_2_FORT_RUN, _SC_2_SW_DEV, _SC_2_LOCALEDEF, _SC_PII, _SC_PII_XTI, _SC_PII_SOCKET, _SC_PII_INTERNET, _SC_PII_OSI, _SC_POLL, _SC_SELECT, _SC_UIO_MAXIOV, _SC_IOV_MAX = _SC_UIO_MAXIOV, _SC_PII_INTERNET_STREAM, _SC_PII_INTERNET_DGRAM, _SC_PII_OSI_COTS, _SC_PII_OSI_CLTS, _SC_PII_OSI_M, _SC_T_IOV_MAX, _SC_THREADS, _SC_THREAD_SAFE_FUNCTIONS, _SC_GETGR_R_SIZE_MAX, _SC_GETPW_R_SIZE_MAX, _SC_LOGIN_NAME_MAX, _SC_TTY_NAME_MAX, _SC_THREAD_DESTRUCTOR_ITERATIONS, _SC_THREAD_KEYS_MAX, _SC_THREAD_STACK_MIN, _SC_THREAD_THREADS_MAX, _SC_THREAD_ATTR_STACKADDR, _SC_THREAD_ATTR_STACKSIZE, _SC_THREAD_PRIORITY_SCHEDULING, _SC_THREAD_PRIO_INHERIT, _SC_THREAD_PRIO_PROTECT, _SC_THREAD_PROCESS_SHARED, _SC_NPROCESSORS_CONF, _SC_NPROCESSORS_ONLN, _SC_PHYS_PAGES, _SC_AVPHYS_PAGES, _SC_ATEXIT_MAX, _SC_PASS_MAX, _SC_XOPEN_VERSION, _SC_XOPEN_XCU_VERSION, _SC_XOPEN_UNIX, _SC_XOPEN_CRYPT, _SC_XOPEN_ENH_I18N, _SC_XOPEN_SHM, _SC_2_CHAR_TERM, _SC_2_C_VERSION, _SC_2_UPE, _SC_XOPEN_XPG2, _SC_XOPEN_XPG3, _SC_XOPEN_XPG4, _SC_CHAR_BIT, _SC_CHAR_MAX, _SC_CHAR_MIN, _SC_INT_MAX, _SC_INT_MIN, _SC_LONG_BIT, _SC_WORD_BIT, _SC_MB_LEN_MAX, _SC_NZERO, _SC_SSIZE_MAX, _SC_SCHAR_MAX, _SC_SCHAR_MIN, _SC_SHRT_MAX, _SC_SHRT_MIN, _SC_UCHAR_MAX, _SC_UINT_MAX, _SC_ULONG_MAX, _SC_USHRT_MAX, _SC_NL_ARGMAX, _SC_NL_LANGMAX, _SC_NL_MSGMAX, _SC_NL_NMAX, _SC_NL_SETMAX, _SC_NL_TEXTMAX, _SC_XBS5_ILP32_OFF32, _SC_XBS5_ILP32_OFFBIG, _SC_XBS5_LP64_OFF64, _SC_XBS5_LPBIG_OFFBIG, _SC_XOPEN_LEGACY, _SC_XOPEN_REALTIME, _SC_XOPEN_REALTIME_THREADS, _SC_ADVISORY_INFO, _SC_BARRIERS, _SC_BASE, _SC_C_LANG_SUPPORT, _SC_C_LANG_SUPPORT_R, _SC_CLOCK_SELECTION, _SC_CPUTIME, _SC_THREAD_CPUTIME, _SC_DEVICE_IO, _SC_DEVICE_SPECIFIC, _SC_DEVICE_SPECIFIC_R, _SC_FD_MGMT, _SC_FIFO, _SC_PIPE, _SC_FILE_ATTRIBUTES, _SC_FILE_LOCKING, _SC_FILE_SYSTEM, _SC_MONOTONIC_CLOCK, _SC_MULTI_PROCESS, _SC_SINGLE_PROCESS, _SC_NETWORKING, _SC_READER_WRITER_LOCKS, _SC_SPIN_LOCKS, _SC_REGEXP, _SC_REGEX_VERSION, _SC_SHELL, _SC_SIGNALS, _SC_SPAWN, _SC_SPORADIC_SERVER, _SC_THREAD_SPORADIC_SERVER, _SC_SYSTEM_DATABASE, _SC_SYSTEM_DATABASE_R, _SC_TIMEOUTS, _SC_TYPED_MEMORY_OBJECTS, _SC_USER_GROUPS, _SC_USER_GROUPS_R, _SC_2_PBS, _SC_2_PBS_ACCOUNTING, _SC_2_PBS_LOCATE, _SC_2_PBS_MESSAGE, _SC_2_PBS_TRACK, _SC_SYMLOOP_MAX, _SC_STREAMS, _SC_2_PBS_CHECKPOINT, _SC_V6_ILP32_OFF32, _SC_V6_ILP32_OFFBIG, _SC_V6_LP64_OFF64, _SC_V6_LPBIG_OFFBIG, _SC_HOST_NAME_MAX, _SC_TRACE, _SC_TRACE_EVENT_FILTER, _SC_TRACE_INHERIT, _SC_TRACE_LOG, _SC_LEVEL1_ICACHE_SIZE, _SC_LEVEL1_ICACHE_ASSOC, _SC_LEVEL1_ICACHE_LINESIZE, _SC_LEVEL1_DCACHE_SIZE, _SC_LEVEL1_DCACHE_ASSOC, _SC_LEVEL1_DCACHE_LINESIZE, _SC_LEVEL2_CACHE_SIZE, _SC_LEVEL2_CACHE_ASSOC, _SC_LEVEL2_CACHE_LINESIZE, _SC_LEVEL3_CACHE_SIZE, _SC_LEVEL3_CACHE_ASSOC, _SC_LEVEL3_CACHE_LINESIZE, _SC_LEVEL4_CACHE_SIZE, _SC_LEVEL4_CACHE_ASSOC, _SC_LEVEL4_CACHE_LINESIZE, _SC_IPV6 = _SC_LEVEL1_ICACHE_SIZE + 50, _SC_RAW_SOCKETS, _SC_V7_ILP32_OFF32, _SC_V7_ILP32_OFFBIG, _SC_V7_LP64_OFF64, _SC_V7_LPBIG_OFFBIG, _SC_SS_REPL_MAX, _SC_TRACE_EVENT_NAME_MAX, _SC_TRACE_NAME_MAX, _SC_TRACE_SYS_MAX, _SC_TRACE_USER_EVENT_MAX, _SC_XOPEN_STREAMS, _SC_THREAD_ROBUST_PRIO_INHERIT, _SC_THREAD_ROBUST_PRIO_PROTECT }; enum { _CS_PATH, _CS_V6_WIDTH_RESTRICTED_ENVS, _CS_GNU_LIBC_VERSION, _CS_GNU_LIBPTHREAD_VERSION, _CS_V5_WIDTH_RESTRICTED_ENVS, _CS_V7_WIDTH_RESTRICTED_ENVS, _CS_LFS_CFLAGS = 1000, _CS_LFS_LDFLAGS, _CS_LFS_LIBS, _CS_LFS_LINTFLAGS, _CS_LFS64_CFLAGS, _CS_LFS64_LDFLAGS, _CS_LFS64_LIBS, _CS_LFS64_LINTFLAGS, _CS_XBS5_ILP32_OFF32_CFLAGS = 1100, _CS_XBS5_ILP32_OFF32_LDFLAGS, _CS_XBS5_ILP32_OFF32_LIBS, _CS_XBS5_ILP32_OFF32_LINTFLAGS, _CS_XBS5_ILP32_OFFBIG_CFLAGS, _CS_XBS5_ILP32_OFFBIG_LDFLAGS, _CS_XBS5_ILP32_OFFBIG_LIBS, _CS_XBS5_ILP32_OFFBIG_LINTFLAGS, _CS_XBS5_LP64_OFF64_CFLAGS, _CS_XBS5_LP64_OFF64_LDFLAGS, _CS_XBS5_LP64_OFF64_LIBS, _CS_XBS5_LP64_OFF64_LINTFLAGS, _CS_XBS5_LPBIG_OFFBIG_CFLAGS, _CS_XBS5_LPBIG_OFFBIG_LDFLAGS, _CS_XBS5_LPBIG_OFFBIG_LIBS, _CS_XBS5_LPBIG_OFFBIG_LINTFLAGS, _CS_POSIX_V6_ILP32_OFF32_CFLAGS, _CS_POSIX_V6_ILP32_OFF32_LDFLAGS, _CS_POSIX_V6_ILP32_OFF32_LIBS, _CS_POSIX_V6_ILP32_OFF32_LINTFLAGS, _CS_POSIX_V6_ILP32_OFFBIG_CFLAGS, _CS_POSIX_V6_ILP32_OFFBIG_LDFLAGS, _CS_POSIX_V6_ILP32_OFFBIG_LIBS, _CS_POSIX_V6_ILP32_OFFBIG_LINTFLAGS, _CS_POSIX_V6_LP64_OFF64_CFLAGS, _CS_POSIX_V6_LP64_OFF64_LDFLAGS, _CS_POSIX_V6_LP64_OFF64_LIBS, _CS_POSIX_V6_LP64_OFF64_LINTFLAGS, _CS_POSIX_V6_LPBIG_OFFBIG_CFLAGS, _CS_POSIX_V6_LPBIG_OFFBIG_LDFLAGS, _CS_POSIX_V6_LPBIG_OFFBIG_LIBS, _CS_POSIX_V6_LPBIG_OFFBIG_LINTFLAGS, _CS_POSIX_V7_ILP32_OFF32_CFLAGS, _CS_POSIX_V7_ILP32_OFF32_LDFLAGS, _CS_POSIX_V7_ILP32_OFF32_LIBS, _CS_POSIX_V7_ILP32_OFF32_LINTFLAGS, _CS_POSIX_V7_ILP32_OFFBIG_CFLAGS, _CS_POSIX_V7_ILP32_OFFBIG_LDFLAGS, _CS_POSIX_V7_ILP32_OFFBIG_LIBS, _CS_POSIX_V7_ILP32_OFFBIG_LINTFLAGS, _CS_POSIX_V7_LP64_OFF64_CFLAGS, _CS_POSIX_V7_LP64_OFF64_LDFLAGS, _CS_POSIX_V7_LP64_OFF64_LIBS, _CS_POSIX_V7_LP64_OFF64_LINTFLAGS, _CS_POSIX_V7_LPBIG_OFFBIG_CFLAGS, _CS_POSIX_V7_LPBIG_OFFBIG_LDFLAGS, _CS_POSIX_V7_LPBIG_OFFBIG_LIBS, _CS_POSIX_V7_LPBIG_OFFBIG_LINTFLAGS, _CS_V6_ENV, _CS_V7_ENV }; # 613 "/usr/include/unistd.h" 2 3 4 extern long int pathconf (const char *__path, int __name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int fpathconf (int __fd, int __name) __attribute__ ((__nothrow__ , __leaf__)); extern long int sysconf (int __name) __attribute__ ((__nothrow__ , __leaf__)); extern size_t confstr (int __name, char *__buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getpid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getppid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getpgrp (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t __getpgid (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getpgid (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern int setpgid (__pid_t __pid, __pid_t __pgid) __attribute__ ((__nothrow__ , __leaf__)); # 663 "/usr/include/unistd.h" 3 4 extern int setpgrp (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t setsid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __pid_t getsid (__pid_t __pid) __attribute__ ((__nothrow__ , __leaf__)); extern __uid_t getuid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __uid_t geteuid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __gid_t getgid (void) __attribute__ ((__nothrow__ , __leaf__)); extern __gid_t getegid (void) __attribute__ ((__nothrow__ , __leaf__)); extern int getgroups (int __size, __gid_t __list[]) __attribute__ ((__nothrow__ , __leaf__)) ; # 703 "/usr/include/unistd.h" 3 4 extern int setuid (__uid_t __uid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setreuid (__uid_t __ruid, __uid_t __euid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int seteuid (__uid_t __uid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setgid (__gid_t __gid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setregid (__gid_t __rgid, __gid_t __egid) __attribute__ ((__nothrow__ , __leaf__)) ; extern int setegid (__gid_t __gid) __attribute__ ((__nothrow__ , __leaf__)) ; # 759 "/usr/include/unistd.h" 3 4 extern __pid_t fork (void) __attribute__ ((__nothrow__)); extern __pid_t vfork (void) __attribute__ ((__nothrow__ , __leaf__)); extern char *ttyname (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int ttyname_r (int __fd, char *__buf, size_t __buflen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))) ; extern int isatty (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int ttyslot (void) __attribute__ ((__nothrow__ , __leaf__)); extern int link (const char *__from, const char *__to) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) ; extern int linkat (int __fromfd, const char *__from, int __tofd, const char *__to, int __flags) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))) ; extern int symlink (const char *__from, const char *__to) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) ; extern ssize_t readlink (const char *__restrict __path, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))) ; extern int symlinkat (const char *__from, int __tofd, const char *__to) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 3))) ; extern ssize_t readlinkat (int __fd, const char *__restrict __path, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))) ; extern int unlink (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int unlinkat (int __fd, const char *__name, int __flag) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int rmdir (const char *__path) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern __pid_t tcgetpgrp (int __fd) __attribute__ ((__nothrow__ , __leaf__)); extern int tcsetpgrp (int __fd, __pid_t __pgrp_id) __attribute__ ((__nothrow__ , __leaf__)); extern char *getlogin (void); extern int getlogin_r (char *__name, size_t __name_len) __attribute__ ((__nonnull__ (1))); extern int setlogin (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 873 "/usr/include/unistd.h" 3 4 # 1 "/usr/include/getopt.h" 1 3 4 # 57 "/usr/include/getopt.h" 3 4 extern char *optarg; # 71 "/usr/include/getopt.h" 3 4 extern int optind; extern int opterr; extern int optopt; # 150 "/usr/include/getopt.h" 3 4 extern int getopt (int ___argc, char *const *___argv, const char *__shortopts) __attribute__ ((__nothrow__ , __leaf__)); # 874 "/usr/include/unistd.h" 2 3 4 extern int gethostname (char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int sethostname (const char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int sethostid (long int __id) __attribute__ ((__nothrow__ , __leaf__)) ; extern int getdomainname (char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int setdomainname (const char *__name, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int vhangup (void) __attribute__ ((__nothrow__ , __leaf__)); extern int revoke (const char *__file) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern int profil (unsigned short int *__sample_buffer, size_t __size, size_t __offset, unsigned int __scale) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int acct (const char *__name) __attribute__ ((__nothrow__ , __leaf__)); extern char *getusershell (void) __attribute__ ((__nothrow__ , __leaf__)); extern void endusershell (void) __attribute__ ((__nothrow__ , __leaf__)); extern void setusershell (void) __attribute__ ((__nothrow__ , __leaf__)); extern int daemon (int __nochdir, int __noclose) __attribute__ ((__nothrow__ , __leaf__)) ; extern int chroot (const char *__path) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern char *getpass (const char *__prompt) __attribute__ ((__nonnull__ (1))); extern int fsync (int __fd); # 971 "/usr/include/unistd.h" 3 4 extern long int gethostid (void); extern void sync (void) __attribute__ ((__nothrow__ , __leaf__)); extern int getpagesize (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); extern int getdtablesize (void) __attribute__ ((__nothrow__ , __leaf__)); # 995 "/usr/include/unistd.h" 3 4 extern int truncate (const char *__file, __off_t __length) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 1018 "/usr/include/unistd.h" 3 4 extern int ftruncate (int __fd, __off_t __length) __attribute__ ((__nothrow__ , __leaf__)) ; # 1039 "/usr/include/unistd.h" 3 4 extern int brk (void *__addr) __attribute__ ((__nothrow__ , __leaf__)) ; extern void *sbrk (intptr_t __delta) __attribute__ ((__nothrow__ , __leaf__)); # 1060 "/usr/include/unistd.h" 3 4 extern long int syscall (long int __sysno, ...) __attribute__ ((__nothrow__ , __leaf__)); # 1083 "/usr/include/unistd.h" 3 4 extern int lockf (int __fd, int __cmd, __off_t __len) ; # 1114 "/usr/include/unistd.h" 3 4 extern int fdatasync (int __fildes); # 1166 "/usr/include/unistd.h" 3 4 # 186 "/usr/include/hwloc/autogen/config.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 1 3 4 # 9 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 3 4 # 1 "/usr/include/stdint.h" 1 3 4 # 26 "/usr/include/stdint.h" 3 4 # 1 "/usr/include/bits/wchar.h" 1 3 4 # 27 "/usr/include/stdint.h" 2 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/stdint.h" 2 3 4 # 36 "/usr/include/stdint.h" 3 4 typedef signed char int8_t; typedef short int int16_t; typedef int int32_t; typedef long int int64_t; typedef unsigned char uint8_t; typedef unsigned short int uint16_t; typedef unsigned int uint32_t; typedef unsigned long int uint64_t; # 65 "/usr/include/stdint.h" 3 4 typedef signed char int_least8_t; typedef short int int_least16_t; typedef int int_least32_t; typedef long int int_least64_t; typedef unsigned char uint_least8_t; typedef unsigned short int uint_least16_t; typedef unsigned int uint_least32_t; typedef unsigned long int uint_least64_t; # 90 "/usr/include/stdint.h" 3 4 typedef signed char int_fast8_t; typedef long int int_fast16_t; typedef long int int_fast32_t; typedef long int int_fast64_t; # 103 "/usr/include/stdint.h" 3 4 typedef unsigned char uint_fast8_t; typedef unsigned long int uint_fast16_t; typedef unsigned long int uint_fast32_t; typedef unsigned long int uint_fast64_t; # 122 "/usr/include/stdint.h" 3 4 typedef unsigned long int uintptr_t; # 134 "/usr/include/stdint.h" 3 4 typedef long int intmax_t; typedef unsigned long int uintmax_t; # 10 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdint.h" 2 3 4 # 188 "/usr/include/hwloc/autogen/config.h" 2 3 4 typedef uint64_t hwloc_uint64_t; # 54 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/sys/types.h" 1 3 4 # 27 "/usr/include/sys/types.h" 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; # 70 "/usr/include/sys/types.h" 3 4 typedef __mode_t mode_t; typedef __nlink_t nlink_t; # 104 "/usr/include/sys/types.h" 3 4 typedef __id_t id_t; # 115 "/usr/include/sys/types.h" 3 4 typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 200 "/usr/include/sys/types.h" 3 4 typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 219 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 47 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 273 "/usr/include/sys/types.h" 3 4 # 55 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/stdio.h" 1 3 4 # 29 "/usr/include/stdio.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 34 "/usr/include/stdio.h" 2 3 4 # 44 "/usr/include/stdio.h" 3 4 struct _IO_FILE; typedef struct _IO_FILE FILE; # 64 "/usr/include/stdio.h" 3 4 typedef struct _IO_FILE __FILE; # 74 "/usr/include/stdio.h" 3 4 # 1 "/usr/include/libio.h" 1 3 4 # 31 "/usr/include/libio.h" 3 4 # 1 "/usr/include/_G_config.h" 1 3 4 # 15 "/usr/include/_G_config.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 16 "/usr/include/_G_config.h" 2 3 4 # 1 "/usr/include/wchar.h" 1 3 4 # 82 "/usr/include/wchar.h" 3 4 typedef struct { int __count; union { unsigned int __wch; char __wchb[4]; } __value; } __mbstate_t; # 21 "/usr/include/_G_config.h" 2 3 4 typedef struct { __off_t __pos; __mbstate_t __state; } _G_fpos_t; typedef struct { __off64_t __pos; __mbstate_t __state; } _G_fpos64_t; # 32 "/usr/include/libio.h" 2 3 4 # 49 "/usr/include/libio.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 1 3 4 # 40 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stdarg.h" 3 4 typedef __builtin_va_list __gnuc_va_list; # 50 "/usr/include/libio.h" 2 3 4 # 144 "/usr/include/libio.h" 3 4 struct _IO_jump_t; struct _IO_FILE; typedef void _IO_lock_t; struct _IO_marker { struct _IO_marker *_next; struct _IO_FILE *_sbuf; int _pos; # 173 "/usr/include/libio.h" 3 4 }; enum __codecvt_result { __codecvt_ok, __codecvt_partial, __codecvt_error, __codecvt_noconv }; # 241 "/usr/include/libio.h" 3 4 struct _IO_FILE { int _flags; char* _IO_read_ptr; char* _IO_read_end; char* _IO_read_base; char* _IO_write_base; char* _IO_write_ptr; char* _IO_write_end; char* _IO_buf_base; char* _IO_buf_end; char *_IO_save_base; char *_IO_backup_base; char *_IO_save_end; struct _IO_marker *_markers; struct _IO_FILE *_chain; int _fileno; int _flags2; __off_t _old_offset; unsigned short _cur_column; signed char _vtable_offset; char _shortbuf[1]; _IO_lock_t *_lock; # 289 "/usr/include/libio.h" 3 4 __off64_t _offset; void *__pad1; void *__pad2; void *__pad3; void *__pad4; size_t __pad5; int _mode; char _unused2[15 * sizeof (int) - 4 * sizeof (void *) - sizeof (size_t)]; }; typedef struct _IO_FILE _IO_FILE; struct _IO_FILE_plus; extern struct _IO_FILE_plus _IO_2_1_stdin_; extern struct _IO_FILE_plus _IO_2_1_stdout_; extern struct _IO_FILE_plus _IO_2_1_stderr_; # 333 "/usr/include/libio.h" 3 4 typedef __ssize_t __io_read_fn (void *__cookie, char *__buf, size_t __nbytes); typedef __ssize_t __io_write_fn (void *__cookie, const char *__buf, size_t __n); typedef int __io_seek_fn (void *__cookie, __off64_t *__pos, int __w); typedef int __io_close_fn (void *__cookie); # 385 "/usr/include/libio.h" 3 4 extern int __underflow (_IO_FILE *); extern int __uflow (_IO_FILE *); extern int __overflow (_IO_FILE *, int); # 429 "/usr/include/libio.h" 3 4 extern int _IO_getc (_IO_FILE *__fp); extern int _IO_putc (int __c, _IO_FILE *__fp); extern int _IO_feof (_IO_FILE *__fp) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_ferror (_IO_FILE *__fp) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_peekc_locked (_IO_FILE *__fp); extern void _IO_flockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); extern void _IO_funlockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); extern int _IO_ftrylockfile (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); # 459 "/usr/include/libio.h" 3 4 extern int _IO_vfscanf (_IO_FILE * __restrict, const char * __restrict, __gnuc_va_list, int *__restrict); extern int _IO_vfprintf (_IO_FILE *__restrict, const char *__restrict, __gnuc_va_list); extern __ssize_t _IO_padn (_IO_FILE *, int, __ssize_t); extern size_t _IO_sgetn (_IO_FILE *, void *, size_t); extern __off64_t _IO_seekoff (_IO_FILE *, __off64_t, int, int); extern __off64_t _IO_seekpos (_IO_FILE *, __off64_t, int); extern void _IO_free_backup_area (_IO_FILE *) __attribute__ ((__nothrow__ , __leaf__)); # 75 "/usr/include/stdio.h" 2 3 4 typedef __gnuc_va_list va_list; # 110 "/usr/include/stdio.h" 3 4 typedef _G_fpos_t fpos_t; # 166 "/usr/include/stdio.h" 3 4 # 1 "/usr/include/bits/stdio_lim.h" 1 3 4 # 167 "/usr/include/stdio.h" 2 3 4 extern struct _IO_FILE *stdin; extern struct _IO_FILE *stdout; extern struct _IO_FILE *stderr; extern int remove (const char *__filename) __attribute__ ((__nothrow__ , __leaf__)); extern int rename (const char *__old, const char *__new) __attribute__ ((__nothrow__ , __leaf__)); extern int renameat (int __oldfd, const char *__old, int __newfd, const char *__new) __attribute__ ((__nothrow__ , __leaf__)); extern FILE *tmpfile (void) ; # 211 "/usr/include/stdio.h" 3 4 extern char *tmpnam (char *__s) __attribute__ ((__nothrow__ , __leaf__)) ; extern char *tmpnam_r (char *__s) __attribute__ ((__nothrow__ , __leaf__)) ; # 229 "/usr/include/stdio.h" 3 4 extern char *tempnam (const char *__dir, const char *__pfx) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int fclose (FILE *__stream); extern int fflush (FILE *__stream); # 254 "/usr/include/stdio.h" 3 4 extern int fflush_unlocked (FILE *__stream); # 268 "/usr/include/stdio.h" 3 4 extern FILE *fopen (const char *__restrict __filename, const char *__restrict __modes) ; extern FILE *freopen (const char *__restrict __filename, const char *__restrict __modes, FILE *__restrict __stream) ; # 297 "/usr/include/stdio.h" 3 4 # 308 "/usr/include/stdio.h" 3 4 extern FILE *fdopen (int __fd, const char *__modes) __attribute__ ((__nothrow__ , __leaf__)) ; # 321 "/usr/include/stdio.h" 3 4 extern FILE *fmemopen (void *__s, size_t __len, const char *__modes) __attribute__ ((__nothrow__ , __leaf__)) ; extern FILE *open_memstream (char **__bufloc, size_t *__sizeloc) __attribute__ ((__nothrow__ , __leaf__)) ; extern void setbuf (FILE *__restrict __stream, char *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)); extern int setvbuf (FILE *__restrict __stream, char *__restrict __buf, int __modes, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern void setbuffer (FILE *__restrict __stream, char *__restrict __buf, size_t __size) __attribute__ ((__nothrow__ , __leaf__)); extern void setlinebuf (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int fprintf (FILE *__restrict __stream, const char *__restrict __format, ...); extern int printf (const char *__restrict __format, ...); extern int sprintf (char *__restrict __s, const char *__restrict __format, ...) __attribute__ ((__nothrow__)); extern int vfprintf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg); extern int vprintf (const char *__restrict __format, __gnuc_va_list __arg); extern int vsprintf (char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__)); extern int snprintf (char *__restrict __s, size_t __maxlen, const char *__restrict __format, ...) __attribute__ ((__nothrow__)) __attribute__ ((__format__ (__printf__, 3, 4))); extern int vsnprintf (char *__restrict __s, size_t __maxlen, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__)) __attribute__ ((__format__ (__printf__, 3, 0))); # 414 "/usr/include/stdio.h" 3 4 extern int vdprintf (int __fd, const char *__restrict __fmt, __gnuc_va_list __arg) __attribute__ ((__format__ (__printf__, 2, 0))); extern int dprintf (int __fd, const char *__restrict __fmt, ...) __attribute__ ((__format__ (__printf__, 2, 3))); extern int fscanf (FILE *__restrict __stream, const char *__restrict __format, ...) ; extern int scanf (const char *__restrict __format, ...) ; extern int sscanf (const char *__restrict __s, const char *__restrict __format, ...) __attribute__ ((__nothrow__ , __leaf__)); # 445 "/usr/include/stdio.h" 3 4 extern int fscanf (FILE *__restrict __stream, const char *__restrict __format, ...) __asm__ ("" "__isoc99_fscanf") ; extern int scanf (const char *__restrict __format, ...) __asm__ ("" "__isoc99_scanf") ; extern int sscanf (const char *__restrict __s, const char *__restrict __format, ...) __asm__ ("" "__isoc99_sscanf") __attribute__ ((__nothrow__ , __leaf__)) ; # 465 "/usr/include/stdio.h" 3 4 extern int vfscanf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__format__ (__scanf__, 2, 0))) ; extern int vscanf (const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__format__ (__scanf__, 1, 0))) ; extern int vsscanf (const char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__format__ (__scanf__, 2, 0))); # 496 "/usr/include/stdio.h" 3 4 extern int vfscanf (FILE *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vfscanf") __attribute__ ((__format__ (__scanf__, 2, 0))) ; extern int vscanf (const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vscanf") __attribute__ ((__format__ (__scanf__, 1, 0))) ; extern int vsscanf (const char *__restrict __s, const char *__restrict __format, __gnuc_va_list __arg) __asm__ ("" "__isoc99_vsscanf") __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__format__ (__scanf__, 2, 0))); # 524 "/usr/include/stdio.h" 3 4 extern int fgetc (FILE *__stream); extern int getc (FILE *__stream); extern int getchar (void); # 552 "/usr/include/stdio.h" 3 4 extern int getc_unlocked (FILE *__stream); extern int getchar_unlocked (void); # 563 "/usr/include/stdio.h" 3 4 extern int fgetc_unlocked (FILE *__stream); extern int fputc (int __c, FILE *__stream); extern int putc (int __c, FILE *__stream); extern int putchar (int __c); # 596 "/usr/include/stdio.h" 3 4 extern int fputc_unlocked (int __c, FILE *__stream); extern int putc_unlocked (int __c, FILE *__stream); extern int putchar_unlocked (int __c); extern int getw (FILE *__stream); extern int putw (int __w, FILE *__stream); extern char *fgets (char *__restrict __s, int __n, FILE *__restrict __stream) ; # 642 "/usr/include/stdio.h" 3 4 # 667 "/usr/include/stdio.h" 3 4 extern __ssize_t __getdelim (char **__restrict __lineptr, size_t *__restrict __n, int __delimiter, FILE *__restrict __stream) ; extern __ssize_t getdelim (char **__restrict __lineptr, size_t *__restrict __n, int __delimiter, FILE *__restrict __stream) ; extern __ssize_t getline (char **__restrict __lineptr, size_t *__restrict __n, FILE *__restrict __stream) ; extern int fputs (const char *__restrict __s, FILE *__restrict __stream); extern int puts (const char *__s); extern int ungetc (int __c, FILE *__stream); extern size_t fread (void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream) ; extern size_t fwrite (const void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __s); # 739 "/usr/include/stdio.h" 3 4 extern size_t fread_unlocked (void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream) ; extern size_t fwrite_unlocked (const void *__restrict __ptr, size_t __size, size_t __n, FILE *__restrict __stream); extern int fseek (FILE *__stream, long int __off, int __whence); extern long int ftell (FILE *__stream) ; extern void rewind (FILE *__stream); # 775 "/usr/include/stdio.h" 3 4 extern int fseeko (FILE *__stream, __off_t __off, int __whence); extern __off_t ftello (FILE *__stream) ; # 794 "/usr/include/stdio.h" 3 4 extern int fgetpos (FILE *__restrict __stream, fpos_t *__restrict __pos); extern int fsetpos (FILE *__stream, const fpos_t *__pos); # 817 "/usr/include/stdio.h" 3 4 # 826 "/usr/include/stdio.h" 3 4 extern void clearerr (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int feof (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int ferror (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void clearerr_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int feof_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int ferror_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void perror (const char *__s); # 1 "/usr/include/bits/sys_errlist.h" 1 3 4 # 26 "/usr/include/bits/sys_errlist.h" 3 4 extern int sys_nerr; extern const char *const sys_errlist[]; # 856 "/usr/include/stdio.h" 2 3 4 extern int fileno (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern int fileno_unlocked (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; # 874 "/usr/include/stdio.h" 3 4 extern FILE *popen (const char *__command, const char *__modes) ; extern int pclose (FILE *__stream); extern char *ctermid (char *__s) __attribute__ ((__nothrow__ , __leaf__)); # 914 "/usr/include/stdio.h" 3 4 extern void flockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); extern int ftrylockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)) ; extern void funlockfile (FILE *__stream) __attribute__ ((__nothrow__ , __leaf__)); # 944 "/usr/include/stdio.h" 3 4 # 56 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/string.h" 1 3 4 # 27 "/usr/include/string.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 33 "/usr/include/string.h" 2 3 4 extern void *memcpy (void *__restrict __dest, const void *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memmove (void *__dest, const void *__src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memccpy (void *__restrict __dest, const void *__restrict __src, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *memset (void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int memcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 92 "/usr/include/string.h" 3 4 extern void *memchr (const void *__s, int __c, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 123 "/usr/include/string.h" 3 4 extern char *strcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strcat (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strncat (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strcoll (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strxfrm (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 162 "/usr/include/string.h" 3 4 extern int strcoll_l (const char *__s1, const char *__s2, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2, 3))); extern size_t strxfrm_l (char *__dest, const char *__src, size_t __n, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern char *strdup (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); extern char *strndup (const char *__string, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__nonnull__ (1))); # 206 "/usr/include/string.h" 3 4 # 231 "/usr/include/string.h" 3 4 extern char *strchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 258 "/usr/include/string.h" 3 4 extern char *strrchr (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 277 "/usr/include/string.h" 3 4 extern size_t strcspn (const char *__s, const char *__reject) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern size_t strspn (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 310 "/usr/include/string.h" 3 4 extern char *strpbrk (const char *__s, const char *__accept) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 337 "/usr/include/string.h" 3 4 extern char *strstr (const char *__haystack, const char *__needle) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strtok (char *__restrict __s, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *__strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); extern char *strtok_r (char *__restrict __s, const char *__restrict __delim, char **__restrict __save_ptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 3))); # 392 "/usr/include/string.h" 3 4 extern size_t strlen (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern size_t strnlen (const char *__string, size_t __maxlen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern char *strerror (int __errnum) __attribute__ ((__nothrow__ , __leaf__)); # 422 "/usr/include/string.h" 3 4 extern int strerror_r (int __errnum, char *__buf, size_t __buflen) __asm__ ("" "__xpg_strerror_r") __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); # 440 "/usr/include/string.h" 3 4 extern char *strerror_l (int __errnum, __locale_t __l) __attribute__ ((__nothrow__ , __leaf__)); extern void __bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void bcopy (const void *__src, void *__dest, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void bzero (void *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int bcmp (const void *__s1, const void *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 484 "/usr/include/string.h" 3 4 extern char *index (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); # 512 "/usr/include/string.h" 3 4 extern char *rindex (const char *__s, int __c) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))); extern int ffs (int __i) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 529 "/usr/include/string.h" 3 4 extern int strcasecmp (const char *__s1, const char *__s2) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); extern int strncasecmp (const char *__s1, const char *__s2, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1, 2))); # 552 "/usr/include/string.h" 3 4 extern char *strsep (char **__restrict __stringp, const char *__restrict __delim) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *strsignal (int __sig) __attribute__ ((__nothrow__ , __leaf__)); extern char *__stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpcpy (char *__restrict __dest, const char *__restrict __src) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *__stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern char *stpncpy (char *__restrict __dest, const char *__restrict __src, size_t __n) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); # 656 "/usr/include/string.h" 3 4 # 57 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 34 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 1 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 1 3 4 # 168 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 3 4 # 1 "/usr/include/limits.h" 1 3 4 # 143 "/usr/include/limits.h" 3 4 # 1 "/usr/include/bits/posix1_lim.h" 1 3 4 # 160 "/usr/include/bits/posix1_lim.h" 3 4 # 1 "/usr/include/bits/local_lim.h" 1 3 4 # 38 "/usr/include/bits/local_lim.h" 3 4 # 1 "/usr/include/linux/limits.h" 1 3 4 # 39 "/usr/include/bits/local_lim.h" 2 3 4 # 161 "/usr/include/bits/posix1_lim.h" 2 3 4 # 144 "/usr/include/limits.h" 2 3 4 # 1 "/usr/include/bits/posix2_lim.h" 1 3 4 # 148 "/usr/include/limits.h" 2 3 4 # 169 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 8 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/syslimits.h" 2 3 4 # 35 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include-fixed/limits.h" 2 3 4 # 58 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/hwloc/rename.h" 1 3 4 # 63 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/hwloc/bitmap.h" 1 3 4 # 17 "/usr/include/hwloc/bitmap.h" 3 4 # 1 "/usr/include/assert.h" 1 3 4 # 64 "/usr/include/assert.h" 3 4 extern void __assert_fail (const char *__assertion, const char *__file, unsigned int __line, const char *__function) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void __assert_perror_fail (int __errnum, const char *__file, unsigned int __line, const char *__function) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void __assert (const char *__assertion, const char *__file, int __line) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); # 18 "/usr/include/hwloc/bitmap.h" 2 3 4 # 51 "/usr/include/hwloc/bitmap.h" 3 4 typedef struct hwloc_bitmap_s * hwloc_bitmap_t; typedef const struct hwloc_bitmap_s * hwloc_const_bitmap_t; # 67 "/usr/include/hwloc/bitmap.h" 3 4 hwloc_bitmap_t hwloc_bitmap_alloc(void) __attribute__((__malloc__)); hwloc_bitmap_t hwloc_bitmap_alloc_full(void) __attribute__((__malloc__)); void hwloc_bitmap_free(hwloc_bitmap_t bitmap); hwloc_bitmap_t hwloc_bitmap_dup(hwloc_const_bitmap_t bitmap) __attribute__((__malloc__)); void hwloc_bitmap_copy(hwloc_bitmap_t dst, hwloc_const_bitmap_t src); # 101 "/usr/include/hwloc/bitmap.h" 3 4 int hwloc_bitmap_snprintf(char * __restrict buf, size_t buflen, hwloc_const_bitmap_t bitmap); int hwloc_bitmap_asprintf(char ** strp, hwloc_const_bitmap_t bitmap); int hwloc_bitmap_sscanf(hwloc_bitmap_t bitmap, const char * __restrict string); # 124 "/usr/include/hwloc/bitmap.h" 3 4 int hwloc_bitmap_list_snprintf(char * __restrict buf, size_t buflen, hwloc_const_bitmap_t bitmap); int hwloc_bitmap_list_asprintf(char ** strp, hwloc_const_bitmap_t bitmap); int hwloc_bitmap_list_sscanf(hwloc_bitmap_t bitmap, const char * __restrict string); # 146 "/usr/include/hwloc/bitmap.h" 3 4 int hwloc_bitmap_taskset_snprintf(char * __restrict buf, size_t buflen, hwloc_const_bitmap_t bitmap); int hwloc_bitmap_taskset_asprintf(char ** strp, hwloc_const_bitmap_t bitmap); int hwloc_bitmap_taskset_sscanf(hwloc_bitmap_t bitmap, const char * __restrict string); void hwloc_bitmap_zero(hwloc_bitmap_t bitmap); void hwloc_bitmap_fill(hwloc_bitmap_t bitmap); void hwloc_bitmap_only(hwloc_bitmap_t bitmap, unsigned id); void hwloc_bitmap_allbut(hwloc_bitmap_t bitmap, unsigned id); void hwloc_bitmap_from_ulong(hwloc_bitmap_t bitmap, unsigned long mask); void hwloc_bitmap_from_ith_ulong(hwloc_bitmap_t bitmap, unsigned i, unsigned long mask); void hwloc_bitmap_set(hwloc_bitmap_t bitmap, unsigned id); void hwloc_bitmap_set_range(hwloc_bitmap_t bitmap, unsigned begin, int end); void hwloc_bitmap_set_ith_ulong(hwloc_bitmap_t bitmap, unsigned i, unsigned long mask); void hwloc_bitmap_clr(hwloc_bitmap_t bitmap, unsigned id); void hwloc_bitmap_clr_range(hwloc_bitmap_t bitmap, unsigned begin, int end); void hwloc_bitmap_singlify(hwloc_bitmap_t bitmap); unsigned long hwloc_bitmap_to_ulong(hwloc_const_bitmap_t bitmap) __attribute__((__pure__)); unsigned long hwloc_bitmap_to_ith_ulong(hwloc_const_bitmap_t bitmap, unsigned i) __attribute__((__pure__)); int hwloc_bitmap_isset(hwloc_const_bitmap_t bitmap, unsigned id) __attribute__((__pure__)); int hwloc_bitmap_iszero(hwloc_const_bitmap_t bitmap) __attribute__((__pure__)); int hwloc_bitmap_isfull(hwloc_const_bitmap_t bitmap) __attribute__((__pure__)); int hwloc_bitmap_first(hwloc_const_bitmap_t bitmap) __attribute__((__pure__)); int hwloc_bitmap_next(hwloc_const_bitmap_t bitmap, int prev) __attribute__((__pure__)); int hwloc_bitmap_last(hwloc_const_bitmap_t bitmap) __attribute__((__pure__)); # 263 "/usr/include/hwloc/bitmap.h" 3 4 int hwloc_bitmap_weight(hwloc_const_bitmap_t bitmap) __attribute__((__pure__)); # 307 "/usr/include/hwloc/bitmap.h" 3 4 void hwloc_bitmap_or (hwloc_bitmap_t res, hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2); void hwloc_bitmap_and (hwloc_bitmap_t res, hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2); void hwloc_bitmap_andnot (hwloc_bitmap_t res, hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2); void hwloc_bitmap_xor (hwloc_bitmap_t res, hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2); void hwloc_bitmap_not (hwloc_bitmap_t res, hwloc_const_bitmap_t bitmap); int hwloc_bitmap_intersects (hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2) __attribute__((__pure__)); int hwloc_bitmap_isincluded (hwloc_const_bitmap_t sub_bitmap, hwloc_const_bitmap_t super_bitmap) __attribute__((__pure__)); int hwloc_bitmap_isequal (hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2) __attribute__((__pure__)); int hwloc_bitmap_compare_first(hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2) __attribute__((__pure__)); # 366 "/usr/include/hwloc/bitmap.h" 3 4 int hwloc_bitmap_compare(hwloc_const_bitmap_t bitmap1, hwloc_const_bitmap_t bitmap2) __attribute__((__pure__)); # 69 "/usr/include/hwloc.h" 2 3 4 # 87 "/usr/include/hwloc.h" 3 4 unsigned hwloc_get_api_version(void); # 125 "/usr/include/hwloc.h" 3 4 typedef hwloc_bitmap_t hwloc_cpuset_t; typedef hwloc_const_bitmap_t hwloc_const_cpuset_t; # 144 "/usr/include/hwloc.h" 3 4 typedef hwloc_bitmap_t hwloc_nodeset_t; typedef hwloc_const_bitmap_t hwloc_const_nodeset_t; # 163 "/usr/include/hwloc.h" 3 4 typedef enum { # 173 "/usr/include/hwloc.h" 3 4 HWLOC_OBJ_SYSTEM, HWLOC_OBJ_MACHINE, HWLOC_OBJ_NUMANODE, HWLOC_OBJ_PACKAGE, HWLOC_OBJ_CACHE, HWLOC_OBJ_CORE, HWLOC_OBJ_PU, # 207 "/usr/include/hwloc.h" 3 4 HWLOC_OBJ_GROUP, # 220 "/usr/include/hwloc.h" 3 4 HWLOC_OBJ_MISC, HWLOC_OBJ_BRIDGE, HWLOC_OBJ_PCI_DEVICE, HWLOC_OBJ_OS_DEVICE, HWLOC_OBJ_TYPE_MAX # 254 "/usr/include/hwloc.h" 3 4 } hwloc_obj_type_t; typedef enum hwloc_obj_cache_type_e { HWLOC_OBJ_CACHE_UNIFIED, HWLOC_OBJ_CACHE_DATA, HWLOC_OBJ_CACHE_INSTRUCTION } hwloc_obj_cache_type_t; typedef enum hwloc_obj_bridge_type_e { HWLOC_OBJ_BRIDGE_HOST, HWLOC_OBJ_BRIDGE_PCI } hwloc_obj_bridge_type_t; typedef enum hwloc_obj_osdev_type_e { HWLOC_OBJ_OSDEV_BLOCK, HWLOC_OBJ_OSDEV_GPU, HWLOC_OBJ_OSDEV_NETWORK, HWLOC_OBJ_OSDEV_OPENFABRICS, HWLOC_OBJ_OSDEV_DMA, HWLOC_OBJ_OSDEV_COPROC } hwloc_obj_osdev_type_t; # 307 "/usr/include/hwloc.h" 3 4 int hwloc_compare_types (hwloc_obj_type_t type1, hwloc_obj_type_t type2) __attribute__((__const__)); enum hwloc_compare_types_e { HWLOC_TYPE_UNORDERED = 0x7fffffff }; # 321 "/usr/include/hwloc.h" 3 4 union hwloc_obj_attr_u; struct hwloc_obj_memory_s { hwloc_uint64_t total_memory; hwloc_uint64_t local_memory; unsigned page_types_len; struct hwloc_obj_memory_page_type_s { hwloc_uint64_t size; hwloc_uint64_t count; } * page_types; }; struct hwloc_obj { hwloc_obj_type_t type; unsigned os_index; char *name; struct hwloc_obj_memory_s memory; union hwloc_obj_attr_u *attr; unsigned depth; unsigned logical_index; signed os_level; struct hwloc_obj *next_cousin; struct hwloc_obj *prev_cousin; struct hwloc_obj *parent; unsigned sibling_rank; struct hwloc_obj *next_sibling; struct hwloc_obj *prev_sibling; unsigned arity; struct hwloc_obj **children; struct hwloc_obj *first_child; struct hwloc_obj *last_child; void *userdata; hwloc_cpuset_t cpuset; # 411 "/usr/include/hwloc.h" 3 4 hwloc_cpuset_t complete_cpuset; # 422 "/usr/include/hwloc.h" 3 4 hwloc_cpuset_t online_cpuset; hwloc_cpuset_t allowed_cpuset; # 441 "/usr/include/hwloc.h" 3 4 hwloc_nodeset_t nodeset; # 458 "/usr/include/hwloc.h" 3 4 hwloc_nodeset_t complete_nodeset; # 472 "/usr/include/hwloc.h" 3 4 hwloc_nodeset_t allowed_nodeset; # 485 "/usr/include/hwloc.h" 3 4 struct hwloc_distances_s **distances; unsigned distances_count; struct hwloc_obj_info_s *infos; unsigned infos_count; int symmetric_subtree; }; typedef struct hwloc_obj * hwloc_obj_t; union hwloc_obj_attr_u { struct hwloc_cache_attr_s { hwloc_uint64_t size; unsigned depth; unsigned linesize; int associativity; hwloc_obj_cache_type_t type; } cache; struct hwloc_group_attr_s { unsigned depth; } group; struct hwloc_pcidev_attr_s { unsigned short domain; unsigned char bus, dev, func; unsigned short class_id; unsigned short vendor_id, device_id, subvendor_id, subdevice_id; unsigned char revision; float linkspeed; } pcidev; struct hwloc_bridge_attr_s { union { struct hwloc_pcidev_attr_s pci; } upstream; hwloc_obj_bridge_type_t upstream_type; union { struct { unsigned short domain; unsigned char secondary_bus, subordinate_bus; } pci; } downstream; hwloc_obj_bridge_type_t downstream_type; unsigned depth; } bridge; struct hwloc_osdev_attr_s { hwloc_obj_osdev_type_t type; } osdev; }; # 560 "/usr/include/hwloc.h" 3 4 struct hwloc_distances_s { unsigned relative_depth; unsigned nbobjs; float *latency; # 580 "/usr/include/hwloc.h" 3 4 float latency_max; float latency_base; }; struct hwloc_obj_info_s { char *name; char *value; }; # 604 "/usr/include/hwloc.h" 3 4 struct hwloc_topology; typedef struct hwloc_topology * hwloc_topology_t; int hwloc_topology_init (hwloc_topology_t *topologyp); # 639 "/usr/include/hwloc.h" 3 4 int hwloc_topology_load(hwloc_topology_t topology); void hwloc_topology_destroy (hwloc_topology_t topology); # 657 "/usr/include/hwloc.h" 3 4 int hwloc_topology_dup(hwloc_topology_t *newtopology, hwloc_topology_t oldtopology); # 670 "/usr/include/hwloc.h" 3 4 void hwloc_topology_check(hwloc_topology_t topology); # 708 "/usr/include/hwloc.h" 3 4 int hwloc_topology_ignore_type(hwloc_topology_t topology, hwloc_obj_type_t type); # 718 "/usr/include/hwloc.h" 3 4 int hwloc_topology_ignore_type_keep_structure(hwloc_topology_t topology, hwloc_obj_type_t type); int hwloc_topology_ignore_all_keep_structure(hwloc_topology_t topology); enum hwloc_topology_flags_e { # 745 "/usr/include/hwloc.h" 3 4 HWLOC_TOPOLOGY_FLAG_WHOLE_SYSTEM = (1UL<<0), # 765 "/usr/include/hwloc.h" 3 4 HWLOC_TOPOLOGY_FLAG_IS_THISSYSTEM = (1UL<<1), # 778 "/usr/include/hwloc.h" 3 4 HWLOC_TOPOLOGY_FLAG_IO_DEVICES = (1UL<<2), # 787 "/usr/include/hwloc.h" 3 4 HWLOC_TOPOLOGY_FLAG_IO_BRIDGES = (1UL<<3), # 797 "/usr/include/hwloc.h" 3 4 HWLOC_TOPOLOGY_FLAG_WHOLE_IO = (1UL<<4), HWLOC_TOPOLOGY_FLAG_ICACHES = (1UL<<5) }; # 817 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_flags (hwloc_topology_t topology, unsigned long flags); unsigned long hwloc_topology_get_flags (hwloc_topology_t topology); # 841 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_pid(hwloc_topology_t __restrict topology, pid_t pid); # 869 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_fsroot(hwloc_topology_t __restrict topology, const char * __restrict fsroot_path); # 898 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_synthetic(hwloc_topology_t __restrict topology, const char * __restrict description); # 926 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_xml(hwloc_topology_t __restrict topology, const char * __restrict xmlpath); # 954 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_xmlbuffer(hwloc_topology_t __restrict topology, const char * __restrict buffer, int size); # 975 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_custom(hwloc_topology_t topology); # 995 "/usr/include/hwloc.h" 3 4 int hwloc_topology_set_distance_matrix(hwloc_topology_t __restrict topology, hwloc_obj_type_t type, unsigned nbobjs, unsigned *os_index, float *distances); # 1006 "/usr/include/hwloc.h" 3 4 int hwloc_topology_is_thissystem(hwloc_topology_t __restrict topology) __attribute__((__pure__)); struct hwloc_topology_discovery_support { unsigned char pu; }; struct hwloc_topology_cpubind_support { unsigned char set_thisproc_cpubind; unsigned char get_thisproc_cpubind; unsigned char set_proc_cpubind; unsigned char get_proc_cpubind; unsigned char set_thisthread_cpubind; unsigned char get_thisthread_cpubind; unsigned char set_thread_cpubind; unsigned char get_thread_cpubind; unsigned char get_thisproc_last_cpu_location; unsigned char get_proc_last_cpu_location; unsigned char get_thisthread_last_cpu_location; }; struct hwloc_topology_membind_support { unsigned char set_thisproc_membind; unsigned char get_thisproc_membind; unsigned char set_proc_membind; unsigned char get_proc_membind; unsigned char set_thisthread_membind; unsigned char get_thisthread_membind; unsigned char set_area_membind; unsigned char get_area_membind; unsigned char alloc_membind; unsigned char firsttouch_membind; unsigned char bind_membind; unsigned char interleave_membind; unsigned char replicate_membind; unsigned char nexttouch_membind; unsigned char migrate_membind; unsigned char get_area_memlocation; }; struct hwloc_topology_support { struct hwloc_topology_discovery_support *discovery; struct hwloc_topology_cpubind_support *cpubind; struct hwloc_topology_membind_support *membind; }; # 1105 "/usr/include/hwloc.h" 3 4 const struct hwloc_topology_support *hwloc_topology_get_support(hwloc_topology_t __restrict topology); # 1117 "/usr/include/hwloc.h" 3 4 void hwloc_topology_set_userdata(hwloc_topology_t topology, const void *userdata); void * hwloc_topology_get_userdata(hwloc_topology_t topology); # 1143 "/usr/include/hwloc.h" 3 4 unsigned hwloc_topology_get_depth(hwloc_topology_t __restrict topology) __attribute__((__pure__)); # 1167 "/usr/include/hwloc.h" 3 4 int hwloc_get_type_depth (hwloc_topology_t topology, hwloc_obj_type_t type); enum hwloc_get_type_depth_e { HWLOC_TYPE_DEPTH_UNKNOWN = -1, HWLOC_TYPE_DEPTH_MULTIPLE = -2, HWLOC_TYPE_DEPTH_BRIDGE = -3, HWLOC_TYPE_DEPTH_PCI_DEVICE = -4, HWLOC_TYPE_DEPTH_OS_DEVICE = -5 }; # 1186 "/usr/include/hwloc.h" 3 4 static __inline__ int hwloc_get_type_or_below_depth (hwloc_topology_t topology, hwloc_obj_type_t type) __attribute__((__pure__)); # 1198 "/usr/include/hwloc.h" 3 4 static __inline__ int hwloc_get_type_or_above_depth (hwloc_topology_t topology, hwloc_obj_type_t type) __attribute__((__pure__)); hwloc_obj_type_t hwloc_get_depth_type (hwloc_topology_t topology, unsigned depth) __attribute__((__pure__)); unsigned hwloc_get_nbobjs_by_depth (hwloc_topology_t topology, unsigned depth) __attribute__((__pure__)); static __inline__ int hwloc_get_nbobjs_by_type (hwloc_topology_t topology, hwloc_obj_type_t type) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_root_obj (hwloc_topology_t topology) __attribute__((__pure__)); hwloc_obj_t hwloc_get_obj_by_depth (hwloc_topology_t topology, unsigned depth, unsigned idx) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_obj_by_type (hwloc_topology_t topology, hwloc_obj_type_t type, unsigned idx) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_next_obj_by_depth (hwloc_topology_t topology, unsigned depth, hwloc_obj_t prev); static __inline__ hwloc_obj_t hwloc_get_next_obj_by_type (hwloc_topology_t topology, hwloc_obj_type_t type, hwloc_obj_t prev); # 1273 "/usr/include/hwloc.h" 3 4 const char * hwloc_obj_type_string (hwloc_obj_type_t type) __attribute__((__const__)); # 1288 "/usr/include/hwloc.h" 3 4 int hwloc_obj_type_snprintf(char * __restrict string, size_t size, hwloc_obj_t obj, int verbose); # 1302 "/usr/include/hwloc.h" 3 4 int hwloc_obj_attr_snprintf(char * __restrict string, size_t size, hwloc_obj_t obj, const char * __restrict separator, int verbose); # 1312 "/usr/include/hwloc.h" 3 4 int hwloc_obj_cpuset_snprintf(char * __restrict str, size_t size, size_t nobj, const hwloc_obj_t * __restrict objs); # 1343 "/usr/include/hwloc.h" 3 4 int hwloc_obj_type_sscanf(const char *string, hwloc_obj_type_t *typep, int *depthattrp, void *typeattrp, size_t typeattrsize); # 1363 "/usr/include/hwloc.h" 3 4 static __inline__ const char * hwloc_obj_get_info_by_name(hwloc_obj_t obj, const char *name) __attribute__((__pure__)); # 1380 "/usr/include/hwloc.h" 3 4 void hwloc_obj_add_info(hwloc_obj_t obj, const char *name, const char *value); # 1450 "/usr/include/hwloc.h" 3 4 typedef enum { HWLOC_CPUBIND_PROCESS = (1<<0), HWLOC_CPUBIND_THREAD = (1<<1), # 1482 "/usr/include/hwloc.h" 3 4 HWLOC_CPUBIND_STRICT = (1<<2), # 1499 "/usr/include/hwloc.h" 3 4 HWLOC_CPUBIND_NOMEMBIND = (1<<3) } hwloc_cpubind_flags_t; int hwloc_set_cpubind(hwloc_topology_t topology, hwloc_const_cpuset_t set, int flags); int hwloc_get_cpubind(hwloc_topology_t topology, hwloc_cpuset_t set, int flags); # 1527 "/usr/include/hwloc.h" 3 4 int hwloc_set_proc_cpubind(hwloc_topology_t topology, pid_t pid, hwloc_const_cpuset_t set, int flags); # 1540 "/usr/include/hwloc.h" 3 4 int hwloc_get_proc_cpubind(hwloc_topology_t topology, pid_t pid, hwloc_cpuset_t set, int flags); # 1550 "/usr/include/hwloc.h" 3 4 int hwloc_set_thread_cpubind(hwloc_topology_t topology, pthread_t thread, hwloc_const_cpuset_t set, int flags); # 1561 "/usr/include/hwloc.h" 3 4 int hwloc_get_thread_cpubind(hwloc_topology_t topology, pthread_t thread, hwloc_cpuset_t set, int flags); # 1577 "/usr/include/hwloc.h" 3 4 int hwloc_get_last_cpu_location(hwloc_topology_t topology, hwloc_cpuset_t set, int flags); # 1595 "/usr/include/hwloc.h" 3 4 int hwloc_get_proc_last_cpu_location(hwloc_topology_t topology, pid_t pid, hwloc_cpuset_t set, int flags); # 1678 "/usr/include/hwloc.h" 3 4 typedef enum { # 1688 "/usr/include/hwloc.h" 3 4 HWLOC_MEMBIND_DEFAULT = 0, # 1698 "/usr/include/hwloc.h" 3 4 HWLOC_MEMBIND_FIRSTTOUCH = 1, HWLOC_MEMBIND_BIND = 2, # 1711 "/usr/include/hwloc.h" 3 4 HWLOC_MEMBIND_INTERLEAVE = 3, # 1724 "/usr/include/hwloc.h" 3 4 HWLOC_MEMBIND_REPLICATE = 4, HWLOC_MEMBIND_NEXTTOUCH = 5, HWLOC_MEMBIND_MIXED = -1 } hwloc_membind_policy_t; # 1753 "/usr/include/hwloc.h" 3 4 typedef enum { HWLOC_MEMBIND_PROCESS = (1<<0), HWLOC_MEMBIND_THREAD = (1<<1), HWLOC_MEMBIND_STRICT = (1<<2), HWLOC_MEMBIND_MIGRATE = (1<<3), # 1790 "/usr/include/hwloc.h" 3 4 HWLOC_MEMBIND_NOCPUBIND = (1<<4), # 1801 "/usr/include/hwloc.h" 3 4 HWLOC_MEMBIND_BYNODESET = (1<<5) } hwloc_membind_flags_t; # 1816 "/usr/include/hwloc.h" 3 4 int hwloc_set_membind_nodeset(hwloc_topology_t topology, hwloc_const_nodeset_t nodeset, hwloc_membind_policy_t policy, int flags); # 1833 "/usr/include/hwloc.h" 3 4 int hwloc_set_membind(hwloc_topology_t topology, hwloc_const_bitmap_t set, hwloc_membind_policy_t policy, int flags); # 1876 "/usr/include/hwloc.h" 3 4 int hwloc_get_membind_nodeset(hwloc_topology_t topology, hwloc_nodeset_t nodeset, hwloc_membind_policy_t * policy, int flags); # 1922 "/usr/include/hwloc.h" 3 4 int hwloc_get_membind(hwloc_topology_t topology, hwloc_bitmap_t set, hwloc_membind_policy_t * policy, int flags); # 1933 "/usr/include/hwloc.h" 3 4 int hwloc_set_proc_membind_nodeset(hwloc_topology_t topology, pid_t pid, hwloc_const_nodeset_t nodeset, hwloc_membind_policy_t policy, int flags); # 1947 "/usr/include/hwloc.h" 3 4 int hwloc_set_proc_membind(hwloc_topology_t topology, pid_t pid, hwloc_const_bitmap_t set, hwloc_membind_policy_t policy, int flags); # 1985 "/usr/include/hwloc.h" 3 4 int hwloc_get_proc_membind_nodeset(hwloc_topology_t topology, pid_t pid, hwloc_nodeset_t nodeset, hwloc_membind_policy_t * policy, int flags); # 2026 "/usr/include/hwloc.h" 3 4 int hwloc_get_proc_membind(hwloc_topology_t topology, pid_t pid, hwloc_bitmap_t set, hwloc_membind_policy_t * policy, int flags); # 2035 "/usr/include/hwloc.h" 3 4 int hwloc_set_area_membind_nodeset(hwloc_topology_t topology, const void *addr, size_t len, hwloc_const_nodeset_t nodeset, hwloc_membind_policy_t policy, int flags); # 2047 "/usr/include/hwloc.h" 3 4 int hwloc_set_area_membind(hwloc_topology_t topology, const void *addr, size_t len, hwloc_const_bitmap_t set, hwloc_membind_policy_t policy, int flags); # 2073 "/usr/include/hwloc.h" 3 4 int hwloc_get_area_membind_nodeset(hwloc_topology_t topology, const void *addr, size_t len, hwloc_nodeset_t nodeset, hwloc_membind_policy_t * policy, int flags); # 2102 "/usr/include/hwloc.h" 3 4 int hwloc_get_area_membind(hwloc_topology_t topology, const void *addr, size_t len, hwloc_bitmap_t set, hwloc_membind_policy_t * policy, int flags); # 2125 "/usr/include/hwloc.h" 3 4 int hwloc_get_area_memlocation(hwloc_topology_t topology, const void *addr, size_t len, hwloc_bitmap_t set, int flags); # 2134 "/usr/include/hwloc.h" 3 4 void *hwloc_alloc(hwloc_topology_t topology, size_t len); # 2147 "/usr/include/hwloc.h" 3 4 void *hwloc_alloc_membind_nodeset(hwloc_topology_t topology, size_t len, hwloc_const_nodeset_t nodeset, hwloc_membind_policy_t policy, int flags) __attribute__((__malloc__)); # 2163 "/usr/include/hwloc.h" 3 4 void *hwloc_alloc_membind(hwloc_topology_t topology, size_t len, hwloc_const_bitmap_t set, hwloc_membind_policy_t policy, int flags) __attribute__((__malloc__)); static __inline__ void * hwloc_alloc_membind_policy_nodeset(hwloc_topology_t topology, size_t len, hwloc_const_nodeset_t nodeset, hwloc_membind_policy_t policy, int flags) __attribute__((__malloc__)); # 2183 "/usr/include/hwloc.h" 3 4 static __inline__ void * hwloc_alloc_membind_policy(hwloc_topology_t topology, size_t len, hwloc_const_bitmap_t set, hwloc_membind_policy_t policy, int flags) __attribute__((__malloc__)); int hwloc_free(hwloc_topology_t topology, void *addr, size_t len); # 2213 "/usr/include/hwloc.h" 3 4 hwloc_obj_t hwloc_topology_insert_misc_object_by_cpuset(hwloc_topology_t topology, hwloc_const_cpuset_t cpuset, const char *name); # 2230 "/usr/include/hwloc.h" 3 4 hwloc_obj_t hwloc_topology_insert_misc_object_by_parent(hwloc_topology_t topology, hwloc_obj_t parent, const char *name); enum hwloc_restrict_flags_e { HWLOC_RESTRICT_FLAG_ADAPT_DISTANCES = (1<<0), HWLOC_RESTRICT_FLAG_ADAPT_MISC = (1<<1), HWLOC_RESTRICT_FLAG_ADAPT_IO = (1<<2) }; # 2274 "/usr/include/hwloc.h" 3 4 int hwloc_topology_restrict(hwloc_topology_t __restrict topology, hwloc_const_cpuset_t cpuset, unsigned long flags); # 2309 "/usr/include/hwloc.h" 3 4 int hwloc_custom_insert_topology(hwloc_topology_t newtopology, hwloc_obj_t newparent, hwloc_topology_t oldtopology, hwloc_obj_t oldroot); # 2332 "/usr/include/hwloc.h" 3 4 hwloc_obj_t hwloc_custom_insert_group_object_by_parent(hwloc_topology_t topology, hwloc_obj_t parent, int groupdepth); # 2359 "/usr/include/hwloc.h" 3 4 int hwloc_topology_export_xml(hwloc_topology_t topology, const char *xmlpath); # 2379 "/usr/include/hwloc.h" 3 4 int hwloc_topology_export_xmlbuffer(hwloc_topology_t topology, char **xmlbuffer, int *buflen); void hwloc_free_xmlbuffer(hwloc_topology_t topology, char *xmlbuffer); # 2402 "/usr/include/hwloc.h" 3 4 void hwloc_topology_set_userdata_export_callback(hwloc_topology_t topology, void (*export_cb)(void *reserved, hwloc_topology_t topology, hwloc_obj_t obj)); # 2430 "/usr/include/hwloc.h" 3 4 int hwloc_export_obj_userdata(void *reserved, hwloc_topology_t topology, hwloc_obj_t obj, const char *name, const void *buffer, size_t length); # 2445 "/usr/include/hwloc.h" 3 4 int hwloc_export_obj_userdata_base64(void *reserved, hwloc_topology_t topology, hwloc_obj_t obj, const char *name, const void *buffer, size_t length); # 2469 "/usr/include/hwloc.h" 3 4 void hwloc_topology_set_userdata_import_callback(hwloc_topology_t topology, void (*import_cb)(hwloc_topology_t topology, hwloc_obj_t obj, const char *name, const void *buffer, size_t length)); # 2483 "/usr/include/hwloc.h" 3 4 enum hwloc_topology_export_synthetic_flags_e { HWLOC_TOPOLOGY_EXPORT_SYNTHETIC_FLAG_NO_EXTENDED_TYPES = (1UL<<0), HWLOC_TOPOLOGY_EXPORT_SYNTHETIC_FLAG_NO_ATTRS = (1UL<<1) }; # 2518 "/usr/include/hwloc.h" 3 4 int hwloc_topology_export_synthetic(hwloc_topology_t topology, char *buffer, size_t buflen, unsigned long flags); # 2530 "/usr/include/hwloc.h" 3 4 # 1 "/usr/include/hwloc/helper.h" 1 3 4 # 20 "/usr/include/hwloc/helper.h" 3 4 # 1 "/usr/include/stdlib.h" 1 3 4 # 32 "/usr/include/stdlib.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 33 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitflags.h" 1 3 4 # 42 "/usr/include/stdlib.h" 2 3 4 # 1 "/usr/include/bits/waitstatus.h" 1 3 4 # 43 "/usr/include/stdlib.h" 2 3 4 # 56 "/usr/include/stdlib.h" 3 4 typedef struct { int quot; int rem; } div_t; typedef struct { long int quot; long int rem; } ldiv_t; __extension__ typedef struct { long long int quot; long long int rem; } lldiv_t; # 100 "/usr/include/stdlib.h" 3 4 extern size_t __ctype_get_mb_cur_max (void) __attribute__ ((__nothrow__ , __leaf__)) ; extern double atof (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern int atoi (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern long int atol (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; __extension__ extern long long int atoll (const char *__nptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; extern double strtod (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern float strtof (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long double strtold (const char *__restrict __nptr, char **__restrict __endptr) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int strtol (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern unsigned long int strtoul (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtouq (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern long long int strtoll (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); __extension__ extern unsigned long long int strtoull (const char *__restrict __nptr, char **__restrict __endptr, int __base) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 266 "/usr/include/stdlib.h" 3 4 extern char *l64a (long int __n) __attribute__ ((__nothrow__ , __leaf__)) ; extern long int a64l (const char *__s) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__pure__)) __attribute__ ((__nonnull__ (1))) ; # 282 "/usr/include/stdlib.h" 3 4 extern long int random (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srandom (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern char *initstate (unsigned int __seed, char *__statebuf, size_t __statelen) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern char *setstate (char *__statebuf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct random_data { int32_t *fptr; int32_t *rptr; int32_t *state; int rand_type; int rand_deg; int rand_sep; int32_t *end_ptr; }; extern int random_r (struct random_data *__restrict __buf, int32_t *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srandom_r (unsigned int __seed, struct random_data *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int initstate_r (unsigned int __seed, char *__restrict __statebuf, size_t __statelen, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2, 4))); extern int setstate_r (char *__restrict __statebuf, struct random_data *__restrict __buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int rand (void) __attribute__ ((__nothrow__ , __leaf__)); extern void srand (unsigned int __seed) __attribute__ ((__nothrow__ , __leaf__)); extern int rand_r (unsigned int *__seed) __attribute__ ((__nothrow__ , __leaf__)); extern double drand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern double erand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int lrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int nrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern long int mrand48 (void) __attribute__ ((__nothrow__ , __leaf__)); extern long int jrand48 (unsigned short int __xsubi[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void srand48 (long int __seedval) __attribute__ ((__nothrow__ , __leaf__)); extern unsigned short int *seed48 (unsigned short int __seed16v[3]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void lcong48 (unsigned short int __param[7]) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); struct drand48_data { unsigned short int __x[3]; unsigned short int __old_x[3]; unsigned short int __c; unsigned short int __init; __extension__ unsigned long long int __a; }; extern int drand48_r (struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int erand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, double *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int nrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int mrand48_r (struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int jrand48_r (unsigned short int __xsubi[3], struct drand48_data *__restrict __buffer, long int *__restrict __result) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int srand48_r (long int __seedval, struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int seed48_r (unsigned short int __seed16v[3], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern int lcong48_r (unsigned short int __param[7], struct drand48_data *__buffer) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2))); extern void *malloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *calloc (size_t __nmemb, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern void *realloc (void *__ptr, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__warn_unused_result__)); extern void free (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); extern void cfree (void *__ptr) __attribute__ ((__nothrow__ , __leaf__)); # 1 "/usr/include/alloca.h" 1 3 4 # 24 "/usr/include/alloca.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 25 "/usr/include/alloca.h" 2 3 4 extern void *alloca (size_t __size) __attribute__ ((__nothrow__ , __leaf__)); # 454 "/usr/include/stdlib.h" 2 3 4 extern void *valloc (size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) ; extern int posix_memalign (void **__memptr, size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; extern void *aligned_alloc (size_t __alignment, size_t __size) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__malloc__)) __attribute__ ((__alloc_size__ (2))) ; extern void abort (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern int atexit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int at_quick_exit (void (*__func) (void)) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int on_exit (void (*__func) (int __status, void *__arg), void *__arg) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern void exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void quick_exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern void _Exit (int __status) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__noreturn__)); extern char *getenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 539 "/usr/include/stdlib.h" 3 4 extern int putenv (char *__string) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int setenv (const char *__name, const char *__value, int __replace) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (2))); extern int unsetenv (const char *__name) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); extern int clearenv (void) __attribute__ ((__nothrow__ , __leaf__)); # 567 "/usr/include/stdlib.h" 3 4 extern char *mktemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 580 "/usr/include/stdlib.h" 3 4 extern int mkstemp (char *__template) __attribute__ ((__nonnull__ (1))) ; # 602 "/usr/include/stdlib.h" 3 4 extern int mkstemps (char *__template, int __suffixlen) __attribute__ ((__nonnull__ (1))) ; # 623 "/usr/include/stdlib.h" 3 4 extern char *mkdtemp (char *__template) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 672 "/usr/include/stdlib.h" 3 4 extern int system (const char *__command) ; # 694 "/usr/include/stdlib.h" 3 4 extern char *realpath (const char *__restrict __name, char *__restrict __resolved) __attribute__ ((__nothrow__ , __leaf__)) ; typedef int (*__compar_fn_t) (const void *, const void *); # 712 "/usr/include/stdlib.h" 3 4 extern void *bsearch (const void *__key, const void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 2, 5))) ; extern void qsort (void *__base, size_t __nmemb, size_t __size, __compar_fn_t __compar) __attribute__ ((__nonnull__ (1, 4))); # 735 "/usr/include/stdlib.h" 3 4 extern int abs (int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern long int labs (long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern long long int llabs (long long int __x) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern div_t div (int __numer, int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; extern ldiv_t ldiv (long int __numer, long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; __extension__ extern lldiv_t lldiv (long long int __numer, long long int __denom) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)) ; # 772 "/usr/include/stdlib.h" 3 4 extern char *ecvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *fcvt (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *gcvt (double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern char *qecvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qfcvt (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4))) ; extern char *qgcvt (long double __value, int __ndigit, char *__buf) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3))) ; extern int ecvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int fcvt_r (double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qecvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int qfcvt_r (long double __value, int __ndigit, int *__restrict __decpt, int *__restrict __sign, char *__restrict __buf, size_t __len) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (3, 4, 5))); extern int mblen (const char *__s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int mbtowc (wchar_t *__restrict __pwc, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int wctomb (char *__s, wchar_t __wchar) __attribute__ ((__nothrow__ , __leaf__)); extern size_t mbstowcs (wchar_t *__restrict __pwcs, const char *__restrict __s, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern size_t wcstombs (char *__restrict __s, const wchar_t *__restrict __pwcs, size_t __n) __attribute__ ((__nothrow__ , __leaf__)); extern int rpmatch (const char *__response) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))) ; # 859 "/usr/include/stdlib.h" 3 4 extern int getsubopt (char **__restrict __optionp, char *const *__restrict __tokens, char **__restrict __valuep) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1, 2, 3))) ; # 911 "/usr/include/stdlib.h" 3 4 extern int getloadavg (double __loadavg[], int __nelem) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__nonnull__ (1))); # 921 "/usr/include/stdlib.h" 3 4 # 1 "/usr/include/bits/stdlib-float.h" 1 3 4 # 922 "/usr/include/stdlib.h" 2 3 4 # 934 "/usr/include/stdlib.h" 3 4 # 21 "/usr/include/hwloc/helper.h" 2 3 4 # 1 "/usr/include/errno.h" 1 3 4 # 31 "/usr/include/errno.h" 3 4 # 1 "/usr/include/bits/errno.h" 1 3 4 # 24 "/usr/include/bits/errno.h" 3 4 # 1 "/usr/include/linux/errno.h" 1 3 4 # 1 "/usr/include/asm/errno.h" 1 3 4 # 1 "/usr/include/asm-generic/errno.h" 1 3 4 # 1 "/usr/include/asm-generic/errno-base.h" 1 3 4 # 5 "/usr/include/asm-generic/errno.h" 2 3 4 # 1 "/usr/include/asm/errno.h" 2 3 4 # 1 "/usr/include/linux/errno.h" 2 3 4 # 25 "/usr/include/bits/errno.h" 2 3 4 # 50 "/usr/include/bits/errno.h" 3 4 extern int *__errno_location (void) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 36 "/usr/include/errno.h" 2 3 4 # 58 "/usr/include/errno.h" 3 4 # 22 "/usr/include/hwloc/helper.h" 2 3 4 # 44 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_first_largest_obj_inside_cpuset(hwloc_topology_t topology, hwloc_const_cpuset_t set) { hwloc_obj_t obj = hwloc_get_root_obj(topology); if (!obj->cpuset || !hwloc_bitmap_intersects(obj->cpuset, set)) return ((void *)0); while (!hwloc_bitmap_isincluded(obj->cpuset, set)) { hwloc_obj_t child = obj->first_child; while (child) { if (child->cpuset && hwloc_bitmap_intersects(child->cpuset, set)) break; child = child->next_sibling; } if (!child) return obj; obj = child; } return obj; } # 75 "/usr/include/hwloc/helper.h" 3 4 int hwloc_get_largest_objs_inside_cpuset (hwloc_topology_t topology, hwloc_const_cpuset_t set, hwloc_obj_t * __restrict objs, int max); # 90 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_next_obj_inside_cpuset_by_depth (hwloc_topology_t topology, hwloc_const_cpuset_t set, unsigned depth, hwloc_obj_t prev) { hwloc_obj_t next = hwloc_get_next_obj_by_depth(topology, depth, prev); if (!next || !next->cpuset) return ((void *)0); while (next && (hwloc_bitmap_iszero(next->cpuset) || !hwloc_bitmap_isincluded(next->cpuset, set))) next = next->next_cousin; return next; } # 114 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_next_obj_inside_cpuset_by_type (hwloc_topology_t topology, hwloc_const_cpuset_t set, hwloc_obj_type_t type, hwloc_obj_t prev) { int depth = hwloc_get_type_depth(topology, type); if (depth == HWLOC_TYPE_DEPTH_UNKNOWN || depth == HWLOC_TYPE_DEPTH_MULTIPLE) return ((void *)0); return hwloc_get_next_obj_inside_cpuset_by_depth(topology, set, depth, prev); } # 132 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_obj_inside_cpuset_by_depth (hwloc_topology_t topology, hwloc_const_cpuset_t set, unsigned depth, unsigned idx) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_obj_inside_cpuset_by_depth (hwloc_topology_t topology, hwloc_const_cpuset_t set, unsigned depth, unsigned idx) { hwloc_obj_t obj = hwloc_get_obj_by_depth (topology, depth, 0); unsigned count = 0; if (!obj || !obj->cpuset) return ((void *)0); while (obj) { if (!hwloc_bitmap_iszero(obj->cpuset) && hwloc_bitmap_isincluded(obj->cpuset, set)) { if (count == idx) return obj; count++; } obj = obj->next_cousin; } return ((void *)0); } # 166 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_obj_inside_cpuset_by_type (hwloc_topology_t topology, hwloc_const_cpuset_t set, hwloc_obj_type_t type, unsigned idx) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_obj_inside_cpuset_by_type (hwloc_topology_t topology, hwloc_const_cpuset_t set, hwloc_obj_type_t type, unsigned idx) { int depth = hwloc_get_type_depth(topology, type); if (depth == HWLOC_TYPE_DEPTH_UNKNOWN || depth == HWLOC_TYPE_DEPTH_MULTIPLE) return ((void *)0); return hwloc_get_obj_inside_cpuset_by_depth(topology, set, depth, idx); } # 187 "/usr/include/hwloc/helper.h" 3 4 static __inline__ unsigned hwloc_get_nbobjs_inside_cpuset_by_depth (hwloc_topology_t topology, hwloc_const_cpuset_t set, unsigned depth) __attribute__((__pure__)); static __inline__ unsigned hwloc_get_nbobjs_inside_cpuset_by_depth (hwloc_topology_t topology, hwloc_const_cpuset_t set, unsigned depth) { hwloc_obj_t obj = hwloc_get_obj_by_depth (topology, depth, 0); unsigned count = 0; if (!obj || !obj->cpuset) return 0; while (obj) { if (!hwloc_bitmap_iszero(obj->cpuset) && hwloc_bitmap_isincluded(obj->cpuset, set)) count++; obj = obj->next_cousin; } return count; } # 218 "/usr/include/hwloc/helper.h" 3 4 static __inline__ int hwloc_get_nbobjs_inside_cpuset_by_type (hwloc_topology_t topology, hwloc_const_cpuset_t set, hwloc_obj_type_t type) __attribute__((__pure__)); static __inline__ int hwloc_get_nbobjs_inside_cpuset_by_type (hwloc_topology_t topology, hwloc_const_cpuset_t set, hwloc_obj_type_t type) { int depth = hwloc_get_type_depth(topology, type); if (depth == HWLOC_TYPE_DEPTH_UNKNOWN) return 0; if (depth == HWLOC_TYPE_DEPTH_MULTIPLE) return -1; return hwloc_get_nbobjs_inside_cpuset_by_depth(topology, set, depth); } # 244 "/usr/include/hwloc/helper.h" 3 4 static __inline__ int hwloc_get_obj_index_inside_cpuset (hwloc_topology_t topology __attribute__((__unused__)), hwloc_const_cpuset_t set, hwloc_obj_t obj) __attribute__((__pure__)); static __inline__ int hwloc_get_obj_index_inside_cpuset (hwloc_topology_t topology __attribute__((__unused__)), hwloc_const_cpuset_t set, hwloc_obj_t obj) { int idx = 0; if (!hwloc_bitmap_isincluded(obj->cpuset, set)) return -1; while ((obj = obj->prev_cousin) != ((void *)0)) if (!hwloc_bitmap_iszero(obj->cpuset) && hwloc_bitmap_isincluded(obj->cpuset, set)) idx++; return idx; } # 275 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_child_covering_cpuset (hwloc_topology_t topology __attribute__((__unused__)), hwloc_const_cpuset_t set, hwloc_obj_t parent) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_child_covering_cpuset (hwloc_topology_t topology __attribute__((__unused__)), hwloc_const_cpuset_t set, hwloc_obj_t parent) { hwloc_obj_t child; if (!parent->cpuset || hwloc_bitmap_iszero(set)) return ((void *)0); child = parent->first_child; while (child) { if (child->cpuset && hwloc_bitmap_isincluded(set, child->cpuset)) return child; child = child->next_sibling; } return ((void *)0); } # 301 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_obj_covering_cpuset (hwloc_topology_t topology, hwloc_const_cpuset_t set) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_obj_covering_cpuset (hwloc_topology_t topology, hwloc_const_cpuset_t set) { struct hwloc_obj *current = hwloc_get_root_obj(topology); if (hwloc_bitmap_iszero(set) || !current->cpuset || !hwloc_bitmap_isincluded(set, current->cpuset)) return ((void *)0); while (1) { hwloc_obj_t child = hwloc_get_child_covering_cpuset(topology, set, current); if (!child) return current; current = child; } } # 327 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_next_obj_covering_cpuset_by_depth(hwloc_topology_t topology, hwloc_const_cpuset_t set, unsigned depth, hwloc_obj_t prev) { hwloc_obj_t next = hwloc_get_next_obj_by_depth(topology, depth, prev); if (!next || !next->cpuset) return ((void *)0); while (next && !hwloc_bitmap_intersects(set, next->cpuset)) next = next->next_cousin; return next; } # 354 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_next_obj_covering_cpuset_by_type(hwloc_topology_t topology, hwloc_const_cpuset_t set, hwloc_obj_type_t type, hwloc_obj_t prev) { int depth = hwloc_get_type_depth(topology, type); if (depth == HWLOC_TYPE_DEPTH_UNKNOWN || depth == HWLOC_TYPE_DEPTH_MULTIPLE) return ((void *)0); return hwloc_get_next_obj_covering_cpuset_by_depth(topology, set, depth, prev); } # 378 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_ancestor_obj_by_depth (hwloc_topology_t topology __attribute__((__unused__)), unsigned depth, hwloc_obj_t obj) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_ancestor_obj_by_depth (hwloc_topology_t topology __attribute__((__unused__)), unsigned depth, hwloc_obj_t obj) { hwloc_obj_t ancestor = obj; if (obj->depth < depth) return ((void *)0); while (ancestor && ancestor->depth > depth) ancestor = ancestor->parent; return ancestor; } static __inline__ hwloc_obj_t hwloc_get_ancestor_obj_by_type (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_type_t type, hwloc_obj_t obj) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_ancestor_obj_by_type (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_type_t type, hwloc_obj_t obj) { hwloc_obj_t ancestor = obj->parent; while (ancestor && ancestor->type != type) ancestor = ancestor->parent; return ancestor; } static __inline__ hwloc_obj_t hwloc_get_common_ancestor_obj (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t obj1, hwloc_obj_t obj2) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_common_ancestor_obj (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t obj1, hwloc_obj_t obj2) { while (obj1 != obj2) { while (obj1->depth > obj2->depth) obj1 = obj1->parent; while (obj2->depth > obj1->depth) obj2 = obj2->parent; if (obj1 != obj2 && obj1->depth == obj2->depth) { obj1 = obj1->parent; obj2 = obj2->parent; } } return obj1; } static __inline__ int hwloc_obj_is_in_subtree (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t obj, hwloc_obj_t subtree_root) __attribute__((__pure__)); static __inline__ int hwloc_obj_is_in_subtree (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t obj, hwloc_obj_t subtree_root) { return hwloc_bitmap_isincluded(obj->cpuset, subtree_root->cpuset); } static __inline__ hwloc_obj_t hwloc_get_next_child (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t parent, hwloc_obj_t prev) { if (!prev) return parent->first_child; if (prev->parent != parent) return ((void *)0); return prev->next_sibling; } # 480 "/usr/include/hwloc/helper.h" 3 4 static __inline__ int hwloc_get_cache_type_depth (hwloc_topology_t topology, unsigned cachelevel, hwloc_obj_cache_type_t cachetype) { int depth; int found = HWLOC_TYPE_DEPTH_UNKNOWN; for (depth=0; ; depth++) { hwloc_obj_t obj = hwloc_get_obj_by_depth(topology, depth, 0); if (!obj) break; if (obj->type != HWLOC_OBJ_CACHE || obj->attr->cache.depth != cachelevel) continue; if (cachetype == (hwloc_obj_cache_type_t) -1) { if (found != HWLOC_TYPE_DEPTH_UNKNOWN) { return HWLOC_TYPE_DEPTH_MULTIPLE; } found = depth; continue; } if (obj->attr->cache.type == cachetype || obj->attr->cache.type == HWLOC_OBJ_CACHE_UNIFIED) return depth; } return found; } # 517 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_cache_covering_cpuset (hwloc_topology_t topology, hwloc_const_cpuset_t set) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_cache_covering_cpuset (hwloc_topology_t topology, hwloc_const_cpuset_t set) { hwloc_obj_t current = hwloc_get_obj_covering_cpuset(topology, set); while (current) { if (current->type == HWLOC_OBJ_CACHE) return current; current = current->parent; } return ((void *)0); } static __inline__ hwloc_obj_t hwloc_get_shared_cache_covering_obj (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t obj) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_shared_cache_covering_obj (hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t obj) { hwloc_obj_t current = obj->parent; if (!obj->cpuset) return ((void *)0); while (current && current->cpuset) { if (!hwloc_bitmap_isequal(current->cpuset, obj->cpuset) && current->type == HWLOC_OBJ_CACHE) return current; current = current->parent; } return ((void *)0); } # 574 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_pu_obj_by_os_index(hwloc_topology_t topology, unsigned os_index) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_pu_obj_by_os_index(hwloc_topology_t topology, unsigned os_index) { hwloc_obj_t obj = ((void *)0); while ((obj = hwloc_get_next_obj_by_type(topology, HWLOC_OBJ_PU, obj)) != ((void *)0)) if (obj->os_index == os_index) return obj; return ((void *)0); } # 595 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_numanode_obj_by_os_index(hwloc_topology_t topology, unsigned os_index) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_numanode_obj_by_os_index(hwloc_topology_t topology, unsigned os_index) { hwloc_obj_t obj = ((void *)0); while ((obj = hwloc_get_next_obj_by_type(topology, HWLOC_OBJ_NUMANODE, obj)) != ((void *)0)) if (obj->os_index == os_index) return obj; return ((void *)0); } # 619 "/usr/include/hwloc/helper.h" 3 4 unsigned hwloc_get_closest_objs (hwloc_topology_t topology, hwloc_obj_t src, hwloc_obj_t * __restrict objs, unsigned max); # 633 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_obj_below_by_type (hwloc_topology_t topology, hwloc_obj_type_t type1, unsigned idx1, hwloc_obj_type_t type2, unsigned idx2) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_obj_below_by_type (hwloc_topology_t topology, hwloc_obj_type_t type1, unsigned idx1, hwloc_obj_type_t type2, unsigned idx2) { hwloc_obj_t obj; obj = hwloc_get_obj_by_type (topology, type1, idx1); if (!obj || !obj->cpuset) return ((void *)0); return hwloc_get_obj_inside_cpuset_by_type(topology, obj->cpuset, type2, idx2); } # 667 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_obj_below_array_by_type (hwloc_topology_t topology, int nr, hwloc_obj_type_t *typev, unsigned *idxv) __attribute__((__pure__)); static __inline__ hwloc_obj_t hwloc_get_obj_below_array_by_type (hwloc_topology_t topology, int nr, hwloc_obj_type_t *typev, unsigned *idxv) { hwloc_obj_t obj = hwloc_get_root_obj(topology); int i; for(i=0; icpuset) return ((void *)0); obj = hwloc_get_obj_inside_cpuset_by_type(topology, obj->cpuset, typev[i], idxv[i]); } return obj; } # 692 "/usr/include/hwloc/helper.h" 3 4 enum hwloc_distrib_flags_e { HWLOC_DISTRIB_FLAG_REVERSE = (1UL<<0) }; # 722 "/usr/include/hwloc/helper.h" 3 4 static __inline__ int hwloc_distrib(hwloc_topology_t topology, hwloc_obj_t *roots, unsigned n_roots, hwloc_cpuset_t *set, unsigned n, unsigned until, unsigned long flags) { unsigned i; unsigned tot_weight; unsigned given, givenweight; hwloc_cpuset_t *cpusetp = set; if (flags & ~HWLOC_DISTRIB_FLAG_REVERSE) { (*__errno_location ()) = 22; return -1; } tot_weight = 0; for (i = 0; i < n_roots; i++) if (roots[i]->cpuset) tot_weight += hwloc_bitmap_weight(roots[i]->cpuset); for (i = 0, given = 0, givenweight = 0; i < n_roots; i++) { unsigned chunk, weight; hwloc_obj_t root = roots[flags & HWLOC_DISTRIB_FLAG_REVERSE ? n_roots-1-i : i]; hwloc_cpuset_t cpuset = root->cpuset; if (!cpuset) continue; weight = hwloc_bitmap_weight(cpuset); if (!weight) continue; chunk = (( (givenweight+weight) * n + tot_weight-1) / tot_weight) - (( givenweight * n + tot_weight-1) / tot_weight); if (!root->arity || chunk <= 1 || root->depth >= until) { if (chunk) { unsigned j; for (j=0; j < chunk; j++) cpusetp[j] = hwloc_bitmap_dup(cpuset); } else { ((given) ? (void) (0) : __assert_fail ("given", "/usr/include/hwloc/helper.h", 769, __PRETTY_FUNCTION__)); hwloc_bitmap_or(cpusetp[-1], cpusetp[-1], cpuset); } } else { hwloc_distrib(topology, root->children, root->arity, cpusetp, chunk, until, flags); } cpusetp += chunk; given += chunk; givenweight += weight; } return 0; } # 800 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_const_cpuset_t hwloc_topology_get_complete_cpuset(hwloc_topology_t topology) __attribute__((__pure__)); static __inline__ hwloc_const_cpuset_t hwloc_topology_get_complete_cpuset(hwloc_topology_t topology) { return hwloc_get_root_obj(topology)->complete_cpuset; } # 818 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_const_cpuset_t hwloc_topology_get_topology_cpuset(hwloc_topology_t topology) __attribute__((__pure__)); static __inline__ hwloc_const_cpuset_t hwloc_topology_get_topology_cpuset(hwloc_topology_t topology) { return hwloc_get_root_obj(topology)->cpuset; } # 835 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_const_cpuset_t hwloc_topology_get_online_cpuset(hwloc_topology_t topology) __attribute__((__pure__)); static __inline__ hwloc_const_cpuset_t hwloc_topology_get_online_cpuset(hwloc_topology_t topology) { return hwloc_get_root_obj(topology)->online_cpuset; } # 852 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_const_cpuset_t hwloc_topology_get_allowed_cpuset(hwloc_topology_t topology) __attribute__((__pure__)); static __inline__ hwloc_const_cpuset_t hwloc_topology_get_allowed_cpuset(hwloc_topology_t topology) { return hwloc_get_root_obj(topology)->allowed_cpuset; } # 869 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_const_nodeset_t hwloc_topology_get_complete_nodeset(hwloc_topology_t topology) __attribute__((__pure__)); static __inline__ hwloc_const_nodeset_t hwloc_topology_get_complete_nodeset(hwloc_topology_t topology) { return hwloc_get_root_obj(topology)->complete_nodeset; } # 887 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_const_nodeset_t hwloc_topology_get_topology_nodeset(hwloc_topology_t topology) __attribute__((__pure__)); static __inline__ hwloc_const_nodeset_t hwloc_topology_get_topology_nodeset(hwloc_topology_t topology) { return hwloc_get_root_obj(topology)->nodeset; } # 904 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_const_nodeset_t hwloc_topology_get_allowed_nodeset(hwloc_topology_t topology) __attribute__((__pure__)); static __inline__ hwloc_const_nodeset_t hwloc_topology_get_allowed_nodeset(hwloc_topology_t topology) { return hwloc_get_root_obj(topology)->allowed_nodeset; } # 942 "/usr/include/hwloc/helper.h" 3 4 static __inline__ void hwloc_cpuset_to_nodeset(hwloc_topology_t topology, hwloc_const_cpuset_t _cpuset, hwloc_nodeset_t nodeset) { int depth = hwloc_get_type_depth(topology, HWLOC_OBJ_NUMANODE); hwloc_obj_t obj; if (depth == HWLOC_TYPE_DEPTH_UNKNOWN) { if (hwloc_bitmap_iszero(_cpuset)) hwloc_bitmap_zero(nodeset); else hwloc_bitmap_fill(nodeset); return; } hwloc_bitmap_zero(nodeset); obj = ((void *)0); while ((obj = hwloc_get_next_obj_covering_cpuset_by_depth(topology, _cpuset, depth, obj)) != ((void *)0)) hwloc_bitmap_set(nodeset, obj->os_index); } # 970 "/usr/include/hwloc/helper.h" 3 4 static __inline__ void hwloc_cpuset_to_nodeset_strict(struct hwloc_topology *topology, hwloc_const_cpuset_t _cpuset, hwloc_nodeset_t nodeset) { int depth = hwloc_get_type_depth(topology, HWLOC_OBJ_NUMANODE); hwloc_obj_t obj; if (depth == HWLOC_TYPE_DEPTH_UNKNOWN ) return; hwloc_bitmap_zero(nodeset); obj = ((void *)0); while ((obj = hwloc_get_next_obj_covering_cpuset_by_depth(topology, _cpuset, depth, obj)) != ((void *)0)) hwloc_bitmap_set(nodeset, obj->os_index); } # 991 "/usr/include/hwloc/helper.h" 3 4 static __inline__ void hwloc_cpuset_from_nodeset(hwloc_topology_t topology, hwloc_cpuset_t _cpuset, hwloc_const_nodeset_t nodeset) { int depth = hwloc_get_type_depth(topology, HWLOC_OBJ_NUMANODE); hwloc_obj_t obj; if (depth == HWLOC_TYPE_DEPTH_UNKNOWN ) { if (hwloc_bitmap_iszero(nodeset)) hwloc_bitmap_zero(_cpuset); else hwloc_bitmap_fill(_cpuset); return; } hwloc_bitmap_zero(_cpuset); obj = ((void *)0); while ((obj = hwloc_get_next_obj_by_depth(topology, depth, obj)) != ((void *)0)) { if (hwloc_bitmap_isset(nodeset, obj->os_index)) hwloc_bitmap_or(_cpuset, _cpuset, obj->cpuset); } } # 1022 "/usr/include/hwloc/helper.h" 3 4 static __inline__ void hwloc_cpuset_from_nodeset_strict(struct hwloc_topology *topology, hwloc_cpuset_t _cpuset, hwloc_const_nodeset_t nodeset) { int depth = hwloc_get_type_depth(topology, HWLOC_OBJ_NUMANODE); hwloc_obj_t obj; if (depth == HWLOC_TYPE_DEPTH_UNKNOWN ) return; hwloc_bitmap_zero(_cpuset); obj = ((void *)0); while ((obj = hwloc_get_next_obj_by_depth(topology, depth, obj)) != ((void *)0)) if (hwloc_bitmap_isset(nodeset, obj->os_index)) hwloc_bitmap_or(_cpuset, _cpuset, obj->cpuset); } # 1064 "/usr/include/hwloc/helper.h" 3 4 static __inline__ const struct hwloc_distances_s * hwloc_get_whole_distance_matrix_by_depth(hwloc_topology_t topology, unsigned depth) { hwloc_obj_t root = hwloc_get_root_obj(topology); unsigned i; for(i=0; idistances_count; i++) if (root->distances[i]->relative_depth == depth) return root->distances[i]; return ((void *)0); } # 1094 "/usr/include/hwloc/helper.h" 3 4 static __inline__ const struct hwloc_distances_s * hwloc_get_whole_distance_matrix_by_type(hwloc_topology_t topology, hwloc_obj_type_t type) { int depth = hwloc_get_type_depth(topology, type); if (depth < 0) return ((void *)0); return hwloc_get_whole_distance_matrix_by_depth(topology, depth); } # 1116 "/usr/include/hwloc/helper.h" 3 4 static __inline__ const struct hwloc_distances_s * hwloc_get_distance_matrix_covering_obj_by_depth(hwloc_topology_t topology, hwloc_obj_t obj, unsigned depth, unsigned *firstp) { while (obj && obj->cpuset) { unsigned i; for(i=0; idistances_count; i++) if (obj->distances[i]->relative_depth == depth - obj->depth) { if (!obj->distances[i]->nbobjs) continue; *firstp = hwloc_get_next_obj_inside_cpuset_by_depth(topology, obj->cpuset, depth, ((void *)0))->logical_index; return obj->distances[i]; } obj = obj->parent; } return ((void *)0); } # 1146 "/usr/include/hwloc/helper.h" 3 4 static __inline__ int hwloc_get_latency(hwloc_topology_t topology, hwloc_obj_t obj1, hwloc_obj_t obj2, float *latency, float *reverse_latency) { hwloc_obj_t ancestor; const struct hwloc_distances_s * distances; unsigned first_logical ; if (obj1->depth != obj2->depth) { (*__errno_location ()) = 22; return -1; } ancestor = hwloc_get_common_ancestor_obj(topology, obj1, obj2); distances = hwloc_get_distance_matrix_covering_obj_by_depth(topology, ancestor, obj1->depth, &first_logical); if (distances && distances->latency) { const float * latency_matrix = distances->latency; unsigned nbobjs = distances->nbobjs; unsigned l1 = obj1->logical_index - first_logical; unsigned l2 = obj2->logical_index - first_logical; *latency = latency_matrix[l1*nbobjs+l2]; *reverse_latency = latency_matrix[l2*nbobjs+l1]; return 0; } (*__errno_location ()) = 38; return -1; } # 1190 "/usr/include/hwloc/helper.h" 3 4 static __inline__ hwloc_obj_t hwloc_get_non_io_ancestor_obj(hwloc_topology_t topology __attribute__((__unused__)), hwloc_obj_t ioobj) { hwloc_obj_t obj = ioobj; while (obj && !obj->cpuset) { obj = obj->parent; } return obj; } static __inline__ hwloc_obj_t hwloc_get_next_pcidev(hwloc_topology_t topology, hwloc_obj_t prev) { return hwloc_get_next_obj_by_type(topology, HWLOC_OBJ_PCI_DEVICE, prev); } static __inline__ hwloc_obj_t hwloc_get_pcidev_by_busid(hwloc_topology_t topology, unsigned domain, unsigned bus, unsigned dev, unsigned func) { hwloc_obj_t obj = ((void *)0); while ((obj = hwloc_get_next_pcidev(topology, obj)) != ((void *)0)) { if (obj->attr->pcidev.domain == domain && obj->attr->pcidev.bus == bus && obj->attr->pcidev.dev == dev && obj->attr->pcidev.func == func) return obj; } return ((void *)0); } static __inline__ hwloc_obj_t hwloc_get_pcidev_by_busidstring(hwloc_topology_t topology, const char *busid) { unsigned domain = 0; unsigned bus, dev, func; if (sscanf(busid, "%x:%x.%x", &bus, &dev, &func) != 3 && sscanf(busid, "%x:%x:%x.%x", &domain, &bus, &dev, &func) != 4) { (*__errno_location ()) = 22; return ((void *)0); } return hwloc_get_pcidev_by_busid(topology, domain, bus, dev, func); } static __inline__ hwloc_obj_t hwloc_get_next_osdev(hwloc_topology_t topology, hwloc_obj_t prev) { return hwloc_get_next_obj_by_type(topology, HWLOC_OBJ_OS_DEVICE, prev); } static __inline__ hwloc_obj_t hwloc_get_next_bridge(hwloc_topology_t topology, hwloc_obj_t prev) { return hwloc_get_next_obj_by_type(topology, HWLOC_OBJ_BRIDGE, prev); } static __inline__ int hwloc_bridge_covers_pcibus(hwloc_obj_t bridge, unsigned domain, unsigned bus) { return bridge->type == HWLOC_OBJ_BRIDGE && bridge->attr->bridge.downstream_type == HWLOC_OBJ_BRIDGE_PCI && bridge->attr->bridge.downstream.pci.domain == domain && bridge->attr->bridge.downstream.pci.secondary_bus <= bus && bridge->attr->bridge.downstream.pci.subordinate_bus >= bus; } static __inline__ hwloc_obj_t hwloc_get_hostbridge_by_pcibus(hwloc_topology_t topology, unsigned domain, unsigned bus) { hwloc_obj_t obj = ((void *)0); while ((obj = hwloc_get_next_bridge(topology, obj)) != ((void *)0)) { if (hwloc_bridge_covers_pcibus(obj, domain, bus)) { ((obj->attr->bridge.upstream_type == HWLOC_OBJ_BRIDGE_HOST) ? (void) (0) : __assert_fail ("obj->attr->bridge.upstream_type == HWLOC_OBJ_BRIDGE_HOST", "/usr/include/hwloc/helper.h", 1293, __PRETTY_FUNCTION__)); ((obj->parent->type != HWLOC_OBJ_BRIDGE) ? (void) (0) : __assert_fail ("obj->parent->type != HWLOC_OBJ_BRIDGE", "/usr/include/hwloc/helper.h", 1294, __PRETTY_FUNCTION__)); ((obj->parent->cpuset) ? (void) (0) : __assert_fail ("obj->parent->cpuset", "/usr/include/hwloc/helper.h", 1295, __PRETTY_FUNCTION__)); return obj; } } return ((void *)0); } # 2531 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/hwloc/inlines.h" 1 3 4 # 21 "/usr/include/hwloc/inlines.h" 3 4 # 1 "/usr/include/errno.h" 1 3 4 # 22 "/usr/include/hwloc/inlines.h" 2 3 4 static __inline__ int hwloc_get_type_or_below_depth (hwloc_topology_t topology, hwloc_obj_type_t type) { int depth = hwloc_get_type_depth(topology, type); if (depth != HWLOC_TYPE_DEPTH_UNKNOWN) return depth; for(depth = hwloc_get_type_depth(topology, HWLOC_OBJ_PU); ; depth--) if (hwloc_compare_types(hwloc_get_depth_type(topology, depth), type) < 0) return depth+1; } static __inline__ int hwloc_get_type_or_above_depth (hwloc_topology_t topology, hwloc_obj_type_t type) { int depth = hwloc_get_type_depth(topology, type); if (depth != HWLOC_TYPE_DEPTH_UNKNOWN) return depth; for(depth = 0; ; depth++) if (hwloc_compare_types(hwloc_get_depth_type(topology, depth), type) > 0) return depth-1; } static __inline__ int hwloc_get_nbobjs_by_type (hwloc_topology_t topology, hwloc_obj_type_t type) { int depth = hwloc_get_type_depth(topology, type); if (depth == HWLOC_TYPE_DEPTH_UNKNOWN) return 0; if (depth == HWLOC_TYPE_DEPTH_MULTIPLE) return -1; return hwloc_get_nbobjs_by_depth(topology, depth); } static __inline__ hwloc_obj_t hwloc_get_obj_by_type (hwloc_topology_t topology, hwloc_obj_type_t type, unsigned idx) { int depth = hwloc_get_type_depth(topology, type); if (depth == HWLOC_TYPE_DEPTH_UNKNOWN) return ((void *)0); if (depth == HWLOC_TYPE_DEPTH_MULTIPLE) return ((void *)0); return hwloc_get_obj_by_depth(topology, depth, idx); } static __inline__ hwloc_obj_t hwloc_get_next_obj_by_depth (hwloc_topology_t topology, unsigned depth, hwloc_obj_t prev) { if (!prev) return hwloc_get_obj_by_depth (topology, depth, 0); if (prev->depth != depth) return ((void *)0); return prev->next_cousin; } static __inline__ hwloc_obj_t hwloc_get_next_obj_by_type (hwloc_topology_t topology, hwloc_obj_type_t type, hwloc_obj_t prev) { int depth = hwloc_get_type_depth(topology, type); if (depth == HWLOC_TYPE_DEPTH_UNKNOWN || depth == HWLOC_TYPE_DEPTH_MULTIPLE) return ((void *)0); return hwloc_get_next_obj_by_depth (topology, depth, prev); } static __inline__ hwloc_obj_t hwloc_get_root_obj (hwloc_topology_t topology) { return hwloc_get_obj_by_depth (topology, 0, 0); } static __inline__ const char * hwloc_obj_get_info_by_name(hwloc_obj_t obj, const char *name) { unsigned i; for(i=0; iinfos_count; i++) if (!strcmp(obj->infos[i].name, name)) return obj->infos[i].value; return ((void *)0); } static __inline__ void * hwloc_alloc_membind_policy_nodeset(hwloc_topology_t topology, size_t len, hwloc_const_nodeset_t nodeset, hwloc_membind_policy_t policy, int flags) { void *p = hwloc_alloc_membind_nodeset(topology, len, nodeset, policy, flags); if (p) return p; hwloc_set_membind_nodeset(topology, nodeset, policy, flags); p = hwloc_alloc(topology, len); if (p && policy != HWLOC_MEMBIND_FIRSTTOUCH) memset(p, 0, len); return p; } static __inline__ void * hwloc_alloc_membind_policy(hwloc_topology_t topology, size_t len, hwloc_const_cpuset_t set, hwloc_membind_policy_t policy, int flags) { void *p = hwloc_alloc_membind(topology, len, set, policy, flags); if (p) return p; hwloc_set_membind(topology, set, policy, flags); p = hwloc_alloc(topology, len); if (p && policy != HWLOC_MEMBIND_FIRSTTOUCH) memset(p, 0, len); return p; } # 2534 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/hwloc/diff.h" 1 3 4 # 60 "/usr/include/hwloc/diff.h" 3 4 typedef enum hwloc_topology_diff_obj_attr_type_e { HWLOC_TOPOLOGY_DIFF_OBJ_ATTR_SIZE, HWLOC_TOPOLOGY_DIFF_OBJ_ATTR_NAME, HWLOC_TOPOLOGY_DIFF_OBJ_ATTR_INFO } hwloc_topology_diff_obj_attr_type_t; union hwloc_topology_diff_obj_attr_u { struct hwloc_topology_diff_obj_attr_generic_s { hwloc_topology_diff_obj_attr_type_t type; } generic; struct hwloc_topology_diff_obj_attr_uint64_s { hwloc_topology_diff_obj_attr_type_t type; hwloc_uint64_t index; hwloc_uint64_t oldvalue; hwloc_uint64_t newvalue; } uint64; struct hwloc_topology_diff_obj_attr_string_s { hwloc_topology_diff_obj_attr_type_t type; char *name; char *oldvalue; char *newvalue; } string; }; typedef enum hwloc_topology_diff_type_e { HWLOC_TOPOLOGY_DIFF_OBJ_ATTR, # 122 "/usr/include/hwloc/diff.h" 3 4 HWLOC_TOPOLOGY_DIFF_TOO_COMPLEX } hwloc_topology_diff_type_t; typedef union hwloc_topology_diff_u { struct hwloc_topology_diff_generic_s { hwloc_topology_diff_type_t type; union hwloc_topology_diff_u * next; } generic; struct hwloc_topology_diff_obj_attr_s { hwloc_topology_diff_type_t type; union hwloc_topology_diff_u * next; unsigned obj_depth; unsigned obj_index; union hwloc_topology_diff_obj_attr_u diff; } obj_attr; struct hwloc_topology_diff_too_complex_s { hwloc_topology_diff_type_t type; union hwloc_topology_diff_u * next; unsigned obj_depth; unsigned obj_index; } too_complex; } * hwloc_topology_diff_t; # 192 "/usr/include/hwloc/diff.h" 3 4 int hwloc_topology_diff_build(hwloc_topology_t topology, hwloc_topology_t newtopology, unsigned long flags, hwloc_topology_diff_t *diff); enum hwloc_topology_diff_apply_flags_e { HWLOC_TOPOLOGY_DIFF_APPLY_REVERSE = (1UL<<0) }; # 220 "/usr/include/hwloc/diff.h" 3 4 int hwloc_topology_diff_apply(hwloc_topology_t topology, hwloc_topology_diff_t diff, unsigned long flags); int hwloc_topology_diff_destroy(hwloc_topology_t topology, hwloc_topology_diff_t diff); # 243 "/usr/include/hwloc/diff.h" 3 4 int hwloc_topology_diff_load_xml(hwloc_topology_t topology, const char *xmlpath, hwloc_topology_diff_t *diff, char **refname); # 257 "/usr/include/hwloc/diff.h" 3 4 int hwloc_topology_diff_export_xml(hwloc_topology_t topology, hwloc_topology_diff_t diff, const char *refname, const char *xmlpath); # 273 "/usr/include/hwloc/diff.h" 3 4 int hwloc_topology_diff_load_xmlbuffer(hwloc_topology_t topology, const char *xmlbuffer, int buflen, hwloc_topology_diff_t *diff, char **refname); # 289 "/usr/include/hwloc/diff.h" 3 4 int hwloc_topology_diff_export_xmlbuffer(hwloc_topology_t topology, hwloc_topology_diff_t diff, const char *refname, char **xmlbuffer, int *buflen); # 2537 "/usr/include/hwloc.h" 2 3 4 # 1 "/usr/include/hwloc/deprecated.h" 1 3 4 # 33 "/usr/include/hwloc/deprecated.h" 3 4 hwloc_obj_type_t hwloc_obj_type_of_string (const char * string) __attribute__((__pure__)) __attribute__((__deprecated__)); # 55 "/usr/include/hwloc/deprecated.h" 3 4 int hwloc_obj_snprintf(char * __restrict string, size_t size, hwloc_topology_t topology, hwloc_obj_t obj, const char * __restrict indexprefix, int verbose) __attribute__((__deprecated__)); # 74 "/usr/include/hwloc/deprecated.h" 3 4 static __inline__ void hwloc_distribute(hwloc_topology_t topology, hwloc_obj_t root, hwloc_cpuset_t *set, unsigned n, unsigned until) __attribute__((__deprecated__)); static __inline__ void hwloc_distribute(hwloc_topology_t topology, hwloc_obj_t root, hwloc_cpuset_t *set, unsigned n, unsigned until) { hwloc_distrib(topology, &root, 1, set, n, until, 0); } # 89 "/usr/include/hwloc/deprecated.h" 3 4 static __inline__ void hwloc_distributev(hwloc_topology_t topology, hwloc_obj_t *roots, unsigned n_roots, hwloc_cpuset_t *set, unsigned n, unsigned until) __attribute__((__deprecated__)); static __inline__ void hwloc_distributev(hwloc_topology_t topology, hwloc_obj_t *roots, unsigned n_roots, hwloc_cpuset_t *set, unsigned n, unsigned until) { hwloc_distrib(topology, roots, n_roots, set, n, until, 0); } # 2540 "/usr/include/hwloc.h" 2 3 4 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Found header files ['hwloc.h'] in ['/usr/include', '/usr/lib/openmpi'] Popping language C ================================================================================ TEST checkSharedLibrary from config.packages.hwloc(/home/florian/software/petsc/config/BuildSystem/config/package.py:738) TESTING: checkSharedLibrary from config.packages.hwloc(config/BuildSystem/config/package.py:738) By default we don't care about checking if the library is shared Popping language C Pushing language C ================================================================================ TEST configureLibrary from config.packages.X(/home/florian/software/petsc/config/BuildSystem/config/package.py:679) TESTING: configureLibrary from config.packages.X(config/BuildSystem/config/package.py:679) Find an installation and check if it can work with PETSc ================================================================================== Checking for a functional X Checking for library in Compiler specific search X: [] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [XSetWMName] in library [] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char XSetWMName(); static void _check_XSetWMName() { XSetWMName(); } int main() { _check_XSetWMName();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_XSetWMName': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `XSetWMName' collect2: error: ld returned 1 exit status Popping language C Checking for library in Compiler specific search X: ['libX11.a'] ================================================================================ TEST check from config.libraries(/home/florian/software/petsc/config/BuildSystem/config/libraries.py:146) TESTING: check from config.libraries(config/BuildSystem/config/libraries.py:146) Checks that the library "libName" contains "funcs", and if it does defines HAVE_LIB"libName" - libDir may be a list of directories - libName may be a list of library names Checking for functions [XSetWMName] in library ['libX11.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char XSetWMName(); static void _check_XSetWMName() { XSetWMName(); } int main() { _check_XSetWMName();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lX11 -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBX11" to "1" Popping language C Checking for headers Compiler specific search X: ['/usr/include', '/usr/lib/openmpi'] Pushing language C ================================================================================ TEST checkInclude from config.headers(/home/florian/software/petsc/config/BuildSystem/config/headers.py:86) TESTING: checkInclude from config.headers(config/BuildSystem/config/headers.py:86) Checks if a particular include file can be found along particular include paths Checking for header files ['X11/Xlib.h'] in ['/usr/include', '/usr/lib/openmpi'] Checking include with compiler flags var CPPFLAGS ['/usr/include', '/usr/lib/openmpi'] Executing: mpicc -E -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.headers -I/usr/include -I/usr/lib/openmpi /tmp/petsc-KvGRNM/config.headers/conftest.c stdout: # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "" # 1 "" # 31 "" # 1 "/usr/include/stdc-predef.h" 1 3 4 # 32 "" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conftest.c" # 1 "/tmp/petsc-KvGRNM/config.headers/confdefs.h" 1 # 2 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/tmp/petsc-KvGRNM/config.headers/conffix.h" 1 # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 # 1 "/usr/include/X11/Xlib.h" 1 3 4 # 38 "/usr/include/X11/Xlib.h" 3 4 # 1 "/usr/include/sys/types.h" 1 3 4 # 25 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/features.h" 1 3 4 # 368 "/usr/include/features.h" 3 4 # 1 "/usr/include/sys/cdefs.h" 1 3 4 # 415 "/usr/include/sys/cdefs.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 416 "/usr/include/sys/cdefs.h" 2 3 4 # 369 "/usr/include/features.h" 2 3 4 # 392 "/usr/include/features.h" 3 4 # 1 "/usr/include/gnu/stubs.h" 1 3 4 # 10 "/usr/include/gnu/stubs.h" 3 4 # 1 "/usr/include/gnu/stubs-64.h" 1 3 4 # 11 "/usr/include/gnu/stubs.h" 2 3 4 # 393 "/usr/include/features.h" 2 3 4 # 26 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/bits/types.h" 1 3 4 # 27 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 28 "/usr/include/bits/types.h" 2 3 4 # 30 "/usr/include/bits/types.h" 3 4 typedef unsigned char __u_char; typedef unsigned short int __u_short; typedef unsigned int __u_int; typedef unsigned long int __u_long; typedef signed char __int8_t; typedef unsigned char __uint8_t; typedef signed short int __int16_t; typedef unsigned short int __uint16_t; typedef signed int __int32_t; typedef unsigned int __uint32_t; typedef signed long int __int64_t; typedef unsigned long int __uint64_t; typedef long int __quad_t; typedef unsigned long int __u_quad_t; # 121 "/usr/include/bits/types.h" 3 4 # 1 "/usr/include/bits/typesizes.h" 1 3 4 # 122 "/usr/include/bits/types.h" 2 3 4 typedef unsigned long int __dev_t; typedef unsigned int __uid_t; typedef unsigned int __gid_t; typedef unsigned long int __ino_t; typedef unsigned long int __ino64_t; typedef unsigned int __mode_t; typedef unsigned long int __nlink_t; typedef long int __off_t; typedef long int __off64_t; typedef int __pid_t; typedef struct { int __val[2]; } __fsid_t; typedef long int __clock_t; typedef unsigned long int __rlim_t; typedef unsigned long int __rlim64_t; typedef unsigned int __id_t; typedef long int __time_t; typedef unsigned int __useconds_t; typedef long int __suseconds_t; typedef int __daddr_t; typedef int __key_t; typedef int __clockid_t; typedef void * __timer_t; typedef long int __blksize_t; typedef long int __blkcnt_t; typedef long int __blkcnt64_t; typedef unsigned long int __fsblkcnt_t; typedef unsigned long int __fsblkcnt64_t; typedef unsigned long int __fsfilcnt_t; typedef unsigned long int __fsfilcnt64_t; typedef long int __fsword_t; typedef long int __ssize_t; typedef long int __syscall_slong_t; typedef unsigned long int __syscall_ulong_t; typedef __off64_t __loff_t; typedef __quad_t *__qaddr_t; typedef char *__caddr_t; typedef long int __intptr_t; typedef unsigned int __socklen_t; # 30 "/usr/include/sys/types.h" 2 3 4 typedef __u_char u_char; typedef __u_short u_short; typedef __u_int u_int; typedef __u_long u_long; typedef __quad_t quad_t; typedef __u_quad_t u_quad_t; typedef __fsid_t fsid_t; typedef __loff_t loff_t; typedef __ino_t ino_t; # 60 "/usr/include/sys/types.h" 3 4 typedef __dev_t dev_t; typedef __gid_t gid_t; typedef __mode_t mode_t; typedef __nlink_t nlink_t; typedef __uid_t uid_t; typedef __off_t off_t; # 98 "/usr/include/sys/types.h" 3 4 typedef __pid_t pid_t; typedef __id_t id_t; typedef __ssize_t ssize_t; typedef __daddr_t daddr_t; typedef __caddr_t caddr_t; typedef __key_t key_t; # 132 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/time.h" 1 3 4 # 57 "/usr/include/time.h" 3 4 typedef __clock_t clock_t; # 73 "/usr/include/time.h" 3 4 typedef __time_t time_t; # 91 "/usr/include/time.h" 3 4 typedef __clockid_t clockid_t; # 103 "/usr/include/time.h" 3 4 typedef __timer_t timer_t; # 133 "/usr/include/sys/types.h" 2 3 4 # 146 "/usr/include/sys/types.h" 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 216 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long unsigned int size_t; # 147 "/usr/include/sys/types.h" 2 3 4 typedef unsigned long int ulong; typedef unsigned short int ushort; typedef unsigned int uint; # 194 "/usr/include/sys/types.h" 3 4 typedef int int8_t __attribute__ ((__mode__ (__QI__))); typedef int int16_t __attribute__ ((__mode__ (__HI__))); typedef int int32_t __attribute__ ((__mode__ (__SI__))); typedef int int64_t __attribute__ ((__mode__ (__DI__))); typedef unsigned int u_int8_t __attribute__ ((__mode__ (__QI__))); typedef unsigned int u_int16_t __attribute__ ((__mode__ (__HI__))); typedef unsigned int u_int32_t __attribute__ ((__mode__ (__SI__))); typedef unsigned int u_int64_t __attribute__ ((__mode__ (__DI__))); typedef int register_t __attribute__ ((__mode__ (__word__))); # 216 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/endian.h" 1 3 4 # 36 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/endian.h" 1 3 4 # 37 "/usr/include/endian.h" 2 3 4 # 60 "/usr/include/endian.h" 3 4 # 1 "/usr/include/bits/byteswap.h" 1 3 4 # 28 "/usr/include/bits/byteswap.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 29 "/usr/include/bits/byteswap.h" 2 3 4 # 1 "/usr/include/bits/byteswap-16.h" 1 3 4 # 36 "/usr/include/bits/byteswap.h" 2 3 4 # 44 "/usr/include/bits/byteswap.h" 3 4 static __inline unsigned int __bswap_32 (unsigned int __bsx) { return __builtin_bswap32 (__bsx); } # 108 "/usr/include/bits/byteswap.h" 3 4 static __inline __uint64_t __bswap_64 (__uint64_t __bsx) { return __builtin_bswap64 (__bsx); } # 61 "/usr/include/endian.h" 2 3 4 # 217 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/select.h" 1 3 4 # 30 "/usr/include/sys/select.h" 3 4 # 1 "/usr/include/bits/select.h" 1 3 4 # 22 "/usr/include/bits/select.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 23 "/usr/include/bits/select.h" 2 3 4 # 31 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/sigset.h" 1 3 4 # 22 "/usr/include/bits/sigset.h" 3 4 typedef int __sig_atomic_t; typedef struct { unsigned long int __val[(1024 / (8 * sizeof (unsigned long int)))]; } __sigset_t; # 34 "/usr/include/sys/select.h" 2 3 4 typedef __sigset_t sigset_t; # 1 "/usr/include/time.h" 1 3 4 # 120 "/usr/include/time.h" 3 4 struct timespec { __time_t tv_sec; __syscall_slong_t tv_nsec; }; # 46 "/usr/include/sys/select.h" 2 3 4 # 1 "/usr/include/bits/time.h" 1 3 4 # 30 "/usr/include/bits/time.h" 3 4 struct timeval { __time_t tv_sec; __suseconds_t tv_usec; }; # 48 "/usr/include/sys/select.h" 2 3 4 typedef __suseconds_t suseconds_t; typedef long int __fd_mask; # 66 "/usr/include/sys/select.h" 3 4 typedef struct { __fd_mask __fds_bits[1024 / (8 * (int) sizeof (__fd_mask))]; } fd_set; typedef __fd_mask fd_mask; # 98 "/usr/include/sys/select.h" 3 4 # 108 "/usr/include/sys/select.h" 3 4 extern int select (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, struct timeval *__restrict __timeout); # 120 "/usr/include/sys/select.h" 3 4 extern int pselect (int __nfds, fd_set *__restrict __readfds, fd_set *__restrict __writefds, fd_set *__restrict __exceptfds, const struct timespec *__restrict __timeout, const __sigset_t *__restrict __sigmask); # 133 "/usr/include/sys/select.h" 3 4 # 220 "/usr/include/sys/types.h" 2 3 4 # 1 "/usr/include/sys/sysmacros.h" 1 3 4 # 24 "/usr/include/sys/sysmacros.h" 3 4 __extension__ extern unsigned int gnu_dev_major (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned int gnu_dev_minor (unsigned long long int __dev) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); __extension__ extern unsigned long long int gnu_dev_makedev (unsigned int __major, unsigned int __minor) __attribute__ ((__nothrow__ , __leaf__)) __attribute__ ((__const__)); # 58 "/usr/include/sys/sysmacros.h" 3 4 # 223 "/usr/include/sys/types.h" 2 3 4 typedef __blksize_t blksize_t; typedef __blkcnt_t blkcnt_t; typedef __fsblkcnt_t fsblkcnt_t; typedef __fsfilcnt_t fsfilcnt_t; # 270 "/usr/include/sys/types.h" 3 4 # 1 "/usr/include/bits/pthreadtypes.h" 1 3 4 # 21 "/usr/include/bits/pthreadtypes.h" 3 4 # 1 "/usr/include/bits/wordsize.h" 1 3 4 # 22 "/usr/include/bits/pthreadtypes.h" 2 3 4 # 60 "/usr/include/bits/pthreadtypes.h" 3 4 typedef unsigned long int pthread_t; union pthread_attr_t { char __size[56]; long int __align; }; typedef union pthread_attr_t pthread_attr_t; typedef struct __pthread_internal_list { struct __pthread_internal_list *__prev; struct __pthread_internal_list *__next; } __pthread_list_t; # 90 "/usr/include/bits/pthreadtypes.h" 3 4 typedef union { struct __pthread_mutex_s { int __lock; unsigned int __count; int __owner; unsigned int __nusers; int __kind; short __spins; short __elision; __pthread_list_t __list; # 125 "/usr/include/bits/pthreadtypes.h" 3 4 } __data; char __size[40]; long int __align; } pthread_mutex_t; typedef union { char __size[4]; int __align; } pthread_mutexattr_t; typedef union { struct { int __lock; unsigned int __futex; __extension__ unsigned long long int __total_seq; __extension__ unsigned long long int __wakeup_seq; __extension__ unsigned long long int __woken_seq; void *__mutex; unsigned int __nwaiters; unsigned int __broadcast_seq; } __data; char __size[48]; __extension__ long long int __align; } pthread_cond_t; typedef union { char __size[4]; int __align; } pthread_condattr_t; typedef unsigned int pthread_key_t; typedef int pthread_once_t; typedef union { struct { int __lock; unsigned int __nr_readers; unsigned int __readers_wakeup; unsigned int __writer_wakeup; unsigned int __nr_readers_queued; unsigned int __nr_writers_queued; int __writer; int __shared; signed char __rwelision; unsigned char __pad1[7]; unsigned long int __pad2; unsigned int __flags; } __data; # 220 "/usr/include/bits/pthreadtypes.h" 3 4 char __size[56]; long int __align; } pthread_rwlock_t; typedef union { char __size[8]; long int __align; } pthread_rwlockattr_t; typedef volatile int pthread_spinlock_t; typedef union { char __size[32]; long int __align; } pthread_barrier_t; typedef union { char __size[4]; int __align; } pthread_barrierattr_t; # 271 "/usr/include/sys/types.h" 2 3 4 # 39 "/usr/include/X11/Xlib.h" 2 3 4 # 1 "/usr/include/X11/X.h" 1 3 4 # 66 "/usr/include/X11/X.h" 3 4 typedef unsigned long XID; typedef unsigned long Mask; typedef unsigned long Atom; typedef unsigned long VisualID; typedef unsigned long Time; # 96 "/usr/include/X11/X.h" 3 4 typedef XID Window; typedef XID Drawable; typedef XID Font; typedef XID Pixmap; typedef XID Cursor; typedef XID Colormap; typedef XID GContext; typedef XID KeySym; typedef unsigned char KeyCode; # 45 "/usr/include/X11/Xlib.h" 2 3 4 # 1 "/usr/include/X11/Xfuncproto.h" 1 3 4 # 48 "/usr/include/X11/Xlib.h" 2 3 4 # 1 "/usr/include/X11/Xosdefs.h" 1 3 4 # 49 "/usr/include/X11/Xlib.h" 2 3 4 # 1 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 1 3 4 # 149 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef long int ptrdiff_t; # 328 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef int wchar_t; # 426 "/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1/include/stddef.h" 3 4 typedef struct { long long __max_align_ll __attribute__((__aligned__(__alignof__(long long)))); long double __max_align_ld __attribute__((__aligned__(__alignof__(long double)))); } max_align_t; # 52 "/usr/include/X11/Xlib.h" 2 3 4 # 62 "/usr/include/X11/Xlib.h" 3 4 extern int _Xmblen( char *str, int len ); # 80 "/usr/include/X11/Xlib.h" 3 4 typedef char *XPointer; # 148 "/usr/include/X11/Xlib.h" 3 4 typedef struct _XExtData { int number; struct _XExtData *next; int (*free_private)( struct _XExtData *extension ); XPointer private_data; } XExtData; typedef struct { int extension; int major_opcode; int first_event; int first_error; } XExtCodes; typedef struct { int depth; int bits_per_pixel; int scanline_pad; } XPixmapFormatValues; typedef struct { int function; unsigned long plane_mask; unsigned long foreground; unsigned long background; int line_width; int line_style; int cap_style; int join_style; int fill_style; int fill_rule; int arc_mode; Pixmap tile; Pixmap stipple; int ts_x_origin; int ts_y_origin; Font font; int subwindow_mode; int graphics_exposures; int clip_x_origin; int clip_y_origin; Pixmap clip_mask; int dash_offset; char dashes; } XGCValues; typedef struct _XGC *GC; typedef struct { XExtData *ext_data; VisualID visualid; int class; unsigned long red_mask, green_mask, blue_mask; int bits_per_rgb; int map_entries; } Visual; typedef struct { int depth; int nvisuals; Visual *visuals; } Depth; struct _XDisplay; typedef struct { XExtData *ext_data; struct _XDisplay *display; Window root; int width, height; int mwidth, mheight; int ndepths; Depth *depths; int root_depth; Visual *root_visual; GC default_gc; Colormap cmap; unsigned long white_pixel; unsigned long black_pixel; int max_maps, min_maps; int backing_store; int save_unders; long root_input_mask; } Screen; typedef struct { XExtData *ext_data; int depth; int bits_per_pixel; int scanline_pad; } ScreenFormat; typedef struct { Pixmap background_pixmap; unsigned long background_pixel; Pixmap border_pixmap; unsigned long border_pixel; int bit_gravity; int win_gravity; int backing_store; unsigned long backing_planes; unsigned long backing_pixel; int save_under; long event_mask; long do_not_propagate_mask; int override_redirect; Colormap colormap; Cursor cursor; } XSetWindowAttributes; typedef struct { int x, y; int width, height; int border_width; int depth; Visual *visual; Window root; int class; int bit_gravity; int win_gravity; int backing_store; unsigned long backing_planes; unsigned long backing_pixel; int save_under; Colormap colormap; int map_installed; int map_state; long all_event_masks; long your_event_mask; long do_not_propagate_mask; int override_redirect; Screen *screen; } XWindowAttributes; typedef struct { int family; int length; char *address; } XHostAddress; typedef struct { int typelength; int valuelength; char *type; char *value; } XServerInterpretedAddress; typedef struct _XImage { int width, height; int xoffset; int format; char *data; int byte_order; int bitmap_unit; int bitmap_bit_order; int bitmap_pad; int depth; int bytes_per_line; int bits_per_pixel; unsigned long red_mask; unsigned long green_mask; unsigned long blue_mask; XPointer obdata; struct funcs { struct _XImage *(*create_image)( struct _XDisplay* , Visual* , unsigned int , int , int , char* , unsigned int , unsigned int , int , int ); int (*destroy_image) (struct _XImage *); unsigned long (*get_pixel) (struct _XImage *, int, int); int (*put_pixel) (struct _XImage *, int, int, unsigned long); struct _XImage *(*sub_image)(struct _XImage *, int, int, unsigned int, unsigned int); int (*add_pixel) (struct _XImage *, long); } f; } XImage; typedef struct { int x, y; int width, height; int border_width; Window sibling; int stack_mode; } XWindowChanges; typedef struct { unsigned long pixel; unsigned short red, green, blue; char flags; char pad; } XColor; typedef struct { short x1, y1, x2, y2; } XSegment; typedef struct { short x, y; } XPoint; typedef struct { short x, y; unsigned short width, height; } XRectangle; typedef struct { short x, y; unsigned short width, height; short angle1, angle2; } XArc; typedef struct { int key_click_percent; int bell_percent; int bell_pitch; int bell_duration; int led; int led_mode; int key; int auto_repeat_mode; } XKeyboardControl; typedef struct { int key_click_percent; int bell_percent; unsigned int bell_pitch, bell_duration; unsigned long led_mask; int global_auto_repeat; char auto_repeats[32]; } XKeyboardState; typedef struct { Time time; short x, y; } XTimeCoord; typedef struct { int max_keypermod; KeyCode *modifiermap; } XModifierKeymap; # 487 "/usr/include/X11/Xlib.h" 3 4 typedef struct _XDisplay Display; struct _XPrivate; struct _XrmHashBucketRec; typedef struct { XExtData *ext_data; struct _XPrivate *private1; int fd; int private2; int proto_major_version; int proto_minor_version; char *vendor; XID private3; XID private4; XID private5; int private6; XID (*resource_alloc)( struct _XDisplay* ); int byte_order; int bitmap_unit; int bitmap_pad; int bitmap_bit_order; int nformats; ScreenFormat *pixmap_format; int private8; int release; struct _XPrivate *private9, *private10; int qlen; unsigned long last_request_read; unsigned long request; XPointer private11; XPointer private12; XPointer private13; XPointer private14; unsigned max_request_size; struct _XrmHashBucketRec *db; int (*private15)( struct _XDisplay* ); char *display_name; int default_screen; int nscreens; Screen *screens; unsigned long motion_buffer; unsigned long private16; int min_keycode; int max_keycode; XPointer private17; XPointer private18; int private19; char *xdefaults; } *_XPrivDisplay; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Window root; Window subwindow; Time time; int x, y; int x_root, y_root; unsigned int state; unsigned int keycode; int same_screen; } XKeyEvent; typedef XKeyEvent XKeyPressedEvent; typedef XKeyEvent XKeyReleasedEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Window root; Window subwindow; Time time; int x, y; int x_root, y_root; unsigned int state; unsigned int button; int same_screen; } XButtonEvent; typedef XButtonEvent XButtonPressedEvent; typedef XButtonEvent XButtonReleasedEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Window root; Window subwindow; Time time; int x, y; int x_root, y_root; unsigned int state; char is_hint; int same_screen; } XMotionEvent; typedef XMotionEvent XPointerMovedEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Window root; Window subwindow; Time time; int x, y; int x_root, y_root; int mode; int detail; int same_screen; int focus; unsigned int state; } XCrossingEvent; typedef XCrossingEvent XEnterWindowEvent; typedef XCrossingEvent XLeaveWindowEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; int mode; int detail; } XFocusChangeEvent; typedef XFocusChangeEvent XFocusInEvent; typedef XFocusChangeEvent XFocusOutEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; char key_vector[32]; } XKeymapEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; int x, y; int width, height; int count; } XExposeEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Drawable drawable; int x, y; int width, height; int count; int major_code; int minor_code; } XGraphicsExposeEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Drawable drawable; int major_code; int minor_code; } XNoExposeEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; int state; } XVisibilityEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window parent; Window window; int x, y; int width, height; int border_width; int override_redirect; } XCreateWindowEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window event; Window window; } XDestroyWindowEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window event; Window window; int from_configure; } XUnmapEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window event; Window window; int override_redirect; } XMapEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window parent; Window window; } XMapRequestEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window event; Window window; Window parent; int x, y; int override_redirect; } XReparentEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window event; Window window; int x, y; int width, height; int border_width; Window above; int override_redirect; } XConfigureEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window event; Window window; int x, y; } XGravityEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; int width, height; } XResizeRequestEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window parent; Window window; int x, y; int width, height; int border_width; Window above; int detail; unsigned long value_mask; } XConfigureRequestEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window event; Window window; int place; } XCirculateEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window parent; Window window; int place; } XCirculateRequestEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Atom atom; Time time; int state; } XPropertyEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Atom selection; Time time; } XSelectionClearEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window owner; Window requestor; Atom selection; Atom target; Atom property; Time time; } XSelectionRequestEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window requestor; Atom selection; Atom target; Atom property; Time time; } XSelectionEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Colormap colormap; int new; int state; } XColormapEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; Atom message_type; int format; union { char b[20]; short s[10]; long l[5]; } data; } XClientMessageEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; int request; int first_keycode; int count; } XMappingEvent; typedef struct { int type; Display *display; XID resourceid; unsigned long serial; unsigned char error_code; unsigned char request_code; unsigned char minor_code; } XErrorEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; Window window; } XAnyEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; int extension; int evtype; } XGenericEvent; typedef struct { int type; unsigned long serial; int send_event; Display *display; int extension; int evtype; unsigned int cookie; void *data; } XGenericEventCookie; typedef union _XEvent { int type; XAnyEvent xany; XKeyEvent xkey; XButtonEvent xbutton; XMotionEvent xmotion; XCrossingEvent xcrossing; XFocusChangeEvent xfocus; XExposeEvent xexpose; XGraphicsExposeEvent xgraphicsexpose; XNoExposeEvent xnoexpose; XVisibilityEvent xvisibility; XCreateWindowEvent xcreatewindow; XDestroyWindowEvent xdestroywindow; XUnmapEvent xunmap; XMapEvent xmap; XMapRequestEvent xmaprequest; XReparentEvent xreparent; XConfigureEvent xconfigure; XGravityEvent xgravity; XResizeRequestEvent xresizerequest; XConfigureRequestEvent xconfigurerequest; XCirculateEvent xcirculate; XCirculateRequestEvent xcirculaterequest; XPropertyEvent xproperty; XSelectionClearEvent xselectionclear; XSelectionRequestEvent xselectionrequest; XSelectionEvent xselection; XColormapEvent xcolormap; XClientMessageEvent xclient; XMappingEvent xmapping; XErrorEvent xerror; XKeymapEvent xkeymap; XGenericEvent xgeneric; XGenericEventCookie xcookie; long pad[24]; } XEvent; typedef struct { short lbearing; short rbearing; short width; short ascent; short descent; unsigned short attributes; } XCharStruct; typedef struct { Atom name; unsigned long card32; } XFontProp; typedef struct { XExtData *ext_data; Font fid; unsigned direction; unsigned min_char_or_byte2; unsigned max_char_or_byte2; unsigned min_byte1; unsigned max_byte1; int all_chars_exist; unsigned default_char; int n_properties; XFontProp *properties; XCharStruct min_bounds; XCharStruct max_bounds; XCharStruct *per_char; int ascent; int descent; } XFontStruct; typedef struct { char *chars; int nchars; int delta; Font font; } XTextItem; typedef struct { unsigned char byte1; unsigned char byte2; } XChar2b; typedef struct { XChar2b *chars; int nchars; int delta; Font font; } XTextItem16; typedef union { Display *display; GC gc; Visual *visual; Screen *screen; ScreenFormat *pixmap_format; XFontStruct *font; } XEDataObject; typedef struct { XRectangle max_ink_extent; XRectangle max_logical_extent; } XFontSetExtents; typedef struct _XOM *XOM; typedef struct _XOC *XOC, *XFontSet; typedef struct { char *chars; int nchars; int delta; XFontSet font_set; } XmbTextItem; typedef struct { wchar_t *chars; int nchars; int delta; XFontSet font_set; } XwcTextItem; # 1121 "/usr/include/X11/Xlib.h" 3 4 typedef struct { int charset_count; char **charset_list; } XOMCharSetList; typedef enum { XOMOrientation_LTR_TTB, XOMOrientation_RTL_TTB, XOMOrientation_TTB_LTR, XOMOrientation_TTB_RTL, XOMOrientation_Context } XOrientation; typedef struct { int num_orientation; XOrientation *orientation; } XOMOrientation; typedef struct { int num_font; XFontStruct **font_struct_list; char **font_name_list; } XOMFontInfo; typedef struct _XIM *XIM; typedef struct _XIC *XIC; typedef void (*XIMProc)( XIM, XPointer, XPointer ); typedef int (*XICProc)( XIC, XPointer, XPointer ); typedef void (*XIDProc)( Display*, XPointer, XPointer ); typedef unsigned long XIMStyle; typedef struct { unsigned short count_styles; XIMStyle *supported_styles; } XIMStyles; # 1233 "/usr/include/X11/Xlib.h" 3 4 typedef void *XVaNestedList; typedef struct { XPointer client_data; XIMProc callback; } XIMCallback; typedef struct { XPointer client_data; XICProc callback; } XICCallback; typedef unsigned long XIMFeedback; # 1257 "/usr/include/X11/Xlib.h" 3 4 typedef struct _XIMText { unsigned short length; XIMFeedback *feedback; int encoding_is_wchar; union { char *multi_byte; wchar_t *wide_char; } string; } XIMText; typedef unsigned long XIMPreeditState; typedef struct _XIMPreeditStateNotifyCallbackStruct { XIMPreeditState state; } XIMPreeditStateNotifyCallbackStruct; typedef unsigned long XIMResetState; typedef unsigned long XIMStringConversionFeedback; # 1291 "/usr/include/X11/Xlib.h" 3 4 typedef struct _XIMStringConversionText { unsigned short length; XIMStringConversionFeedback *feedback; int encoding_is_wchar; union { char *mbs; wchar_t *wcs; } string; } XIMStringConversionText; typedef unsigned short XIMStringConversionPosition; typedef unsigned short XIMStringConversionType; typedef unsigned short XIMStringConversionOperation; typedef enum { XIMForwardChar, XIMBackwardChar, XIMForwardWord, XIMBackwardWord, XIMCaretUp, XIMCaretDown, XIMNextLine, XIMPreviousLine, XIMLineStart, XIMLineEnd, XIMAbsolutePosition, XIMDontChange } XIMCaretDirection; typedef struct _XIMStringConversionCallbackStruct { XIMStringConversionPosition position; XIMCaretDirection direction; XIMStringConversionOperation operation; unsigned short factor; XIMStringConversionText *text; } XIMStringConversionCallbackStruct; typedef struct _XIMPreeditDrawCallbackStruct { int caret; int chg_first; int chg_length; XIMText *text; } XIMPreeditDrawCallbackStruct; typedef enum { XIMIsInvisible, XIMIsPrimary, XIMIsSecondary } XIMCaretStyle; typedef struct _XIMPreeditCaretCallbackStruct { int position; XIMCaretDirection direction; XIMCaretStyle style; } XIMPreeditCaretCallbackStruct; typedef enum { XIMTextType, XIMBitmapType } XIMStatusDataType; typedef struct _XIMStatusDrawCallbackStruct { XIMStatusDataType type; union { XIMText *text; Pixmap bitmap; } data; } XIMStatusDrawCallbackStruct; typedef struct _XIMHotKeyTrigger { KeySym keysym; int modifier; int modifier_mask; } XIMHotKeyTrigger; typedef struct _XIMHotKeyTriggers { int num_hot_key; XIMHotKeyTrigger *key; } XIMHotKeyTriggers; typedef unsigned long XIMHotKeyState; typedef struct { unsigned short count_values; char **supported_values; } XIMValuesList; extern int _Xdebug; extern XFontStruct *XLoadQueryFont( Display* , const char* ); extern XFontStruct *XQueryFont( Display* , XID ); extern XTimeCoord *XGetMotionEvents( Display* , Window , Time , Time , int* ); extern XModifierKeymap *XDeleteModifiermapEntry( XModifierKeymap* , KeyCode , int ); extern XModifierKeymap *XGetModifierMapping( Display* ); extern XModifierKeymap *XInsertModifiermapEntry( XModifierKeymap* , KeyCode , int ); extern XModifierKeymap *XNewModifiermap( int ); extern XImage *XCreateImage( Display* , Visual* , unsigned int , int , int , char* , unsigned int , unsigned int , int , int ); extern int XInitImage( XImage* ); extern XImage *XGetImage( Display* , Drawable , int , int , unsigned int , unsigned int , unsigned long , int ); extern XImage *XGetSubImage( Display* , Drawable , int , int , unsigned int , unsigned int , unsigned long , int , XImage* , int , int ); extern Display *XOpenDisplay( const char* ); extern void XrmInitialize( void ); extern char *XFetchBytes( Display* , int* ); extern char *XFetchBuffer( Display* , int* , int ); extern char *XGetAtomName( Display* , Atom ); extern int XGetAtomNames( Display* , Atom* , int , char** ); extern char *XGetDefault( Display* , const char* , const char* ); extern char *XDisplayName( const char* ); extern char *XKeysymToString( KeySym ); extern int (*XSynchronize( Display* , int ))( Display* ); extern int (*XSetAfterFunction( Display* , int (*) ( Display* ) ))( Display* ); extern Atom XInternAtom( Display* , const char* , int ); extern int XInternAtoms( Display* , char** , int , int , Atom* ); extern Colormap XCopyColormapAndFree( Display* , Colormap ); extern Colormap XCreateColormap( Display* , Window , Visual* , int ); extern Cursor XCreatePixmapCursor( Display* , Pixmap , Pixmap , XColor* , XColor* , unsigned int , unsigned int ); extern Cursor XCreateGlyphCursor( Display* , Font , Font , unsigned int , unsigned int , XColor const * , XColor const * ); extern Cursor XCreateFontCursor( Display* , unsigned int ); extern Font XLoadFont( Display* , const char* ); extern GC XCreateGC( Display* , Drawable , unsigned long , XGCValues* ); extern GContext XGContextFromGC( GC ); extern void XFlushGC( Display* , GC ); extern Pixmap XCreatePixmap( Display* , Drawable , unsigned int , unsigned int , unsigned int ); extern Pixmap XCreateBitmapFromData( Display* , Drawable , const char* , unsigned int , unsigned int ); extern Pixmap XCreatePixmapFromBitmapData( Display* , Drawable , char* , unsigned int , unsigned int , unsigned long , unsigned long , unsigned int ); extern Window XCreateSimpleWindow( Display* , Window , int , int , unsigned int , unsigned int , unsigned int , unsigned long , unsigned long ); extern Window XGetSelectionOwner( Display* , Atom ); extern Window XCreateWindow( Display* , Window , int , int , unsigned int , unsigned int , unsigned int , int , unsigned int , Visual* , unsigned long , XSetWindowAttributes* ); extern Colormap *XListInstalledColormaps( Display* , Window , int* ); extern char **XListFonts( Display* , const char* , int , int* ); extern char **XListFontsWithInfo( Display* , const char* , int , int* , XFontStruct** ); extern char **XGetFontPath( Display* , int* ); extern char **XListExtensions( Display* , int* ); extern Atom *XListProperties( Display* , Window , int* ); extern XHostAddress *XListHosts( Display* , int* , int* ); __attribute__((deprecated)) extern KeySym XKeycodeToKeysym( Display* , KeyCode , int ); extern KeySym XLookupKeysym( XKeyEvent* , int ); extern KeySym *XGetKeyboardMapping( Display* , KeyCode , int , int* ); extern KeySym XStringToKeysym( const char* ); extern long XMaxRequestSize( Display* ); extern long XExtendedMaxRequestSize( Display* ); extern char *XResourceManagerString( Display* ); extern char *XScreenResourceString( Screen* ); extern unsigned long XDisplayMotionBufferSize( Display* ); extern VisualID XVisualIDFromVisual( Visual* ); extern int XInitThreads( void ); extern void XLockDisplay( Display* ); extern void XUnlockDisplay( Display* ); extern XExtCodes *XInitExtension( Display* , const char* ); extern XExtCodes *XAddExtension( Display* ); extern XExtData *XFindOnExtensionList( XExtData** , int ); extern XExtData **XEHeadOfExtensionList( XEDataObject ); extern Window XRootWindow( Display* , int ); extern Window XDefaultRootWindow( Display* ); extern Window XRootWindowOfScreen( Screen* ); extern Visual *XDefaultVisual( Display* , int ); extern Visual *XDefaultVisualOfScreen( Screen* ); extern GC XDefaultGC( Display* , int ); extern GC XDefaultGCOfScreen( Screen* ); extern unsigned long XBlackPixel( Display* , int ); extern unsigned long XWhitePixel( Display* , int ); extern unsigned long XAllPlanes( void ); extern unsigned long XBlackPixelOfScreen( Screen* ); extern unsigned long XWhitePixelOfScreen( Screen* ); extern unsigned long XNextRequest( Display* ); extern unsigned long XLastKnownRequestProcessed( Display* ); extern char *XServerVendor( Display* ); extern char *XDisplayString( Display* ); extern Colormap XDefaultColormap( Display* , int ); extern Colormap XDefaultColormapOfScreen( Screen* ); extern Display *XDisplayOfScreen( Screen* ); extern Screen *XScreenOfDisplay( Display* , int ); extern Screen *XDefaultScreenOfDisplay( Display* ); extern long XEventMaskOfScreen( Screen* ); extern int XScreenNumberOfScreen( Screen* ); typedef int (*XErrorHandler) ( Display* , XErrorEvent* ); extern XErrorHandler XSetErrorHandler ( XErrorHandler ); typedef int (*XIOErrorHandler) ( Display* ); extern XIOErrorHandler XSetIOErrorHandler ( XIOErrorHandler ); extern XPixmapFormatValues *XListPixmapFormats( Display* , int* ); extern int *XListDepths( Display* , int , int* ); extern int XReconfigureWMWindow( Display* , Window , int , unsigned int , XWindowChanges* ); extern int XGetWMProtocols( Display* , Window , Atom** , int* ); extern int XSetWMProtocols( Display* , Window , Atom* , int ); extern int XIconifyWindow( Display* , Window , int ); extern int XWithdrawWindow( Display* , Window , int ); extern int XGetCommand( Display* , Window , char*** , int* ); extern int XGetWMColormapWindows( Display* , Window , Window** , int* ); extern int XSetWMColormapWindows( Display* , Window , Window* , int ); extern void XFreeStringList( char** ); extern int XSetTransientForHint( Display* , Window , Window ); extern int XActivateScreenSaver( Display* ); extern int XAddHost( Display* , XHostAddress* ); extern int XAddHosts( Display* , XHostAddress* , int ); extern int XAddToExtensionList( struct _XExtData** , XExtData* ); extern int XAddToSaveSet( Display* , Window ); extern int XAllocColor( Display* , Colormap , XColor* ); extern int XAllocColorCells( Display* , Colormap , int , unsigned long* , unsigned int , unsigned long* , unsigned int ); extern int XAllocColorPlanes( Display* , Colormap , int , unsigned long* , int , int , int , int , unsigned long* , unsigned long* , unsigned long* ); extern int XAllocNamedColor( Display* , Colormap , const char* , XColor* , XColor* ); extern int XAllowEvents( Display* , int , Time ); extern int XAutoRepeatOff( Display* ); extern int XAutoRepeatOn( Display* ); extern int XBell( Display* , int ); extern int XBitmapBitOrder( Display* ); extern int XBitmapPad( Display* ); extern int XBitmapUnit( Display* ); extern int XCellsOfScreen( Screen* ); extern int XChangeActivePointerGrab( Display* , unsigned int , Cursor , Time ); extern int XChangeGC( Display* , GC , unsigned long , XGCValues* ); extern int XChangeKeyboardControl( Display* , unsigned long , XKeyboardControl* ); extern int XChangeKeyboardMapping( Display* , int , int , KeySym* , int ); extern int XChangePointerControl( Display* , int , int , int , int , int ); extern int XChangeProperty( Display* , Window , Atom , Atom , int , int , const unsigned char* , int ); extern int XChangeSaveSet( Display* , Window , int ); extern int XChangeWindowAttributes( Display* , Window , unsigned long , XSetWindowAttributes* ); extern int XCheckIfEvent( Display* , XEvent* , int (*) ( Display* , XEvent* , XPointer ) , XPointer ); extern int XCheckMaskEvent( Display* , long , XEvent* ); extern int XCheckTypedEvent( Display* , int , XEvent* ); extern int XCheckTypedWindowEvent( Display* , Window , int , XEvent* ); extern int XCheckWindowEvent( Display* , Window , long , XEvent* ); extern int XCirculateSubwindows( Display* , Window , int ); extern int XCirculateSubwindowsDown( Display* , Window ); extern int XCirculateSubwindowsUp( Display* , Window ); extern int XClearArea( Display* , Window , int , int , unsigned int , unsigned int , int ); extern int XClearWindow( Display* , Window ); extern int XCloseDisplay( Display* ); extern int XConfigureWindow( Display* , Window , unsigned int , XWindowChanges* ); extern int XConnectionNumber( Display* ); extern int XConvertSelection( Display* , Atom , Atom , Atom , Window , Time ); extern int XCopyArea( Display* , Drawable , Drawable , GC , int , int , unsigned int , unsigned int , int , int ); extern int XCopyGC( Display* , GC , unsigned long , GC ); extern int XCopyPlane( Display* , Drawable , Drawable , GC , int , int , unsigned int , unsigned int , int , int , unsigned long ); extern int XDefaultDepth( Display* , int ); extern int XDefaultDepthOfScreen( Screen* ); extern int XDefaultScreen( Display* ); extern int XDefineCursor( Display* , Window , Cursor ); extern int XDeleteProperty( Display* , Window , Atom ); extern int XDestroyWindow( Display* , Window ); extern int XDestroySubwindows( Display* , Window ); extern int XDoesBackingStore( Screen* ); extern int XDoesSaveUnders( Screen* ); extern int XDisableAccessControl( Display* ); extern int XDisplayCells( Display* , int ); extern int XDisplayHeight( Display* , int ); extern int XDisplayHeightMM( Display* , int ); extern int XDisplayKeycodes( Display* , int* , int* ); extern int XDisplayPlanes( Display* , int ); extern int XDisplayWidth( Display* , int ); extern int XDisplayWidthMM( Display* , int ); extern int XDrawArc( Display* , Drawable , GC , int , int , unsigned int , unsigned int , int , int ); extern int XDrawArcs( Display* , Drawable , GC , XArc* , int ); extern int XDrawImageString( Display* , Drawable , GC , int , int , const char* , int ); extern int XDrawImageString16( Display* , Drawable , GC , int , int , const XChar2b* , int ); extern int XDrawLine( Display* , Drawable , GC , int , int , int , int ); extern int XDrawLines( Display* , Drawable , GC , XPoint* , int , int ); extern int XDrawPoint( Display* , Drawable , GC , int , int ); extern int XDrawPoints( Display* , Drawable , GC , XPoint* , int , int ); extern int XDrawRectangle( Display* , Drawable , GC , int , int , unsigned int , unsigned int ); extern int XDrawRectangles( Display* , Drawable , GC , XRectangle* , int ); extern int XDrawSegments( Display* , Drawable , GC , XSegment* , int ); extern int XDrawString( Display* , Drawable , GC , int , int , const char* , int ); extern int XDrawString16( Display* , Drawable , GC , int , int , const XChar2b* , int ); extern int XDrawText( Display* , Drawable , GC , int , int , XTextItem* , int ); extern int XDrawText16( Display* , Drawable , GC , int , int , XTextItem16* , int ); extern int XEnableAccessControl( Display* ); extern int XEventsQueued( Display* , int ); extern int XFetchName( Display* , Window , char** ); extern int XFillArc( Display* , Drawable , GC , int , int , unsigned int , unsigned int , int , int ); extern int XFillArcs( Display* , Drawable , GC , XArc* , int ); extern int XFillPolygon( Display* , Drawable , GC , XPoint* , int , int , int ); extern int XFillRectangle( Display* , Drawable , GC , int , int , unsigned int , unsigned int ); extern int XFillRectangles( Display* , Drawable , GC , XRectangle* , int ); extern int XFlush( Display* ); extern int XForceScreenSaver( Display* , int ); extern int XFree( void* ); extern int XFreeColormap( Display* , Colormap ); extern int XFreeColors( Display* , Colormap , unsigned long* , int , unsigned long ); extern int XFreeCursor( Display* , Cursor ); extern int XFreeExtensionList( char** ); extern int XFreeFont( Display* , XFontStruct* ); extern int XFreeFontInfo( char** , XFontStruct* , int ); extern int XFreeFontNames( char** ); extern int XFreeFontPath( char** ); extern int XFreeGC( Display* , GC ); extern int XFreeModifiermap( XModifierKeymap* ); extern int XFreePixmap( Display* , Pixmap ); extern int XGeometry( Display* , int , const char* , const char* , unsigned int , unsigned int , unsigned int , int , int , int* , int* , int* , int* ); extern int XGetErrorDatabaseText( Display* , const char* , const char* , const char* , char* , int ); extern int XGetErrorText( Display* , int , char* , int ); extern int XGetFontProperty( XFontStruct* , Atom , unsigned long* ); extern int XGetGCValues( Display* , GC , unsigned long , XGCValues* ); extern int XGetGeometry( Display* , Drawable , Window* , int* , int* , unsigned int* , unsigned int* , unsigned int* , unsigned int* ); extern int XGetIconName( Display* , Window , char** ); extern int XGetInputFocus( Display* , Window* , int* ); extern int XGetKeyboardControl( Display* , XKeyboardState* ); extern int XGetPointerControl( Display* , int* , int* , int* ); extern int XGetPointerMapping( Display* , unsigned char* , int ); extern int XGetScreenSaver( Display* , int* , int* , int* , int* ); extern int XGetTransientForHint( Display* , Window , Window* ); extern int XGetWindowProperty( Display* , Window , Atom , long , long , int , Atom , Atom* , int* , unsigned long* , unsigned long* , unsigned char** ); extern int XGetWindowAttributes( Display* , Window , XWindowAttributes* ); extern int XGrabButton( Display* , unsigned int , unsigned int , Window , int , unsigned int , int , int , Window , Cursor ); extern int XGrabKey( Display* , int , unsigned int , Window , int , int , int ); extern int XGrabKeyboard( Display* , Window , int , int , int , Time ); extern int XGrabPointer( Display* , Window , int , unsigned int , int , int , Window , Cursor , Time ); extern int XGrabServer( Display* ); extern int XHeightMMOfScreen( Screen* ); extern int XHeightOfScreen( Screen* ); extern int XIfEvent( Display* , XEvent* , int (*) ( Display* , XEvent* , XPointer ) , XPointer ); extern int XImageByteOrder( Display* ); extern int XInstallColormap( Display* , Colormap ); extern KeyCode XKeysymToKeycode( Display* , KeySym ); extern int XKillClient( Display* , XID ); extern int XLookupColor( Display* , Colormap , const char* , XColor* , XColor* ); extern int XLowerWindow( Display* , Window ); extern int XMapRaised( Display* , Window ); extern int XMapSubwindows( Display* , Window ); extern int XMapWindow( Display* , Window ); extern int XMaskEvent( Display* , long , XEvent* ); extern int XMaxCmapsOfScreen( Screen* ); extern int XMinCmapsOfScreen( Screen* ); extern int XMoveResizeWindow( Display* , Window , int , int , unsigned int , unsigned int ); extern int XMoveWindow( Display* , Window , int , int ); extern int XNextEvent( Display* , XEvent* ); extern int XNoOp( Display* ); extern int XParseColor( Display* , Colormap , const char* , XColor* ); extern int XParseGeometry( const char* , int* , int* , unsigned int* , unsigned int* ); extern int XPeekEvent( Display* , XEvent* ); extern int XPeekIfEvent( Display* , XEvent* , int (*) ( Display* , XEvent* , XPointer ) , XPointer ); extern int XPending( Display* ); extern int XPlanesOfScreen( Screen* ); extern int XProtocolRevision( Display* ); extern int XProtocolVersion( Display* ); extern int XPutBackEvent( Display* , XEvent* ); extern int XPutImage( Display* , Drawable , GC , XImage* , int , int , int , int , unsigned int , unsigned int ); extern int XQLength( Display* ); extern int XQueryBestCursor( Display* , Drawable , unsigned int , unsigned int , unsigned int* , unsigned int* ); extern int XQueryBestSize( Display* , int , Drawable , unsigned int , unsigned int , unsigned int* , unsigned int* ); extern int XQueryBestStipple( Display* , Drawable , unsigned int , unsigned int , unsigned int* , unsigned int* ); extern int XQueryBestTile( Display* , Drawable , unsigned int , unsigned int , unsigned int* , unsigned int* ); extern int XQueryColor( Display* , Colormap , XColor* ); extern int XQueryColors( Display* , Colormap , XColor* , int ); extern int XQueryExtension( Display* , const char* , int* , int* , int* ); extern int XQueryKeymap( Display* , char [32] ); extern int XQueryPointer( Display* , Window , Window* , Window* , int* , int* , int* , int* , unsigned int* ); extern int XQueryTextExtents( Display* , XID , const char* , int , int* , int* , int* , XCharStruct* ); extern int XQueryTextExtents16( Display* , XID , const XChar2b* , int , int* , int* , int* , XCharStruct* ); extern int XQueryTree( Display* , Window , Window* , Window* , Window** , unsigned int* ); extern int XRaiseWindow( Display* , Window ); extern int XReadBitmapFile( Display* , Drawable , const char* , unsigned int* , unsigned int* , Pixmap* , int* , int* ); extern int XReadBitmapFileData( const char* , unsigned int* , unsigned int* , unsigned char** , int* , int* ); extern int XRebindKeysym( Display* , KeySym , KeySym* , int , const unsigned char* , int ); extern int XRecolorCursor( Display* , Cursor , XColor* , XColor* ); extern int XRefreshKeyboardMapping( XMappingEvent* ); extern int XRemoveFromSaveSet( Display* , Window ); extern int XRemoveHost( Display* , XHostAddress* ); extern int XRemoveHosts( Display* , XHostAddress* , int ); extern int XReparentWindow( Display* , Window , Window , int , int ); extern int XResetScreenSaver( Display* ); extern int XResizeWindow( Display* , Window , unsigned int , unsigned int ); extern int XRestackWindows( Display* , Window* , int ); extern int XRotateBuffers( Display* , int ); extern int XRotateWindowProperties( Display* , Window , Atom* , int , int ); extern int XScreenCount( Display* ); extern int XSelectInput( Display* , Window , long ); extern int XSendEvent( Display* , Window , int , long , XEvent* ); extern int XSetAccessControl( Display* , int ); extern int XSetArcMode( Display* , GC , int ); extern int XSetBackground( Display* , GC , unsigned long ); extern int XSetClipMask( Display* , GC , Pixmap ); extern int XSetClipOrigin( Display* , GC , int , int ); extern int XSetClipRectangles( Display* , GC , int , int , XRectangle* , int , int ); extern int XSetCloseDownMode( Display* , int ); extern int XSetCommand( Display* , Window , char** , int ); extern int XSetDashes( Display* , GC , int , const char* , int ); extern int XSetFillRule( Display* , GC , int ); extern int XSetFillStyle( Display* , GC , int ); extern int XSetFont( Display* , GC , Font ); extern int XSetFontPath( Display* , char** , int ); extern int XSetForeground( Display* , GC , unsigned long ); extern int XSetFunction( Display* , GC , int ); extern int XSetGraphicsExposures( Display* , GC , int ); extern int XSetIconName( Display* , Window , const char* ); extern int XSetInputFocus( Display* , Window , int , Time ); extern int XSetLineAttributes( Display* , GC , unsigned int , int , int , int ); extern int XSetModifierMapping( Display* , XModifierKeymap* ); extern int XSetPlaneMask( Display* , GC , unsigned long ); extern int XSetPointerMapping( Display* , const unsigned char* , int ); extern int XSetScreenSaver( Display* , int , int , int , int ); extern int XSetSelectionOwner( Display* , Atom , Window , Time ); extern int XSetState( Display* , GC , unsigned long , unsigned long , int , unsigned long ); extern int XSetStipple( Display* , GC , Pixmap ); extern int XSetSubwindowMode( Display* , GC , int ); extern int XSetTSOrigin( Display* , GC , int , int ); extern int XSetTile( Display* , GC , Pixmap ); extern int XSetWindowBackground( Display* , Window , unsigned long ); extern int XSetWindowBackgroundPixmap( Display* , Window , Pixmap ); extern int XSetWindowBorder( Display* , Window , unsigned long ); extern int XSetWindowBorderPixmap( Display* , Window , Pixmap ); extern int XSetWindowBorderWidth( Display* , Window , unsigned int ); extern int XSetWindowColormap( Display* , Window , Colormap ); extern int XStoreBuffer( Display* , const char* , int , int ); extern int XStoreBytes( Display* , const char* , int ); extern int XStoreColor( Display* , Colormap , XColor* ); extern int XStoreColors( Display* , Colormap , XColor* , int ); extern int XStoreName( Display* , Window , const char* ); extern int XStoreNamedColor( Display* , Colormap , const char* , unsigned long , int ); extern int XSync( Display* , int ); extern int XTextExtents( XFontStruct* , const char* , int , int* , int* , int* , XCharStruct* ); extern int XTextExtents16( XFontStruct* , const XChar2b* , int , int* , int* , int* , XCharStruct* ); extern int XTextWidth( XFontStruct* , const char* , int ); extern int XTextWidth16( XFontStruct* , const XChar2b* , int ); extern int XTranslateCoordinates( Display* , Window , Window , int , int , int* , int* , Window* ); extern int XUndefineCursor( Display* , Window ); extern int XUngrabButton( Display* , unsigned int , unsigned int , Window ); extern int XUngrabKey( Display* , int , unsigned int , Window ); extern int XUngrabKeyboard( Display* , Time ); extern int XUngrabPointer( Display* , Time ); extern int XUngrabServer( Display* ); extern int XUninstallColormap( Display* , Colormap ); extern int XUnloadFont( Display* , Font ); extern int XUnmapSubwindows( Display* , Window ); extern int XUnmapWindow( Display* , Window ); extern int XVendorRelease( Display* ); extern int XWarpPointer( Display* , Window , Window , int , int , unsigned int , unsigned int , int , int ); extern int XWidthMMOfScreen( Screen* ); extern int XWidthOfScreen( Screen* ); extern int XWindowEvent( Display* , Window , long , XEvent* ); extern int XWriteBitmapFile( Display* , const char* , Pixmap , unsigned int , unsigned int , int , int ); extern int XSupportsLocale (void); extern char *XSetLocaleModifiers( const char* ); extern XOM XOpenOM( Display* , struct _XrmHashBucketRec* , const char* , const char* ); extern int XCloseOM( XOM ); extern char *XSetOMValues( XOM , ... ) __attribute__ ((__sentinel__(0))); extern char *XGetOMValues( XOM , ... ) __attribute__ ((__sentinel__(0))); extern Display *XDisplayOfOM( XOM ); extern char *XLocaleOfOM( XOM ); extern XOC XCreateOC( XOM , ... ) __attribute__ ((__sentinel__(0))); extern void XDestroyOC( XOC ); extern XOM XOMOfOC( XOC ); extern char *XSetOCValues( XOC , ... ) __attribute__ ((__sentinel__(0))); extern char *XGetOCValues( XOC , ... ) __attribute__ ((__sentinel__(0))); extern XFontSet XCreateFontSet( Display* , const char* , char*** , int* , char** ); extern void XFreeFontSet( Display* , XFontSet ); extern int XFontsOfFontSet( XFontSet , XFontStruct*** , char*** ); extern char *XBaseFontNameListOfFontSet( XFontSet ); extern char *XLocaleOfFontSet( XFontSet ); extern int XContextDependentDrawing( XFontSet ); extern int XDirectionalDependentDrawing( XFontSet ); extern int XContextualDrawing( XFontSet ); extern XFontSetExtents *XExtentsOfFontSet( XFontSet ); extern int XmbTextEscapement( XFontSet , const char* , int ); extern int XwcTextEscapement( XFontSet , const wchar_t* , int ); extern int Xutf8TextEscapement( XFontSet , const char* , int ); extern int XmbTextExtents( XFontSet , const char* , int , XRectangle* , XRectangle* ); extern int XwcTextExtents( XFontSet , const wchar_t* , int , XRectangle* , XRectangle* ); extern int Xutf8TextExtents( XFontSet , const char* , int , XRectangle* , XRectangle* ); extern int XmbTextPerCharExtents( XFontSet , const char* , int , XRectangle* , XRectangle* , int , int* , XRectangle* , XRectangle* ); extern int XwcTextPerCharExtents( XFontSet , const wchar_t* , int , XRectangle* , XRectangle* , int , int* , XRectangle* , XRectangle* ); extern int Xutf8TextPerCharExtents( XFontSet , const char* , int , XRectangle* , XRectangle* , int , int* , XRectangle* , XRectangle* ); extern void XmbDrawText( Display* , Drawable , GC , int , int , XmbTextItem* , int ); extern void XwcDrawText( Display* , Drawable , GC , int , int , XwcTextItem* , int ); extern void Xutf8DrawText( Display* , Drawable , GC , int , int , XmbTextItem* , int ); extern void XmbDrawString( Display* , Drawable , XFontSet , GC , int , int , const char* , int ); extern void XwcDrawString( Display* , Drawable , XFontSet , GC , int , int , const wchar_t* , int ); extern void Xutf8DrawString( Display* , Drawable , XFontSet , GC , int , int , const char* , int ); extern void XmbDrawImageString( Display* , Drawable , XFontSet , GC , int , int , const char* , int ); extern void XwcDrawImageString( Display* , Drawable , XFontSet , GC , int , int , const wchar_t* , int ); extern void Xutf8DrawImageString( Display* , Drawable , XFontSet , GC , int , int , const char* , int ); extern XIM XOpenIM( Display* , struct _XrmHashBucketRec* , char* , char* ); extern int XCloseIM( XIM ); extern char *XGetIMValues( XIM , ... ) __attribute__ ((__sentinel__(0))); extern char *XSetIMValues( XIM , ... ) __attribute__ ((__sentinel__(0))); extern Display *XDisplayOfIM( XIM ); extern char *XLocaleOfIM( XIM ); extern XIC XCreateIC( XIM , ... ) __attribute__ ((__sentinel__(0))); extern void XDestroyIC( XIC ); extern void XSetICFocus( XIC ); extern void XUnsetICFocus( XIC ); extern wchar_t *XwcResetIC( XIC ); extern char *XmbResetIC( XIC ); extern char *Xutf8ResetIC( XIC ); extern char *XSetICValues( XIC , ... ) __attribute__ ((__sentinel__(0))); extern char *XGetICValues( XIC , ... ) __attribute__ ((__sentinel__(0))); extern XIM XIMOfIC( XIC ); extern int XFilterEvent( XEvent* , Window ); extern int XmbLookupString( XIC , XKeyPressedEvent* , char* , int , KeySym* , int* ); extern int XwcLookupString( XIC , XKeyPressedEvent* , wchar_t* , int , KeySym* , int* ); extern int Xutf8LookupString( XIC , XKeyPressedEvent* , char* , int , KeySym* , int* ); extern XVaNestedList XVaCreateNestedList( int , ... ) __attribute__ ((__sentinel__(0))); extern int XRegisterIMInstantiateCallback( Display* , struct _XrmHashBucketRec* , char* , char* , XIDProc , XPointer ); extern int XUnregisterIMInstantiateCallback( Display* , struct _XrmHashBucketRec* , char* , char* , XIDProc , XPointer ); typedef void (*XConnectionWatchProc)( Display* , XPointer , int , int , XPointer* ); extern int XInternalConnectionNumbers( Display* , int** , int* ); extern void XProcessInternalConnection( Display* , int ); extern int XAddConnectionWatch( Display* , XConnectionWatchProc , XPointer ); extern void XRemoveConnectionWatch( Display* , XConnectionWatchProc , XPointer ); extern void XSetAuthorization( char * , int , char * , int ); extern int _Xmbtowc( wchar_t * , char * , int ); extern int _Xwctomb( char * , wchar_t ); extern int XGetEventData( Display* , XGenericEventCookie* ); extern void XFreeEventData( Display* , XGenericEventCookie* ); # 3 "/tmp/petsc-KvGRNM/config.headers/conftest.c" 2 Preprocess stderr before filtering:: Preprocess stderr after filtering:: Found header files ['X11/Xlib.h'] in ['/usr/include', '/usr/lib/openmpi'] Popping language C ================================================================================ TEST checkSharedLibrary from config.packages.X(/home/florian/software/petsc/config/BuildSystem/config/package.py:738) TESTING: checkSharedLibrary from config.packages.X(config/BuildSystem/config/package.py:738) By default we don't care about checking if the library is shared Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.Triangle(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.Triangle(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.PARTY(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.PARTY(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.Numpy(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.Numpy(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default Pushing language C ================================================================================ TEST configureLibrary from config.packages.petsc4py(/home/florian/software/petsc/config/BuildSystem/config/packages/petsc4py.py:82) TESTING: configureLibrary from config.packages.petsc4py(config/BuildSystem/config/packages/petsc4py.py:82) Looking for PETSC4PY at git.petsc4py, hg.petsc4py or a directory starting with ['petsc-petsc4py', 'petsc4py'] Could not locate an existing copy of PETSC4PY: ['git.sowing'] Downloading petsc4py =============================================================================== Trying to download git://https://bitbucket.org/petsc/petsc4py for PETSC4PY =============================================================================== Executing: git clone https://bitbucket.org/petsc/petsc4py /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.petsc4py Looking for PETSC4PY at git.petsc4py, hg.petsc4py or a directory starting with ['petsc-petsc4py', 'petsc4py'] Found a copy of PETSC4PY in git.petsc4py Executing: ['git', 'rev-parse', '--git-dir'] stdout: .git Executing: ['git', 'cat-file', '-e', '026d6fa^{commit}'] Executing: ['git', 'rev-parse', '026d6fa'] stdout: 026d6fae7fb21b1ebe601b2f64c921ae72fd3636 Executing: ['git', 'stash'] stdout: No local changes to save Executing: ['git', 'clean', '-f', '-d', '-x'] Executing: ['git', 'checkout', '-f', '026d6fae7fb21b1ebe601b2f64c921ae72fd3636'] Executing: uname -s stdout: Linux Executing: uname -s stdout: Linux Defined make rule "petsc4pybuild" with dependencies "" and code ['@echo "*** Building petsc4py ***"', '@${RM} -f ${PETSC_ARCH}/lib/petsc/conf/petsc4py.errorflg', '@(cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.petsc4py && \\\n MPICC=${PCC} python setup.py clean --all && \\\n MPICC=${PCC} python setup.py build ) > ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log 2>&1 || \\\n (echo "**************************ERROR*************************************" && \\\n echo "Error building petsc4py. Check ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log" && \\\n echo "********************************************************************" && \\\n touch ${PETSC_ARCH}/lib/petsc/conf/petsc4py.errorflg && \\\n exit 1)'] Defined make rule "petsc4pyinstall" with dependencies "" and code ['@echo "*** Installing petsc4py ***"', '@(MPICC=${PCC} && export MPICC && cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.petsc4py && \\\n MPICC=${PCC} python setup.py install --install-lib=/home/florian/software/petsc/arch-linux2-c-debug/lib) >> ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log 2>&1 || \\\n (echo "**************************ERROR*************************************" && \\\n echo "Error building petsc4py. Check ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log" && \\\n echo "********************************************************************" && \\\n exit 1)', '@echo "====================================="', '@echo "To use petsc4py, add /home/florian/software/petsc/arch-linux2-c-debug/lib to PYTHONPATH"', '@echo "====================================="'] Defined make rule "petsc4py-build" with dependencies "petsc4pybuild petsc4pyinstall" and code [] Defined make rule "petsc4py-install" with dependencies "" and code [] Executing: uname -s stdout: Linux ================================================================================ TEST checkSharedLibrary from config.packages.petsc4py(/home/florian/software/petsc/config/BuildSystem/config/package.py:738) TESTING: checkSharedLibrary from config.packages.petsc4py(config/BuildSystem/config/package.py:738) By default we don't care about checking if the library is shared Popping language C Pushing language C ================================================================================ TEST configureLibrary from config.packages.mpi4py(/home/florian/software/petsc/config/BuildSystem/config/packages/mpi4py.py:65) TESTING: configureLibrary from config.packages.mpi4py(config/BuildSystem/config/packages/mpi4py.py:65) Looking for MPI4PY at git.mpi4py, hg.mpi4py or a directory starting with ['mpi4py'] Could not locate an existing copy of MPI4PY: ['git.sowing', 'git.petsc4py'] Downloading mpi4py =============================================================================== Trying to download https://mpi4py.googlecode.com/files/mpi4py-1.3.1.tar.gz for MPI4PY =============================================================================== ERROR: file could not be opened successfully Downloaded package MPI4PY from: https://mpi4py.googlecode.com/files/mpi4py-1.3.1.tar.gz is not a tarball. [or installed python cannot process compressed files] * If you are behind a firewall - please fix your proxy and rerun ./configure For example at LANL you may need to set the environmental variable http_proxy (or HTTP_PROXY?) to http://proxyout.lanl.gov * You can run with --with-packages-dir=/adirectory and ./configure will instruct you what packages to download manually * or you can download the above URL manually, to /yourselectedlocation/mpi4py-1.3.1.tar.gz and use the configure option: --download-mpi4py=/yourselectedlocation/mpi4py-1.3.1.tar.gz =============================================================================== Trying to download http://ftp.mcs.anl.gov/pub/petsc/externalpackages/mpi4py-1.3.1.tar.gz for MPI4PY =============================================================================== Downloading https://mpi4py.googlecode.com/files/mpi4py-1.3.1.tar.gz to /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/_d_mpi4py-1.3.1.tar.gz Extracting /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/_d_mpi4py-1.3.1.tar.gz Downloading http://ftp.mcs.anl.gov/pub/petsc/externalpackages/mpi4py-1.3.1.tar.gz to /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/_d_mpi4py-1.3.1.tar.gz Extracting /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/_d_mpi4py-1.3.1.tar.gz Executing: cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages; chmod -R a+r mpi4py-1.3.1;find mpi4py-1.3.1 -type d -name "*" -exec chmod a+rx {} \; Looking for MPI4PY at git.mpi4py, hg.mpi4py or a directory starting with ['mpi4py'] Found a copy of MPI4PY in mpi4py-1.3.1 Executing: uname -s stdout: Linux Executing: uname -s stdout: Linux Defined make rule "mpi4pybuild" with dependencies "" and code ['@echo "*** Building mpi4py ***"', '@(MPICC=${PCC} && export MPICC && cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/mpi4py-1.3.1 && \\\n python setup.py clean --all && \\\n python setup.py build ) > ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log 2>&1 || \\\n (echo "**************************ERROR*************************************" && \\\n echo "Error building mpi4py. Check ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log" && \\\n echo "********************************************************************" && \\\n exit 1)'] Defined make rule "mpi4pyinstall" with dependencies "" and code ['@echo "*** Installing mpi4py ***"', '@(MPICC=${PCC} && export MPICC && cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/mpi4py-1.3.1 && \\\n python setup.py install --install-lib=/home/florian/software/petsc/arch-linux2-c-debug/lib) >> ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log 2>&1 || \\\n (echo "**************************ERROR*************************************" && \\\n echo "Error building mpi4py. Check ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log" && \\\n echo "********************************************************************" && \\\n exit 1)', '@echo "====================================="', '@echo "To use mpi4py, add /home/florian/software/petsc/arch-linux2-c-debug/lib to PYTHONPATH"', '@echo "====================================="'] Defined make rule "mpi4py-build" with dependencies "mpi4pybuild mpi4pyinstall" and code [] Defined make rule "mpi4py-install" with dependencies "" and code [] ================================================================================ TEST checkSharedLibrary from config.packages.mpi4py(/home/florian/software/petsc/config/BuildSystem/config/package.py:738) TESTING: checkSharedLibrary from config.packages.mpi4py(config/BuildSystem/config/package.py:738) By default we don't care about checking if the library is shared Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.Matlab(/home/florian/software/petsc/config/BuildSystem/config/packages/Matlab.py:35) TESTING: alternateConfigureLibrary from config.packages.Matlab(config/BuildSystem/config/packages/Matlab.py:35) ================================================================================ TEST alternateConfigureLibrary from config.packages.MatlabEngine(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.MatlabEngine(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.Mathematica(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.Mathematica(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.PTScotch(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.PTScotch(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.hdf5(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.hdf5(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.ascem-io(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.ascem-io(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.pflotran(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.pflotran(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.netcdf(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.netcdf(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.exodusii(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.exodusii(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.MOAB(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.MOAB(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.Chaco(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.Chaco(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST locateCMake from config.packages.cmake(/home/florian/software/petsc/config/BuildSystem/config/packages/cmake.py:33) TESTING: locateCMake from config.packages.cmake(config/BuildSystem/config/packages/cmake.py:33) Looking for default CMake executable Checking for program /home/florian/software/bin/cmake...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/cmake...not found Checking for program /usr/local/sbin/cmake...not found Checking for program /usr/local/bin/cmake...not found Checking for program /usr/bin/cmake...found Defined make macro "CMAKE" to "/usr/bin/cmake" Looking for default CTest executable Checking for program /home/florian/software/bin/ctest...not found Checking for program /home/florian/.gem/ruby/2.4.0/bin/ctest...not found Checking for program /usr/local/sbin/ctest...not found Checking for program /usr/local/bin/ctest...not found Checking for program /usr/bin/ctest...found Defined make macro "CTEST" to "/usr/bin/ctest" ================================================================================ TEST alternateConfigureLibrary from config.packages.unittestcpp(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.unittestcpp(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.alquimia(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.alquimia(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST configureScalarType from PETSc.options.scalarTypes(/home/florian/software/petsc/config/PETSc/options/scalarTypes.py:37) TESTING: configureScalarType from PETSc.options.scalarTypes(config/PETSc/options/scalarTypes.py:37) Choose between real and complex numbers Defined "USE_SCALAR_REAL" to "1" Scalar type is real Pushing language C All intermediate test results are stored in /tmp/petsc-KvGRNM/PETSc.options.scalarTypes Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6:21: warning: unused variable 'a' [-Wunused-variable] double b = 2.0; int a = isnormal(b); ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0; int a = isnormal(b); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_ISNORMAL" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6:21: warning: unused variable 'a' [-Wunused-variable] double b = 2.0; int a = isnan(b); ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0; int a = isnan(b); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_ISNAN" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6:21: warning: unused variable 'a' [-Wunused-variable] double b = 2.0; int a = isinf(b); ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0; int a = isinf(b); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_ISINF" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6:24: warning: implicit declaration of function '_isnan' [-Wimplicit-function-declaration] double b = 2.0;int a = _isnan(b); ^~~~~~ /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6:20: warning: unused variable 'a' [-Wunused-variable] double b = 2.0;int a = _isnan(b); ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0;int a = _isnan(b); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o: In function `main': /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6: undefined reference to `_isnan' collect2: error: ld returned 1 exit status Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6:24: warning: implicit declaration of function '_finite' [-Wimplicit-function-declaration] double b = 2.0;int a = _finite(b); ^~~~~~~ /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6:20: warning: unused variable 'a' [-Wunused-variable] double b = 2.0;int a = _finite(b); ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { double b = 2.0;int a = _finite(b); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.o: In function `main': /tmp/petsc-KvGRNM/PETSc.options.scalarTypes/conftest.c:6: undefined reference to `_finite' collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST configurePrecision from PETSc.options.scalarTypes(/home/florian/software/petsc/config/PETSc/options/scalarTypes.py:77) TESTING: configurePrecision from PETSc.options.scalarTypes(config/PETSc/options/scalarTypes.py:77) Set the default real number precision for PETSc objects Defined "USE_REAL_DOUBLE" to "1" Defined make macro "PETSC_SCALAR_SIZE" to "64" Precision is double ================================================================================ TEST alternateConfigureLibrary from config.packages.opencl(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.opencl(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.viennacl(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.viennacl(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.fblaslapack(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.fblaslapack(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.f2cblaslapack(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.f2cblaslapack(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST configureLibrary from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:333) TESTING: configureLibrary from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:333) ================================================================================ Checking for a functional BLAS and LAPACK in IRIX Mathematics library ================================================================================ TEST checkLib from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:106) TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:106) Checking for BLAS and LAPACK symbols Checking for functions [ddot_] in library ['libcomplib.sgimath.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lcomplib.sgimath -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lcomplib.sgimath collect2: error: ld returned 1 exit status Popping language C Checking for no name mangling on BLAS/LAPACK Checking for functions [ddot] in library ['libcomplib.sgimath.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot(); static void _check_ddot() { ddot(); } int main() { _check_ddot();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lcomplib.sgimath -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lcomplib.sgimath collect2: error: ld returned 1 exit status Popping language C Checking for underscore name mangling on BLAS/LAPACK Checking for functions [ddot_] in library ['libcomplib.sgimath.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lcomplib.sgimath -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lcomplib.sgimath collect2: error: ld returned 1 exit status Popping language C Unknown name mangling in BLAS/LAPACK ================================================================================ Checking for a functional BLAS and LAPACK in Another IRIX Mathematics library ================================================================================ TEST checkLib from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:106) TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:106) Checking for BLAS and LAPACK symbols Checking for functions [ddot_] in library ['libscs.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lscs -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lscs collect2: error: ld returned 1 exit status Popping language C Checking for no name mangling on BLAS/LAPACK Checking for functions [ddot] in library ['libscs.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot(); static void _check_ddot() { ddot(); } int main() { _check_ddot();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lscs -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lscs collect2: error: ld returned 1 exit status Popping language C Checking for underscore name mangling on BLAS/LAPACK Checking for functions [ddot_] in library ['libscs.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lscs -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lscs collect2: error: ld returned 1 exit status Popping language C Unknown name mangling in BLAS/LAPACK ================================================================================ Checking for a functional BLAS and LAPACK in Compaq/Alpha Mathematics library ================================================================================ TEST checkLib from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:106) TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:106) Checking for BLAS and LAPACK symbols Checking for functions [ddot_] in library ['libcxml.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lcxml -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lcxml collect2: error: ld returned 1 exit status Popping language C Checking for no name mangling on BLAS/LAPACK Checking for functions [ddot] in library ['libcxml.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot(); static void _check_ddot() { ddot(); } int main() { _check_ddot();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lcxml -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lcxml collect2: error: ld returned 1 exit status Popping language C Checking for underscore name mangling on BLAS/LAPACK Checking for functions [ddot_] in library ['libcxml.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lcxml -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lcxml collect2: error: ld returned 1 exit status Popping language C Unknown name mangling in BLAS/LAPACK ================================================================================ Checking for a functional BLAS and LAPACK in IBM ESSL Mathematics library ================================================================================ TEST checkLib from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:106) TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:106) Checking for BLAS and LAPACK symbols Checking for functions [ddot_] in library ['libessl.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lessl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lessl collect2: error: ld returned 1 exit status Popping language C Checking for no name mangling on BLAS/LAPACK Checking for functions [ddot] in library ['libessl.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot(); static void _check_ddot() { ddot(); } int main() { _check_ddot();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lessl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lessl collect2: error: ld returned 1 exit status Popping language C Checking for underscore name mangling on BLAS/LAPACK Checking for functions [ddot_] in library ['libessl.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lessl -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lessl collect2: error: ld returned 1 exit status Popping language C Unknown name mangling in BLAS/LAPACK ================================================================================ Checking for a functional BLAS and LAPACK in IBM ESSL Mathematics library for Blue Gene ================================================================================ TEST checkLib from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:106) TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:106) Checking for BLAS and LAPACK symbols Checking for functions [ddot_] in library ['libesslbg.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lesslbg -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lesslbg collect2: error: ld returned 1 exit status Popping language C Checking for no name mangling on BLAS/LAPACK Checking for functions [ddot] in library ['libesslbg.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot(); static void _check_ddot() { ddot(); } int main() { _check_ddot();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lesslbg -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lesslbg collect2: error: ld returned 1 exit status Popping language C Checking for underscore name mangling on BLAS/LAPACK Checking for functions [ddot_] in library ['libesslbg.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lesslbg -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /usr/bin/ld: cannot find -lesslbg collect2: error: ld returned 1 exit status Popping language C Unknown name mangling in BLAS/LAPACK ================================================================================ Checking for a functional BLAS and LAPACK in Default compiler libraries ================================================================================ TEST checkLib from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:106) TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:106) Checking for BLAS and LAPACK symbols Checking for functions [ddot_] in library [''] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_ddot_': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `ddot_' collect2: error: ld returned 1 exit status Popping language C Checking for no name mangling on BLAS/LAPACK Checking for functions [ddot] in library [''] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot(); static void _check_ddot() { ddot(); } int main() { _check_ddot();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_ddot': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `ddot' collect2: error: ld returned 1 exit status Popping language C Checking for underscore name mangling on BLAS/LAPACK Checking for functions [ddot_] in library [''] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_ddot_': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `ddot_' collect2: error: ld returned 1 exit status Popping language C Unknown name mangling in BLAS/LAPACK ================================================================================ Checking for a functional BLAS and LAPACK in Default compiler locations ================================================================================ TEST checkLib from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:106) TESTING: checkLib from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:106) Checking for BLAS and LAPACK symbols Checking for functions [ddot_] in library ['libblas.a'] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ddot_(); static void _check_ddot_() { ddot_(); } int main() { _check_ddot_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBBLAS" to "1" Popping language C Checking for functions [dgetrs_] in library ['liblapack.a'] ['libblas.a', '-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char dgetrs_(); static void _check_dgetrs_() { dgetrs_(); } int main() { _check_dgetrs_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -llapack -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBLAPACK" to "1" Popping language C Checking for functions [dgeev_] in library ['liblapack.a'] ['libblas.a', '-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char dgeev_(); static void _check_dgeev_() { dgeev_(); } int main() { _check_dgeev_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -llapack -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBLAPACK" to "1" Popping language C Found Fortran mangling on BLAS/LAPACK which is underscore Defined "BLASLAPACK_UNDERSCORE" to "1" ================================================================================ TEST check64BitBLASIndices from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:483) TESTING: check64BitBLASIndices from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:483) Check for and use 64bit integer blas ================================================================================ TEST checkESSL from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:400) TESTING: checkESSL from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:400) Check for the IBM ESSL library Checking for functions [iessl] in library ['liblapack.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char iessl(); static void _check_iessl() { iessl(); } int main() { _check_iessl();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -llapack -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_iessl': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `iessl' collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST checkPESSL from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:416) TESTING: checkPESSL from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:416) Check for the IBM PESSL library - and error out - if used instead of ESSL Checking for functions [ipessl] in library ['liblapack.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char ipessl(); static void _check_ipessl() { ipessl(); } int main() { _check_ipessl();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -llapack -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_ipessl': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `ipessl' collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST checkMKL from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:408) TESTING: checkMKL from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:408) Check for Intel MKL library Checking for functions [mkl_set_num_threads] in library ['liblapack.a'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char mkl_set_num_threads(); static void _check_mkl_set_num_threads() { mkl_set_num_threads(); } int main() { _check_mkl_set_num_threads();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -llapack -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_mkl_set_num_threads': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `mkl_set_num_threads' collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST checkMissing from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:441) TESTING: checkMissing from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:441) Check for missing LAPACK routines Checking for functions [dtrsen_ dgerfs_ dgges_ dtgsen_ dgesvd_ dgetrf_ dgetrs_ dgeev_ dgelss_ dsyev_ dsyevx_ dsygv_ dsygvx_ dpotrf_ dpotrs_ dstebz_ dpttrf_ dpttrs_ dstein_ dorgqr_ dgeqrf_ dgesv_ dhseqr_ dsteqr_] in library ['liblapack.a'] ['libblas.a', '-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char dtrsen_(); static void _check_dtrsen_() { dtrsen_(); } char dgerfs_(); static void _check_dgerfs_() { dgerfs_(); } char dgges_(); static void _check_dgges_() { dgges_(); } char dtgsen_(); static void _check_dtgsen_() { dtgsen_(); } char dgesvd_(); static void _check_dgesvd_() { dgesvd_(); } char dgetrf_(); static void _check_dgetrf_() { dgetrf_(); } char dgetrs_(); static void _check_dgetrs_() { dgetrs_(); } char dgeev_(); static void _check_dgeev_() { dgeev_(); } char dgelss_(); static void _check_dgelss_() { dgelss_(); } char dsyev_(); static void _check_dsyev_() { dsyev_(); } char dsyevx_(); static void _check_dsyevx_() { dsyevx_(); } char dsygv_(); static void _check_dsygv_() { dsygv_(); } char dsygvx_(); static void _check_dsygvx_() { dsygvx_(); } char dpotrf_(); static void _check_dpotrf_() { dpotrf_(); } char dpotrs_(); static void _check_dpotrs_() { dpotrs_(); } char dstebz_(); static void _check_dstebz_() { dstebz_(); } char dpttrf_(); static void _check_dpttrf_() { dpttrf_(); } char dpttrs_(); static void _check_dpttrs_() { dpttrs_(); } char dstein_(); static void _check_dstein_() { dstein_(); } char dorgqr_(); static void _check_dorgqr_() { dorgqr_(); } char dgeqrf_(); static void _check_dgeqrf_() { dgeqrf_(); } char dgesv_(); static void _check_dgesv_() { dgesv_(); } char dhseqr_(); static void _check_dhseqr_() { dhseqr_(); } char dsteqr_(); static void _check_dsteqr_() { dsteqr_(); } int main() { _check_dtrsen_(); _check_dgerfs_(); _check_dgges_(); _check_dtgsen_(); _check_dgesvd_(); _check_dgetrf_(); _check_dgetrs_(); _check_dgeev_(); _check_dgelss_(); _check_dsyev_(); _check_dsyevx_(); _check_dsygv_(); _check_dsygvx_(); _check_dpotrf_(); _check_dpotrs_(); _check_dstebz_(); _check_dpttrf_(); _check_dpttrs_(); _check_dstein_(); _check_dorgqr_(); _check_dgeqrf_(); _check_dgesv_(); _check_dhseqr_(); _check_dsteqr_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -llapack -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBLAPACK" to "1" Popping language C ================================================================================ TEST checklsame from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:456) TESTING: checklsame from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:456) Do the BLAS/LAPACK libraries have a valid lsame() function with correction binding. Lion and xcode 4.2 do not Checking for functions [lsame_] in library ['liblapack.a', 'libblas.a', '-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char lsame_(); static void _check_lsame_() { lsame_(); } int main() { _check_lsame_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -llapack -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_LIBLAPACK" to "1" Defined "HAVE_LIBBLAS" to "1" Defined "HAVE_LIBMPI_USEMPIF08" to "1" Defined "HAVE_LIBMPI_USEMPI_IGNORE_TKR" to "1" Defined "HAVE_LIBMPI_MPIFH" to "1" Defined "HAVE_LIBGFORTRAN" to "1" Defined "HAVE_LIBM" to "1" Defined "HAVE_LIBGFORTRAN" to "1" Defined "HAVE_LIBM" to "1" Defined "HAVE_LIBQUADMATH" to "1" Defined "HAVE_LIBM" to "1" Popping language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" char *dgeev_(void); char* testroutine(void){return dgeev_(); }Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" char *dgeev_(void); char* testroutine(void){return dgeev_(); } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.setCompilers/libconftest.so -shared -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.setCompilers/conftest.o -llapack -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl ================================================================================ TEST checksdotreturnsdouble from config.packages.BlasLapack(/home/florian/software/petsc/config/BuildSystem/config/packages/BlasLapack.py:518) TESTING: checksdotreturnsdouble from config.packages.BlasLapack(config/BuildSystem/config/packages/BlasLapack.py:518) Determines if BLAS sdot routine returns a float or a double Checking if sdot() returns a float or a double Pushing language C All intermediate test results are stored in /tmp/petsc-KvGRNM/config.packages.BlasLapack Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *output = fopen("runtimetestoutput","w"); extern float sdot_(int*,float*,int *,float*,int*); float x1[1] = {3.0}; int one1 = 1; float sdotresult = sdot_(&one1,x1,&one1,x1,&one1); fprintf(output, " '--known-sdot-returns-double=%d',\n",(sdotresult != 9.0)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest.o -llapack -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest Executing: /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest Popping language C Checking if snrm() returns a float or a double Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { FILE *output = fopen("runtimetestoutput","w"); extern float snrm2_(int*,float*,int*); float x2[1] = {3.0}; int one2 = 1; float normresult = snrm2_(&one2,x2,&one2); fprintf(output, " '--known-snrm2-returns-double=%d',\n",(normresult != 3.0)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest.o -llapack -lblas -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest Executing: /tmp/petsc-KvGRNM/config.packages.BlasLapack/conftest Popping language C ================================================================================ TEST alternateConfigureLibrary from config.packages.sundials(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.sundials(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.spai(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.spai(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.mkl_cpardiso(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.mkl_cpardiso(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.fftw(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.fftw(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.ml(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.ml(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.hypre(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.hypre(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.mkl_pardiso(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.mkl_pardiso(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.SuperLU_MT(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.SuperLU_MT(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.SuperLU(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.SuperLU(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.SuiteSparse(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.SuiteSparse(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.PaStiX(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.PaStiX(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.scalapack(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.scalapack(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.Chombo(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.Chombo(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.pARMS(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.pARMS(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.metis(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.metis(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.pragmatic(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.pragmatic(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.parmetis(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.parmetis(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.elemental(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.elemental(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.Zoltan(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.Zoltan(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.SuperLU_DIST(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.SuperLU_DIST(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST configureRegression from PETSc.Regression(/home/florian/software/petsc/config/PETSc/Regression.py:35) TESTING: configureRegression from PETSc.Regression(config/PETSc/Regression.py:35) Output a file listing the jobs that should be run by the PETSc buildtest Defined make macro "TEST_RUNS" to "C C_Info C_NotSingle C_X Fortran Fortran_NotSingle F90_NotSingle Fortran_NoComplex_NotSingle C_NoComplex_NotSingle Cxx F90 F90_NoComplex F2003 Fortran_NoComplex C_NoComplex DOUBLEINT32 Fortran_DOUBLEINT32" ================================================================================ TEST alternateConfigureLibrary from config.packages.MUMPS(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.MUMPS(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.Trilinos(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.Trilinos(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.xSDKTrilinos(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.xSDKTrilinos(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.mstk(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.mstk(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.amanzi(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.amanzi(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.cuda(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.cuda(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST alternateConfigureLibrary from config.packages.cusp(/home/florian/software/petsc/config/BuildSystem/config/package.py:742) TESTING: alternateConfigureLibrary from config.packages.cusp(config/BuildSystem/config/package.py:742) Called if --with-packagename=0; does nothing by default ================================================================================ TEST configureVecCUDA from config.utilities.veccuda(/home/florian/software/petsc/config/BuildSystem/config/utilities/veccuda.py:15) TESTING: configureVecCUDA from config.utilities.veccuda(config/BuildSystem/config/utilities/veccuda.py:15) Configure VecCUDA as fallback CUDA vector if CUSP and VIENNACL are not present ================================================================================ TEST configureRTLDDefault from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:839) TESTING: configureRTLDDefault from PETSc.Configure(config/PETSc/Configure.py:839) All intermediate test results are stored in /tmp/petsc-KvGRNM/PETSc.Configure Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:4:15: error: 'RTLD_DEFAULT' undeclared here (not in a function) void *ptr = RTLD_DEFAULT; ^~~~~~~~~~~~ Source: #include "confdefs.h" #include "conffix.h" #include void *ptr = RTLD_DEFAULT; int main() { ; return 0; } ================================================================================ TEST configurePrefetch from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:642) TESTING: configurePrefetch from PETSc.Configure(config/PETSc/Configure.py:642) Sees if there are any prefetch functions supported Executing: uname -s stdout: Linux Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { void *v = 0;_mm_prefetch((const char*)v,_MM_HINT_NTA); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_XMMINTRIN_H" to "1" Defined "Prefetch(a,b,c)" to "_mm_prefetch((const char*)(a),(c))" Defined "PREFETCH_HINT_NTA" to "_MM_HINT_NTA" Defined "PREFETCH_HINT_T0" to "_MM_HINT_T0" Defined "PREFETCH_HINT_T1" to "_MM_HINT_T1" Defined "PREFETCH_HINT_T2" to "_MM_HINT_T2" Popping language C ================================================================================ TEST configureUnused from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:703) TESTING: configureUnused from PETSc.Configure(config/PETSc/Configure.py:703) Sees if __attribute((unused)) is supported Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:7:5: warning: unused variable 'j' [-Wunused-variable] int j = myfunc(&i); ^ Source: #include "confdefs.h" #include "conffix.h" __attribute((unused)) static int myfunc(__attribute((unused)) void *name){ return 1;} int main() { int i = 0; int j = myfunc(&i); typedef void* atype; __attribute((unused)) atype a; ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "UNUSED" to "__attribute((unused))" Popping language C ================================================================================ TEST configureDeprecated from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:721) TESTING: configureDeprecated from PETSc.Configure(config/PETSc/Configure.py:721) Check if __attribute((deprecated)) is supported Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:3:38: warning: 'myfunc' defined but not used [-Wunused-function] __attribute((deprecated)) static int myfunc(void) { return 1;} ^~~~~~ Source: #include "confdefs.h" #include "conffix.h" __attribute((deprecated)) static int myfunc(void) { return 1;} int main() { ; return 0; } Defined "DEPRECATED(why)" to "__attribute((deprecated))" Popping language C ================================================================================ TEST configureIsatty from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:715) TESTING: configureIsatty from PETSc.Configure(config/PETSc/Configure.py:715) Check if the Unix C function isatty() works correctly Actually just assumes it does not work correctly on batch systems Defined "USE_ISATTY" to "1" ================================================================================ TEST configureExpect from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:788) TESTING: configureExpect from PETSc.Configure(config/PETSc/Configure.py:788) Sees if the __builtin_expect directive is supported Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { if (__builtin_expect(0,1)) return 1;; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_BUILTIN_EXPECT" to "1" Popping language C ================================================================================ TEST configureAlign from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:741) TESTING: configureAlign from PETSc.Configure(config/PETSc/Configure.py:741) Check if __attribute(align) is supported Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #include #endif int main() { struct mystruct {int myint;} __attribute((aligned(16))); FILE *f = fopen("conftestalign", "w"); if (!f) exit(1); fprintf(f, "%lu\n", (unsigned long)sizeof(struct mystruct)); ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Testing executable /tmp/petsc-KvGRNM/PETSc.Configure/conftest to see if it can be run Executing: /tmp/petsc-KvGRNM/PETSc.Configure/conftest Executing: /tmp/petsc-KvGRNM/PETSc.Configure/conftest Popping language C Defined "ATTRIBUTEALIGNED(size)" to "__attribute((aligned (size)))" Defined "HAVE_ATTRIBUTEALIGNED" to "1" ================================================================================ TEST configureFunctionName from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:795) TESTING: configureFunctionName from PETSc.Configure(config/PETSc/Configure.py:795) Sees if the compiler supports __func__ or a variant. Falls back on __FUNCT__ which PETSc source defines, but most users do not, thus stack traces through user code are better when the compiler's variant is used. Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { if (__func__[0] != 'm') return 1;; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language C Defined "FUNCTION_NAME_C" to "__func__" Pushing language Cxx Executing: mpicxx -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC /tmp/petsc-KvGRNM/PETSc.Configure/conftest.cc Successful compile: Source: #include "confdefs.h" #include "conffix.h" int main() { if (__func__[0] != 'm') return 1;; return 0; } Pushing language CXX Popping language CXX Executing: mpicxx -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Popping language Cxx Defined "FUNCTION_NAME_CXX" to "__func__" ================================================================================ TEST configureIntptrt from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:817) TESTING: configureIntptrt from PETSc.Configure(config/PETSc/Configure.py:817) Determine what to use for uintptr_t Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:6:18: warning: unused variable 'i' [-Wunused-variable] int x; uintptr_t i = (uintptr_t)&x;; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { int x; uintptr_t i = (uintptr_t)&x;; return 0; } Defined "UINTPTR_T" to "uintptr_t" Popping language C ================================================================================ TEST configureSolaris from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:844) TESTING: configureSolaris from PETSc.Configure(config/PETSc/Configure.py:844) Solaris specific stuff ================================================================================ TEST configureLinux from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:857) TESTING: configureLinux from PETSc.Configure(config/PETSc/Configure.py:857) Linux specific stuff Defined "HAVE_DOUBLE_ALIGN_MALLOC" to "1" ================================================================================ TEST configureWin32 from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:863) TESTING: configureWin32 from PETSc.Configure(config/PETSc/Configure.py:863) Win32 non-cygwin specific stuff Checking for functions [GetComputerName] in library ['Kernel32.lib'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_GetComputerName() { GetComputerName(NULL,NULL);; } int main() { _check_GetComputerName();; return 0; } Compile failed inside link Popping language C Checking for functions [GetComputerName] in library ['kernel32'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_GetComputerName() { GetComputerName(NULL,NULL);; } int main() { _check_GetComputerName();; return 0; } Compile failed inside link Popping language C Checking for functions [GetUserName] in library ['Advapi32.lib'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_GetUserName() { GetUserName(NULL,NULL);; } int main() { _check_GetUserName();; return 0; } Compile failed inside link Popping language C Checking for functions [GetUserName] in library ['advapi32'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_GetUserName() { GetUserName(NULL,NULL);; } int main() { _check_GetUserName();; return 0; } Compile failed inside link Popping language C Checking for functions [GetDC] in library ['User32.lib'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_GetDC() { GetDC(0);; } int main() { _check_GetDC();; return 0; } Compile failed inside link Popping language C Checking for functions [GetDC] in library ['user32'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_GetDC() { GetDC(0);; } int main() { _check_GetDC();; return 0; } Compile failed inside link Popping language C Checking for functions [CreateCompatibleDC] in library ['Gdi32.lib'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_CreateCompatibleDC() { CreateCompatibleDC(0);; } int main() { _check_CreateCompatibleDC();; return 0; } Compile failed inside link Popping language C Checking for functions [CreateCompatibleDC] in library ['gdi32'] [] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.types -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.c:4:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ #include static void _check_CreateCompatibleDC() { CreateCompatibleDC(0);; } int main() { _check_CreateCompatibleDC();; return 0; } Compile failed inside link Popping language C Checking for type: int32_t Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.types/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.types/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/config.types/conftest.c: In function 'main': /tmp/petsc-KvGRNM/config.types/conftest.c:13:9: warning: unused variable 'a' [-Wunused-variable] int32_t a;; ^ Source: #include "confdefs.h" #include "conffix.h" #include #if STDC_HEADERS #include #include #endif int main() { int32_t a;; return 0; } int32_t found Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:6:7: warning: unused variable 'u' [-Wunused-variable] uid_t u; ^ Source: #include "confdefs.h" #include "conffix.h" #include int main() { uid_t u; ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:8:5: warning: unused variable 'a' [-Wunused-variable] int a=R_OK; ^ Source: #include "confdefs.h" #include "conffix.h" #if defined(PETSC_HAVE_UNISTD_H) #include #endif int main() { int a=R_OK; ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { int a=0; if (S_ISDIR(a)){} ; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:3:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include int main() { LARGE_INTEGER a; DWORD b=a.u.HighPart; ; return 0; } Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: exit code 256 stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:3:21: fatal error: Windows.h: No such file or directory #include ^ compilation terminated. Source: #include "confdefs.h" #include "conffix.h" #include #include int main() { int flags = O_BINARY;; return 0; } Defined "PATH_SEPARATOR" to "':'" Defined "REPLACE_DIR_SEPARATOR" to "'\\'" Defined "DIR_SEPARATOR" to "'/'" Defined "DIR" to ""/home/florian/software/petsc"" ================================================================================ TEST configureCygwinBrokenPipe from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:933) TESTING: configureCygwinBrokenPipe from PETSc.Configure(config/PETSc/Configure.py:933) Cygwin version <= 1.7.18 had issues with pipes and long commands invoked from gnu-make http://cygwin.com/ml/cygwin/2013-05/msg00340.html Executing: uname -s stdout: Linux ================================================================================ TEST configureDefaultArch from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:949) TESTING: configureDefaultArch from PETSc.Configure(config/PETSc/Configure.py:949) ================================================================================ TEST configureScript from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:966) TESTING: configureScript from PETSc.Configure(config/PETSc/Configure.py:966) Output a script in the conf directory which will reproduce the configuration ================================================================================ TEST configureInstall from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:1003) TESTING: configureInstall from PETSc.Configure(config/PETSc/Configure.py:1003) Setup the directories for installation Defined make rule "shared_install" with dependencies "" and code ['- at echo "Now to check if the libraries are working do:"', '- at echo "make PETSC_DIR=${PETSC_DIR} PETSC_ARCH=${PETSC_ARCH} test"', '- at echo "========================================="'] ================================================================================ TEST configureGCOV from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:1015) TESTING: configureGCOV from PETSc.Configure(config/PETSc/Configure.py:1015) ================================================================================ TEST configureFortranFlush from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:1020) TESTING: configureFortranFlush from PETSc.Configure(config/PETSc/Configure.py:1020) Checking for functions [flush_] in library [''] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char flush_(); static void _check_flush_() { flush_(); } int main() { _check_flush_();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_flush_': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `flush_' collect2: error: ld returned 1 exit status Popping language C Checking for functions [flush__] in library [''] ['-Wl,-rpath,/usr/lib/openmpi', '-L/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-L/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-L/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-L/home/florian/software/petsc', '-lmpi_usempif08', '-lmpi_usempi_ignore_tkr', '-lmpi_mpifh', '-lgfortran', '-lm', '-Wl,-rpath,/usr/lib/openmpi', '-Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib', '-Wl,-rpath,/home/florian/software/lib', '-Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1', '-Wl,-rpath,/home/florian/software/petsc', '-lgfortran', '-lm', '-lquadmath', '-lm'] Pushing language C Executing: mpicc -c -o /tmp/petsc-KvGRNM/config.libraries/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" /* Override any gcc2 internal prototype to avoid an error. */ char flush__(); static void _check_flush__() { flush__(); } int main() { _check_flush__();; return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/config.libraries/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/config.libraries/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -Wl,-rpath,/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -lgfortran -lm -lquadmath -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Possible ERROR while running linker: exit code 256 stderr: /tmp/petsc-KvGRNM/config.libraries/conftest.o: In function `_check_flush__': /tmp/petsc-KvGRNM/config.libraries/conftest.c:5: undefined reference to `flush__' collect2: error: ld returned 1 exit status Popping language C ================================================================================ TEST configureAtoll from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:698) TESTING: configureAtoll from PETSc.Configure(config/PETSc/Configure.py:698) Checks if atoll exists Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Possible ERROR while running compiler: stderr: /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c: In function 'main': /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c:7:6: warning: unused variable 'v' [-Wunused-variable] long v = atoll("25"); ^ Source: #include "confdefs.h" #include "conffix.h" #define _POSIX_C_SOURCE 200112L #include int main() { long v = atoll("25"); return 0; } Pushing language C Popping language C Executing: mpicc -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl Defined "HAVE_ATOLL" to "1" ================================================================================ TEST configureViewFromOptions from PETSc.Configure(/home/florian/software/petsc/config/PETSc/Configure.py:1027) TESTING: configureViewFromOptions from PETSc.Configure(config/PETSc/Configure.py:1027) Defined make rule "remote" with dependencies "" and code [] Defined make rule "remoteclean" with dependencies "" and code [] Pushing language C Defined make macro "CC_FLAGS" to " -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 " Popping language C Pushing language Cxx Defined make macro "CXX_FLAGS" to " -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC " Popping language Cxx Defined make macro "CPP_FLAGS" to "" Pushing language C Defined make macro "PCC" to "mpicc" Defined make macro "PCC_FLAGS" to " -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 " Popping language C Defined make macro "CC_SUFFIX" to "o" Pushing language C Defined make macro "PCC_LINKER" to "mpicc" Defined make macro "PCC_LINKER_FLAGS" to " -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3" Popping language C Defined make macro "CC_LINKER_SUFFIX" to "" Pushing language FC Defined "HAVE_FORTRAN" to "1" Defined make macro "FPP_FLAGS" to "" Defined make macro "FC_FLAGS" to " -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g " Popping language FC Defined make macro "FC_SUFFIX" to "o" Pushing language FC Executing: mpif90 -V Defined make macro "FC_LINKER" to "mpif90" Defined make macro "FC_LINKER_FLAGS" to " -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g " Popping language FC Defined make macro "FC_MODULE_FLAG" to "-I" Defined make macro "FC_MODULE_OUTPUT_FLAG" to "-J" Pushing language C Defined make macro "SL_LINKER" to "mpicc" Defined make macro "SL_LINKER_FLAGS" to "${PCC_LINKER_FLAGS}" Popping language C Defined make macro "SL_LINKER_SUFFIX" to "so" Defined "SLSUFFIX" to ""so"" Defined make macro "SL_LINKER_LIBS" to "${PETSC_EXTERNAL_LIB_BASIC}" Defined make macro "PETSC_LANGUAGE" to "CONLY" Defined make macro "PETSC_SCALAR" to "real" Defined make macro "PETSC_PRECISION" to "double" Executing: CC -VV Defined "USE_SOCKET_VIEWER" to "1" Executing: mpicc -c -o /tmp/petsc-KvGRNM/PETSc.Configure/conftest.o -I/tmp/petsc-KvGRNM/config.compilers -I/tmp/petsc-KvGRNM/config.utilities.closure -I/tmp/petsc-KvGRNM/config.headers -I/tmp/petsc-KvGRNM/config.utilities.cacheDetails -I/tmp/petsc-KvGRNM/config.atomics -I/tmp/petsc-KvGRNM/config.functions -I/tmp/petsc-KvGRNM/config.utilities.featureTestMacros -I/tmp/petsc-KvGRNM/config.utilities.missing -I/tmp/petsc-KvGRNM/config.packages.MPI -I/tmp/petsc-KvGRNM/config.packages.valgrind -I/tmp/petsc-KvGRNM/config.packages.pthread -I/tmp/petsc-KvGRNM/PETSc.options.scalarTypes -I/tmp/petsc-KvGRNM/config.setCompilers -I/tmp/petsc-KvGRNM/config.packages.BlasLapack -I/tmp/petsc-KvGRNM/PETSc.Configure -I/tmp/petsc-KvGRNM/config.libraries -I/tmp/petsc-KvGRNM/config.types -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 /tmp/petsc-KvGRNM/PETSc.Configure/conftest.c Successful compile: Source: #include "confdefs.h" #include "conffix.h" #include int main() { setsockopt(0,SOL_SOCKET,SO_REUSEADDR,0,0); return 0; } Defined "HAVE_SO_REUSEADDR" to "1" Defined "HAVE_BLASLAPACK" to "1" Defined make macro "BLASLAPACK_LIB" to "-llapack -lblas" Defined make macro "BLASLAPACK_INCLUDE" to "" Defined "HAVE_X" to "1" Defined make macro "X_LIB" to "-lX11" Defined make macro "X_INCLUDE" to "" Defined "HAVE_HWLOC" to "1" Defined make macro "HWLOC_LIB" to "-lhwloc" Defined make macro "HWLOC_INCLUDE" to "" Defined "HAVE_PTHREAD" to "1" Defined make macro "PTHREAD_LIB" to "" Defined make macro "PTHREAD_INCLUDE" to "" Defined "HAVE_SOWING" to "1" Defined make macro "SOWING_LIB" to "" Defined make macro "SOWING_INCLUDE" to "" Defined "HAVE_SSL" to "1" Defined make macro "SSL_LIB" to "-lssl -lcrypto" Defined make macro "SSL_INCLUDE" to "" Defined "HAVE_VALGRIND" to "1" Defined make macro "VALGRIND_LIB" to "" Defined make macro "VALGRIND_INCLUDE" to "" Defined "HAVE_MPI" to "1" Defined make macro "MPI_LIB" to "" Defined make macro "MPI_INCLUDE" to "" Defined make macro "PETSC_WITH_EXTERNAL_LIB" to "-L/home/florian/software/petsc/arch-linux2-c-debug/lib -lpetsc -llapack -lblas -lX11 -lhwloc -lssl -lcrypto -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl " Defined make macro "PETSC_EXTERNAL_LIB_BASIC" to "-llapack -lblas -lX11 -lhwloc -lssl -lcrypto -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl " Defined make macro "PETSC_CC_INCLUDES" to "-I/home/florian/software/petsc/include -I/home/florian/software/petsc/arch-linux2-c-debug/include" Pushing language FC Popping language FC Pushing language FC Popping language FC Defined make macro "PETSC_FC_INCLUDES" to "-I/home/florian/software/petsc/include -I/home/florian/software/petsc/arch-linux2-c-debug/include" Defined make macro "DESTDIR" to "/home/florian/software/petsc/arch-linux2-c-debug" Defined "LIB_DIR" to ""/home/florian/software/petsc/arch-linux2-c-debug/lib"" Defined make macro "LIBNAME" to "${INSTALL_LIB_DIR}/libpetsc.${AR_LIB_SUFFIX}" Defined make macro "SHLIBS" to "libpetsc" Defined make macro "PETSC_LIB_BASIC" to "-lpetsc" Defined make macro "PETSC_KSP_LIB_BASIC" to "-lpetsc" Defined make macro "PETSC_TS_LIB_BASIC" to "-lpetsc" Defined make macro "PETSC_TAO_LIB_BASIC" to "-lpetsc" Defined "USE_SINGLE_LIBRARY" to "1" Defined make macro "PETSC_SYS_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_VEC_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_MAT_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_DM_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_KSP_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_SNES_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_TS_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_TAO_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_CHARACTERISTIC_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_LIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "PETSC_CONTRIB" to "${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB}" Defined make macro "CONFIGURE_OPTIONS" to "--download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1" Pushing language C Popping language C Pushing language FC Popping language FC Pushing language C Popping language C Pushing language FC Popping language FC Executing: ['/usr/bin/cmake', '--version'] stdout: cmake version 3.7.2 CMake suite maintained and supported by Kitware (kitware.com/cmake). Pushing language C Popping language C Pushing language FC Popping language FC Pushing language Cxx Popping language Cxx Contents of initial cache file /home/florian/software/petsc/arch-linux2-c-debug/initial_cache_file.cmake : SET (CMAKE_C_COMPILER mpicc CACHE FILEPATH "Dummy comment" FORCE) SET (CMAKE_C_FLAGS " -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 " CACHE STRING "Dummy comment" FORCE) SET (PETSC_CUDA_HOST_FLAGS ,-fPIC,-Wall,-Wwrite-strings,-Wno-strict-aliasing,-Wno-unknown-pragmas,-fvisibility=hidden,-g3 CACHE STRING "Dummy comment" FORCE) SET (CMAKE_Fortran_COMPILER mpif90 CACHE FILEPATH "Dummy comment" FORCE) SET (CMAKE_Fortran_FLAGS " -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g " CACHE STRING "Dummy comment" FORCE) SET (CMAKE_CXX_COMPILER mpicxx CACHE FILEPATH "Dummy comment" FORCE) SET (CMAKE_CXX_FLAGS " -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC " CACHE STRING "Dummy comment" FORCE) SET (CMAKE_AR /usr/bin/ar CACHE FILEPATH "Dummy comment" FORCE) SET (CMAKE_RANLIB /usr/bin/ranlib CACHE FILEPATH "Dummy comment" FORCE) Removing: /home/florian/software/petsc/arch-linux2-c-debug/CMakeCache.txt Removing: /home/florian/software/petsc/arch-linux2-c-debug/CMakeFiles/3.7.2 Invoking: ['/usr/bin/cmake', '--trace', '--debug-output', '-C/home/florian/software/petsc/arch-linux2-c-debug/initial_cache_file.cmake', '-DPETSC_CMAKE_ARCH:STRING=arch-linux2-c-debug', '/home/florian/software/petsc'] Executing: ['/usr/bin/cmake', '--trace', '--debug-output', '-C/home/florian/software/petsc/arch-linux2-c-debug/initial_cache_file.cmake', '-DPETSC_CMAKE_ARCH:STRING=arch-linux2-c-debug', '/home/florian/software/petsc'] stdout: Running with trace output on. Running with debug output on. loading initial cache file /home/florian/software/petsc/arch-linux2-c-debug/initial_cache_file.cmake -- The C compiler identification is GNU 6.3.1 Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerId.cmake [2] /usr/share/cmake-3.7/Modules/CMakeDetermineCCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Check for working C compiler: /usr/bin/mpicc Called from: [3] /usr/share/cmake-3.7/Modules/CMakeTestCompilerCommon.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Check for working C compiler: /usr/bin/mpicc -- works Called from: [3] /usr/share/cmake-3.7/Modules/CMakeTestCompilerCommon.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting C compiler ABI info Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerABI.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting C compiler ABI info - done Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerABI.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting C compile features Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompileFeatures.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting C compile features - done Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompileFeatures.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- The Fortran compiler identification is GNU 6.3.1 Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerId.cmake [2] /usr/share/cmake-3.7/Modules/CMakeDetermineFortranCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Check for working Fortran compiler: /usr/bin/mpif90 Called from: [3] /usr/share/cmake-3.7/Modules/CMakeTestCompilerCommon.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestFortranCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Check for working Fortran compiler: /usr/bin/mpif90 -- works Called from: [3] /usr/share/cmake-3.7/Modules/CMakeTestCompilerCommon.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestFortranCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting Fortran compiler ABI info Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerABI.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestFortranCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting Fortran compiler ABI info - done Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerABI.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestFortranCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Checking whether /usr/bin/mpif90 supports Fortran 90 Called from: [2] /usr/share/cmake-3.7/Modules/CMakeTestFortranCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Checking whether /usr/bin/mpif90 supports Fortran 90 -- yes Called from: [2] /usr/share/cmake-3.7/Modules/CMakeTestFortranCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- The CXX compiler identification is GNU 6.3.1 Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerId.cmake [2] /usr/share/cmake-3.7/Modules/CMakeDetermineCXXCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Check for working CXX compiler: /usr/bin/mpicxx Called from: [3] /usr/share/cmake-3.7/Modules/CMakeTestCompilerCommon.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCXXCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Check for working CXX compiler: /usr/bin/mpicxx -- works Called from: [3] /usr/share/cmake-3.7/Modules/CMakeTestCompilerCommon.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCXXCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting CXX compiler ABI info Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerABI.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCXXCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting CXX compiler ABI info - done Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompilerABI.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCXXCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting CXX compile features Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompileFeatures.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCXXCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Detecting CXX compile features - done Called from: [3] /usr/share/cmake-3.7/Modules/CMakeDetermineCompileFeatures.cmake [2] /usr/share/cmake-3.7/Modules/CMakeTestCXXCompiler.cmake [1] /home/florian/software/petsc/CMakeLists.txt -- Configuring done -- Generating /home/florian/software/petsc/arch-linux2-c-debug Called from: [1] /home/florian/software/petsc/CMakeLists.txt -- Generating done -- Build files have been written to: /home/florian/software/petsc/arch-linux2-c-debug CMake configured successfully, using as default build Defined make macro "PETSC_BUILD_USING_CMAKE" to "1" Pushing language C Popping language C Pushing language FC Popping language FC ================================================================================ **** arch-linux2-c-debug/lib/petsc/conf/petscvariables **** MPICXX_SHOW = g++ -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_cxx -lmpi C_DEPFLAGS = -MMD -MP FC_DEFINE_FLAG = -D MPICC_SHOW = gcc -pthread -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi AR_FLAGS = cr CXX_DEPFLAGS = -MMD -MP FC_DEPFLAGS = -MMD -MP MPIFC_SHOW = /usr/bin/gfortran -I/usr/include -pthread -I/usr/lib/openmpi -Wl,-rpath -Wl,/usr/lib/openmpi -Wl,--enable-new-dtags -L/usr/lib/openmpi -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lmpi FAST_AR_FLAGS = Scq FC_MODULE_OUTPUT_FLAG = -J PETSC_LANGUAGE = CONLY FC_LINKER_FLAGS = -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g LIBNAME = ${INSTALL_LIB_DIR}/libpetsc.${AR_LIB_SUFFIX} SL_LINKER = mpicc PETSC_BUILD_USING_CMAKE = 1 CC_FLAGS = -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 HWLOC_INCLUDE = PETSC_PRECISION = double PETSC_LIB_BASIC = -lpetsc FC_FLAGS = -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g BLASLAPACK_LIB = -llapack -lblas PETSC_MAT_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} PCC = mpicc SL_LINKER_LIBS = ${PETSC_EXTERNAL_LIB_BASIC} VALGRIND_INCLUDE = MPI_LIB = SSL_LIB = -lssl -lcrypto PETSC_EXTERNAL_LIB_BASIC = -llapack -lblas -lX11 -lhwloc -lssl -lcrypto -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl SL_LINKER_FLAGS = ${PCC_LINKER_FLAGS} CC_SUFFIX = o PETSC_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} SHLIBS = libpetsc CONFIGURE_OPTIONS = --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 PETSC_CHARACTERISTIC_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} HWLOC_LIB = -lhwloc PTHREAD_LIB = PETSC_SCALAR = real PETSC_FC_INCLUDES = -I/home/florian/software/petsc/include -I/home/florian/software/petsc/arch-linux2-c-debug/include CPP_FLAGS = PETSC_KSP_LIB_BASIC = -lpetsc FPP_FLAGS = SOWING_LIB = FC_LINKER = mpif90 PETSC_KSP_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} CXX_FLAGS = -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC PCC_LINKER_FLAGS = -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 SSL_INCLUDE = PETSC_CONTRIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} PETSC_CC_INCLUDES = -I/home/florian/software/petsc/include -I/home/florian/software/petsc/arch-linux2-c-debug/include PCC_LINKER = mpicc PETSC_SYS_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} PCC_FLAGS = -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 VALGRIND_LIB = PTHREAD_INCLUDE = PETSC_TS_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} PETSC_TAO_LIB_BASIC = -lpetsc BLASLAPACK_INCLUDE = PETSC_TS_LIB_BASIC = -lpetsc PETSC_VEC_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} CC_LINKER_SUFFIX = SL_LINKER_SUFFIX = so PETSC_DM_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} DESTDIR = /home/florian/software/petsc/arch-linux2-c-debug FC_MODULE_FLAG = -I X_LIB = -lX11 X_INCLUDE = PETSC_WITH_EXTERNAL_LIB = -L/home/florian/software/petsc/arch-linux2-c-debug/lib -lpetsc -llapack -lblas -lX11 -lhwloc -lssl -lcrypto -lm -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -lmpi_usempif08 -lmpi_usempi_ignore_tkr -lmpi_mpifh -lgfortran -lm -lgfortran -lm -lquadmath -lm -lmpi_cxx -lstdc++ -Wl,-rpath,/usr/lib/openmpi -L/usr/lib/openmpi -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/lib -L/home/florian/software/lib -Wl,-rpath,/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -L/usr/lib/gcc/x86_64-pc-linux-gnu/6.3.1 -Wl,-rpath,/home/florian/software/petsc/arch-linux2-c-debug/lib -L/home/florian/software/petsc/arch-linux2-c-debug/lib -Wl,-rpath,/home/florian/software/petsc -L/home/florian/software/petsc -ldl -Wl,-rpath,/usr/lib/openmpi -lmpi -lgcc_s -lpthread -ldl SOWING_INCLUDE = PETSC_TAO_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} MPI_INCLUDE = FC_SUFFIX = o PETSC_SNES_LIB = ${C_SH_LIB_PATH} ${PETSC_WITH_EXTERNAL_LIB} SHELL = /usr/bin/sh GREP = /usr/bin/grep MV = /usr/bin/mv PYTHON = /usr/bin/python2 MKDIR = /usr/bin/mkdir -p SEDINPLACE = /usr/bin/sed -i SED = /usr/bin/sed DIFF = /usr/bin/diff -w GZIP = /usr/bin/gzip RM = /usr/bin/rm -f CP = /usr/bin/cp CC_LINKER_SLFLAG = -Wl,-rpath, CC = mpicc RANLIB = /usr/bin/ranlib DYNAMICLINKER = mpicc CXX = mpicxx FC = mpif90 CXXCPP = mpicxx -E FC_LINKER_SLFLAG = -Wl,-rpath, CPP = mpicc -E AR_LIB_SUFFIX = a LD_SHARED = mpicc AR = /usr/bin/ar DIR = /home/florian/software/petsc PETSC_SCALAR_SIZE = 64 MPIEXEC = mpiexec GIT = git HG = hg SL_LINKER_FUNCTION = -shared -Wl,-soname,$(call SONAME_FUNCTION,$(notdir $(1)),$(2)) SONAME_FUNCTION = $(1).so.$(2) BUILDSHAREDLIB = yes GDB = /usr/bin/gdb DSYMUTIL = true MAKE_IS_GNUMAKE = 1 MAKE_NP = 4 NPMAX = 4 OMAKE_PRINTDIR = /usr/bin/make --print-directory MAKE = /usr/bin/make MAKE_PAR_OUT_FLG = --output-sync=recurse OMAKE = /usr/bin/make --no-print-directory PETSC_INDEX_SIZE = 32 CMAKE = /usr/bin/cmake CTEST = /usr/bin/ctest DOCTEXT = /home/florian/software/petsc/arch-linux2-c-debug/bin/doctext BIB2HTML = /home/florian/software/petsc/arch-linux2-c-debug/bin/bib2html PDFLATEX = /usr/bin/pdflatex BFORT = /home/florian/software/petsc/arch-linux2-c-debug/bin/bfort MAPNAMES = /home/florian/software/petsc/arch-linux2-c-debug/bin/mapnames TEST_RUNS = C C_Info C_NotSingle C_X Fortran Fortran_NotSingle F90_NotSingle Fortran_NoComplex_NotSingle C_NoComplex_NotSingle Cxx F90 F90_NoComplex F2003 Fortran_NoComplex C_NoComplex DOUBLEINT32 Fortran_DOUBLEINT32 **** arch-linux2-c-debug/lib/petsc/conf/petscrules **** shared_install: - at echo "Now to check if the libraries are working do:" - at echo "make PETSC_DIR=${PETSC_DIR} PETSC_ARCH=${PETSC_ARCH} test" - at echo "=========================================" remoteclean: remote: shared_arch: shared_linux libc: ${LIBNAME}(${OBJSC}) libcxx: ${LIBNAME}(${OBJSCXX}) libcu: ${LIBNAME}(${OBJSCU}) libf: ${OBJSF} -${AR} ${AR_FLAGS} ${LIBNAME} ${OBJSF} .F.a: ${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} ${FCPPFLAGS} $< -${AR} ${AR_FLAGS} ${LIBNAME} $*.o -${RM} $*.o .f.o .f90.o .f95.o: ${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} -o $@ $< .f.a: ${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} $< -${AR} ${AR_FLAGS} ${LIBNAME} $*.o -${RM} $*.o .F.o .F90.o .F95.o: ${PETSC_MAKE_STOP_ON_ERROR}${FC} -c ${FC_FLAGS} ${FFLAGS} ${FCPPFLAGS} -o $@ $< mpi4pybuild: @echo "*** Building mpi4py ***" @(MPICC=${PCC} && export MPICC && cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/mpi4py-1.3.1 && \ python setup.py clean --all && \ python setup.py build ) > ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log 2>&1 || \ (echo "**************************ERROR*************************************" && \ echo "Error building mpi4py. Check ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log" && \ echo "********************************************************************" && \ exit 1) mpi4py-build: mpi4pybuild mpi4pyinstall mpi4pyinstall: @echo "*** Installing mpi4py ***" @(MPICC=${PCC} && export MPICC && cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/mpi4py-1.3.1 && \ python setup.py install --install-lib=/home/florian/software/petsc/arch-linux2-c-debug/lib) >> ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log 2>&1 || \ (echo "**************************ERROR*************************************" && \ echo "Error building mpi4py. Check ${PETSC_ARCH}/lib/petsc/conf/mpi4py.log" && \ echo "********************************************************************" && \ exit 1) @echo "=====================================" @echo "To use mpi4py, add /home/florian/software/petsc/arch-linux2-c-debug/lib to PYTHONPATH" @echo "=====================================" mpi4py-install: petsc4pyinstall: @echo "*** Installing petsc4py ***" @(MPICC=${PCC} && export MPICC && cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.petsc4py && \ MPICC=${PCC} python setup.py install --install-lib=/home/florian/software/petsc/arch-linux2-c-debug/lib) >> ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log 2>&1 || \ (echo "**************************ERROR*************************************" && \ echo "Error building petsc4py. Check ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log" && \ echo "********************************************************************" && \ exit 1) @echo "=====================================" @echo "To use petsc4py, add /home/florian/software/petsc/arch-linux2-c-debug/lib to PYTHONPATH" @echo "=====================================" petsc4py-build: petsc4pybuild petsc4pyinstall petsc4py-install: petsc4pybuild: @echo "*** Building petsc4py ***" @${RM} -f ${PETSC_ARCH}/lib/petsc/conf/petsc4py.errorflg @(cd /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.petsc4py && \ MPICC=${PCC} python setup.py clean --all && \ MPICC=${PCC} python setup.py build ) > ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log 2>&1 || \ (echo "**************************ERROR*************************************" && \ echo "Error building petsc4py. Check ${PETSC_ARCH}/lib/petsc/conf/petsc4py.log" && \ echo "********************************************************************" && \ touch ${PETSC_ARCH}/lib/petsc/conf/petsc4py.errorflg && \ exit 1) **** arch-linux2-c-debug/include/petscconf.h **** #if !defined(INCLUDED_UNKNOWN) #define INCLUDED_UNKNOWN #ifndef IS_COLORING_MAX #define IS_COLORING_MAX 65535 #endif #ifndef STDC_HEADERS #define STDC_HEADERS 1 #endif #ifndef MPIU_COLORING_VALUE #define MPIU_COLORING_VALUE MPI_UNSIGNED_SHORT #endif #ifndef PETSC_UINTPTR_T #define PETSC_UINTPTR_T uintptr_t #endif #ifndef PETSC_HAVE_PTHREAD #define PETSC_HAVE_PTHREAD 1 #endif #ifndef PETSC_HAVE_SSL #define PETSC_HAVE_SSL 1 #endif #ifndef PETSC_DEPRECATED #define PETSC_DEPRECATED(why) __attribute((deprecated)) #endif #ifndef PETSC_REPLACE_DIR_SEPARATOR #define PETSC_REPLACE_DIR_SEPARATOR '\\' #endif #ifndef PETSC_HAVE_SO_REUSEADDR #define PETSC_HAVE_SO_REUSEADDR 1 #endif #ifndef PETSC_HAVE_MPI #define PETSC_HAVE_MPI 1 #endif #ifndef PETSC_PREFETCH_HINT_T2 #define PETSC_PREFETCH_HINT_T2 _MM_HINT_T2 #endif #ifndef PETSC_PREFETCH_HINT_T0 #define PETSC_PREFETCH_HINT_T0 _MM_HINT_T0 #endif #ifndef PETSC_PREFETCH_HINT_T1 #define PETSC_PREFETCH_HINT_T1 _MM_HINT_T1 #endif #ifndef PETSC_HAVE_FORTRAN #define PETSC_HAVE_FORTRAN 1 #endif #ifndef PETSC_DIR #define PETSC_DIR "/home/florian/software/petsc" #endif #ifndef PETSC_HAVE_X #define PETSC_HAVE_X 1 #endif #ifndef PETSC_LIB_DIR #define PETSC_LIB_DIR "/home/florian/software/petsc/arch-linux2-c-debug/lib" #endif #ifndef PETSC_USE_SOCKET_VIEWER #define PETSC_USE_SOCKET_VIEWER 1 #endif #ifndef PETSC_USE_ISATTY #define PETSC_USE_ISATTY 1 #endif #ifndef PETSC_HAVE_SOWING #define PETSC_HAVE_SOWING 1 #endif #ifndef PETSC_SLSUFFIX #define PETSC_SLSUFFIX "so" #endif #ifndef PETSC_FUNCTION_NAME_CXX #define PETSC_FUNCTION_NAME_CXX __func__ #endif #ifndef PETSC_HAVE_ATOLL #define PETSC_HAVE_ATOLL 1 #endif #ifndef PETSC_HAVE_ATTRIBUTEALIGNED #define PETSC_HAVE_ATTRIBUTEALIGNED 1 #endif #ifndef PETSC_HAVE_DOUBLE_ALIGN_MALLOC #define PETSC_HAVE_DOUBLE_ALIGN_MALLOC 1 #endif #ifndef PETSC_UNUSED #define PETSC_UNUSED __attribute((unused)) #endif #ifndef PETSC_ATTRIBUTEALIGNED #define PETSC_ATTRIBUTEALIGNED(size) __attribute((aligned (size))) #endif #ifndef PETSC_FUNCTION_NAME_C #define PETSC_FUNCTION_NAME_C __func__ #endif #ifndef PETSC_HAVE_VALGRIND #define PETSC_HAVE_VALGRIND 1 #endif #ifndef PETSC_USE_SINGLE_LIBRARY #define PETSC_USE_SINGLE_LIBRARY 1 #endif #ifndef PETSC_HAVE_BUILTIN_EXPECT #define PETSC_HAVE_BUILTIN_EXPECT 1 #endif #ifndef PETSC_DIR_SEPARATOR #define PETSC_DIR_SEPARATOR '/' #endif #ifndef PETSC_PATH_SEPARATOR #define PETSC_PATH_SEPARATOR ':' #endif #ifndef PETSC_HAVE_XMMINTRIN_H #define PETSC_HAVE_XMMINTRIN_H 1 #endif #ifndef PETSC_PREFETCH_HINT_NTA #define PETSC_PREFETCH_HINT_NTA _MM_HINT_NTA #endif #ifndef PETSC_Prefetch #define PETSC_Prefetch(a,b,c) _mm_prefetch((const char*)(a),(c)) #endif #ifndef PETSC_HAVE_BLASLAPACK #define PETSC_HAVE_BLASLAPACK 1 #endif #ifndef PETSC_HAVE_HWLOC #define PETSC_HAVE_HWLOC 1 #endif #ifndef PETSC_HAVE_GZIP #define PETSC_HAVE_GZIP 1 #endif #ifndef PETSC_HAVE_STRING_H #define PETSC_HAVE_STRING_H 1 #endif #ifndef PETSC_HAVE_SYS_TYPES_H #define PETSC_HAVE_SYS_TYPES_H 1 #endif #ifndef PETSC_HAVE_ENDIAN_H #define PETSC_HAVE_ENDIAN_H 1 #endif #ifndef PETSC_HAVE_SYS_PROCFS_H #define PETSC_HAVE_SYS_PROCFS_H 1 #endif #ifndef PETSC_HAVE_DLFCN_H #define PETSC_HAVE_DLFCN_H 1 #endif #ifndef PETSC_HAVE_SCHED_H #define PETSC_HAVE_SCHED_H 1 #endif #ifndef PETSC_HAVE_STDINT_H #define PETSC_HAVE_STDINT_H 1 #endif #ifndef PETSC_HAVE_LINUX_KERNEL_H #define PETSC_HAVE_LINUX_KERNEL_H 1 #endif #ifndef PETSC_HAVE_TIME_H #define PETSC_HAVE_TIME_H 1 #endif #ifndef PETSC_HAVE_MATH_H #define PETSC_HAVE_MATH_H 1 #endif #ifndef PETSC_HAVE_INTTYPES_H #define PETSC_HAVE_INTTYPES_H 1 #endif #ifndef PETSC_TIME_WITH_SYS_TIME #define PETSC_TIME_WITH_SYS_TIME 1 #endif #ifndef PETSC_HAVE_SYS_PARAM_H #define PETSC_HAVE_SYS_PARAM_H 1 #endif #ifndef PETSC_HAVE_PTHREAD_H #define PETSC_HAVE_PTHREAD_H 1 #endif #ifndef PETSC_HAVE_UNISTD_H #define PETSC_HAVE_UNISTD_H 1 #endif #ifndef PETSC_HAVE_STDLIB_H #define PETSC_HAVE_STDLIB_H 1 #endif #ifndef PETSC_HAVE_SYS_WAIT_H #define PETSC_HAVE_SYS_WAIT_H 1 #endif #ifndef PETSC_HAVE_SETJMP_H #define PETSC_HAVE_SETJMP_H 1 #endif #ifndef PETSC_HAVE_LIMITS_H #define PETSC_HAVE_LIMITS_H 1 #endif #ifndef PETSC_HAVE_SYS_UTSNAME_H #define PETSC_HAVE_SYS_UTSNAME_H 1 #endif #ifndef PETSC_HAVE_NETINET_IN_H #define PETSC_HAVE_NETINET_IN_H 1 #endif #ifndef PETSC_HAVE_SYS_SOCKET_H #define PETSC_HAVE_SYS_SOCKET_H 1 #endif #ifndef PETSC_HAVE_FLOAT_H #define PETSC_HAVE_FLOAT_H 1 #endif #ifndef PETSC_HAVE_SEARCH_H #define PETSC_HAVE_SEARCH_H 1 #endif #ifndef PETSC_HAVE_SYS_RESOURCE_H #define PETSC_HAVE_SYS_RESOURCE_H 1 #endif #ifndef PETSC_HAVE_SYS_TIMES_H #define PETSC_HAVE_SYS_TIMES_H 1 #endif #ifndef PETSC_HAVE_NETDB_H #define PETSC_HAVE_NETDB_H 1 #endif #ifndef PETSC_HAVE_MALLOC_H #define PETSC_HAVE_MALLOC_H 1 #endif #ifndef PETSC_HAVE_PWD_H #define PETSC_HAVE_PWD_H 1 #endif #ifndef PETSC_HAVE_FCNTL_H #define PETSC_HAVE_FCNTL_H 1 #endif #ifndef PETSC_HAVE_STRINGS_H #define PETSC_HAVE_STRINGS_H 1 #endif #ifndef PETSC_HAVE_SYS_SYSINFO_H #define PETSC_HAVE_SYS_SYSINFO_H 1 #endif #ifndef PETSC_HAVE_SYS_TIME_H #define PETSC_HAVE_SYS_TIME_H 1 #endif #ifndef PETSC_USING_F90 #define PETSC_USING_F90 1 #endif #ifndef PETSC_USING_F2003 #define PETSC_USING_F2003 1 #endif #ifndef PETSC_HAVE_RTLD_NOW #define PETSC_HAVE_RTLD_NOW 1 #endif #ifndef PETSC_HAVE_RTLD_LOCAL #define PETSC_HAVE_RTLD_LOCAL 1 #endif #ifndef PETSC_HAVE_RTLD_LAZY #define PETSC_HAVE_RTLD_LAZY 1 #endif #ifndef PETSC_C_STATIC_INLINE #define PETSC_C_STATIC_INLINE static inline #endif #ifndef PETSC_HAVE_FORTRAN_UNDERSCORE #define PETSC_HAVE_FORTRAN_UNDERSCORE 1 #endif #ifndef PETSC_HAVE_CXX_NAMESPACE #define PETSC_HAVE_CXX_NAMESPACE 1 #endif #ifndef PETSC_HAVE_RTLD_GLOBAL #define PETSC_HAVE_RTLD_GLOBAL 1 #endif #ifndef PETSC_C_RESTRICT #define PETSC_C_RESTRICT restrict #endif #ifndef PETSC_CXX_RESTRICT #define PETSC_CXX_RESTRICT __restrict__ #endif #ifndef PETSC_CXX_STATIC_INLINE #define PETSC_CXX_STATIC_INLINE static inline #endif #ifndef PETSC_HAVE_LIBHWLOC #define PETSC_HAVE_LIBHWLOC 1 #endif #ifndef PETSC_HAVE_LIBZ #define PETSC_HAVE_LIBZ 1 #endif #ifndef PETSC_HAVE_LIBDL #define PETSC_HAVE_LIBDL 1 #endif #ifndef PETSC_HAVE_LIBM #define PETSC_HAVE_LIBM 1 #endif #ifndef PETSC_HAVE_LIBX11 #define PETSC_HAVE_LIBX11 1 #endif #ifndef PETSC_HAVE_LIBLAPACK #define PETSC_HAVE_LIBLAPACK 1 #endif #ifndef PETSC_HAVE_LIBCRYPTO #define PETSC_HAVE_LIBCRYPTO 1 #endif #ifndef PETSC_HAVE_FENV_H #define PETSC_HAVE_FENV_H 1 #endif #ifndef PETSC_HAVE_LOG2 #define PETSC_HAVE_LOG2 1 #endif #ifndef PETSC_HAVE_LIBBLAS #define PETSC_HAVE_LIBBLAS 1 #endif #ifndef PETSC_HAVE_LIBMPI_USEMPI_IGNORE_TKR #define PETSC_HAVE_LIBMPI_USEMPI_IGNORE_TKR 1 #endif #ifndef PETSC_HAVE_LIBMPI_USEMPIF08 #define PETSC_HAVE_LIBMPI_USEMPIF08 1 #endif #ifndef PETSC_HAVE_ERF #define PETSC_HAVE_ERF 1 #endif #ifndef PETSC_HAVE_LIBSSL #define PETSC_HAVE_LIBSSL 1 #endif #ifndef PETSC_HAVE_LIBQUADMATH #define PETSC_HAVE_LIBQUADMATH 1 #endif #ifndef PETSC_HAVE_LIBMPI_MPIFH #define PETSC_HAVE_LIBMPI_MPIFH 1 #endif #ifndef PETSC_HAVE_TGAMMA #define PETSC_HAVE_TGAMMA 1 #endif #ifndef PETSC_HAVE_LIBGFORTRAN #define PETSC_HAVE_LIBGFORTRAN 1 #endif #ifndef PETSC_ARCH #define PETSC_ARCH "arch-linux2-c-debug" #endif #ifndef PETSC_USE_SCALAR_REAL #define PETSC_USE_SCALAR_REAL 1 #endif #ifndef PETSC_HAVE_ISINF #define PETSC_HAVE_ISINF 1 #endif #ifndef PETSC_HAVE_ISNAN #define PETSC_HAVE_ISNAN 1 #endif #ifndef PETSC_HAVE_ISNORMAL #define PETSC_HAVE_ISNORMAL 1 #endif #ifndef PETSC_USE_REAL_DOUBLE #define PETSC_USE_REAL_DOUBLE 1 #endif #ifndef PETSC_SIZEOF_MPI_COMM #define PETSC_SIZEOF_MPI_COMM 8 #endif #ifndef PETSC_BITS_PER_BYTE #define PETSC_BITS_PER_BYTE 8 #endif #ifndef PETSC_SIZEOF_MPI_FINT #define PETSC_SIZEOF_MPI_FINT 4 #endif #ifndef PETSC_USE_VISIBILITY_C #define PETSC_USE_VISIBILITY_C 1 #endif #ifndef PETSC_SIZEOF_VOID_P #define PETSC_SIZEOF_VOID_P 8 #endif #ifndef PETSC_RETSIGTYPE #define PETSC_RETSIGTYPE void #endif #ifndef PETSC_HAVE_CXX_COMPLEX #define PETSC_HAVE_CXX_COMPLEX 1 #endif #ifndef PETSC_SIZEOF_LONG #define PETSC_SIZEOF_LONG 8 #endif #ifndef PETSC_USE_FORTRANKIND #define PETSC_USE_FORTRANKIND 1 #endif #ifndef PETSC_USE_VISIBILITY_CXX #define PETSC_USE_VISIBILITY_CXX 1 #endif #ifndef PETSC_SIZEOF_SIZE_T #define PETSC_SIZEOF_SIZE_T 8 #endif #ifndef PETSC_HAVE_SIGINFO_T #define PETSC_HAVE_SIGINFO_T 1 #endif #ifndef PETSC_SIZEOF_CHAR #define PETSC_SIZEOF_CHAR 1 #endif #ifndef PETSC_SIZEOF_DOUBLE #define PETSC_SIZEOF_DOUBLE 8 #endif #ifndef PETSC_SIZEOF_FLOAT #define PETSC_SIZEOF_FLOAT 4 #endif #ifndef PETSC_HAVE_C99_COMPLEX #define PETSC_HAVE_C99_COMPLEX 1 #endif #ifndef PETSC_SIZEOF_INT #define PETSC_SIZEOF_INT 4 #endif #ifndef PETSC_SIZEOF_LONG_LONG #define PETSC_SIZEOF_LONG_LONG 8 #endif #ifndef PETSC_SIZEOF_SHORT #define PETSC_SIZEOF_SHORT 2 #endif #ifndef PETSC_CLANGUAGE_C #define PETSC_CLANGUAGE_C 1 #endif #ifndef PETSC_HAVE_MPI_F90MODULE #define PETSC_HAVE_MPI_F90MODULE 1 #endif #ifndef PETSC_HAVE_MPI_IALLREDUCE #define PETSC_HAVE_MPI_IALLREDUCE 1 #endif #ifndef PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK #define PETSC_HAVE_MPI_REDUCE_SCATTER_BLOCK 1 #endif #ifndef PETSC_HAVE_MPI_IN_PLACE #define PETSC_HAVE_MPI_IN_PLACE 1 #endif #ifndef PETSC_HAVE_MPI_COMM_C2F #define PETSC_HAVE_MPI_COMM_C2F 1 #endif #ifndef PETSC_HAVE_MPI_COMBINER_CONTIGUOUS #define PETSC_HAVE_MPI_COMBINER_CONTIGUOUS 1 #endif #ifndef PETSC_HAVE_MPI_INT64_T #define PETSC_HAVE_MPI_INT64_T 1 #endif #ifndef PETSC_HAVE_MPI_TYPE_GET_EXTENT #define PETSC_HAVE_MPI_TYPE_GET_EXTENT 1 #endif #ifndef PETSC_HAVE_MPI_WIN_CREATE #define PETSC_HAVE_MPI_WIN_CREATE 1 #endif #ifndef PETSC_HAVE_MPI_TYPE_DUP #define PETSC_HAVE_MPI_TYPE_DUP 1 #endif #ifndef PETSC_HAVE_MPI_INIT_THREAD #define PETSC_HAVE_MPI_INIT_THREAD 1 #endif #ifndef PETSC_HAVE_MPI_COMBINER_NAMED #define PETSC_HAVE_MPI_COMBINER_NAMED 1 #endif #ifndef PETSC_HAVE_MPI_LONG_DOUBLE #define PETSC_HAVE_MPI_LONG_DOUBLE 1 #endif #ifndef PETSC_HAVE_MPI_COMM_F2C #define PETSC_HAVE_MPI_COMM_F2C 1 #endif #ifndef PETSC_HAVE_MPI_TYPE_GET_ENVELOPE #define PETSC_HAVE_MPI_TYPE_GET_ENVELOPE 1 #endif #ifndef PETSC_HAVE_MPI_REDUCE_SCATTER #define PETSC_HAVE_MPI_REDUCE_SCATTER 1 #endif #ifndef PETSC_HAVE_MPI_COMBINER_DUP #define PETSC_HAVE_MPI_COMBINER_DUP 1 #endif #ifndef PETSC_HAVE_MPIIO #define PETSC_HAVE_MPIIO 1 #endif #ifndef PETSC_HAVE_MPI_COMM_SPAWN #define PETSC_HAVE_MPI_COMM_SPAWN 1 #endif #ifndef PETSC_HAVE_MPI_FINT #define PETSC_HAVE_MPI_FINT 1 #endif #ifndef PETSC_HAVE_MPI_IBARRIER #define PETSC_HAVE_MPI_IBARRIER 1 #endif #ifndef PETSC_HAVE_MPI_ALLTOALLW #define PETSC_HAVE_MPI_ALLTOALLW 1 #endif #ifndef PETSC_HAVE_MPI_REDUCE_LOCAL #define PETSC_HAVE_MPI_REDUCE_LOCAL 1 #endif #ifndef PETSC_HAVE_MPI_REPLACE #define PETSC_HAVE_MPI_REPLACE 1 #endif #ifndef PETSC_HAVE_MPI_EXSCAN #define PETSC_HAVE_MPI_EXSCAN 1 #endif #ifndef PETSC_HAVE_MPI_C_DOUBLE_COMPLEX #define PETSC_HAVE_MPI_C_DOUBLE_COMPLEX 1 #endif #ifndef PETSC_HAVE_MPI_FINALIZED #define PETSC_HAVE_MPI_FINALIZED 1 #endif #ifndef PETSC_HAVE_DYNAMIC_LIBRARIES #define PETSC_HAVE_DYNAMIC_LIBRARIES 1 #endif #ifndef PETSC_HAVE_SHARED_LIBRARIES #define PETSC_HAVE_SHARED_LIBRARIES 1 #endif #ifndef PETSC_USE_SHARED_LIBRARIES #define PETSC_USE_SHARED_LIBRARIES 1 #endif #ifndef PETSC_USE_GDB_DEBUGGER #define PETSC_USE_GDB_DEBUGGER 1 #endif #ifndef PETSC_VERSION_DATE_GIT #define PETSC_VERSION_DATE_GIT "2017-01-19 08:56:29 -0600" #endif #ifndef PETSC_VERSION_BRANCH_GIT #define PETSC_VERSION_BRANCH_GIT "maint" #endif #ifndef PETSC_VERSION_GIT #define PETSC_VERSION_GIT "v3.7.5-10-ga4629e9613" #endif #ifndef PETSC_USE_ERRORCHECKING #define PETSC_USE_ERRORCHECKING 1 #endif #ifndef PETSC_HAVE_STRCASECMP #define PETSC_HAVE_STRCASECMP 1 #endif #ifndef PETSC_HAVE_GET_NPROCS #define PETSC_HAVE_GET_NPROCS 1 #endif #ifndef PETSC_HAVE_POPEN #define PETSC_HAVE_POPEN 1 #endif #ifndef PETSC_HAVE_SIGSET #define PETSC_HAVE_SIGSET 1 #endif #ifndef PETSC_HAVE_GETWD #define PETSC_HAVE_GETWD 1 #endif #ifndef PETSC_HAVE_VSNPRINTF #define PETSC_HAVE_VSNPRINTF 1 #endif #ifndef PETSC_HAVE_TIMES #define PETSC_HAVE_TIMES 1 #endif #ifndef PETSC_HAVE_DLSYM #define PETSC_HAVE_DLSYM 1 #endif #ifndef PETSC_HAVE_SNPRINTF #define PETSC_HAVE_SNPRINTF 1 #endif #ifndef PETSC_HAVE_GETHOSTBYNAME #define PETSC_HAVE_GETHOSTBYNAME 1 #endif #ifndef PETSC_HAVE_GETCWD #define PETSC_HAVE_GETCWD 1 #endif #ifndef PETSC_HAVE_DLERROR #define PETSC_HAVE_DLERROR 1 #endif #ifndef PETSC_HAVE_FORK #define PETSC_HAVE_FORK 1 #endif #ifndef PETSC_HAVE_RAND #define PETSC_HAVE_RAND 1 #endif #ifndef PETSC_HAVE_GETTIMEOFDAY #define PETSC_HAVE_GETTIMEOFDAY 1 #endif #ifndef PETSC_HAVE_DLCLOSE #define PETSC_HAVE_DLCLOSE 1 #endif #ifndef PETSC_HAVE_UNAME #define PETSC_HAVE_UNAME 1 #endif #ifndef PETSC_HAVE_GETHOSTNAME #define PETSC_HAVE_GETHOSTNAME 1 #endif #ifndef PETSC_HAVE_MKSTEMP #define PETSC_HAVE_MKSTEMP 1 #endif #ifndef PETSC_HAVE_SIGACTION #define PETSC_HAVE_SIGACTION 1 #endif #ifndef PETSC_HAVE_DRAND48 #define PETSC_HAVE_DRAND48 1 #endif #ifndef PETSC_HAVE_MEMALIGN #define PETSC_HAVE_MEMALIGN 1 #endif #ifndef PETSC_HAVE_VA_COPY #define PETSC_HAVE_VA_COPY 1 #endif #ifndef PETSC_HAVE_CLOCK #define PETSC_HAVE_CLOCK 1 #endif #ifndef PETSC_HAVE_ACCESS #define PETSC_HAVE_ACCESS 1 #endif #ifndef PETSC_HAVE_SIGNAL #define PETSC_HAVE_SIGNAL 1 #endif #ifndef PETSC_HAVE_USLEEP #define PETSC_HAVE_USLEEP 1 #endif #ifndef PETSC_HAVE_GETRUSAGE #define PETSC_HAVE_GETRUSAGE 1 #endif #ifndef PETSC_HAVE_VFPRINTF #define PETSC_HAVE_VFPRINTF 1 #endif #ifndef PETSC_HAVE_NANOSLEEP #define PETSC_HAVE_NANOSLEEP 1 #endif #ifndef PETSC_HAVE_GETDOMAINNAME #define PETSC_HAVE_GETDOMAINNAME 1 #endif #ifndef PETSC_HAVE_TIME #define PETSC_HAVE_TIME 1 #endif #ifndef PETSC_HAVE_LSEEK #define PETSC_HAVE_LSEEK 1 #endif #ifndef PETSC_HAVE_SOCKET #define PETSC_HAVE_SOCKET 1 #endif #ifndef PETSC_HAVE_SYSINFO #define PETSC_HAVE_SYSINFO 1 #endif #ifndef PETSC_HAVE_READLINK #define PETSC_HAVE_READLINK 1 #endif #ifndef PETSC_HAVE_REALPATH #define PETSC_HAVE_REALPATH 1 #endif #ifndef PETSC_HAVE_DLOPEN #define PETSC_HAVE_DLOPEN 1 #endif #ifndef PETSC_HAVE_MEMMOVE #define PETSC_HAVE_MEMMOVE 1 #endif #ifndef PETSC_HAVE__GFORTRAN_IARGC #define PETSC_HAVE__GFORTRAN_IARGC 1 #endif #ifndef PETSC_SIGNAL_CAST #define PETSC_SIGNAL_CAST #endif #ifndef PETSC_HAVE_SLEEP #define PETSC_HAVE_SLEEP 1 #endif #ifndef PETSC_HAVE_VPRINTF #define PETSC_HAVE_VPRINTF 1 #endif #ifndef PETSC_HAVE_BZERO #define PETSC_HAVE_BZERO 1 #endif #ifndef PETSC_HAVE_GETPAGESIZE #define PETSC_HAVE_GETPAGESIZE 1 #endif #ifndef PETSC_WRITE_MEMORY_BARRIER #define PETSC_WRITE_MEMORY_BARRIER() asm volatile("sfence":::"memory") #endif #ifndef PETSC_MEMORY_BARRIER #define PETSC_MEMORY_BARRIER() asm volatile("mfence":::"memory") #endif #ifndef PETSC_READ_MEMORY_BARRIER #define PETSC_READ_MEMORY_BARRIER() asm volatile("lfence":::"memory") #endif #ifndef PETSC_CPU_RELAX #define PETSC_CPU_RELAX() asm volatile("rep; nop" ::: "memory") #endif #ifndef PETSC_BLASLAPACK_UNDERSCORE #define PETSC_BLASLAPACK_UNDERSCORE 1 #endif #ifndef PETSC_USE_INFO #define PETSC_USE_INFO 1 #endif #ifndef PETSC_Alignx #define PETSC_Alignx(a,b) #endif #ifndef PETSC_USE_BACKWARD_LOOP #define PETSC_USE_BACKWARD_LOOP 1 #endif #ifndef PETSC_USE_DEBUG #define PETSC_USE_DEBUG 1 #endif #ifndef PETSC_USE_LOG #define PETSC_USE_LOG 1 #endif #ifndef PETSC_IS_COLOR_VALUE_TYPE_F #define PETSC_IS_COLOR_VALUE_TYPE_F integer2 #endif #ifndef PETSC_IS_COLOR_VALUE_TYPE #define PETSC_IS_COLOR_VALUE_TYPE short #endif #ifndef PETSC_USE_CTABLE #define PETSC_USE_CTABLE 1 #endif #ifndef PETSC_MEMALIGN #define PETSC_MEMALIGN 16 #endif #ifndef PETSC_LEVEL1_DCACHE_LINESIZE #define PETSC_LEVEL1_DCACHE_LINESIZE 64 #endif #ifndef PETSC_LEVEL1_DCACHE_SIZE #define PETSC_LEVEL1_DCACHE_SIZE 32768 #endif #ifndef PETSC_LEVEL1_DCACHE_ASSOC #define PETSC_LEVEL1_DCACHE_ASSOC 8 #endif #ifndef PETSC__GNU_SOURCE #define PETSC__GNU_SOURCE 1 #endif #ifndef PETSC__BSD_SOURCE #define PETSC__BSD_SOURCE 1 #endif #ifndef PETSC__DEFAULT_SOURCE #define PETSC__DEFAULT_SOURCE 1 #endif #ifndef PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT #define PETSC_HAVE_FORTRAN_GET_COMMAND_ARGUMENT 1 #endif #ifndef PETSC_HAVE_GFORTRAN_IARGC #define PETSC_HAVE_GFORTRAN_IARGC 1 #endif #ifndef PETSC_USE_PROC_FOR_SIZE #define PETSC_USE_PROC_FOR_SIZE 1 #endif #ifndef PETSC_HAVE_SCHED_CPU_SET_T #define PETSC_HAVE_SCHED_CPU_SET_T 1 #endif #ifndef PETSC_HAVE_PTHREAD_BARRIER_T #define PETSC_HAVE_PTHREAD_BARRIER_T 1 #endif #ifndef PETSC_HAVE_SYS_SYSCTL_H #define PETSC_HAVE_SYS_SYSCTL_H 1 #endif #endif **** arch-linux2-c-debug/include/petscfix.h **** #if !defined(INCLUDED_UNKNOWN) #define INCLUDED_UNKNOWN #if defined(__cplusplus) extern "C" { } #else #endif #endif Configure Actions These are the actions performed by configure on the filesystem ----------------------------------------------------------------- Framework: Directory creation : Created the external packages directory: /home/florian/software/petsc/arch-linux2-c-debug/externalpackages RDict update : Substitutions were stored in RDict with parent None File creation : Created makefile configure header arch-linux2-c-debug/lib/petsc/conf/petscvariables File creation : Created makefile configure header arch-linux2-c-debug/lib/petsc/conf/petscrules File creation : Created configure header arch-linux2-c-debug/include/petscconf.h File creation : Created C specific configure header arch-linux2-c-debug/include/petscfix.h SOWING: Download : Downloaded SOWING into /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.sowing Install : Installed SOWING into /home/florian/software/petsc/arch-linux2-c-debug PETSc: File creation : Generated Fortran stubs Build : Set default architecture to arch-linux2-c-debug in lib/petsc/conf/petscvariables File creation : Created arch-linux2-c-debug/lib/petsc/conf/reconfigure-arch-linux2-c-debug.py for automatic reconfiguration PETSC4PY: Download : Downloaded PETSC4PY into /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/git.petsc4py MPI4PY: Download : Downloaded MPI4PY into /home/florian/software/petsc/arch-linux2-c-debug/externalpackages/mpi4py-1.3.1 Pushing language C Popping language C Pushing language Cxx Popping language Cxx Pushing language FC Popping language FC Compilers: C Compiler: mpicc -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 C++ Compiler: mpicxx -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g -fPIC Fortran Compiler: mpif90 -fPIC -Wall -ffree-line-length-0 -Wno-unused-dummy-argument -g Linkers: Shared linker: mpicc -shared -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 Dynamic linker: mpicc -shared -fPIC -Wall -Wwrite-strings -Wno-strict-aliasing -Wno-unknown-pragmas -fvisibility=hidden -g3 MPI: make: BLAS/LAPACK: -llapack -lblas cmake: Arch: X: Library: -lX11 hwloc: Library: -lhwloc pthread: sowing: ssl: Library: -lssl -lcrypto valgrind: PETSc: PETSC_ARCH: arch-linux2-c-debug PETSC_DIR: /home/florian/software/petsc Scalar type: real Precision: double Clanguage: C shared libraries: enabled Integer size: 32 Memory alignment: 16 xxx=========================================================================xxx Configure stage complete. Now build PETSc libraries with (gnumake build): make PETSC_DIR=/home/florian/software/petsc PETSC_ARCH=arch-linux2-c-debug all xxx=========================================================================xxx ================================================================================ Finishing Configure Run at Mon Jan 23 12:51:57 2017 ================================================================================ From knepley at gmail.com Mon Jan 23 06:59:27 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 23 Jan 2017 06:59:27 -0600 Subject: [petsc-users] Building petsc4py / mpi4py In-Reply-To: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> References: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> Message-ID: On Mon, Jan 23, 2017 at 6:38 AM, Florian Lindner wrote: > Hello, > > I try to build petsc from the maint branch together with petsc4py and > mpi4py > > python2 configure --download-petsc4py=yes --download-mpi4py=yes > --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 > make > > works without errors, so does make test. > > % echo $PYTHONPATH > /home/florian/software/petsc/arch-linux2-c-debug/lib > > % ls $PYTHONPATH > libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py > mpi4py-1.3.1-py3.6.egg-info petsc petsc4py > petsc4py-3.7.0-py3.6.egg-info pkgconfig > > > but: > > % python2 RBF_Load.py > Traceback (most recent call last): > File "RBF_Load.py", line 10, in > petsc4py.init(sys.argv) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", > line 42, in init > PETSc = petsc4py.lib.ImportPETSc(arch) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", > line 29, in ImportPETSc > return Import('petsc4py', 'PETSc', path, arch) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", > line 63, in Import > fo, fn, stuff = imp.find_module(name, pathlist) > ImportError: No module named PETSc > > > Anyone having an idea what could be the problem here? > Is your PETSC_ARCH correct? Matt > I have also attached the configure.log > > Best, > Florian > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Mon Jan 23 07:05:05 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Mon, 23 Jan 2017 14:05:05 +0100 Subject: [petsc-users] Building petsc4py / mpi4py In-Reply-To: References: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> Message-ID: <5ad2f29d-e89d-5a87-74b2-b7db7b66d06d@xgm.de> Am 23.01.2017 um 13:59 schrieb Matthew Knepley: > On Mon, Jan 23, 2017 at 6:38 AM, Florian Lindner > wrote: > > Hello, > > I try to build petsc from the maint branch together with petsc4py and mpi4py > > python2 configure --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 > make > > works without errors, so does make test. > > % echo $PYTHONPATH > /home/florian/software/petsc/arch-linux2-c-debug/lib > > % ls $PYTHONPATH > libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py mpi4py-1.3.1-py3.6.egg-info petsc petsc4py > petsc4py-3.7.0-py3.6.egg-info pkgconfig > > > but: > > % python2 RBF_Load.py > Traceback (most recent call last): > File "RBF_Load.py", line 10, in > petsc4py.init(sys.argv) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", line 42, in init > PETSc = petsc4py.lib.ImportPETSc(arch) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 29, in ImportPETSc > return Import('petsc4py', 'PETSc', path, arch) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 63, in Import > fo, fn, stuff = imp.find_module(name, pathlist) > ImportError: No module named PETSc > > > Anyone having an idea what could be the problem here? > > > Is your PETSC_ARCH correct? I think so: % ls $PETSC_DIR/$PETSC_ARCH bin CMakeCache.txt CMakeFiles cmake_install.cmake externalpackages include initial_cache_file.cmake lib Makefile obj share % echo $PETSC_DIR /home/florian/software/petsc % echo $PETSC_ARCH arch-linux2-c-debug Florian From knepley at gmail.com Mon Jan 23 07:24:38 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 23 Jan 2017 07:24:38 -0600 Subject: [petsc-users] Building petsc4py / mpi4py In-Reply-To: <5ad2f29d-e89d-5a87-74b2-b7db7b66d06d@xgm.de> References: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> <5ad2f29d-e89d-5a87-74b2-b7db7b66d06d@xgm.de> Message-ID: On Mon, Jan 23, 2017 at 7:05 AM, Florian Lindner wrote: > > > Am 23.01.2017 um 13:59 schrieb Matthew Knepley: > > On Mon, Jan 23, 2017 at 6:38 AM, Florian Lindner > wrote: > > > > Hello, > > > > I try to build petsc from the maint branch together with petsc4py > and mpi4py > > > > python2 configure --download-petsc4py=yes --download-mpi4py=yes > --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 > > make > > > > works without errors, so does make test. > > > > % echo $PYTHONPATH > > /home/florian/software/petsc/arch-linux2-c-debug/lib > > > > % ls $PYTHONPATH > > libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py > mpi4py-1.3.1-py3.6.egg-info petsc petsc4py > > petsc4py-3.7.0-py3.6.egg-info pkgconfig > > > > > > but: > > > > % python2 RBF_Load.py > > Traceback (most recent call last): > > File "RBF_Load.py", line 10, in > > petsc4py.init(sys.argv) > > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", > line 42, in init > > PETSc = petsc4py.lib.ImportPETSc(arch) > > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", > line 29, in ImportPETSc > > return Import('petsc4py', 'PETSc', path, arch) > > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", > line 63, in Import > > fo, fn, stuff = imp.find_module(name, pathlist) > > ImportError: No module named PETSc > > > > > > Anyone having an idea what could be the problem here? > > > > > > Is your PETSC_ARCH correct? > > I think so: > > % ls $PETSC_DIR/$PETSC_ARCH > bin CMakeCache.txt CMakeFiles cmake_install.cmake externalpackages > include initial_cache_file.cmake lib Makefile > obj share > > % echo $PETSC_DIR > /home/florian/software/petsc > > % echo $PETSC_ARCH > arch-linux2-c-debug For Python import, its either the module, or the path. Can you cd down to PETSc.so and import it directly? Matt > > Florian > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From mfadams at lbl.gov Mon Jan 23 07:55:51 2017 From: mfadams at lbl.gov (Mark Adams) Date: Mon, 23 Jan 2017 08:55:51 -0500 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: And I trust you updated all system software (like gcc, NetCDF and ExodusII). OSX upgrades are hell. On Mon, Jan 23, 2017 at 12:36 AM, Matthew Knepley wrote: > On Sun, Jan 22, 2017 at 11:18 PM, Fande Kong wrote: > >> Thanks, Matt, >> >> Clang does not have this issue. The code runs fine with clang. >> > > Okay, it sounds like a gcc bug on Mac 10.6, or at least in the version you > have. > > Matt > > >> Fande, >> >> On Sun, Jan 22, 2017 at 8:03 PM, Matthew Knepley >> wrote: >> >>> On Sun, Jan 22, 2017 at 8:40 PM, Fande Kong wrote: >>> >>>> Thanks, Matt. >>>> >>>> It is a weird bug. >>>> >>>> Do we have an alternative solution to this? I was wondering whether it >>>> is possible to read the ".exo" files without using the ExodusII. For >>>> example, can we read the ".exo" files using the netcdf only? >>>> >>> >>> Well, ExodusII is only a think layer on NetCDF, just like other wrappers >>> are thin layers on HDF5. It is >>> really NetCDF that is failing. Can you switch compilers and see if that >>> helps? >>> >>> Matt >>> >>> >>>> Fande Kong, >>>> >>>> >>>> >>>> On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong >>>>> wrote: >>>>> >>>>>> On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong >>>>>>> wrote: >>>>>>> >>>>>>>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley < >>>>>>>> knepley at gmail.com> wrote: >>>>>>>> >>>>>>>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>>> Hi All, >>>>>>>>>> >>>>>>>>>> I upgraded the OS system to macOS Sierra, and observed that PETSc >>>>>>>>>> can not read the exodus file any more. The same code runs fine on macOS >>>>>>>>>> Capitan. I also tested the function DMPlexCreateExodusFromFile() against >>>>>>>>>> different versions of the GCC compiler such as GCC-5.4 and GCC-6, and >>>>>>>>>> neither of them work. I guess this issue is related to the external package >>>>>>>>>> *exodus*, and PETSc might not pick up the right enveriment >>>>>>>>>> variables for the *exodus.* >>>>>>>>>> >>>>>>>>>> This issue can be reproduced using the following simple code: >>>>>>>>>> >>>>>>>>> >>>>>>>>> 1) This is just a standard check. Have you reconfigured so that >>>>>>>>> you know ExodusII was built with the same compilers and system libraries? >>>>>>>>> >>>>>>>>> 2) If so, can you get a stack trace with gdb or lldb? >>>>>>>>> >>>>>>>> >>>>>>>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda >>>>>>>> __pthread_kill + 10 >>>>>>>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill >>>>>>>> + 90 >>>>>>>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>>>>>>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>>>>>>> PetscAbortErrorHandler + 506 (errstop.c:40) >>>>>>>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + >>>>>>>> 916 (err.c:379) >>>>>>>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>>>>>>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>>>>>>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>>>>>>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>>>>>>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + >>>>>>>> 26 >>>>>>>> 8 ??? 0x000000011ea09370 >>>>>>>> initialPoolContent + 19008 >>>>>>>> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map >>>>>>>> + 210 (dutf8proc.c:543) >>>>>>>> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC >>>>>>>> + 38 (dutf8proc.c:568) >>>>>>>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + >>>>>>>> 110 (attr.c:341) >>>>>>>> 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr >>>>>>>> + 119 (attr.c:384) >>>>>>>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + >>>>>>>> 47 (attr.c:1138) >>>>>>>> 14 libnetcdf.7.dylib 0x0000000112286126 >>>>>>>> nc_get_att_float + 90 (dattget.c:192) >>>>>>>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int >>>>>>>> + 171 (ex_open.c:259) >>>>>>>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>>>>>>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>>>>>>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>>>>>>> (DMPlexCreateExodusFromFile.cpp:24) >>>>>>>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>>>>>>> >>>>>>> >>>>>>> This is a NetCDF error on ex_open_int(). My guess is that your >>>>>>> NetCDF build is old and when it calls the system DLL >>>>>>> you bomb. Can you do a completely new build, meaning either reclone >>>>>>> PETSc somewhere else, or delete the whole >>>>>>> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and >>>>>>> reconfigure/build? >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>> >>>>>> Hi Matt, >>>>>> >>>>>> Thanks for reply. I recloned PETSc (the old petsc folder is deleted >>>>>> completely) and reconfigure. And still has the same issue. I also checked >>>>>> if the binary is complied against any other netcdf. The binary is actually >>>>>> complied against the right netcdf installed through PETSc. >>>>>> >>>>> >>>>> You can see that this crash happens on the call to >>>>> >>>>> int CPU_word_size = 0, IO_word_size = 0, exoid = -1; >>>>> float version; >>>>> >>>>> exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, >>>>> &version); >>>>> >>>>> which means the fault is not in PETSc, but rather in ExodusII for your >>>>> machine. We could definitely >>>>> confirm this if you made a 5 line program that only called this, but I >>>>> don't see why it should be different. >>>>> I am not sure what to do, since I am not in control of anything about >>>>> ExodusII. Can you report this to >>>>> their dev team? It is strange since Blaise has not reported this, and >>>>> I know he uses it all the time. >>>>> >>>>> Thanks, >>>>> >>>>> Matt >>>>> >>>>> >>>>>> *LiviadeMacBook-Pro:partition livia$ otool -L >>>>>> DMPlexCreateExodusFromFile* >>>>>> *DMPlexCreateExodusFromFile:* >>>>>> * >>>>>> /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib >>>>>> (compatibility version 3.7.0, current version 3.7.5)* >>>>>> * >>>>>> /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib >>>>>> (compatibility version 5.0.0, current version 5.1.3)* >>>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib >>>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib >>>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib >>>>>> (compatibility version 10.0.0, current version 10.0.0)* >>>>>> * >>>>>> /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib >>>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>>> * /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib >>>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>>> * /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current >>>>>> version 10.0.0)* >>>>>> * /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib >>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility >>>>>> version 4.0.0, current version 4.0.0)* >>>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility >>>>>> version 1.0.0, current version 1.0.0)* >>>>>> * /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib >>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility >>>>>> version 7.0.0, current version 7.21.0)* >>>>>> * /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility >>>>>> version 1.0.0, current version 1.0.0)* >>>>>> * /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib >>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>> * /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib >>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>> * /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current >>>>>> version 1238.0.0)* >>>>>> * /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, >>>>>> current version 1.0.0)* >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> >>>>>>>> >>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>>>>>>> >>>>>>>>>> *#include * >>>>>>>>>> *#include * >>>>>>>>>> >>>>>>>>>> *#undef __FUNCT__* >>>>>>>>>> *#define __FUNCT__ "main"* >>>>>>>>>> *int main(int argc,char **argv)* >>>>>>>>>> *{* >>>>>>>>>> * char fineMeshFileName[2048];* >>>>>>>>>> * DM dm;* >>>>>>>>>> * MPI_Comm comm;* >>>>>>>>>> * PetscBool flg;* >>>>>>>>>> >>>>>>>>>> * PetscErrorCode ierr;* >>>>>>>>>> >>>>>>>>>> * ierr = PetscInitialize(&argc,&argv,(char >>>>>>>>>> *)0,help);CHKERRQ(ierr);* >>>>>>>>>> * comm = PETSC_COMM_WORLD;* >>>>>>>>>> * ierr = >>>>>>>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>>>>>>> * if(!flg){* >>>>>>>>>> * SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh >>>>>>>>>> file \n");* >>>>>>>>>> * }* >>>>>>>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>>>>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>>>>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>>>>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>>>>>>> *}* >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> *LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile >>>>>>>>>> -file Tri3.exo * >>>>>>>>>> *[0]PETSC ERROR: >>>>>>>>>> ------------------------------------------------------------------------* >>>>>>>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>> Violation, probably memory access out of range* >>>>>>>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>> -on_error_attach_debugger* >>>>>>>>>> *[0]PETSC ERROR: or see >>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>>>>>>> * >>>>>>>>>> *[0]PETSC ERROR: or try http://valgrind.org >>>>>>>>>> on GNU/linux and Apple Mac OS X to find memory corruption errors* >>>>>>>>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>>>>>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>> ------------------------------------* >>>>>>>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>> not available,* >>>>>>>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>> the function* >>>>>>>>>> *[0]PETSC ERROR: is given.* >>>>>>>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>>>>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>>>>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>>>>>>> --------------------------------------------------------------* >>>>>>>>>> *[0]PETSC ERROR: Signal received* >>>>>>>>>> *[0]PETSC ERROR: See >>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>>>>>>> for trouble shooting.* >>>>>>>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>>>>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>>>>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>>>>>>> 21:04:22 2017* >>>>>>>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>>>>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>>>>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>>>>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>>>>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>>>>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>>>>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>>> file* >>>>>>>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>>>>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>>>>>>> *:* >>>>>>>>>> *system msg for write_line failure : Bad file descriptor* >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> The log files of make and configuration are also attached. If >>>>>>>>>> you have any idea on this issue, please let me know! >>>>>>>>>> >>>>>>>>>> Fande Kong, >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mailinglists at xgm.de Mon Jan 23 08:10:11 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Mon, 23 Jan 2017 15:10:11 +0100 Subject: [petsc-users] Building petsc4py / mpi4py In-Reply-To: References: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> <5ad2f29d-e89d-5a87-74b2-b7db7b66d06d@xgm.de> Message-ID: <0cacf901-ced9-7066-c725-4864017be77f@xgm.de> Am 23.01.2017 um 14:24 schrieb Matthew Knepley: > On Mon, Jan 23, 2017 at 7:05 AM, Florian Lindner > wrote: > > > > Am 23.01.2017 um 13:59 schrieb Matthew Knepley: > > On Mon, Jan 23, 2017 at 6:38 AM, Florian Lindner > >> wrote: > > > > Hello, > > > > I try to build petsc from the maint branch together with petsc4py and mpi4py > > > > python2 configure --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes > --with-debugging=1 > > make > > > > works without errors, so does make test. > > > > % echo $PYTHONPATH > > /home/florian/software/petsc/arch-linux2-c-debug/lib > > > > % ls $PYTHONPATH > > libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py mpi4py-1.3.1-py3.6.egg-info petsc petsc4py > > petsc4py-3.7.0-py3.6.egg-info pkgconfig > > > > > > but: > > > > % python2 RBF_Load.py > > Traceback (most recent call last): > > File "RBF_Load.py", line 10, in > > petsc4py.init(sys.argv) > > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", line 42, in init > > PETSc = petsc4py.lib.ImportPETSc(arch) > > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 29, in ImportPETSc > > return Import('petsc4py', 'PETSc', path, arch) > > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 63, in Import > > fo, fn, stuff = imp.find_module(name, pathlist) > > ImportError: No module named PETSc > > > > > > Anyone having an idea what could be the problem here? > > > > > > Is your PETSC_ARCH correct? > > I think so: > > % ls $PETSC_DIR/$PETSC_ARCH > bin CMakeCache.txt CMakeFiles cmake_install.cmake externalpackages include initial_cache_file.cmake lib Makefile > obj share > > % echo $PETSC_DIR > /home/florian/software/petsc > > % echo $PETSC_ARCH > arch-linux2-c-debug > > > For Python import, its either the module, or the path. Can you cd down to PETSc.so and import it directly? Not really sure what you mean. I tried to symlink "PETSc.so" to "PETSc.cpython-36m-x86_64-linux-gnu.so" to in ./petsc/arch-linux2-c-debug/lib/petsc4py/lib/arch-linux2-c-debug. That changed the error message to: Traceback (most recent call last): File "RBF_Load.py", line 11, in petsc4py.init(sys.argv) File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", line 42, in init PETSc = petsc4py.lib.ImportPETSc(arch) File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 29, in ImportPETSc return Import('petsc4py', 'PETSc', path, arch) File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 64, in Import module = imp.load_module(fullname, fo, fn, stuff) ImportError: dynamic module does not define init function (initPETSc) Is there another PETSc.so that should exist? Because there exists none, just the symlink I created. Best, Florian From balay at mcs.anl.gov Mon Jan 23 09:22:56 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 23 Jan 2017 09:22:56 -0600 Subject: [petsc-users] Building petsc4py / mpi4py In-Reply-To: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> References: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> Message-ID: Looks like --download-petsc4py [and --download-mpi4py] uses python from PATH - and not the one used by configure. 'mpi4py-1.3.1-py3.6.egg-info' suggests that default 'python' in PATH is python-3.6. So try: python RBF_Load.py Should we use the same python for configure and all external python packages? [petsc configure is limited to python2 - but petsc4py works with python3] How about other externalpackages that use python? [scientificpython.py etc.. Satish On Mon, 23 Jan 2017, Florian Lindner wrote: > Hello, > > I try to build petsc from the maint branch together with petsc4py and mpi4py > > python2 configure --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 > make > > works without errors, so does make test. > > % echo $PYTHONPATH > /home/florian/software/petsc/arch-linux2-c-debug/lib > > % ls $PYTHONPATH > libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py mpi4py-1.3.1-py3.6.egg-info petsc petsc4py > petsc4py-3.7.0-py3.6.egg-info pkgconfig > > > but: > > % python2 RBF_Load.py > Traceback (most recent call last): > File "RBF_Load.py", line 10, in > petsc4py.init(sys.argv) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", line 42, in init > PETSc = petsc4py.lib.ImportPETSc(arch) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 29, in ImportPETSc > return Import('petsc4py', 'PETSc', path, arch) > File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 63, in Import > fo, fn, stuff = imp.find_module(name, pathlist) > ImportError: No module named PETSc > > > Anyone having an idea what could be the problem here? > I have also attached the configure.log > > Best, > Florian > From fangbowa at buffalo.edu Mon Jan 23 09:56:33 2017 From: fangbowa at buffalo.edu (Fangbo Wang) Date: Mon, 23 Jan 2017 10:56:33 -0500 Subject: [petsc-users] How can I do reduction operation for Petsc vectors in splitted communicators? Message-ID: Hi, Assume I did MPI_Comm_split, the global communicator is splitted into several disjoint communicators. I created a parallel Petsc vector with same size in all the small communicators. How can I do a reduction operation for this Petsc vector among all the small communicators? Thank you very much! Best regards, Fangbo Wang -- Fangbo Wang, PhD student Stochastic Geomechanics Research Group Department of Civil, Structural and Environmental Engineering University at Buffalo Email: *fangbowa at buffalo.edu * -------------- next part -------------- An HTML attachment was scrubbed... URL: From jed at jedbrown.org Mon Jan 23 10:09:54 2017 From: jed at jedbrown.org (Jed Brown) Date: Mon, 23 Jan 2017 10:09:54 -0600 Subject: [petsc-users] How can I do reduction operation for Petsc vectors in splitted communicators? In-Reply-To: References: Message-ID: <878tq1ahbh.fsf@jedbrown.org> Fangbo Wang writes: > Hi, > > Assume I did MPI_Comm_split, the global communicator is splitted into > several disjoint communicators. > > I created a parallel Petsc vector with same size in all the small > communicators. > How can I do a reduction operation for this Petsc vector among all the > small communicators? VecNorm (and related functions) on the Vecs created using a communicator are done exclusively on that communicator. Is that the question? Of course you can always use the original communicator (by creating objects on it or by calling MPI_* functions directly). -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 800 bytes Desc: not available URL: From hgbk2008 at gmail.com Mon Jan 23 10:22:13 2017 From: hgbk2008 at gmail.com (Hoang Giang Bui) Date: Mon, 23 Jan 2017 17:22:13 +0100 Subject: [petsc-users] Nested Fieldsplit for custom index sets In-Reply-To: <5799C3D2.8000407@imperial.ac.uk> References: <0B3B3C93-5C07-4E07-A37E-DEBA9577D3EE@utdallas.edu> <5799C3D2.8000407@imperial.ac.uk> Message-ID: Hello May I know if the nested IS is identified based on relative local index or the global one? It's hard to identify that in the attached example. Giang On Thu, Jul 28, 2016 at 10:35 AM, Lawrence Mitchell < lawrence.mitchell at imperial.ac.uk> wrote: > Dear Artur, > > On 28/07/16 02:20, Safin, Artur wrote: > > Barry, Lawrence, > > > >> I think the SubKSPs (and therefore SubPCs) are not set up until you > call KSPSetUp(ksp) which your code does not do explicitly and is therefore > done in KSPSolve. > > > > I added KSPSetUp(), but unfortunately the issue did not go away. > > > > > > > > I have created a MWE that replicates the issue. The program tries to > solve a tridiagonal system, where the first fieldsplit partitions the > global matrix > > > > [ P x ] > > [ x T ], > > > > and the nested fieldsplit partitions P into > > > > [ A x ] > > [ x B ]. > > Two things: > > 1. Always check the return value from all PETSc calls. This will > normally give you a very useful backtrace when something goes wrong. > > That is, annotate all your calls with: > > PetscErrorCode ierr; > > > ierr = SomePetscFunction(...); CHKERRQ(ierr); > > If I do this, I see that the call to KSPSetUp fails: > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Petsc has generated inconsistent data > [0]PETSC ERROR: Unhandled case, must have at least two fields, not 1 > [0]PETSC ERROR: See > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.7.2-931-g1e46b98 > GIT Date: 2016-07-06 16:57:50 -0500 > > ... > > [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 470 in > /data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 487 in > /data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: #3 PCSetUp() line 968 in > /data/lmitche1/src/deps/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: #4 KSPSetUp() line 393 in > /data/lmitche1/src/deps/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #5 main() line 65 in /homes/lmitche1/tmp/ex.c > > The reason is you need to call KSPSetUp *after* setting the outermost > fieldsplit ISes. > > If I move the call to KSPSetUp, then things seem to work. I've > attached the working code. > > Cheers, > > Lawrence > > $ cat options.txt > -pc_type fieldsplit > -pc_fieldsplit_type multiplicative > -fieldsplit_T_ksp_type bcgs > -fieldsplit_P_ksp_type gmres > -fieldsplit_P_pc_type fieldsplit > -fieldsplit_P_pc_fieldsplit_type multiplicative > -fieldsplit_P_fieldsplit_A_ksp_type gmres > -fieldsplit_P_fieldsplit_B_pc_type lu > -fieldsplit_P_fieldsplit_B_ksp_type preonly > -ksp_converged_reason > -ksp_monitor_true_residual > -ksp_view > > $ ./ex -options_file options.txt > > 0 KSP preconditioned resid norm 5.774607007892e+00 true resid norm > 1.414213562373e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 1.921795888956e-01 true resid norm > 4.802975385197e-02 ||r(i)||/||b|| 3.396216464745e-02 > 2 KSP preconditioned resid norm 1.436304589027e-12 true resid norm > 2.435255920058e-13 ||r(i)||/||b|| 1.721985974998e-13 > Linear solve converged due to CONVERGED_RTOL iterations 2 > KSP Object: 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with MULTIPLICATIVE composition: total splits = 2 > Solver info for each split is in the following KSP objects: > Split number 0 Defined by IS > KSP Object: (fieldsplit_P_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_P_) 1 MPI processes > type: fieldsplit > FieldSplit with MULTIPLICATIVE composition: total splits = 2 > Solver info for each split is in the following KSP objects: > Split number 0 Defined by IS > KSP Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) > Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=25, cols=25 > package used to perform factorization: petsc > total: nonzeros=73, allocated nonzeros=73 > total number of mallocs used during MatSetValues > calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes > type: seqaij > rows=25, cols=25 > total: nonzeros=73, allocated nonzeros=73 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > Split number 1 Defined by IS > KSP Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes > type: lu > LU: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: nd > factor fill ratio given 5., needed 1.43836 > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=25, cols=25 > package used to perform factorization: petsc > total: nonzeros=105, allocated nonzeros=105 > total number of mallocs used during MatSetValues > calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes > type: seqaij > rows=25, cols=25 > total: nonzeros=73, allocated nonzeros=73 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_P_) 1 MPI processes > type: seqaij > rows=50, cols=50 > total: nonzeros=148, allocated nonzeros=148 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > Split number 1 Defined by IS > KSP Object: (fieldsplit_T_) 1 MPI processes > type: bcgs > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_T_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=50, cols=50 > package used to perform factorization: petsc > total: nonzeros=148, allocated nonzeros=148 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_T_) 1 MPI processes > type: seqaij > rows=50, cols=50 > total: nonzeros=148, allocated nonzeros=148 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: 1 MPI processes > type: seqaij > rows=100, cols=100 > total: nonzeros=298, allocated nonzeros=500 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Mon Jan 23 10:26:54 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 23 Jan 2017 10:26:54 -0600 Subject: [petsc-users] Application context in fortran In-Reply-To: References: Message-ID: <63544712-2277-4495-9FA3-A3234998D0CA@mcs.anl.gov> > On Jan 22, 2017, at 11:16 PM, Praveen C wrote: > > Hello > > With snes/ts, we use an "application context to contain data needed by the application-provided call-back routines, FormJacobian() and FormFunction()". This can be a struct in the C examples. What can I use in case of fortran ? Can I use a module to pass the data needed by the call-back routines ? You can use a module to communicate the information, in that case the ctx you would pass is 0 You can also use a fortran derived type. See src/snes/examples/tests/ex12f.F > > Thanks > praveen From bsmith at mcs.anl.gov Mon Jan 23 10:39:12 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 23 Jan 2017 10:39:12 -0600 Subject: [petsc-users] Building petsc4py / mpi4py In-Reply-To: References: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> Message-ID: <4AFD50A9-EEA6-41E4-9F72-61C81D068424@mcs.anl.gov> mpi4py.py and petsc4py.py should have a --with-mpi/petsc4py-python=fullpathforpython and use that otherwise use the python in the path ( have a warning box if the python in the path is different from the python being used for configure) Barry > On Jan 23, 2017, at 9:22 AM, Satish Balay wrote: > > Looks like --download-petsc4py [and --download-mpi4py] uses python > from PATH - and not the one used by configure. > > 'mpi4py-1.3.1-py3.6.egg-info' suggests that default 'python' in PATH is python-3.6. So try: > > python RBF_Load.py > > Should we use the same python for configure and all external python packages? > [petsc configure is limited to python2 - but petsc4py works with python3] > How about other externalpackages that use python? [scientificpython.py etc.. > > Satish > > On Mon, 23 Jan 2017, Florian Lindner wrote: > >> Hello, >> >> I try to build petsc from the maint branch together with petsc4py and mpi4py >> >> python2 configure --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 >> make >> >> works without errors, so does make test. >> >> % echo $PYTHONPATH >> /home/florian/software/petsc/arch-linux2-c-debug/lib >> >> % ls $PYTHONPATH >> libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py mpi4py-1.3.1-py3.6.egg-info petsc petsc4py >> petsc4py-3.7.0-py3.6.egg-info pkgconfig >> >> >> but: >> >> % python2 RBF_Load.py >> Traceback (most recent call last): >> File "RBF_Load.py", line 10, in >> petsc4py.init(sys.argv) >> File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", line 42, in init >> PETSc = petsc4py.lib.ImportPETSc(arch) >> File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 29, in ImportPETSc >> return Import('petsc4py', 'PETSc', path, arch) >> File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 63, in Import >> fo, fn, stuff = imp.find_module(name, pathlist) >> ImportError: No module named PETSc >> >> >> Anyone having an idea what could be the problem here? >> I have also attached the configure.log >> >> Best, >> Florian >> > From bsmith at mcs.anl.gov Mon Jan 23 10:41:15 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Mon, 23 Jan 2017 10:41:15 -0600 Subject: [petsc-users] Nested Fieldsplit for custom index sets In-Reply-To: References: <0B3B3C93-5C07-4E07-A37E-DEBA9577D3EE@utdallas.edu> <5799C3D2.8000407@imperial.ac.uk> Message-ID: <858AB505-65C9-4DB5-8B13-AD2C8F166A93@mcs.anl.gov> It is "relative" to the level in which you are doing the fieldsplit, it does not go all the way up to the original problem. > On Jan 23, 2017, at 10:22 AM, Hoang Giang Bui wrote: > > Hello > > May I know if the nested IS is identified based on relative local index or the global one? It's hard to identify that in the attached example. > > Giang > > On Thu, Jul 28, 2016 at 10:35 AM, Lawrence Mitchell wrote: > Dear Artur, > > On 28/07/16 02:20, Safin, Artur wrote: > > Barry, Lawrence, > > > >> I think the SubKSPs (and therefore SubPCs) are not set up until you call KSPSetUp(ksp) which your code does not do explicitly and is therefore done in KSPSolve. > > > > I added KSPSetUp(), but unfortunately the issue did not go away. > > > > > > > > I have created a MWE that replicates the issue. The program tries to solve a tridiagonal system, where the first fieldsplit partitions the global matrix > > > > [ P x ] > > [ x T ], > > > > and the nested fieldsplit partitions P into > > > > [ A x ] > > [ x B ]. > > Two things: > > 1. Always check the return value from all PETSc calls. This will > normally give you a very useful backtrace when something goes wrong. > > That is, annotate all your calls with: > > PetscErrorCode ierr; > > > ierr = SomePetscFunction(...); CHKERRQ(ierr); > > If I do this, I see that the call to KSPSetUp fails: > > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Petsc has generated inconsistent data > [0]PETSC ERROR: Unhandled case, must have at least two fields, not 1 > [0]PETSC ERROR: See > http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. > [0]PETSC ERROR: Petsc Development GIT revision: v3.7.2-931-g1e46b98 > GIT Date: 2016-07-06 16:57:50 -0500 > > ... > > [0]PETSC ERROR: #1 PCFieldSplitSetDefaults() line 470 in > /data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: #2 PCSetUp_FieldSplit() line 487 in > /data/lmitche1/src/deps/petsc/src/ksp/pc/impls/fieldsplit/fieldsplit.c > [0]PETSC ERROR: #3 PCSetUp() line 968 in > /data/lmitche1/src/deps/petsc/src/ksp/pc/interface/precon.c > [0]PETSC ERROR: #4 KSPSetUp() line 393 in > /data/lmitche1/src/deps/petsc/src/ksp/ksp/interface/itfunc.c > [0]PETSC ERROR: #5 main() line 65 in /homes/lmitche1/tmp/ex.c > > The reason is you need to call KSPSetUp *after* setting the outermost > fieldsplit ISes. > > If I move the call to KSPSetUp, then things seem to work. I've > attached the working code. > > Cheers, > > Lawrence > > $ cat options.txt > -pc_type fieldsplit > -pc_fieldsplit_type multiplicative > -fieldsplit_T_ksp_type bcgs > -fieldsplit_P_ksp_type gmres > -fieldsplit_P_pc_type fieldsplit > -fieldsplit_P_pc_fieldsplit_type multiplicative > -fieldsplit_P_fieldsplit_A_ksp_type gmres > -fieldsplit_P_fieldsplit_B_pc_type lu > -fieldsplit_P_fieldsplit_B_ksp_type preonly > -ksp_converged_reason > -ksp_monitor_true_residual > -ksp_view > > $ ./ex -options_file options.txt > > 0 KSP preconditioned resid norm 5.774607007892e+00 true resid norm > 1.414213562373e+00 ||r(i)||/||b|| 1.000000000000e+00 > 1 KSP preconditioned resid norm 1.921795888956e-01 true resid norm > 4.802975385197e-02 ||r(i)||/||b|| 3.396216464745e-02 > 2 KSP preconditioned resid norm 1.436304589027e-12 true resid norm > 2.435255920058e-13 ||r(i)||/||b|| 1.721985974998e-13 > Linear solve converged due to CONVERGED_RTOL iterations 2 > KSP Object: 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with MULTIPLICATIVE composition: total splits = 2 > Solver info for each split is in the following KSP objects: > Split number 0 Defined by IS > KSP Object: (fieldsplit_P_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) Gram-Schmidt > Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_P_) 1 MPI processes > type: fieldsplit > FieldSplit with MULTIPLICATIVE composition: total splits = 2 > Solver info for each split is in the following KSP objects: > Split number 0 Defined by IS > KSP Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes > type: gmres > GMRES: restart=30, using Classical (unmodified) > Gram-Schmidt Orthogonalization with no iterative refinement > GMRES: happy breakdown tolerance 1e-30 > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=25, cols=25 > package used to perform factorization: petsc > total: nonzeros=73, allocated nonzeros=73 > total number of mallocs used during MatSetValues > calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_P_fieldsplit_A_) 1 MPI processes > type: seqaij > rows=25, cols=25 > total: nonzeros=73, allocated nonzeros=73 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > Split number 1 Defined by IS > KSP Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes > type: lu > LU: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: nd > factor fill ratio given 5., needed 1.43836 > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=25, cols=25 > package used to perform factorization: petsc > total: nonzeros=105, allocated nonzeros=105 > total number of mallocs used during MatSetValues > calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_P_fieldsplit_B_) 1 MPI processes > type: seqaij > rows=25, cols=25 > total: nonzeros=73, allocated nonzeros=73 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_P_) 1 MPI processes > type: seqaij > rows=50, cols=50 > total: nonzeros=148, allocated nonzeros=148 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > Split number 1 Defined by IS > KSP Object: (fieldsplit_T_) 1 MPI processes > type: bcgs > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_T_) 1 MPI processes > type: ilu > ILU: out-of-place factorization > 0 levels of fill > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 1., needed 1. > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqaij > rows=50, cols=50 > package used to perform factorization: petsc > total: nonzeros=148, allocated nonzeros=148 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: (fieldsplit_T_) 1 MPI processes > type: seqaij > rows=50, cols=50 > total: nonzeros=148, allocated nonzeros=148 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > linear system matrix = precond matrix: > Mat Object: 1 MPI processes > type: seqaij > rows=100, cols=100 > total: nonzeros=298, allocated nonzeros=500 > total number of mallocs used during MatSetValues calls =0 > not using I-node routines > > > From mailinglists at xgm.de Tue Jan 24 02:07:41 2017 From: mailinglists at xgm.de (Florian Lindner) Date: Tue, 24 Jan 2017 09:07:41 +0100 Subject: [petsc-users] Building petsc4py / mpi4py In-Reply-To: References: <67c1ebe7-5c06-2467-8cc5-d143d8037def@xgm.de> Message-ID: Am 23.01.2017 um 16:22 schrieb Satish Balay: > Looks like --download-petsc4py [and --download-mpi4py] uses python > from PATH - and not the one used by configure. > > 'mpi4py-1.3.1-py3.6.egg-info' suggests that default 'python' in PATH is python-3.6. So try: > > python RBF_Load.py Oh, yeah. That works (without any additional symlinking). I was assuming it depends on python 2, since I got to use python2 on the configure script. I actually prefer python3, so it's fine for me. > > Should we use the same python for configure and all external python packages? > [petsc configure is limited to python2 - but petsc4py works with python3] > How about other externalpackages that use python? [scientificpython.py etc.. I think it's fine the way it is. Best, Florian > > Satish > > On Mon, 23 Jan 2017, Florian Lindner wrote: > >> Hello, >> >> I try to build petsc from the maint branch together with petsc4py and mpi4py >> >> python2 configure --download-petsc4py=yes --download-mpi4py=yes --with-mpi4py=yes --with-petsc4py=yes --with-debugging=1 >> make >> >> works without errors, so does make test. >> >> % echo $PYTHONPATH >> /home/florian/software/petsc/arch-linux2-c-debug/lib >> >> % ls $PYTHONPATH >> libpetsc.so libpetsc.so.3.7 libpetsc.so.3.7.5 mpi4py mpi4py-1.3.1-py3.6.egg-info petsc petsc4py >> petsc4py-3.7.0-py3.6.egg-info pkgconfig >> >> >> but: >> >> % python2 RBF_Load.py >> Traceback (most recent call last): >> File "RBF_Load.py", line 10, in >> petsc4py.init(sys.argv) >> File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/__init__.py", line 42, in init >> PETSc = petsc4py.lib.ImportPETSc(arch) >> File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 29, in ImportPETSc >> return Import('petsc4py', 'PETSc', path, arch) >> File "/home/florian/software/petsc/arch-linux2-c-debug/lib/petsc4py/lib/__init__.py", line 63, in Import >> fo, fn, stuff = imp.find_module(name, pathlist) >> ImportError: No module named PETSc >> >> >> Anyone having an idea what could be the problem here? >> I have also attached the configure.log >> >> Best, >> Florian >> > From bourdin at lsu.edu Tue Jan 24 08:57:18 2017 From: bourdin at lsu.edu (Blaise A Bourdin) Date: Tue, 24 Jan 2017 14:57:18 +0000 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: Message-ID: <59EC0C25-D6E8-459E-B143-148A36A0C81A@lsu.edu> Hi, I was able to build petsc master with home-brew gcc 6.3 and intel 17.0 under macOS sierra. As Mark said, it is important to reinstall the entire home-brew compiler + libs after upgrading macOS (and often Xcode). I am able to read / write exodus files. I am attaching my configure command lines: with intel 17.0: ./configure ./configure \ --download-chaco=1 \ --download-exodusii=1 \ --download-hdf5=1 \ --download-hypre=1 \ --download-metis=1 \ --download-ml=1 \ --download-netcdf=1 \ --download-parmetis=1 \ --download-triangle=1 \ --with-blas-lapack-dir=$MKLROOT \ --with-debugging=1 \ --with-mpi-dir=$MPI_HOME \ --with-pic \ --with-shared-libraries=1 \ --with-vendor-compilers=intel \ --with-x11=1 with gcc: ./configure \ --download-exodusii=1 \ --download-chaco=1 \ --download-ctetgen=1 \ --download-hdf5=1 \ --download-hypre=1 \ --download-metis=1 \ --download-ml=1 \ --download-netcdf=1 \ --download-parmetis=1 \ --download-triangle=1 \ --download-yaml=1 \ --with-debugging=1 \ --with-shared-libraries=1 \ --with-x11=1 Blaise On Jan 23, 2017, at 7:55 AM, Mark Adams > wrote: And I trust you updated all system software (like gcc, NetCDF and ExodusII). OSX upgrades are hell. On Mon, Jan 23, 2017 at 12:36 AM, Matthew Knepley > wrote: On Sun, Jan 22, 2017 at 11:18 PM, Fande Kong > wrote: Thanks, Matt, Clang does not have this issue. The code runs fine with clang. Okay, it sounds like a gcc bug on Mac 10.6, or at least in the version you have. Matt Fande, On Sun, Jan 22, 2017 at 8:03 PM, Matthew Knepley > wrote: On Sun, Jan 22, 2017 at 8:40 PM, Fande Kong > wrote: Thanks, Matt. It is a weird bug. Do we have an alternative solution to this? I was wondering whether it is possible to read the ".exo" files without using the ExodusII. For example, can we read the ".exo" files using the netcdf only? Well, ExodusII is only a think layer on NetCDF, just like other wrappers are thin layers on HDF5. It is really NetCDF that is failing. Can you switch compilers and see if that helps? Matt Fande Kong, On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley > wrote: On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong > wrote: On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley > wrote: On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong > wrote: On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley > wrote: On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong > wrote: Hi All, I upgraded the OS system to macOS Sierra, and observed that PETSc can not read the exodus file any more. The same code runs fine on macOS Capitan. I also tested the function DMPlexCreateExodusFromFile() against different versions of the GCC compiler such as GCC-5.4 and GCC-6, and neither of them work. I guess this issue is related to the external package exodus, and PETSc might not pick up the right enveriment variables for the exodus. This issue can be reproduced using the following simple code: 1) This is just a standard check. Have you reconfigured so that you know ExodusII was built with the same compilers and system libraries? 2) If so, can you get a stack trace with gdb or lldb? 0 libsystem_kernel.dylib 0x00007fffad8b8dda __pthread_kill + 10 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill + 90 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 3 libpetsc.3.7.dylib 0x00000001100eb9ee PetscAbortErrorHandler + 506 (errstop.c:40) 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError + 916 (err.c:379) 5 libpetsc.3.7.dylib 0x00000001100ed830 PetscSignalHandlerDefault + 1927 (signal.c:160) 6 libpetsc.3.7.dylib 0x00000001100ed088 PetscSignalHandler_Private(int) + 630 (signal.c:49) 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + 26 8 ??? 0x000000011ea09370 initialPoolContent + 19008 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map + 210 (dutf8proc.c:543) 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC + 38 (dutf8proc.c:568) 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr + 110 (attr.c:341) 12 libnetcdf.7.dylib 0x00000001122a7a4e NC_lookupattr + 119 (attr.c:384) 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att + 47 (attr.c:1138) 14 libnetcdf.7.dylib 0x0000000112286126 nc_get_att_float + 90 (dattget.c:192) 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int + 171 (ex_open.c:259) 16 libpetsc.3.7.dylib 0x0000000110c36609 DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 (DMPlexCreateExodusFromFile.cpp:24) 18 libdyld.dylib 0x00007fffad78a255 start + 1 This is a NetCDF error on ex_open_int(). My guess is that your NetCDF build is old and when it calls the system DLL you bomb. Can you do a completely new build, meaning either reclone PETSc somewhere else, or delete the whole $PETSC_DIR/$PETSC_ARCH/externalpackage directory and reconfigure/build? Thanks, Matt Hi Matt, Thanks for reply. I recloned PETSc (the old petsc folder is deleted completely) and reconfigure. And still has the same issue. I also checked if the binary is complied against any other netcdf. The binary is actually complied against the right netcdf installed through PETSc. You can see that this crash happens on the call to int CPU_word_size = 0, IO_word_size = 0, exoid = -1; float version; exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, &version); which means the fault is not in PETSc, but rather in ExodusII for your machine. We could definitely confirm this if you made a 5 line program that only called this, but I don't see why it should be different. I am not sure what to do, since I am not in control of anything about ExodusII. Can you report this to their dev team? It is strange since Blaise has not reported this, and I know he uses it all the time. Thanks, Matt LiviadeMacBook-Pro:partition livia$ otool -L DMPlexCreateExodusFromFile DMPlexCreateExodusFromFile: /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib (compatibility version 3.7.0, current version 3.7.5) /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib (compatibility version 5.0.0, current version 5.1.3) /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib (compatibility version 0.0.0, current version 0.0.0) /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib (compatibility version 0.0.0, current version 0.0.0) /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib (compatibility version 10.0.0, current version 10.0.0) /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib (compatibility version 9.0.0, current version 9.1.0) /Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib (compatibility version 9.0.0, current version 9.1.0) /opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current version 10.0.0) /Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib (compatibility version 14.0.0, current version 14.0.0) /usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility version 4.0.0, current version 4.0.0) /usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility version 1.0.0, current version 1.0.0) /Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib (compatibility version 14.0.0, current version 14.0.0) /usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility version 7.0.0, current version 7.21.0) /usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0) /Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib (compatibility version 14.0.0, current version 14.0.0) /Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib (compatibility version 14.0.0, current version 14.0.0) /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1238.0.0) /usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, current version 1.0.0) Matt static char help[] = " create mesh from exodus.\n\n"; #include #include #undef __FUNCT__ #define __FUNCT__ "main" int main(int argc,char **argv) { char fineMeshFileName[2048]; DM dm; MPI_Comm comm; PetscBool flg; PetscErrorCode ierr; ierr = PetscInitialize(&argc,&argv,(char *)0,help);CHKERRQ(ierr); comm = PETSC_COMM_WORLD; ierr = PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr); if(!flg){ SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh file \n"); } ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, PETSC_FALSE, &dm);CHKERRQ(ierr); ierr = DMDestroy(&dm);CHKERRQ(ierr); ierr = PetscFinalize();CHKERRQ(ierr); } LiviadeMacBook-Pro:partition livia$ ./DMPlexCreateExodusFromFile -file Tri3.exo [0]PETSC ERROR: ------------------------------------------------------------------------ [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger [0]PETSC ERROR: or see http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS X to find memory corruption errors [0]PETSC ERROR: likely location of problem given in stack below [0]PETSC ERROR: --------------------- Stack Frames ------------------------------------ [0]PETSC ERROR: Note: The EXACT line numbers in the stack are not available, [0]PETSC ERROR: INSTEAD the line number of the start of the function [0]PETSC ERROR: is given. [0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Signal received [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.5, unknown [0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 21:04:22 2017 [0]PETSC ERROR: Configure options --with-clanguage=cxx --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 --download-parmetis=1 --download-metis=1 --download-netcdf=1 --download-exodusii=1 --download-hdf5=1 --with-debugging=yes --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug [0]PETSC ERROR: #1 User provided function() line 0 in unknown file application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 [unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59 : system msg for write_line failure : Bad file descriptor The log files of make and configuration are also attached. If you have any idea on this issue, please let me know! Fande Kong, -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -- Department of Mathematics and Center for Computation & Technology Louisiana State University, Baton Rouge, LA 70803, USA Tel. +1 (225) 578 1612, Fax +1 (225) 578 4276 http://www.math.lsu.edu/~bourdin -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Tue Jan 24 10:26:09 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 24 Jan 2017 10:26:09 -0600 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: <59EC0C25-D6E8-459E-B143-148A36A0C81A@lsu.edu> References: <59EC0C25-D6E8-459E-B143-148A36A0C81A@lsu.edu> Message-ID: Thanks! Matt On Tue, Jan 24, 2017 at 8:57 AM, Blaise A Bourdin wrote: > Hi, > > I was able to build petsc master with home-brew gcc 6.3 and intel 17.0 > under macOS sierra. As Mark said, it is important to reinstall the entire > home-brew compiler + libs after upgrading macOS (and often Xcode). > I am able to read / write exodus files. I am attaching my configure > command lines: > > with intel 17.0: > ./configure ./configure \ > --download-chaco=1 \ > --download-exodusii=1 \ > --download-hdf5=1 \ > --download-hypre=1 \ > --download-metis=1 \ > --download-ml=1 \ > --download-netcdf=1 \ > --download-parmetis=1 \ > --download-triangle=1 \ > --with-blas-lapack-dir=$MKLROOT \ > --with-debugging=1 \ > --with-mpi-dir=$MPI_HOME \ > --with-pic \ > --with-shared-libraries=1 \ > --with-vendor-compilers=intel \ > --with-x11=1 > > with gcc: > ./configure \ > --download-exodusii=1 \ > --download-chaco=1 \ > --download-ctetgen=1 \ > --download-hdf5=1 \ > --download-hypre=1 \ > --download-metis=1 \ > --download-ml=1 \ > --download-netcdf=1 \ > --download-parmetis=1 \ > --download-triangle=1 \ > --download-yaml=1 \ > --with-debugging=1 \ > --with-shared-libraries=1 \ > --with-x11=1 > > Blaise > > On Jan 23, 2017, at 7:55 AM, Mark Adams wrote: > > And I trust you updated all system software (like gcc, NetCDF and > ExodusII). OSX upgrades are hell. > > On Mon, Jan 23, 2017 at 12:36 AM, Matthew Knepley > wrote: > >> On Sun, Jan 22, 2017 at 11:18 PM, Fande Kong wrote: >> >>> Thanks, Matt, >>> >>> Clang does not have this issue. The code runs fine with clang. >>> >> >> Okay, it sounds like a gcc bug on Mac 10.6, or at least in the version >> you have. >> >> Matt >> >> >>> Fande, >>> >>> On Sun, Jan 22, 2017 at 8:03 PM, Matthew Knepley >>> wrote: >>> >>>> On Sun, Jan 22, 2017 at 8:40 PM, Fande Kong >>>> wrote: >>>> >>>>> Thanks, Matt. >>>>> >>>>> It is a weird bug. >>>>> >>>>> Do we have an alternative solution to this? I was wondering whether it >>>>> is possible to read the ".exo" files without using the ExodusII. For >>>>> example, can we read the ".exo" files using the netcdf only? >>>>> >>>> >>>> Well, ExodusII is only a think layer on NetCDF, just like other >>>> wrappers are thin layers on HDF5. It is >>>> really NetCDF that is failing. Can you switch compilers and see if that >>>> helps? >>>> >>>> Matt >>>> >>>> >>>>> Fande Kong, >>>>> >>>>> >>>>> >>>>> On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley >>>>> wrote: >>>>> >>>>>> On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong >>>>>> wrote: >>>>>> >>>>>>> On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley >>>>>> > wrote: >>>>>>> >>>>>>>> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong >>>>>>>> wrote: >>>>>>>> >>>>>>>>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley < >>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>> >>>>>>>>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong >>>>>>>>> > wrote: >>>>>>>>>> >>>>>>>>>>> Hi All, >>>>>>>>>>> >>>>>>>>>>> I upgraded the OS system to macOS Sierra, and observed that >>>>>>>>>>> PETSc can not read the exodus file any more. The same code runs fine >>>>>>>>>>> on macOS Capitan. I also tested the function DMPlexCreateExodusFromFile() >>>>>>>>>>> against different versions of the GCC compiler such as GCC-5.4 and GCC-6, >>>>>>>>>>> and neither of them work. I guess this issue is related to the external >>>>>>>>>>> package *exodus*, and PETSc might not pick up the right >>>>>>>>>>> enveriment variables for the *exodus.* >>>>>>>>>>> >>>>>>>>>>> This issue can be reproduced using the following simple code: >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> 1) This is just a standard check. Have you reconfigured so that >>>>>>>>>> you know ExodusII was built with the same compilers and system libraries? >>>>>>>>>> >>>>>>>>>> 2) If so, can you get a stack trace with gdb or lldb? >>>>>>>>>> >>>>>>>>> >>>>>>>>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda >>>>>>>>> __pthread_kill + 10 >>>>>>>>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 pthread_kill >>>>>>>>> + 90 >>>>>>>>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>>>>>>>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>>>>>>>> PetscAbortErrorHandler + 506 (errstop.c:40) >>>>>>>>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError >>>>>>>>> + 916 (err.c:379) >>>>>>>>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>>>>>>>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>>>>>>>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>>>>>>>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>>>>>>>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp + >>>>>>>>> 26 >>>>>>>>> 8 ??? 0x000000011ea09370 >>>>>>>>> initialPoolContent + 19008 >>>>>>>>> 9 libnetcdf.7.dylib 0x000000011228fc62 utf8proc_map >>>>>>>>> + 210 (dutf8proc.c:543) >>>>>>>>> 10 libnetcdf.7.dylib 0x000000011228fd0f utf8proc_NFC >>>>>>>>> + 38 (dutf8proc.c:568) >>>>>>>>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr >>>>>>>>> + 110 (attr.c:341) >>>>>>>>> 12 libnetcdf.7.dylib 0x00000001122a7a4e >>>>>>>>> NC_lookupattr + 119 (attr.c:384) >>>>>>>>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att >>>>>>>>> + 47 (attr.c:1138) >>>>>>>>> 14 libnetcdf.7.dylib 0x0000000112286126 >>>>>>>>> nc_get_att_float + 90 (dattget.c:192) >>>>>>>>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b ex_open_int >>>>>>>>> + 171 (ex_open.c:259) >>>>>>>>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>>>>>>>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>>>>>>>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>>>>>>>> (DMPlexCreateExodusFromFile.cpp:24) >>>>>>>>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>>>>>>>> >>>>>>>> >>>>>>>> This is a NetCDF error on ex_open_int(). My guess is that your >>>>>>>> NetCDF build is old and when it calls the system DLL >>>>>>>> you bomb. Can you do a completely new build, meaning either reclone >>>>>>>> PETSc somewhere else, or delete the whole >>>>>>>> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and >>>>>>>> reconfigure/build? >>>>>>>> >>>>>>>> Thanks, >>>>>>>> >>>>>>>> Matt >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> Hi Matt, >>>>>>> >>>>>>> Thanks for reply. I recloned PETSc (the old petsc folder is >>>>>>> deleted completely) and reconfigure. And still has the same issue. I also >>>>>>> checked if the binary is complied against any other netcdf. The binary is >>>>>>> actually complied against the right netcdf installed through PETSc. >>>>>>> >>>>>> >>>>>> You can see that this crash happens on the call to >>>>>> >>>>>> int CPU_word_size = 0, IO_word_size = 0, exoid = -1; >>>>>> float version; >>>>>> >>>>>> exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, >>>>>> &version); >>>>>> >>>>>> which means the fault is not in PETSc, but rather in ExodusII for >>>>>> your machine. We could definitely >>>>>> confirm this if you made a 5 line program that only called this, but >>>>>> I don't see why it should be different. >>>>>> I am not sure what to do, since I am not in control of anything about >>>>>> ExodusII. Can you report this to >>>>>> their dev team? It is strange since Blaise has not reported this, and >>>>>> I know he uses it all the time. >>>>>> >>>>>> Thanks, >>>>>> >>>>>> Matt >>>>>> >>>>>> >>>>>>> *LiviadeMacBook-Pro:partition livia$ otool -L >>>>>>> DMPlexCreateExodusFromFile* >>>>>>> *DMPlexCreateExodusFromFile:* >>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib >>>>>>> (compatibility version 3.7.0, current version 3.7.5)* >>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib >>>>>>> (compatibility version 5.0.0, current version 5.1.3)* >>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib >>>>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib >>>>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib >>>>>>> (compatibility version 10.0.0, current version 10.0.0)* >>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib >>>>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib >>>>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>>>> */opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current >>>>>>> version 10.0.0)* >>>>>>> */Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib >>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility >>>>>>> version 4.0.0, current version 4.0.0)* >>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility >>>>>>> version 1.0.0, current version 1.0.0)* >>>>>>> */Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib >>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility >>>>>>> version 7.0.0, current version 7.21.0)* >>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility >>>>>>> version 1.0.0, current version 1.0.0)* >>>>>>> */Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib >>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>> */Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib >>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>> */usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current >>>>>>> version 1238.0.0)* >>>>>>> */usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version 1.0.0, >>>>>>> current version 1.0.0)* >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>>>> Matt >>>>>>>>>> >>>>>>>>>> >>>>>>>>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>>>>>>>> >>>>>>>>>>> *#include * >>>>>>>>>>> *#include * >>>>>>>>>>> >>>>>>>>>>> *#undef __FUNCT__* >>>>>>>>>>> *#define __FUNCT__ "main"* >>>>>>>>>>> *int main(int argc,char **argv)* >>>>>>>>>>> *{* >>>>>>>>>>> * char fineMeshFileName[2048];* >>>>>>>>>>> * DM dm;* >>>>>>>>>>> * MPI_Comm comm;* >>>>>>>>>>> * PetscBool flg;* >>>>>>>>>>> >>>>>>>>>>> * PetscErrorCode ierr;* >>>>>>>>>>> >>>>>>>>>>> * ierr = PetscInitialize(&argc,&argv,(char >>>>>>>>>>> *)0,help);CHKERRQ(ierr);* >>>>>>>>>>> * comm = PETSC_COMM_WORLD;* >>>>>>>>>>> * ierr = >>>>>>>>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>>>>>>>> * if(!flg){* >>>>>>>>>>> *SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh >>>>>>>>>>> file \n");* >>>>>>>>>>> * }* >>>>>>>>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>>>>>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>>>>>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>>>>>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>>>>>>>> *}* >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> *LiviadeMacBook-Pro:partition livia$ >>>>>>>>>>> ./DMPlexCreateExodusFromFile -file Tri3.exo * >>>>>>>>>>> *[0]PETSC ERROR: >>>>>>>>>>> ------------------------------------------------------------------------* >>>>>>>>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>> Violation, probably memory access out of range* >>>>>>>>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>> -on_error_attach_debugger* >>>>>>>>>>> *[0]PETSC ERROR: or see >>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>>>>>>>> * >>>>>>>>>>> *[0]PETSC ERROR: or try http://valgrind.org >>>>>>>>>>> on GNU/linux and Apple Mac OS X to find memory >>>>>>>>>>> corruption errors* >>>>>>>>>>> *[0]PETSC ERROR: likely location of problem given in stack below* >>>>>>>>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>> ------------------------------------* >>>>>>>>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>> not available,* >>>>>>>>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>> the function* >>>>>>>>>>> *[0]PETSC ERROR: is given.* >>>>>>>>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>>>>>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>>>>>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>> --------------------------------------------------------------* >>>>>>>>>>> *[0]PETSC ERROR: Signal received* >>>>>>>>>>> *[0]PETSC ERROR: See >>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>>>>>>>> for trouble shooting.* >>>>>>>>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>>>>>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>>>>>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>>>>>>>> 21:04:22 2017* >>>>>>>>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>>>>>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>>>>>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>>>>>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>>>>>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>>>>>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>>>>>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>>>> file* >>>>>>>>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>>>>>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>>>>>>>> *:* >>>>>>>>>>> *system msg for write_line failure : Bad file descriptor* >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> The log files of make and configuration are also attached. If >>>>>>>>>>> you have any idea on this issue, please let me know! >>>>>>>>>>> >>>>>>>>>>> Fande Kong, >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>>> -- >>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>> experiments lead. >>>>>>>>>> -- Norbert Wiener >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> -- >>>>>>>> What most experimenters take for granted before they begin their >>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>> experiments lead. >>>>>>>> -- Norbert Wiener >>>>>>>> >>>>>>> >>>>>>> >>>>>> >>>>>> >>>>>> -- >>>>>> What most experimenters take for granted before they begin their >>>>>> experiments is infinitely more interesting than any results to which their >>>>>> experiments lead. >>>>>> -- Norbert Wiener >>>>>> >>>>> >>>>> >>>> >>>> >>>> -- >>>> What most experimenters take for granted before they begin their >>>> experiments is infinitely more interesting than any results to which their >>>> experiments lead. >>>> -- Norbert Wiener >>>> >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > > -- > Department of Mathematics and Center for Computation & Technology > Louisiana State University, Baton Rouge, LA 70803, USA > Tel. +1 (225) 578 1612 <(225)%20578-1612>, Fax +1 (225) 578 4276 > <(225)%20578-4276> http://www.math.lsu.edu/~bourdin > > > > > > > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From fande.kong at inl.gov Tue Jan 24 10:39:35 2017 From: fande.kong at inl.gov (Kong, Fande) Date: Tue, 24 Jan 2017 09:39:35 -0700 Subject: [petsc-users] DMPlexCreateExodusFromFile() does not work on macOS Sierra In-Reply-To: References: <59EC0C25-D6E8-459E-B143-148A36A0C81A@lsu.edu> Message-ID: Thanks, Blaise and Matt, I possibly mess up the brew installation somehow. I will clean up everything and start over, and report back. Fande, On Tue, Jan 24, 2017 at 9:26 AM, Matthew Knepley wrote: > Thanks! > > Matt > > On Tue, Jan 24, 2017 at 8:57 AM, Blaise A Bourdin wrote: > >> Hi, >> >> I was able to build petsc master with home-brew gcc 6.3 and intel 17.0 >> under macOS sierra. As Mark said, it is important to reinstall the entire >> home-brew compiler + libs after upgrading macOS (and often Xcode). >> I am able to read / write exodus files. I am attaching my configure >> command lines: >> >> with intel 17.0: >> ./configure ./configure \ >> --download-chaco=1 \ >> --download-exodusii=1 \ >> --download-hdf5=1 \ >> --download-hypre=1 \ >> --download-metis=1 \ >> --download-ml=1 \ >> --download-netcdf=1 \ >> --download-parmetis=1 \ >> --download-triangle=1 \ >> --with-blas-lapack-dir=$MKLROOT \ >> --with-debugging=1 \ >> --with-mpi-dir=$MPI_HOME \ >> --with-pic \ >> --with-shared-libraries=1 \ >> --with-vendor-compilers=intel \ >> --with-x11=1 >> >> with gcc: >> ./configure \ >> --download-exodusii=1 \ >> --download-chaco=1 \ >> --download-ctetgen=1 \ >> --download-hdf5=1 \ >> --download-hypre=1 \ >> --download-metis=1 \ >> --download-ml=1 \ >> --download-netcdf=1 \ >> --download-parmetis=1 \ >> --download-triangle=1 \ >> --download-yaml=1 \ >> --with-debugging=1 \ >> --with-shared-libraries=1 \ >> --with-x11=1 >> >> Blaise >> >> On Jan 23, 2017, at 7:55 AM, Mark Adams wrote: >> >> And I trust you updated all system software (like gcc, NetCDF and >> ExodusII). OSX upgrades are hell. >> >> On Mon, Jan 23, 2017 at 12:36 AM, Matthew Knepley >> wrote: >> >>> On Sun, Jan 22, 2017 at 11:18 PM, Fande Kong >>> wrote: >>> >>>> Thanks, Matt, >>>> >>>> Clang does not have this issue. The code runs fine with clang. >>>> >>> >>> Okay, it sounds like a gcc bug on Mac 10.6, or at least in the version >>> you have. >>> >>> Matt >>> >>> >>>> Fande, >>>> >>>> On Sun, Jan 22, 2017 at 8:03 PM, Matthew Knepley >>>> wrote: >>>> >>>>> On Sun, Jan 22, 2017 at 8:40 PM, Fande Kong >>>>> wrote: >>>>> >>>>>> Thanks, Matt. >>>>>> >>>>>> It is a weird bug. >>>>>> >>>>>> Do we have an alternative solution to this? I was wondering whether >>>>>> it is possible to read the ".exo" files without using the ExodusII. For >>>>>> example, can we read the ".exo" files using the netcdf only? >>>>>> >>>>> >>>>> Well, ExodusII is only a think layer on NetCDF, just like other >>>>> wrappers are thin layers on HDF5. It is >>>>> really NetCDF that is failing. Can you switch compilers and see if >>>>> that helps? >>>>> >>>>> Matt >>>>> >>>>> >>>>>> Fande Kong, >>>>>> >>>>>> >>>>>> >>>>>> On Sun, Jan 22, 2017 at 6:50 PM, Matthew Knepley >>>>>> wrote: >>>>>> >>>>>>> On Sun, Jan 22, 2017 at 5:28 PM, Fande Kong >>>>>>> wrote: >>>>>>> >>>>>>>> On Sun, Jan 22, 2017 at 12:35 PM, Matthew Knepley < >>>>>>>> knepley at gmail.com> wrote: >>>>>>>> >>>>>>>>> On Sun, Jan 22, 2017 at 12:41 PM, Fande Kong >>>>>>>>> wrote: >>>>>>>>> >>>>>>>>>> On Sat, Jan 21, 2017 at 10:47 PM, Matthew Knepley < >>>>>>>>>> knepley at gmail.com> wrote: >>>>>>>>>> >>>>>>>>>>> On Sat, Jan 21, 2017 at 10:38 PM, Fande Kong < >>>>>>>>>>> fdkong.jd at gmail.com> wrote: >>>>>>>>>>> >>>>>>>>>>>> Hi All, >>>>>>>>>>>> >>>>>>>>>>>> I upgraded the OS system to macOS Sierra, and observed that >>>>>>>>>>>> PETSc can not read the exodus file any more. The same code runs fine >>>>>>>>>>>> on macOS Capitan. I also tested the function DMPlexCreateExodusFromFile() >>>>>>>>>>>> against different versions of the GCC compiler such as GCC-5.4 and GCC-6, >>>>>>>>>>>> and neither of them work. I guess this issue is related to the external >>>>>>>>>>>> package *exodus*, and PETSc might not pick up the right >>>>>>>>>>>> enveriment variables for the *exodus.* >>>>>>>>>>>> >>>>>>>>>>>> This issue can be reproduced using the following simple code: >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> 1) This is just a standard check. Have you reconfigured so that >>>>>>>>>>> you know ExodusII was built with the same compilers and system libraries? >>>>>>>>>>> >>>>>>>>>>> 2) If so, can you get a stack trace with gdb or lldb? >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> 0 libsystem_kernel.dylib 0x00007fffad8b8dda >>>>>>>>>> __pthread_kill + 10 >>>>>>>>>> 1 libsystem_pthread.dylib 0x00007fffad9a4787 >>>>>>>>>> pthread_kill + 90 >>>>>>>>>> 2 libsystem_c.dylib 0x00007fffad81e420 abort + 129 >>>>>>>>>> 3 libpetsc.3.7.dylib 0x00000001100eb9ee >>>>>>>>>> PetscAbortErrorHandler + 506 (errstop.c:40) >>>>>>>>>> 4 libpetsc.3.7.dylib 0x00000001100e631d PetscError >>>>>>>>>> + 916 (err.c:379) >>>>>>>>>> 5 libpetsc.3.7.dylib 0x00000001100ed830 >>>>>>>>>> PetscSignalHandlerDefault + 1927 (signal.c:160) >>>>>>>>>> 6 libpetsc.3.7.dylib 0x00000001100ed088 >>>>>>>>>> PetscSignalHandler_Private(int) + 630 (signal.c:49) >>>>>>>>>> 7 libsystem_platform.dylib 0x00007fffad997bba _sigtramp >>>>>>>>>> + 26 >>>>>>>>>> 8 ??? 0x000000011ea09370 >>>>>>>>>> initialPoolContent + 19008 >>>>>>>>>> 9 libnetcdf.7.dylib 0x000000011228fc62 >>>>>>>>>> utf8proc_map + 210 (dutf8proc.c:543) >>>>>>>>>> 10 libnetcdf.7.dylib 0x000000011228fd0f >>>>>>>>>> utf8proc_NFC + 38 (dutf8proc.c:568) >>>>>>>>>> 11 libnetcdf.7.dylib 0x00000001122a7928 NC_findattr >>>>>>>>>> + 110 (attr.c:341) >>>>>>>>>> 12 libnetcdf.7.dylib 0x00000001122a7a4e >>>>>>>>>> NC_lookupattr + 119 (attr.c:384) >>>>>>>>>> 13 libnetcdf.7.dylib 0x00000001122a93ef NC3_get_att >>>>>>>>>> + 47 (attr.c:1138) >>>>>>>>>> 14 libnetcdf.7.dylib 0x0000000112286126 >>>>>>>>>> nc_get_att_float + 90 (dattget.c:192) >>>>>>>>>> 15 libpetsc.3.7.dylib 0x00000001117f3a5b >>>>>>>>>> ex_open_int + 171 (ex_open.c:259) >>>>>>>>>> 16 libpetsc.3.7.dylib 0x0000000110c36609 >>>>>>>>>> DMPlexCreateExodusFromFile + 781 (plexexodusii.c:43) >>>>>>>>>> 17 DMPlexCreateExodusFromFile 0x000000010fed4cfc main + 397 >>>>>>>>>> (DMPlexCreateExodusFromFile.cpp:24) >>>>>>>>>> 18 libdyld.dylib 0x00007fffad78a255 start + 1 >>>>>>>>>> >>>>>>>>> >>>>>>>>> This is a NetCDF error on ex_open_int(). My guess is that your >>>>>>>>> NetCDF build is old and when it calls the system DLL >>>>>>>>> you bomb. Can you do a completely new build, meaning either >>>>>>>>> reclone PETSc somewhere else, or delete the whole >>>>>>>>> $PETSC_DIR/$PETSC_ARCH/externalpackage directory and >>>>>>>>> reconfigure/build? >>>>>>>>> >>>>>>>>> Thanks, >>>>>>>>> >>>>>>>>> Matt >>>>>>>>> >>>>>>>>> >>>>>>>> >>>>>>>> Hi Matt, >>>>>>>> >>>>>>>> Thanks for reply. I recloned PETSc (the old petsc folder is >>>>>>>> deleted completely) and reconfigure. And still has the same issue. I also >>>>>>>> checked if the binary is complied against any other netcdf. The binary is >>>>>>>> actually complied against the right netcdf installed through PETSc. >>>>>>>> >>>>>>> >>>>>>> You can see that this crash happens on the call to >>>>>>> >>>>>>> int CPU_word_size = 0, IO_word_size = 0, exoid = -1; >>>>>>> float version; >>>>>>> >>>>>>> exoid = ex_open(filename, EX_READ, &CPU_word_size, &IO_word_size, >>>>>>> &version); >>>>>>> >>>>>>> which means the fault is not in PETSc, but rather in ExodusII for >>>>>>> your machine. We could definitely >>>>>>> confirm this if you made a 5 line program that only called this, but >>>>>>> I don't see why it should be different. >>>>>>> I am not sure what to do, since I am not in control of anything >>>>>>> about ExodusII. Can you report this to >>>>>>> their dev team? It is strange since Blaise has not reported this, >>>>>>> and I know he uses it all the time. >>>>>>> >>>>>>> Thanks, >>>>>>> >>>>>>> Matt >>>>>>> >>>>>>> >>>>>>>> *LiviadeMacBook-Pro:partition livia$ otool -L >>>>>>>> DMPlexCreateExodusFromFile* >>>>>>>> *DMPlexCreateExodusFromFile:* >>>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libpetsc.3.7.dylib >>>>>>>> (compatibility version 3.7.0, current version 3.7.5)* >>>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libsuperlu_dist.5.dylib >>>>>>>> (compatibility version 5.0.0, current version 5.1.3)* >>>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libparmetis.dylib >>>>>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libmetis.dylib >>>>>>>> (compatibility version 0.0.0, current version 0.0.0)* >>>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libnetcdf.7.dylib >>>>>>>> (compatibility version 10.0.0, current version 10.0.0)* >>>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5_hl.8.dylib >>>>>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>>>>> */Users/livia/math/petsc/arch-darwin-cxx-debug/lib/libhdf5.8.dylib >>>>>>>> (compatibility version 9.0.0, current version 9.1.0)* >>>>>>>> */opt/X11/lib/libX11.6.dylib (compatibility version 10.0.0, current >>>>>>>> version 10.0.0)* >>>>>>>> */Users/livia/math/mpich-3.2_install/lib/libmpifort.12.dylib >>>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libgfortran.3.dylib (compatibility >>>>>>>> version 4.0.0, current version 4.0.0)* >>>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libquadmath.0.dylib (compatibility >>>>>>>> version 1.0.0, current version 1.0.0)* >>>>>>>> */Users/livia/math/mpich-3.2_install/lib/libmpicxx.12.dylib >>>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libstdc++.6.dylib (compatibility >>>>>>>> version 7.0.0, current version 7.21.0)* >>>>>>>> */usr/local/opt/gcc at 5/lib/gcc/5/libgcc_s.1.dylib (compatibility >>>>>>>> version 1.0.0, current version 1.0.0)* >>>>>>>> */Users/livia/math/mpich-3.2_install/lib/libmpi.12.dylib >>>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>>> */Users/livia/math/mpich-3.2_install/lib/libpmpi.12.dylib >>>>>>>> (compatibility version 14.0.0, current version 14.0.0)* >>>>>>>> */usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current >>>>>>>> version 1238.0.0)* >>>>>>>> */usr/local/lib/gcc/5/libgcc_s.1.dylib (compatibility version >>>>>>>> 1.0.0, current version 1.0.0)* >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>>>> Matt >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>>> *static char help[] = " create mesh from exodus.\n\n";* >>>>>>>>>>>> >>>>>>>>>>>> *#include * >>>>>>>>>>>> *#include * >>>>>>>>>>>> >>>>>>>>>>>> *#undef __FUNCT__* >>>>>>>>>>>> *#define __FUNCT__ "main"* >>>>>>>>>>>> *int main(int argc,char **argv)* >>>>>>>>>>>> *{* >>>>>>>>>>>> * char fineMeshFileName[2048];* >>>>>>>>>>>> * DM dm;* >>>>>>>>>>>> * MPI_Comm comm;* >>>>>>>>>>>> * PetscBool flg;* >>>>>>>>>>>> >>>>>>>>>>>> * PetscErrorCode ierr;* >>>>>>>>>>>> >>>>>>>>>>>> * ierr = PetscInitialize(&argc,&argv,(char >>>>>>>>>>>> *)0,help);CHKERRQ(ierr);* >>>>>>>>>>>> * comm = PETSC_COMM_WORLD;* >>>>>>>>>>>> * ierr = >>>>>>>>>>>> PetscOptionsGetString(NULL,NULL,"-file",fineMeshFileName,sizeof(fineMeshFileName),&flg);CHKERRQ(ierr);* >>>>>>>>>>>> * if(!flg){* >>>>>>>>>>>> *SETERRQ(comm,PETSC_ERR_ARG_NULL,"please specify a fine mesh >>>>>>>>>>>> file \n");* >>>>>>>>>>>> * }* >>>>>>>>>>>> * ierr = DMPlexCreateExodusFromFile( comm,fineMeshFileName, >>>>>>>>>>>> PETSC_FALSE, &dm);CHKERRQ(ierr);* >>>>>>>>>>>> * ierr = DMDestroy(&dm);CHKERRQ(ierr);* >>>>>>>>>>>> * ierr = PetscFinalize();CHKERRQ(ierr);* >>>>>>>>>>>> *}* >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> *LiviadeMacBook-Pro:partition livia$ >>>>>>>>>>>> ./DMPlexCreateExodusFromFile -file Tri3.exo * >>>>>>>>>>>> *[0]PETSC ERROR: >>>>>>>>>>>> ------------------------------------------------------------------------* >>>>>>>>>>>> *[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation >>>>>>>>>>>> Violation, probably memory access out of range* >>>>>>>>>>>> *[0]PETSC ERROR: Try option -start_in_debugger or >>>>>>>>>>>> -on_error_attach_debugger* >>>>>>>>>>>> *[0]PETSC ERROR: or see >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind >>>>>>>>>>>> * >>>>>>>>>>>> *[0]PETSC ERROR: or try http://valgrind.org >>>>>>>>>>>> >>>>>>>>>>>> on GNU/linux and Apple Mac OS X to find memory corruption errors* >>>>>>>>>>>> *[0]PETSC ERROR: likely location of problem given in stack >>>>>>>>>>>> below* >>>>>>>>>>>> *[0]PETSC ERROR: --------------------- Stack Frames >>>>>>>>>>>> ------------------------------------* >>>>>>>>>>>> *[0]PETSC ERROR: Note: The EXACT line numbers in the stack are >>>>>>>>>>>> not available,* >>>>>>>>>>>> *[0]PETSC ERROR: INSTEAD the line number of the start of >>>>>>>>>>>> the function* >>>>>>>>>>>> *[0]PETSC ERROR: is given.* >>>>>>>>>>>> *[0]PETSC ERROR: [0] DMPlexCreateExodusFromFile line 38 >>>>>>>>>>>> /Users/livia/math/petsc/src/dm/impls/plex/plexexodusii.c* >>>>>>>>>>>> *[0]PETSC ERROR: --------------------- Error Message >>>>>>>>>>>> --------------------------------------------------------------* >>>>>>>>>>>> *[0]PETSC ERROR: Signal received* >>>>>>>>>>>> *[0]PETSC ERROR: See >>>>>>>>>>>> http://www.mcs.anl.gov/petsc/documentation/faq.html >>>>>>>>>>>> >>>>>>>>>>>> for trouble shooting.* >>>>>>>>>>>> *[0]PETSC ERROR: Petsc Release Version 3.7.5, unknown * >>>>>>>>>>>> *[0]PETSC ERROR: ./DMPlexCreateExodusFromFile on a >>>>>>>>>>>> arch-darwin-cxx-debug named LiviadeMacBook-Pro.local by livia Sat Jan 21 >>>>>>>>>>>> 21:04:22 2017* >>>>>>>>>>>> *[0]PETSC ERROR: Configure options --with-clanguage=cxx >>>>>>>>>>>> --with-shared-libraries=1 --download-fblaslapack=1 --with-mpi=1 >>>>>>>>>>>> --download-parmetis=1 --download-metis=1 --download-netcdf=1 >>>>>>>>>>>> --download-exodusii=1 --download-hdf5=1 --with-debugging=yes >>>>>>>>>>>> --with-c2html=0 --download-hypre=1 --with-64-bit-indices=1 >>>>>>>>>>>> --download-superlu_dist=1 PETSC_ARCH=arch-darwin-cxx-debug* >>>>>>>>>>>> *[0]PETSC ERROR: #1 User provided function() line 0 in unknown >>>>>>>>>>>> file* >>>>>>>>>>>> *application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0* >>>>>>>>>>>> *[unset]: write_line error; fd=-1 buf=:cmd=abort exitcode=59* >>>>>>>>>>>> *:* >>>>>>>>>>>> *system msg for write_line failure : Bad file descriptor* >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>>> The log files of make and configuration are also attached. If >>>>>>>>>>>> you have any idea on this issue, please let me know! >>>>>>>>>>>> >>>>>>>>>>>> Fande Kong, >>>>>>>>>>>> >>>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> >>>>>>>>>>> -- >>>>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>>>> experiments lead. >>>>>>>>>>> -- Norbert Wiener >>>>>>>>>>> >>>>>>>>>> >>>>>>>>>> >>>>>>>>> >>>>>>>>> >>>>>>>>> -- >>>>>>>>> What most experimenters take for granted before they begin their >>>>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>>>> experiments lead. >>>>>>>>> -- Norbert Wiener >>>>>>>>> >>>>>>>> >>>>>>>> >>>>>>> >>>>>>> >>>>>>> -- >>>>>>> What most experimenters take for granted before they begin their >>>>>>> experiments is infinitely more interesting than any results to which their >>>>>>> experiments lead. >>>>>>> -- Norbert Wiener >>>>>>> >>>>>> >>>>>> >>>>> >>>>> >>>>> -- >>>>> What most experimenters take for granted before they begin their >>>>> experiments is infinitely more interesting than any results to which their >>>>> experiments lead. >>>>> -- Norbert Wiener >>>>> >>>> >>>> >>> >>> >>> -- >>> What most experimenters take for granted before they begin their >>> experiments is infinitely more interesting than any results to which their >>> experiments lead. >>> -- Norbert Wiener >>> >> >> >> -- >> Department of Mathematics and Center for Computation & Technology >> Louisiana State University, Baton Rouge, LA 70803, USA >> Tel. +1 (225) 578 1612 <(225)%20578-1612>, Fax +1 (225) 578 4276 >> <(225)%20578-4276> http://www.math.lsu.edu/~bourdin >> >> >> >> >> >> >> >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From epscodes at gmail.com Wed Jan 25 09:17:16 2017 From: epscodes at gmail.com (Xiangdong) Date: Wed, 25 Jan 2017 10:17:16 -0500 Subject: [petsc-users] MatSetValuesStencil for cols not in the dmda stencil width Message-ID: Hello everyone, I have a question on setting matrix entries which are not in the stencil width. Take ksp ex45.c as an example, http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex45.c.html Instead of setting up the standard 7-point stencil, now for each cell, the matrix also has a additional dependency on the cell (Mx, My, Mz). Namely, for each row, the col corresponding to (Mx, My, Mz) is always nonzero. I modify the example code to add this entries: + MatSetOption(B,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); + MatSetOption(jac,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); + v[7] = 100; col[7].i = mx-1; col[7].j = my-1; col[7].k = mz-1; + ierr = MatSetValuesStencil(B,1,&row,8,col,v,INSERT_VALUES);CHKERRQ(ierr); It is okay to for np=1, but crash for np>=2 with the error message: [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Argument out of range [0]PETSC ERROR: Local index 342 too large 244 (max) at 7 [0]PETSC ERROR: #1 ISLocalToGlobalMappingApply() line 423 in petsc-3.7.4/src/vec/is/utils/isltog.c [0]PETSC ERROR: #2 MatSetValuesLocal() line 2052 in petsc-3.7.4/src/mat/interface/matrix.c [0]PETSC ERROR: #3 MatSetValuesStencil() line 1447 in petsc-3.7.4/src/mat/interface/matrix.c [0]PETSC ERROR: #4 ComputeMatrix() line 151 in extest.c Can I add new entries to the cols not in the stencil width into the dmda matrix or Jacobian? Attached please find the modifed ex45 example, the diff file as well as the run log. Thanks for your help. Best, Xiangdong -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- diff --git a/ex45.c b/ex45.c index b4bb565..6bcb074 100644 --- a/ex45.c +++ b/ex45.c @@ -117,8 +117,8 @@ PetscErrorCode ComputeMatrix(KSP ksp,Mat jac,Mat B,void *ctx) DM da; PetscErrorCode ierr; PetscInt i,j,k,mx,my,mz,xm,ym,zm,xs,ys,zs; - PetscScalar v[7],Hx,Hy,Hz,HxHydHz,HyHzdHx,HxHzdHy; - MatStencil row,col[7]; + PetscScalar v[8],Hx,Hy,Hz,HxHydHz,HyHzdHx,HxHzdHy; + MatStencil row,col[8]; PetscFunctionBeginUser; ierr = KSPGetDM(ksp,&da);CHKERRQ(ierr); @@ -127,6 +127,9 @@ PetscErrorCode ComputeMatrix(KSP ksp,Mat jac,Mat B,void *ctx) HxHydHz = Hx*Hy/Hz; HxHzdHy = Hx*Hz/Hy; HyHzdHx = Hy*Hz/Hx; ierr = DMDAGetCorners(da,&xs,&ys,&zs,&xm,&ym,&zm);CHKERRQ(ierr); + MatSetOption(B,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); + MatSetOption(jac,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); + for (k=zs; k -------------- next part -------------- DM Object: 2 MPI processes type: da Processor [0] M 7 N 7 P 7 m 1 n 1 p 2 w 1 s 1 X range of indices: 0 7, Y range of indices: 0 7, Z range of indices: 0 4 Processor [1] M 7 N 7 P 7 m 1 n 1 p 2 w 1 s 1 X range of indices: 0 7, Y range of indices: 0 7, Z range of indices: 4 7 [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Argument out of range [0]PETSC ERROR: Local index 342 too large 244 (max) at 7 [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.7.4, Oct, 02, 2016 [0]PETSC ERROR: #1 ISLocalToGlobalMappingApply() line 423 in petsc-3.7.4/src/vec/is/utils/isltog.c [0]PETSC ERROR: #2 MatSetValuesLocal() line 2052 in petsc-3.7.4/src/mat/interface/matrix.c [0]PETSC ERROR: #3 MatSetValuesStencil() line 1447 in petsc-3.7.4/src/mat/interface/matrix.c [0]PETSC ERROR: #4 ComputeMatrix() line 151 in extest.c [0]PETSC ERROR: #5 KSPSetUp() line 341 in petsc-3.7.4/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #6 KSPSolve() line 599 in petsc-3.7.4/src/ksp/ksp/interface/itfunc.c [0]PETSC ERROR: #7 main() line 51 in extest.c [0]PETSC ERROR: PETSc Option Table entries: [0]PETSC ERROR: -dm_view [0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-maint at mcs.anl.gov---------- From knepley at gmail.com Wed Jan 25 09:53:19 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 25 Jan 2017 09:53:19 -0600 Subject: [petsc-users] MatSetValuesStencil for cols not in the dmda stencil width In-Reply-To: References: Message-ID: On Wed, Jan 25, 2017 at 9:17 AM, Xiangdong wrote: > Hello everyone, > > I have a question on setting matrix entries which are not in the stencil > width. Take ksp ex45.c as an example, > http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/ > examples/tutorials/ex45.c.html > > Instead of setting up the standard 7-point stencil, now for each cell, the > matrix also has a additional dependency on the cell (Mx, My, Mz). Namely, > for each row, the col corresponding to (Mx, My, Mz) is always nonzero. I > modify the example code to add this entries: > > + MatSetOption(B,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); > + MatSetOption(jac,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); > > + v[7] = 100; col[7].i = mx-1; col[7].j = my-1; col[7].k = mz-1; > + ierr = MatSetValuesStencil(B,1,&row,8,col,v,INSERT_VALUES); > CHKERRQ(ierr); > > It is okay to for np=1, but crash for np>=2 with the error message: > You can do this, but 1) You cannot use MatSetStencil, since your entry is not actually in your stencil. You will have to make a separate call to MatSetValues() using the global index. 2) The nonzero pattern we have allocated will be wrong. You will have to set the MatOption which gives an error on new nonzeros to PETSC_FALSE. 3) You will have a dense row in your Jacobian, which is hard to make perform well, and also stymies most preconditioners. Thanks, Matt > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: Local index 342 too large 244 (max) at 7 > > [0]PETSC ERROR: #1 ISLocalToGlobalMappingApply() line 423 in > petsc-3.7.4/src/vec/is/utils/isltog.c > [0]PETSC ERROR: #2 MatSetValuesLocal() line 2052 in > petsc-3.7.4/src/mat/interface/matrix.c > [0]PETSC ERROR: #3 MatSetValuesStencil() line 1447 in > petsc-3.7.4/src/mat/interface/matrix.c > [0]PETSC ERROR: #4 ComputeMatrix() line 151 in extest.c > > Can I add new entries to the cols not in the stencil width into the dmda > matrix or Jacobian? > > Attached please find the modifed ex45 example, the diff file as well as > the run log. > > Thanks for your help. > > Best, > Xiangdong > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From A.T.T.McRae at bath.ac.uk Wed Jan 25 13:13:03 2017 From: A.T.T.McRae at bath.ac.uk (Andrew McRae) Date: Wed, 25 Jan 2017 19:13:03 +0000 Subject: [petsc-users] how to stop SNES linesearch (l^2 minimization) from choosing obviously suboptimal lambda? Message-ID: I have a nonlinear problem in which the line search procedure is making 'obviously wrong' choices for lambda. My nonlinear solver options (going via petsc4py) include {"snes_linesearch_type": "l2", "snes_linesearch_max_it": 3}. After monotonically decreasing the residual by about 4 orders of magnitude, I get the following... 15 SNES Function norm 9.211230243067e-06 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, 3.14838e-05, 9.21123e-06] Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = [3.14183e-05, 3.13437e-05, 3.13039e-05] Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = [3.12969e-05, 3.13273e-05, 3.14183e-05] Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 16 SNES Function norm 3.129688997145e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09357e-05, 1.58135e-05, 3.12969e-05] Line search: lambdas = [0.503912, 0.751956, 1.], fnorms = [1.59287e-05, 2.33645e-05, 3.09357e-05] Line search: lambdas = [0.0186202, 0.261266, 0.503912], fnorms = [3.07204e-05, 9.11e-06, 1.59287e-05] Line search terminated: lambda = 0.342426, fnorms = 1.12885e-05 17 SNES Function norm 1.128846081676e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09448e-05, 5.94789e-06, 1.12885e-05] Line search: lambdas = [0.295379, 0.64769, 1.], fnorms = [8.09996e-06, 4.46782e-06, 3.09448e-05] Line search: lambdas = [0.48789, 0.391635, 0.295379], fnorms = [6.07286e-06, 7.07842e-06, 8.09996e-06] Line search terminated: lambda = 0.997854, fnorms = 3.09222e-05 18 SNES Function norm 3.092215965860e-05 So, in iteration 16, the lambda chosen is 0.91..., even though we see that lambda close to 0 would give a smaller residual. In iteration 18, we see that some lambda around 0.65 gives a far smaller residual (approx 4e-6) than the 0.997... value that gets used (which gives approx 3e-5). The nonlinear iterations then get caught in some kind of cycle until my snes_max_it is reached [full log attached]. I guess this is an artifact of (if I understand correctly) trying to minimize some polynomial fitted to the evaluated values of lambda? But it's frustrating that it leads to 'obviously wrong' results! For background information, this comes from an FE discretisation of a Monge-Amp?re equation (and also from several timesteps into a time-varying problem). For various reasons (related to Monge-Amp?re convexity requirements), I use a partial Jacobian that omits a term from the linearisation of the residual, and so the nonlinear convergence is not expected to be quadratic. Andrew -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- 0 SNES Function norm 8.029428739596e-02 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.0361527, 0.0477994, 0.0802943] Line search: lambdas = [0.903514, 0.951757, 1.], fnorms = [0.0352394, 0.0354822, 0.0361527] Line search: lambdas = [0.90078, 0.902147, 0.903514], fnorms = [0.0352386, 0.0352388, 0.0352394] Line search terminated: lambda = 0.900703, fnorms = 0.0352386 1 SNES Function norm 3.523861076534e-02 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.0172942, 0.0221407, 0.0352386] Line search: lambdas = [0.920512, 0.960256, 1.], fnorms = [0.0170826, 0.0171367, 0.0172942] Line search: lambdas = [0.919798, 0.920155, 0.920512], fnorms = [0.0170826, 0.0170826, 0.0170826] Line search terminated: lambda = 0.919789, fnorms = 0.0170826 2 SNES Function norm 1.708258531649e-02 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.0110799, 0.0116967, 0.0170826] Line search: lambdas = [0.799832, 0.899916, 1.], fnorms = [0.0105751, 0.0107035, 0.0110799] Line search: lambdas = [0.799823, 0.799828, 0.799832], fnorms = [0.0105751, 0.0105751, 0.0105751] Line search terminated: lambda = 0.805802, fnorms = 0.0105755 3 SNES Function norm 1.057551738195e-02 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.00621427, 0.00740836, 0.0105755] Line search: lambdas = [0.949881, 0.97494, 1.], fnorms = [0.00619869, 0.0062024, 0.00621427] Line search: lambdas = [0.951028, 0.950454, 0.949881], fnorms = [0.00619868, 0.00619868, 0.00619869] Line search terminated: lambda = 0.951032, fnorms = 0.00619868 4 SNES Function norm 6.198678560523e-03 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.00325715, 0.00397891, 0.00619868] Line search: lambdas = [0.900345, 0.950172, 1.], fnorms = [0.00320353, 0.00321705, 0.00325715] Line search: lambdas = [0.90023, 0.900287, 0.900345], fnorms = [0.00320353, 0.00320353, 0.00320353] Line search terminated: lambda = 0.900225, fnorms = 0.00320353 5 SNES Function norm 3.203532630368e-03 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.00198382, 0.00218876, 0.00320353] Line search: lambdas = [0.842609, 0.921305, 1.], fnorms = [0.00192443, 0.00193951, 0.00198382] Line search: lambdas = [0.842274, 0.842442, 0.842609], fnorms = [0.00192443, 0.00192443, 0.00192443] Line search terminated: lambda = 0.842222, fnorms = 0.00192443 6 SNES Function norm 1.924427560912e-03 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.00108014, 0.00131485, 0.00192443] Line search: lambdas = [0.948983, 0.974492, 1.], fnorms = [0.0010767, 0.00107757, 0.00108014] Line search: lambdas = [0.948765, 0.948874, 0.948983], fnorms = [0.0010767, 0.0010767, 0.0010767] Line search terminated: lambda = 0.948764, fnorms = 0.0010767 7 SNES Function norm 1.076698506228e-03 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.000571456, 0.0006976, 0.0010767] Line search: lambdas = [0.906164, 0.953082, 1.], fnorms = [0.000563516, 0.000565509, 0.000571456] Line search: lambdas = [0.906231, 0.906197, 0.906164], fnorms = [0.000563516, 0.000563516, 0.000563516] Line search terminated: lambda = 0.906232, fnorms = 0.000563516 8 SNES Function norm 5.635162562152e-04 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.000339155, 0.000383369, 0.000563516] Line search: lambdas = [0.865217, 0.932609, 1.], fnorms = [0.000330868, 0.000333796, 0.000339155] Line search: lambdas = [0.819914, 0.842566, 0.865217], fnorms = [0.000331865, 0.000331154, 0.000330868] Line search terminated: lambda = 0.869063, fnorms = 0.000330862 9 SNES Function norm 3.308616258520e-04 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.00018679, 0.000227276, 0.000330862] Line search: lambdas = [0.954185, 0.977092, 1.], fnorms = [0.000186236, 0.000186395, 0.00018679] Line search: lambdas = [0.950139, 0.952162, 0.954185], fnorms = [0.000186232, 0.000186233, 0.000186236] Line search terminated: lambda = 0.950139, fnorms = 0.000186232 10 SNES Function norm 1.862321617129e-04 Line search: lambdas = [1., 0.5, 0.], fnorms = [0.000102589, 0.000121554, 0.000186232] Line search: lambdas = [0.885752, 0.942876, 1.], fnorms = [9.73359e-05, 9.71282e-05, 0.000102589] Line search: lambdas = [0.916354, 0.901053, 0.885752], fnorms = [9.71018e-05, 9.71834e-05, 9.73359e-05] Line search terminated: lambda = 0.926324, fnorms = 9.70867e-05 11 SNES Function norm 9.708668367121e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [6.38156e-05, 7.19222e-05, 9.70867e-05] Line search: lambdas = [0.924516, 0.962258, 1.], fnorms = [6.31271e-05, 6.33834e-05, 6.38156e-05] Line search: lambdas = [0.889094, 0.906805, 0.924516], fnorms = [6.30484e-05, 6.30681e-05, 6.31271e-05] Line search terminated: lambda = 0.889093, fnorms = 6.30484e-05 12 SNES Function norm 6.304843182372e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09866e-05, 4.05459e-05, 6.30484e-05] Line search: lambdas = [0.957547, 0.978774, 1.], fnorms = [3.08906e-05, 3.09146e-05, 3.09866e-05] Line search: lambdas = [0.957541, 0.957544, 0.957547], fnorms = [3.08906e-05, 3.08906e-05, 3.08906e-05] Line search terminated: lambda = 0.95754, fnorms = 3.08906e-05 13 SNES Function norm 3.089060391537e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [1.67849e-05, 2.06894e-05, 3.08906e-05] Line search: lambdas = [0.942596, 0.971298, 1.], fnorms = [1.67101e-05, 1.67288e-05, 1.67849e-05] Line search: lambdas = [0.942596, 0.942596, 0.942596], fnorms = [1.67101e-05, 1.67101e-05, 1.67101e-05] Line search terminated: lambda = 0.942596, fnorms = 1.67101e-05 14 SNES Function norm 1.671012156294e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.22057e-05, 1.12535e-05, 1.67101e-05] Line search: lambdas = [0.321761, 0.660881, 1.], fnorms = [1.29696e-05, 1.00659e-05, 3.22057e-05] Line search: lambdas = [0.513941, 0.417851, 0.321761], fnorms = [1.11351e-05, 1.20015e-05, 1.29696e-05] Line search terminated: lambda = 0.932312, fnorms = 9.21123e-06 15 SNES Function norm 9.211230243067e-06 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, 3.14838e-05, 9.21123e-06] Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = [3.14183e-05, 3.13437e-05, 3.13039e-05] Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = [3.12969e-05, 3.13273e-05, 3.14183e-05] Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 16 SNES Function norm 3.129688997145e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09357e-05, 1.58135e-05, 3.12969e-05] Line search: lambdas = [0.503912, 0.751956, 1.], fnorms = [1.59287e-05, 2.33645e-05, 3.09357e-05] Line search: lambdas = [0.0186202, 0.261266, 0.503912], fnorms = [3.07204e-05, 9.11e-06, 1.59287e-05] Line search terminated: lambda = 0.342426, fnorms = 1.12885e-05 17 SNES Function norm 1.128846081676e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09448e-05, 5.94789e-06, 1.12885e-05] Line search: lambdas = [0.295379, 0.64769, 1.], fnorms = [8.09996e-06, 4.46782e-06, 3.09448e-05] Line search: lambdas = [0.48789, 0.391635, 0.295379], fnorms = [6.07286e-06, 7.07842e-06, 8.09996e-06] Line search terminated: lambda = 0.997854, fnorms = 3.09222e-05 18 SNES Function norm 3.092215965860e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08118e-05, 1.54926e-05, 3.09222e-05] Line search: lambdas = [0.501195, 0.750598, 1.], fnorms = [1.55291e-05, 2.31575e-05, 3.08118e-05] Line search: lambdas = [0.00203641, 0.251616, 0.501195], fnorms = [3.08595e-05, 7.99583e-06, 1.55291e-05] Line search terminated: lambda = 0.334898, fnorms = 1.0485e-05 19 SNES Function norm 1.048497478694e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08828e-05, 5.31221e-06, 1.0485e-05] Line search: lambdas = [0.290564, 0.645282, 1.], fnorms = [7.47161e-06, 3.83131e-06, 3.08828e-05] Line search: lambdas = [0.482813, 0.386688, 0.290564], fnorms = [5.48866e-06, 6.47839e-06, 7.47161e-06] Line search terminated: lambda = 1.00084, fnorms = 3.08914e-05 20 SNES Function norm 3.089143204407e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07985e-05, 1.5419e-05, 3.08914e-05] Line search: lambdas = [0.501004, 0.750502, 1.], fnorms = [1.54499e-05, 2.31217e-05, 3.07985e-05] Line search: lambdas = [0.000160764, 0.250582, 0.501004], fnorms = [3.08865e-05, 7.76926e-06, 1.54499e-05] Line search terminated: lambda = 0.334131, fnorms = 1.03269e-05 21 SNES Function norm 1.032685941104e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08661e-05, 5.19297e-06, 1.03269e-05] Line search: lambdas = [0.289623, 0.644812, 1.], fnorms = [7.35129e-06, 3.71148e-06, 3.08661e-05] Line search: lambdas = [0.481823, 0.385723, 0.289623], fnorms = [5.37926e-06, 6.36485e-06, 7.35129e-06] Line search terminated: lambda = 1.00269, fnorms = 3.08938e-05 22 SNES Function norm 3.089378442296e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.0798e-05, 1.54031e-05, 3.08938e-05] Line search: lambdas = [0.501034, 0.750517, 1.], fnorms = [1.5435e-05, 2.31159e-05, 3.0798e-05] Line search: lambdas = [6.42473e-05, 0.250549, 0.501034], fnorms = [3.08918e-05, 7.72781e-06, 1.5435e-05] Line search terminated: lambda = 0.334122, fnorms = 1.02981e-05 23 SNES Function norm 1.029807313639e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08623e-05, 5.16944e-06, 1.02981e-05] Line search: lambdas = [0.289463, 0.644731, 1.], fnorms = [7.32849e-06, 3.68639e-06, 3.08623e-05] Line search: lambdas = [0.481655, 0.385559, 0.289463], fnorms = [5.35752e-06, 6.34289e-06, 7.32849e-06] Line search terminated: lambda = 1.00314, fnorms = 3.08945e-05 24 SNES Function norm 3.089454671948e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.0797e-05, 1.53995e-05, 3.08945e-05] Line search: lambdas = [0.501053, 0.750526, 1.], fnorms = [1.54319e-05, 2.31143e-05, 3.0797e-05] Line search: lambdas = [1.90629e-05, 0.250536, 0.501053], fnorms = [3.0894e-05, 7.71862e-06, 1.54319e-05] Line search terminated: lambda = 0.33412, fnorms = 1.02918e-05 25 SNES Function norm 1.029182711068e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08627e-05, 5.16398e-06, 1.02918e-05] Line search: lambdas = [0.289427, 0.644713, 1.], fnorms = [7.32328e-06, 3.68069e-06, 3.08627e-05] Line search: lambdas = [0.481616, 0.385521, 0.289427], fnorms = [5.35247e-06, 6.33781e-06, 7.32328e-06] Line search terminated: lambda = 1.00305, fnorms = 3.08939e-05 26 SNES Function norm 3.089393766930e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07962e-05, 1.5399e-05, 3.08939e-05] Line search: lambdas = [0.501055, 0.750528, 1.], fnorms = [1.54315e-05, 2.31138e-05, 3.07962e-05] Line search: lambdas = [0.750528, 0.625791, 0.501055], fnorms = [2.31138e-05, 1.92726e-05, 1.54315e-05] Line search terminated: lambda = 0.625791, fnorms = 1.92726e-05 27 SNES Function norm 1.927257601023e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08333e-05, 9.6684e-06, 1.92726e-05] Line search: lambdas = [0.372429, 0.686214, 1.], fnorms = [1.21184e-05, 6.0938e-06, 3.08333e-05] Line search: lambdas = [0.562967, 0.467698, 0.372429], fnorms = [8.45935e-06, 1.02887e-05, 1.21184e-05] Line search terminated: lambda = 1.0029, fnorms = 3.0889e-05 28 SNES Function norm 3.088902536578e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07953e-05, 1.54009e-05, 3.0889e-05] Line search: lambdas = [0.501012, 0.750506, 1.], fnorms = [1.5432e-05, 2.31134e-05, 3.07953e-05] Line search: lambdas = [0.750506, 0.625759, 0.501012], fnorms = [2.31134e-05, 1.92726e-05, 1.5432e-05] Line search terminated: lambda = 0.625759, fnorms = 1.92726e-05 29 SNES Function norm 1.927259393778e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08335e-05, 9.6688e-06, 1.92726e-05] Line search: lambdas = [0.372426, 0.686213, 1.], fnorms = [1.21188e-05, 6.09436e-06, 3.08335e-05] Line search: lambdas = [0.562966, 0.467696, 0.372426], fnorms = [8.45983e-06, 1.02891e-05, 1.21188e-05] Line search terminated: lambda = 1.00294, fnorms = 3.08901e-05 30 SNES Function norm 3.089006071232e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501022, 0.750511, 1.], fnorms = [1.54319e-05, 2.31134e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501022], fnorms = [2.31134e-05, 1.92725e-05, 1.54319e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 31 SNES Function norm 1.927249100532e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66877e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686213, 1.], fnorms = [1.21187e-05, 6.09434e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45981e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 32 SNES Function norm 3.089009716315e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501022, 0.750511, 1.], fnorms = [1.54318e-05, 2.31134e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501022], fnorms = [2.31134e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 33 SNES Function norm 1.927248677224e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66875e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686213, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 34 SNES Function norm 3.089008816676e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 35 SNES Function norm 1.927248252619e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 36 SNES Function norm 3.089008385404e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 37 SNES Function norm 1.927248268661e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 38 SNES Function norm 3.089008466314e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 39 SNES Function norm 1.927248319694e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 40 SNES Function norm 3.089008515441e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 41 SNES Function norm 1.927248317403e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 42 SNES Function norm 3.089008504499e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 43 SNES Function norm 1.927248310466e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 44 SNES Function norm 3.089008496293e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 45 SNES Function norm 1.927248310274e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 46 SNES Function norm 3.089008495811e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 47 SNES Function norm 1.927248310735e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 48 SNES Function norm 3.089008495408e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.07954e-05, 1.54004e-05, 3.08901e-05] Line search: lambdas = [0.501023, 0.750511, 1.], fnorms = [1.54318e-05, 2.31133e-05, 3.07954e-05] Line search: lambdas = [0.750511, 0.625767, 0.501023], fnorms = [2.31133e-05, 1.92725e-05, 1.54318e-05] Line search terminated: lambda = 0.625767, fnorms = 1.92725e-05 49 SNES Function norm 1.927248310443e-05 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.08334e-05, 9.66874e-06, 1.92725e-05] Line search: lambdas = [0.372425, 0.686212, 1.], fnorms = [1.21187e-05, 6.09431e-06, 3.08334e-05] Line search: lambdas = [0.562965, 0.467695, 0.372425], fnorms = [8.45979e-06, 1.02891e-05, 1.21187e-05] Line search terminated: lambda = 1.00295, fnorms = 3.08901e-05 50 SNES Function norm 3.089008494267e-05 From bsmith at mcs.anl.gov Wed Jan 25 13:43:02 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Wed, 25 Jan 2017 13:43:02 -0600 Subject: [petsc-users] how to stop SNES linesearch (l^2 minimization) from choosing obviously suboptimal lambda? In-Reply-To: References: Message-ID: <7FA42DE6-0CFD-4FDC-840B-964B5BFFB3D4@mcs.anl.gov> Have you tried bt ? > On Jan 25, 2017, at 1:13 PM, Andrew McRae wrote: > > I have a nonlinear problem in which the line search procedure is making 'obviously wrong' choices for lambda. My nonlinear solver options (going via petsc4py) include {"snes_linesearch_type": "l2", "snes_linesearch_max_it": 3}. > > After monotonically decreasing the residual by about 4 orders of magnitude, I get the following... > > 15 SNES Function norm 9.211230243067e-06 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, 3.14838e-05, 9.21123e-06] > Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = [3.14183e-05, 3.13437e-05, 3.13039e-05] > Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = [3.12969e-05, 3.13273e-05, 3.14183e-05] > Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 > 16 SNES Function norm 3.129688997145e-05 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09357e-05, 1.58135e-05, 3.12969e-05] > Line search: lambdas = [0.503912, 0.751956, 1.], fnorms = [1.59287e-05, 2.33645e-05, 3.09357e-05] > Line search: lambdas = [0.0186202, 0.261266, 0.503912], fnorms = [3.07204e-05, 9.11e-06, 1.59287e-05] > Line search terminated: lambda = 0.342426, fnorms = 1.12885e-05 > 17 SNES Function norm 1.128846081676e-05 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09448e-05, 5.94789e-06, 1.12885e-05] > Line search: lambdas = [0.295379, 0.64769, 1.], fnorms = [8.09996e-06, 4.46782e-06, 3.09448e-05] > Line search: lambdas = [0.48789, 0.391635, 0.295379], fnorms = [6.07286e-06, 7.07842e-06, 8.09996e-06] > Line search terminated: lambda = 0.997854, fnorms = 3.09222e-05 > 18 SNES Function norm 3.092215965860e-05 > > So, in iteration 16, the lambda chosen is 0.91..., even though we see that lambda close to 0 would give a smaller residual. In iteration 18, we see that some lambda around 0.65 gives a far smaller residual (approx 4e-6) than the 0.997... value that gets used (which gives approx 3e-5). The nonlinear iterations then get caught in some kind of cycle until my snes_max_it is reached [full log attached]. > > I guess this is an artifact of (if I understand correctly) trying to minimize some polynomial fitted to the evaluated values of lambda? But it's frustrating that it leads to 'obviously wrong' results! > > For background information, this comes from an FE discretisation of a Monge-Amp?re equation (and also from several timesteps into a time-varying problem). For various reasons (related to Monge-Amp?re convexity requirements), I use a partial Jacobian that omits a term from the linearisation of the residual, and so the nonlinear convergence is not expected to be quadratic. > > Andrew > From knepley at gmail.com Wed Jan 25 13:57:02 2017 From: knepley at gmail.com (Matthew Knepley) Date: Wed, 25 Jan 2017 13:57:02 -0600 Subject: [petsc-users] how to stop SNES linesearch (l^2 minimization) from choosing obviously suboptimal lambda? In-Reply-To: References: Message-ID: On Wed, Jan 25, 2017 at 1:13 PM, Andrew McRae wrote: > I have a nonlinear problem in which the line search procedure is making > 'obviously wrong' choices for lambda. My nonlinear solver options (going > via petsc4py) include {"snes_linesearch_type": "l2", > "snes_linesearch_max_it": 3}. > > After monotonically decreasing the residual by about 4 orders of > magnitude, I get the following... > > 15 SNES Function norm 9.211230243067e-06 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, > 3.14838e-05, 9.21123e-06] > Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = > [3.14183e-05, 3.13437e-05, 3.13039e-05] > Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = > [3.12969e-05, 3.13273e-05, 3.14183e-05] > Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 > 16 SNES Function norm 3.129688997145e-05 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09357e-05, > 1.58135e-05, 3.12969e-05] > Line search: lambdas = [0.503912, 0.751956, 1.], fnorms = > [1.59287e-05, 2.33645e-05, 3.09357e-05] > Line search: lambdas = [0.0186202, 0.261266, 0.503912], fnorms = > [3.07204e-05, 9.11e-06, 1.59287e-05] > Line search terminated: lambda = 0.342426, fnorms = 1.12885e-05 > 17 SNES Function norm 1.128846081676e-05 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09448e-05, > 5.94789e-06, 1.12885e-05] > Line search: lambdas = [0.295379, 0.64769, 1.], fnorms = > [8.09996e-06, 4.46782e-06, 3.09448e-05] > Line search: lambdas = [0.48789, 0.391635, 0.295379], fnorms = > [6.07286e-06, 7.07842e-06, 8.09996e-06] > Line search terminated: lambda = 0.997854, fnorms = 3.09222e-05 > 18 SNES Function norm 3.092215965860e-05 > > So, in iteration 16, the lambda chosen is 0.91..., even though we see that > lambda close to 0 would give a smaller residual. In iteration 18, we see > that some lambda around 0.65 gives a far smaller residual (approx 4e-6) > than the 0.997... value that gets used (which gives approx 3e-5). The > nonlinear iterations then get caught in some kind of cycle until my > snes_max_it is reached [full log attached]. > > I guess this is an artifact of (if I understand correctly) trying to > minimize some polynomial fitted to the evaluated values of lambda? But > it's frustrating that it leads to 'obviously wrong' results! > There might be better line searches for this problem. For example, 'bt' should be more robust then 'l2', and 'cp' tries really hard to find a minimum. The 'nleqerr' is Deufelhard's search that should also be more robust. I would try them out to see if its better. Matt > For background information, this comes from an FE discretisation of a > Monge-Amp?re equation (and also from several timesteps into a time-varying > problem). For various reasons (related to Monge-Amp?re convexity > requirements), I use a partial Jacobian that omits a term from the > linearisation of the residual, and so the nonlinear convergence is not > expected to be quadratic. > > Andrew > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From A.T.T.McRae at bath.ac.uk Thu Jan 26 02:20:09 2017 From: A.T.T.McRae at bath.ac.uk (Andrew McRae) Date: Thu, 26 Jan 2017 08:20:09 +0000 Subject: [petsc-users] how to stop SNES linesearch (l^2 minimization) from choosing obviously suboptimal lambda? In-Reply-To: <12f8eaa2ea984ac88757dc7c03d6e6dd@exch07.campus.bath.ac.uk> References: <12f8eaa2ea984ac88757dc7c03d6e6dd@exch07.campus.bath.ac.uk> Message-ID: Okay. I discarded bt quite early since I have no reason to think the default step size (lambda = 1) is 'good', due to the partial Jacobian. But I can try it again. cp sometimes behaves well, but other times I've seen it do something crazy like take lambda = 2.5 on the first step. Due to the MA convexity reqs, the linear system at the second step is then malformed and the solver dies. I also briefly tried nleqerr in the past and found it to take a huge number of iterations, but I can try that again. On 25 January 2017 at 19:57, Matthew Knepley wrote: > On Wed, Jan 25, 2017 at 1:13 PM, Andrew McRae > wrote: > >> I have a nonlinear problem in which the line search procedure is making >> 'obviously wrong' choices for lambda. My nonlinear solver options (going >> via petsc4py) include {"snes_linesearch_type": "l2", >> "snes_linesearch_max_it": 3}. >> >> After monotonically decreasing the residual by about 4 orders of >> magnitude, I get the following... >> >> 15 SNES Function norm 9.211230243067e-06 >> Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, >> 3.14838e-05, 9.21123e-06] >> Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = >> [3.14183e-05, 3.13437e-05, 3.13039e-05] >> Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = >> [3.12969e-05, 3.13273e-05, 3.14183e-05] >> Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 >> 16 SNES Function norm 3.129688997145e-05 >> Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09357e-05, >> 1.58135e-05, 3.12969e-05] >> Line search: lambdas = [0.503912, 0.751956, 1.], fnorms = >> [1.59287e-05, 2.33645e-05, 3.09357e-05] >> Line search: lambdas = [0.0186202, 0.261266, 0.503912], fnorms = >> [3.07204e-05, 9.11e-06, 1.59287e-05] >> Line search terminated: lambda = 0.342426, fnorms = 1.12885e-05 >> 17 SNES Function norm 1.128846081676e-05 >> Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09448e-05, >> 5.94789e-06, 1.12885e-05] >> Line search: lambdas = [0.295379, 0.64769, 1.], fnorms = >> [8.09996e-06, 4.46782e-06, 3.09448e-05] >> Line search: lambdas = [0.48789, 0.391635, 0.295379], fnorms = >> [6.07286e-06, 7.07842e-06, 8.09996e-06] >> Line search terminated: lambda = 0.997854, fnorms = 3.09222e-05 >> 18 SNES Function norm 3.092215965860e-05 >> >> So, in iteration 16, the lambda chosen is 0.91..., even though we see >> that lambda close to 0 would give a smaller residual. In iteration 18, we >> see that some lambda around 0.65 gives a far smaller residual (approx 4e-6) >> than the 0.997... value that gets used (which gives approx 3e-5). The >> nonlinear iterations then get caught in some kind of cycle until my >> snes_max_it is reached [full log attached]. >> >> I guess this is an artifact of (if I understand correctly) trying to >> minimize some polynomial fitted to the evaluated values of lambda? But >> it's frustrating that it leads to 'obviously wrong' results! >> > > There might be better line searches for this problem. For example, 'bt' > should be more robust then 'l2', and 'cp' > tries really hard to find a minimum. The 'nleqerr' is Deufelhard's search > that should also be more robust. I would > try them out to see if its better. > > Matt > > >> For background information, this comes from an FE discretisation of a >> Monge-Amp?re equation (and also from several timesteps into a time-varying >> problem). For various reasons (related to Monge-Amp?re convexity >> requirements), I use a partial Jacobian that omits a term from the >> linearisation of the residual, and so the nonlinear convergence is not >> expected to be quadratic. >> >> Andrew >> > > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Thu Jan 26 06:29:09 2017 From: cpraveen at gmail.com (Praveen C) Date: Thu, 26 Jan 2017 17:59:09 +0530 Subject: [petsc-users] MPI_Wtime in fortran Message-ID: Dear all In my petsc fortran code, I am using MPI_Wtime to measure times. I include #include When compiling, I get warning for MPI_Wtime *Warning:* Possible change of value in conversion from REAL(16) to REAL(8) at (1) [*-Wconversion*] But MPI_Wtime is supposed to return double precision. The values I get from this function also seem to be wrong, always zero. Thanks praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Thu Jan 26 06:41:40 2017 From: cpraveen at gmail.com (Praveen C) Date: Thu, 26 Jan 2017 18:11:40 +0530 Subject: [petsc-users] MPI_Wtime in fortran In-Reply-To: References: Message-ID: To be more precise, my code looks something like this implicit none #include double precision time time = MPI_Wtime() I get the warning of conversion from REAL(16) to REAL(8) and the value of time is always zero. Thanks praveen On Thu, Jan 26, 2017 at 5:59 PM, Praveen C wrote: > Dear all > > In my petsc fortran code, I am using MPI_Wtime to measure times. I include > > #include > > When compiling, I get warning for MPI_Wtime > > *Warning:* Possible change of value in conversion from REAL(16) to > REAL(8) at (1) [*-Wconversion*] > > > But MPI_Wtime is supposed to return double precision. The values I get > from this function also seem to be wrong, always zero. > > Thanks > praveen > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 26 07:57:42 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 26 Jan 2017 07:57:42 -0600 Subject: [petsc-users] how to stop SNES linesearch (l^2 minimization) from choosing obviously suboptimal lambda? In-Reply-To: References: <12f8eaa2ea984ac88757dc7c03d6e6dd@exch07.campus.bath.ac.uk> Message-ID: On Thu, Jan 26, 2017 at 2:20 AM, Andrew McRae wrote: > Okay. I discarded bt quite early since I have no reason to think the > default step size (lambda = 1) is 'good', due to the partial Jacobian. But > I can try it again. > > cp sometimes behaves well, but other times I've seen it do something crazy > like take lambda = 2.5 on the first step. Due to the MA convexity reqs, > the linear system at the second step is then malformed and the solver dies. > > I also briefly tried nleqerr in the past and found it to take a huge > number of iterations, but I can try that again. > Line search is not good at all for functions that wiggle on the scale of your step size. You could try trust region, although I am not sure that is better. Lots of people use "annealing" for this kind of thing, but that is a lot of work. Thanks, Matt > On 25 January 2017 at 19:57, Matthew Knepley wrote: > >> On Wed, Jan 25, 2017 at 1:13 PM, Andrew McRae >> wrote: >> >>> I have a nonlinear problem in which the line search procedure is making >>> 'obviously wrong' choices for lambda. My nonlinear solver options (going >>> via petsc4py) include {"snes_linesearch_type": "l2", >>> "snes_linesearch_max_it": 3}. >>> >>> After monotonically decreasing the residual by about 4 orders of >>> magnitude, I get the following... >>> >>> 15 SNES Function norm 9.211230243067e-06 >>> Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, >>> 3.14838e-05, 9.21123e-06] >>> Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = >>> [3.14183e-05, 3.13437e-05, 3.13039e-05] >>> Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = >>> [3.12969e-05, 3.13273e-05, 3.14183e-05] >>> Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 >>> 16 SNES Function norm 3.129688997145e-05 >>> Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09357e-05, >>> 1.58135e-05, 3.12969e-05] >>> Line search: lambdas = [0.503912, 0.751956, 1.], fnorms = >>> [1.59287e-05, 2.33645e-05, 3.09357e-05] >>> Line search: lambdas = [0.0186202, 0.261266, 0.503912], fnorms = >>> [3.07204e-05, 9.11e-06, 1.59287e-05] >>> Line search terminated: lambda = 0.342426, fnorms = 1.12885e-05 >>> 17 SNES Function norm 1.128846081676e-05 >>> Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09448e-05, >>> 5.94789e-06, 1.12885e-05] >>> Line search: lambdas = [0.295379, 0.64769, 1.], fnorms = >>> [8.09996e-06, 4.46782e-06, 3.09448e-05] >>> Line search: lambdas = [0.48789, 0.391635, 0.295379], fnorms = >>> [6.07286e-06, 7.07842e-06, 8.09996e-06] >>> Line search terminated: lambda = 0.997854, fnorms = 3.09222e-05 >>> 18 SNES Function norm 3.092215965860e-05 >>> >>> So, in iteration 16, the lambda chosen is 0.91..., even though we see >>> that lambda close to 0 would give a smaller residual. In iteration 18, we >>> see that some lambda around 0.65 gives a far smaller residual (approx 4e-6) >>> than the 0.997... value that gets used (which gives approx 3e-5). The >>> nonlinear iterations then get caught in some kind of cycle until my >>> snes_max_it is reached [full log attached]. >>> >>> I guess this is an artifact of (if I understand correctly) trying to >>> minimize some polynomial fitted to the evaluated values of lambda? But >>> it's frustrating that it leads to 'obviously wrong' results! >>> >> >> There might be better line searches for this problem. For example, 'bt' >> should be more robust then 'l2', and 'cp' >> tries really hard to find a minimum. The 'nleqerr' is Deufelhard's search >> that should also be more robust. I would >> try them out to see if its better. >> >> Matt >> >> >>> For background information, this comes from an FE discretisation of a >>> Monge-Amp?re equation (and also from several timesteps into a time-varying >>> problem). For various reasons (related to Monge-Amp?re convexity >>> requirements), I use a partial Jacobian that omits a term from the >>> linearisation of the residual, and so the nonlinear convergence is not >>> expected to be quadratic. >>> >>> Andrew >>> >> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 26 07:58:40 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 26 Jan 2017 07:58:40 -0600 Subject: [petsc-users] MPI_Wtime in fortran In-Reply-To: References: Message-ID: Just use PETSc events, since we will call MPI_Wtime() for you. Matt On Thu, Jan 26, 2017 at 6:41 AM, Praveen C wrote: > To be more precise, my code looks something like this > > implicit none > #include > double precision time > > time = MPI_Wtime() > > I get the warning of conversion from REAL(16) to REAL(8) and the value of > time is always zero. > > Thanks > praveen > > On Thu, Jan 26, 2017 at 5:59 PM, Praveen C wrote: > >> Dear all >> >> In my petsc fortran code, I am using MPI_Wtime to measure times. I include >> >> #include >> >> When compiling, I get warning for MPI_Wtime >> >> *Warning:* Possible change of value in conversion from REAL(16) to >> REAL(8) at (1) [*-Wconversion*] >> >> >> But MPI_Wtime is supposed to return double precision. The values I get >> from this function also seem to be wrong, always zero. >> >> Thanks >> praveen >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Thu Jan 26 08:34:27 2017 From: cpraveen at gmail.com (Praveen C) Date: Thu, 26 Jan 2017 20:04:27 +0530 Subject: [petsc-users] MPI_Wtime in fortran In-Reply-To: References: Message-ID: Thank you Matt. I am able to use events. I used -log_view to see the event logs, but this prints a lot of other stuff. Is it possible to only print event logs ? Best praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 26 08:44:53 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 26 Jan 2017 08:44:53 -0600 Subject: [petsc-users] MPI_Wtime in fortran In-Reply-To: References: Message-ID: On Thu, Jan 26, 2017 at 8:34 AM, Praveen C wrote: > Thank you Matt. I am able to use events. > > I used -log_view to see the event logs, but this prints a lot of other > stuff. Is it possible to only print event logs ? > -log_view ::ascii_info_detail print a Python module with the data -log_view ::ascii_xml prints an XML file which can be looked at with a web browser Thanks, Matt > Best > praveen > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Jan 26 09:58:13 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 26 Jan 2017 09:58:13 -0600 Subject: [petsc-users] MPI_Wtime in fortran In-Reply-To: References: Message-ID: <31C2D51C-AA11-4D42-A02A-FC321E1D2661@mcs.anl.gov> MPI_Wtime() most definitely is suppose to return double precision. So you should not be getting this. Send the complete compiler warning message, cut and paste. Make a pure MPI program (no PETSc) that includes mpif.h and calls MPI_Init() MPI_Wtime() do you get a compiler warning message and incorrect values? Are you using some Fortran compiler flag to promote all single to double? This also promotes all double to quad so we don't recommend it. Look in your mpif.h file does it indicate MPI_Wtime() returns double? Barry > On Jan 26, 2017, at 6:41 AM, Praveen C wrote: > > To be more precise, my code looks something like this > > implicit none > #include > double precision time > > time = MPI_Wtime() > > I get the warning of conversion from REAL(16) to REAL(8) and the value of time is always zero. > > Thanks > praveen > > On Thu, Jan 26, 2017 at 5:59 PM, Praveen C wrote: > Dear all > > In my petsc fortran code, I am using MPI_Wtime to measure times. I include > > #include > > When compiling, I get warning for MPI_Wtime > > Warning: Possible change of value in conversion from REAL(16) to REAL(8) at (1) [-Wconversion] > > But MPI_Wtime is supposed to return double precision. The values I get from this function also seem to be wrong, always zero. > > Thanks > praveen > From cpraveen at gmail.com Thu Jan 26 10:01:49 2017 From: cpraveen at gmail.com (Praveen C) Date: Thu, 26 Jan 2017 21:31:49 +0530 Subject: [petsc-users] MPI_Wtime in fortran In-Reply-To: <31C2D51C-AA11-4D42-A02A-FC321E1D2661@mcs.anl.gov> References: <31C2D51C-AA11-4D42-A02A-FC321E1D2661@mcs.anl.gov> Message-ID: >> Are you using some Fortran compiler flag to promote all single to double? I am doing this :-( Time to fix my reals. Thank you for pointing this out. Best praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From epscodes at gmail.com Thu Jan 26 10:20:10 2017 From: epscodes at gmail.com (Xiangdong) Date: Thu, 26 Jan 2017 11:20:10 -0500 Subject: [petsc-users] MatSetValuesStencil for cols not in the dmda stencil width In-Reply-To: References: Message-ID: Thanks, Matt. For a cell in DMDA 3d with global id (ix,iy,iz), what is the global row id of that cell corresponding to the matrix generated by DMCreateMatrix? It is not always ix + iy*Nx + iz*Ny*Nx for different da_processors_xyz, is it? How can I obtain that global row id? Thanks. Xiangdong On Wed, Jan 25, 2017 at 10:53 AM, Matthew Knepley wrote: > On Wed, Jan 25, 2017 at 9:17 AM, Xiangdong wrote: > >> Hello everyone, >> >> I have a question on setting matrix entries which are not in the stencil >> width. Take ksp ex45.c as an example, >> http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examp >> les/tutorials/ex45.c.html >> >> Instead of setting up the standard 7-point stencil, now for each cell, >> the matrix also has a additional dependency on the cell (Mx, My, Mz). >> Namely, for each row, the col corresponding to (Mx, My, Mz) is always >> nonzero. I modify the example code to add this entries: >> >> + MatSetOption(B,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); >> + MatSetOption(jac,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); >> >> + v[7] = 100; col[7].i = mx-1; col[7].j = my-1; col[7].k = mz-1; >> + ierr = MatSetValuesStencil(B,1,&row,8,col,v,INSERT_VALUES);CHKERRQ( >> ierr); >> >> It is okay to for np=1, but crash for np>=2 with the error message: >> > > You can do this, but > > 1) You cannot use MatSetStencil, since your entry is not actually in your > stencil. You will have to make a separate call to MatSetValues() using the > global index. > > 2) The nonzero pattern we have allocated will be wrong. You will have to > set the MatOption which gives an error on new nonzeros to PETSC_FALSE. > > 3) You will have a dense row in your Jacobian, which is hard to make > perform well, and also stymies most preconditioners. > > Thanks, > > Matt > > >> [0]PETSC ERROR: --------------------- Error Message >> -------------------------------------------------------------- >> [0]PETSC ERROR: Argument out of range >> [0]PETSC ERROR: Local index 342 too large 244 (max) at 7 >> >> [0]PETSC ERROR: #1 ISLocalToGlobalMappingApply() line 423 in >> petsc-3.7.4/src/vec/is/utils/isltog.c >> [0]PETSC ERROR: #2 MatSetValuesLocal() line 2052 in >> petsc-3.7.4/src/mat/interface/matrix.c >> [0]PETSC ERROR: #3 MatSetValuesStencil() line 1447 in >> petsc-3.7.4/src/mat/interface/matrix.c >> [0]PETSC ERROR: #4 ComputeMatrix() line 151 in extest.c >> >> Can I add new entries to the cols not in the stencil width into the dmda >> matrix or Jacobian? >> >> Attached please find the modifed ex45 example, the diff file as well as >> the run log. >> >> Thanks for your help. >> >> Best, >> Xiangdong >> >> > > > -- > What most experimenters take for granted before they begin their > experiments is infinitely more interesting than any results to which their > experiments lead. > -- Norbert Wiener > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 26 10:29:22 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 26 Jan 2017 10:29:22 -0600 Subject: [petsc-users] MatSetValuesStencil for cols not in the dmda stencil width In-Reply-To: References: Message-ID: On Thu, Jan 26, 2017 at 10:20 AM, Xiangdong wrote: > Thanks, Matt. For a cell in DMDA 3d with global id (ix,iy,iz), what is > the global row id of that cell corresponding to the matrix generated by > DMCreateMatrix? It is not always ix + iy*Nx + iz*Ny*Nx for > different da_processors_xyz, is it? How can I obtain that global row id? > It is not simple: 1) Figure out the process division using http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMDA/DMDAGetInfo.html 2) Figure out which process p your i,j,k in on You have to march through the processes, knowing how big the chunk of indices is on each, which you get from 1) and http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMDA/DMDAGetOwnershipRanges.html 3) Add up the sizes from all process less than p, and then the offset on process p. This the global offset This is not a scalable thing to do, which is why we do not include it in the API. Matt > Thanks. > > Xiangdong > > > > On Wed, Jan 25, 2017 at 10:53 AM, Matthew Knepley > wrote: > >> On Wed, Jan 25, 2017 at 9:17 AM, Xiangdong wrote: >> >>> Hello everyone, >>> >>> I have a question on setting matrix entries which are not in the stencil >>> width. Take ksp ex45.c as an example, >>> http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examp >>> les/tutorials/ex45.c.html >>> >>> Instead of setting up the standard 7-point stencil, now for each cell, >>> the matrix also has a additional dependency on the cell (Mx, My, Mz). >>> Namely, for each row, the col corresponding to (Mx, My, Mz) is always >>> nonzero. I modify the example code to add this entries: >>> >>> + MatSetOption(B,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); >>> + MatSetOption(jac,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); >>> >>> + v[7] = 100; col[7].i = mx-1; col[7].j = my-1; col[7].k = mz-1; >>> + ierr = MatSetValuesStencil(B,1,&row,8,col,v,INSERT_VALUES);CHKERRQ( >>> ierr); >>> >>> It is okay to for np=1, but crash for np>=2 with the error message: >>> >> >> You can do this, but >> >> 1) You cannot use MatSetStencil, since your entry is not actually in your >> stencil. You will have to make a separate call to MatSetValues() using the >> global index. >> >> 2) The nonzero pattern we have allocated will be wrong. You will have to >> set the MatOption which gives an error on new nonzeros to PETSC_FALSE. >> >> 3) You will have a dense row in your Jacobian, which is hard to make >> perform well, and also stymies most preconditioners. >> >> Thanks, >> >> Matt >> >> >>> [0]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> [0]PETSC ERROR: Argument out of range >>> [0]PETSC ERROR: Local index 342 too large 244 (max) at 7 >>> >>> [0]PETSC ERROR: #1 ISLocalToGlobalMappingApply() line 423 in >>> petsc-3.7.4/src/vec/is/utils/isltog.c >>> [0]PETSC ERROR: #2 MatSetValuesLocal() line 2052 in >>> petsc-3.7.4/src/mat/interface/matrix.c >>> [0]PETSC ERROR: #3 MatSetValuesStencil() line 1447 in >>> petsc-3.7.4/src/mat/interface/matrix.c >>> [0]PETSC ERROR: #4 ComputeMatrix() line 151 in extest.c >>> >>> Can I add new entries to the cols not in the stencil width into the dmda >>> matrix or Jacobian? >>> >>> Attached please find the modifed ex45 example, the diff file as well as >>> the run log. >>> >>> Thanks for your help. >>> >>> Best, >>> Xiangdong >>> >>> >> >> >> -- >> What most experimenters take for granted before they begin their >> experiments is infinitely more interesting than any results to which their >> experiments lead. >> -- Norbert Wiener >> > > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Thu Jan 26 10:36:42 2017 From: cpraveen at gmail.com (Praveen C) Date: Thu, 26 Jan 2017 22:06:42 +0530 Subject: [petsc-users] VecGhostUpdateBegin/End in -log_view Message-ID: Dear all in -log_view, I see timing info for VecScatterBegin/End. Does this include time info for VecGhostUpdateBegin/End also ? Does the time shown for VecScatterEnd indicate the time process is waiting for scatter to finish ? Thanks praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 26 10:43:50 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 26 Jan 2017 10:43:50 -0600 Subject: [petsc-users] VecGhostUpdateBegin/End in -log_view In-Reply-To: References: Message-ID: On Thu, Jan 26, 2017 at 10:36 AM, Praveen C wrote: > Dear all > > in -log_view, I see timing info for VecScatterBegin/End. Does this include > time info for VecGhostUpdateBegin/End also ? > Yes > Does the time shown for VecScatterEnd indicate the time process is waiting > for scatter to finish ? > Yes. Matt > Thanks > praveen > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Jan 26 11:38:14 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 26 Jan 2017 11:38:14 -0600 Subject: [petsc-users] MatSetValuesStencil for cols not in the dmda stencil width In-Reply-To: References: Message-ID: <009ED760-8194-4D55-AB6C-55872FED2FF2@mcs.anl.gov> In the natural numbering on the mesh it is I = ix + iy*Nx + iz*Ny*Nx but PETSc does not use the natural numbering. You can use DMDAGetAO() then AOApplicationToPetsc() to make the value from the natural numbering to the PETSc number, then this number can be used with MatSetValues(), VecSetValues(), What Matt describes below is a more cumbersome way to achieve the same result. Barry > On Jan 26, 2017, at 10:29 AM, Matthew Knepley wrote: > > On Thu, Jan 26, 2017 at 10:20 AM, Xiangdong wrote: > Thanks, Matt. For a cell in DMDA 3d with global id (ix,iy,iz), what is the global row id of that cell corresponding to the matrix generated by DMCreateMatrix? It is not always ix + iy*Nx + iz*Ny*Nx for different da_processors_xyz, is it? How can I obtain that global row id? > > It is not simple: > > 1) Figure out the process division using > > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMDA/DMDAGetInfo.html > > 2) Figure out which process p your i,j,k in on > > You have to march through the processes, knowing how big the chunk of indices is on each, > which you get from 1) and > > http://www.mcs.anl.gov/petsc/petsc-master/docs/manualpages/DMDA/DMDAGetOwnershipRanges.html > > 3) Add up the sizes from all process less than p, and then the offset on process p. This the global offset > > This is not a scalable thing to do, which is why we do not include it in the API. > > Matt > > Thanks. > > Xiangdong > > > > On Wed, Jan 25, 2017 at 10:53 AM, Matthew Knepley wrote: > On Wed, Jan 25, 2017 at 9:17 AM, Xiangdong wrote: > Hello everyone, > > I have a question on setting matrix entries which are not in the stencil width. Take ksp ex45.c as an example, > http://www.mcs.anl.gov/petsc/petsc-current/src/ksp/ksp/examples/tutorials/ex45.c.html > > Instead of setting up the standard 7-point stencil, now for each cell, the matrix also has a additional dependency on the cell (Mx, My, Mz). Namely, for each row, the col corresponding to (Mx, My, Mz) is always nonzero. I modify the example code to add this entries: > > + MatSetOption(B,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); > + MatSetOption(jac,MAT_NEW_NONZERO_LOCATIONS,PETSC_TRUE); > > + v[7] = 100; col[7].i = mx-1; col[7].j = my-1; col[7].k = mz-1; > + ierr = MatSetValuesStencil(B,1,&row,8,col,v,INSERT_VALUES);CHKERRQ(ierr); > > It is okay to for np=1, but crash for np>=2 with the error message: > > You can do this, but > > 1) You cannot use MatSetStencil, since your entry is not actually in your stencil. You will have to make a separate call to MatSetValues() using the global index. > > 2) The nonzero pattern we have allocated will be wrong. You will have to set the MatOption which gives an error on new nonzeros to PETSC_FALSE. > > 3) You will have a dense row in your Jacobian, which is hard to make perform well, and also stymies most preconditioners. > > Thanks, > > Matt > > [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- > [0]PETSC ERROR: Argument out of range > [0]PETSC ERROR: Local index 342 too large 244 (max) at 7 > > [0]PETSC ERROR: #1 ISLocalToGlobalMappingApply() line 423 in petsc-3.7.4/src/vec/is/utils/isltog.c > [0]PETSC ERROR: #2 MatSetValuesLocal() line 2052 in petsc-3.7.4/src/mat/interface/matrix.c > [0]PETSC ERROR: #3 MatSetValuesStencil() line 1447 in petsc-3.7.4/src/mat/interface/matrix.c > [0]PETSC ERROR: #4 ComputeMatrix() line 151 in extest.c > > Can I add new entries to the cols not in the stencil width into the dmda matrix or Jacobian? > > Attached please find the modifed ex45 example, the diff file as well as the run log. > > Thanks for your help. > > Best, > Xiangdong > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener From bsmith at mcs.anl.gov Thu Jan 26 13:04:07 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 26 Jan 2017 13:04:07 -0600 Subject: [petsc-users] how to stop SNES linesearch (l^2 minimization) from choosing obviously suboptimal lambda? In-Reply-To: References: <12f8eaa2ea984ac88757dc7c03d6e6dd@exch07.campus.bath.ac.uk> Message-ID: <457BBF92-7638-4399-B732-9558F8D59ABB@mcs.anl.gov> 15 SNES Function norm 9.211230243067e-06 Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, 3.14838e-05, 9.21123e-06] Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = [3.14183e-05, 3.13437e-05, 3.13039e-05] Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = [3.12969e-05, 3.13273e-05, 3.14183e-05] Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 In this case it could be that the computed direction is not a descent direction and hence no line search is going to help you. You need a better Jacobian. You can use -snes_mf_operator to use the "true" Jacobian but have the preconditioner built using the "partial Jacobian" that you provide. This makes it more likely you actually end up with a descent direction and is IMHO better than try to do Newton with an incorrect "partial Jacobian". Barry > On Jan 26, 2017, at 2:20 AM, Andrew McRae wrote: > > Okay. I discarded bt quite early since I have no reason to think the default step size (lambda = 1) is 'good', due to the partial Jacobian. But I can try it again. > > cp sometimes behaves well, but other times I've seen it do something crazy like take lambda = 2.5 on the first step. Due to the MA convexity reqs, the linear system at the second step is then malformed and the solver dies. > > I also briefly tried nleqerr in the past and found it to take a huge number of iterations, but I can try that again. > > On 25 January 2017 at 19:57, Matthew Knepley wrote: > On Wed, Jan 25, 2017 at 1:13 PM, Andrew McRae wrote: > I have a nonlinear problem in which the line search procedure is making 'obviously wrong' choices for lambda. My nonlinear solver options (going via petsc4py) include {"snes_linesearch_type": "l2", "snes_linesearch_max_it": 3}. > > After monotonically decreasing the residual by about 4 orders of magnitude, I get the following... > > 15 SNES Function norm 9.211230243067e-06 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.13039e-05, 3.14838e-05, 9.21123e-06] > Line search: lambdas = [1.25615, 1.12808, 1.], fnorms = [3.14183e-05, 3.13437e-05, 3.13039e-05] > Line search: lambdas = [0.91881, 1.08748, 1.25615], fnorms = [3.12969e-05, 3.13273e-05, 3.14183e-05] > Line search terminated: lambda = 0.918811, fnorms = 3.12969e-05 > 16 SNES Function norm 3.129688997145e-05 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09357e-05, 1.58135e-05, 3.12969e-05] > Line search: lambdas = [0.503912, 0.751956, 1.], fnorms = [1.59287e-05, 2.33645e-05, 3.09357e-05] > Line search: lambdas = [0.0186202, 0.261266, 0.503912], fnorms = [3.07204e-05, 9.11e-06, 1.59287e-05] > Line search terminated: lambda = 0.342426, fnorms = 1.12885e-05 > 17 SNES Function norm 1.128846081676e-05 > Line search: lambdas = [1., 0.5, 0.], fnorms = [3.09448e-05, 5.94789e-06, 1.12885e-05] > Line search: lambdas = [0.295379, 0.64769, 1.], fnorms = [8.09996e-06, 4.46782e-06, 3.09448e-05] > Line search: lambdas = [0.48789, 0.391635, 0.295379], fnorms = [6.07286e-06, 7.07842e-06, 8.09996e-06] > Line search terminated: lambda = 0.997854, fnorms = 3.09222e-05 > 18 SNES Function norm 3.092215965860e-05 > > So, in iteration 16, the lambda chosen is 0.91..., even though we see that lambda close to 0 would give a smaller residual. In iteration 18, we see that some lambda around 0.65 gives a far smaller residual (approx 4e-6) than the 0.997... value that gets used (which gives approx 3e-5). The nonlinear iterations then get caught in some kind of cycle until my snes_max_it is reached [full log attached]. > > I guess this is an artifact of (if I understand correctly) trying to minimize some polynomial fitted to the evaluated values of lambda? But it's frustrating that it leads to 'obviously wrong' results! > > There might be better line searches for this problem. For example, 'bt' should be more robust then 'l2', and 'cp' > tries really hard to find a minimum. The 'nleqerr' is Deufelhard's search that should also be more robust. I would > try them out to see if its better. > > Matt > > For background information, this comes from an FE discretisation of a Monge-Amp?re equation (and also from several timesteps into a time-varying problem). For various reasons (related to Monge-Amp?re convexity requirements), I use a partial Jacobian that omits a term from the linearisation of the residual, and so the nonlinear convergence is not expected to be quadratic. > > Andrew > > > > -- > What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. > -- Norbert Wiener > From david.knezevic at akselos.com Thu Jan 26 13:26:04 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Thu, 26 Jan 2017 13:26:04 -0600 Subject: [petsc-users] Understanding inner vs outer fieldsplit convergence Message-ID: I'm exploring fieldsplit with Schur (this continues from some emails I sent a few weeks ago about this topic), and I had a quick question about the inner vs outer convergence. I've pasted the output below from "-ksp_monitor -fieldsplit_FE_split_ksp_monitor", and I'm just wondering about why the second outer iteration has two inner iteration loops, whereas all the other outer iterations have one inner iteration loop? I assume it is something to do with a convergence tolerance, but it's not clear to me which tolerance would control that. Thanks, David ------------------------------------------------------------------ Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 4.742303891408e+01 1 KSP Residual norm 2.909253505630e-01 2 KSP Residual norm 9.891933795059e-02 3 KSP Residual norm 7.147789520745e-02 4 KSP Residual norm 1.668752967907e-02 5 KSP Residual norm 5.019869896662e-03 6 KSP Residual norm 2.848579237244e-03 7 KSP Residual norm 2.847897269641e-03 8 KSP Residual norm 2.840502392022e-03 9 KSP Residual norm 2.831875522381e-03 10 KSP Residual norm 2.688309287993e-03 11 KSP Residual norm 1.351494303229e-03 12 KSP Residual norm 1.350874246297e-03 13 KSP Residual norm 9.154691604943e-06 0 KSP Residual norm 2.254632353893e+02 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 4.742303891408e+01 1 KSP Residual norm 2.909253505630e-01 2 KSP Residual norm 9.891933795059e-02 3 KSP Residual norm 7.147789520745e-02 4 KSP Residual norm 1.668752967907e-02 5 KSP Residual norm 5.019869896662e-03 6 KSP Residual norm 2.848579237244e-03 7 KSP Residual norm 2.847897269641e-03 8 KSP Residual norm 2.840502392022e-03 9 KSP Residual norm 2.831875522381e-03 10 KSP Residual norm 2.688309287993e-03 11 KSP Residual norm 1.351494303229e-03 12 KSP Residual norm 1.350874246297e-03 13 KSP Residual norm 9.154691604943e-06 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 1.554697370480e-05 1 KSP Residual norm 1.554471967929e-05 2 KSP Residual norm 1.551293889691e-05 3 KSP Residual norm 8.031337431574e-06 4 KSP Residual norm 4.137185786243e-06 5 KSP Residual norm 4.066606123330e-06 6 KSP Residual norm 4.051107282928e-06 7 KSP Residual norm 4.047442850256e-06 8 KSP Residual norm 4.047129984657e-06 9 KSP Residual norm 4.030697964677e-06 10 KSP Residual norm 2.882383190940e-06 11 KSP Residual norm 3.325005138484e-07 12 KSP Residual norm 2.107354774516e-07 13 KSP Residual norm 2.107005548204e-07 14 KSP Residual norm 4.399320792736e-08 15 KSP Residual norm 4.236902403786e-08 16 KSP Residual norm 2.932877082709e-08 17 KSP Residual norm 3.881909203171e-09 18 KSP Residual norm 1.107791399514e-09 19 KSP Residual norm 2.645048006100e-11 1 KSP Residual norm 8.266776463696e-01 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 9.262528453386e-08 1 KSP Residual norm 5.683232925010e-10 2 KSP Residual norm 1.915223168286e-10 3 KSP Residual norm 1.397893184942e-10 4 KSP Residual norm 1.691441435404e-11 5 KSP Residual norm 6.138315243419e-12 6 KSP Residual norm 5.576043830003e-12 7 KSP Residual norm 5.574440028225e-12 8 KSP Residual norm 5.559544964428e-12 9 KSP Residual norm 5.539862581746e-12 10 KSP Residual norm 5.258329460152e-12 11 KSP Residual norm 2.643581511791e-12 12 KSP Residual norm 2.641293392449e-12 13 KSP Residual norm 2.354608977643e-14 2 KSP Residual norm 4.450925351013e-07 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 6.653681330477e-14 1 KSP Residual norm 6.650750698147e-14 2 KSP Residual norm 6.111123464526e-14 3 KSP Residual norm 2.026817941567e-14 4 KSP Residual norm 9.604999144183e-15 5 KSP Residual norm 9.208296307424e-15 6 KSP Residual norm 9.196769686859e-15 7 KSP Residual norm 9.185058975459e-15 8 KSP Residual norm 9.180207477303e-15 9 KSP Residual norm 8.991574890909e-15 10 KSP Residual norm 8.032736869820e-15 11 KSP Residual norm 1.536409278928e-15 12 KSP Residual norm 1.177374264280e-15 13 KSP Residual norm 1.175712092044e-15 14 KSP Residual norm 2.572275406087e-16 15 KSP Residual norm 2.548423809711e-16 16 KSP Residual norm 8.616505207588e-17 17 KSP Residual norm 7.563053994201e-18 18 KSP Residual norm 6.807636198601e-18 19 KSP Residual norm 9.747028518744e-19 20 KSP Residual norm 2.419807103570e-21 3 KSP Residual norm 2.986369469883e-09 Residual norms for fieldsplit_FE_split_ solve. 0 KSP Residual norm 7.813223137340e-16 1 KSP Residual norm 4.793103235095e-18 2 KSP Residual norm 1.615526128222e-18 3 KSP Residual norm 1.179102504397e-18 4 KSP Residual norm 1.427467627551e-19 5 KSP Residual norm 5.177440470993e-20 6 KSP Residual norm 4.703763659148e-20 7 KSP Residual norm 4.701953228322e-20 8 KSP Residual norm 4.689269668869e-20 9 KSP Residual norm 4.672625361251e-20 10 KSP Residual norm 4.435174006113e-20 11 KSP Residual norm 2.229156843383e-20 12 KSP Residual norm 2.228887211080e-20 13 KSP Residual norm 3.492936921635e-22 4 KSP Residual norm 3.753341263086e-15 -------------- next part -------------- An HTML attachment was scrubbed... URL: From bsmith at mcs.anl.gov Thu Jan 26 17:50:48 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Thu, 26 Jan 2017 17:50:48 -0600 Subject: [petsc-users] Understanding inner vs outer fieldsplit convergence In-Reply-To: References: Message-ID: <93603F02-2D76-438F-9DB8-65DF77174E64@mcs.anl.gov> David, with Schur complement preconditioning the nesting can be rather complicated and hard to track. We need to know exactly what monitors you have turned on and the output from -ksp_view in order to understand why you are seeing this (seemingly) strange effect. Send all command line arguments and if you are running a PETSc example. > On Jan 26, 2017, at 1:26 PM, David Knezevic wrote: > > I'm exploring fieldsplit with Schur (this continues from some emails I sent a few weeks ago about this topic), and I had a quick question about the inner vs outer convergence. > > I've pasted the output below from "-ksp_monitor -fieldsplit_FE_split_ksp_monitor", and I'm just wondering about why the second outer iteration has two inner iteration loops, whereas all the other outer iterations have one inner iteration loop? I assume it is something to do with a convergence tolerance, but it's not clear to me which tolerance would control that. > > Thanks, > David > > ------------------------------------------------------------------ > > Residual norms for fieldsplit_FE_split_ solve. > 0 KSP Residual norm 4.742303891408e+01 > 1 KSP Residual norm 2.909253505630e-01 > 2 KSP Residual norm 9.891933795059e-02 > 3 KSP Residual norm 7.147789520745e-02 > 4 KSP Residual norm 1.668752967907e-02 > 5 KSP Residual norm 5.019869896662e-03 > 6 KSP Residual norm 2.848579237244e-03 > 7 KSP Residual norm 2.847897269641e-03 > 8 KSP Residual norm 2.840502392022e-03 > 9 KSP Residual norm 2.831875522381e-03 > 10 KSP Residual norm 2.688309287993e-03 > 11 KSP Residual norm 1.351494303229e-03 > 12 KSP Residual norm 1.350874246297e-03 > 13 KSP Residual norm 9.154691604943e-06 > 0 KSP Residual norm 2.254632353893e+02 > Residual norms for fieldsplit_FE_split_ solve. > 0 KSP Residual norm 4.742303891408e+01 > 1 KSP Residual norm 2.909253505630e-01 > 2 KSP Residual norm 9.891933795059e-02 > 3 KSP Residual norm 7.147789520745e-02 > 4 KSP Residual norm 1.668752967907e-02 > 5 KSP Residual norm 5.019869896662e-03 > 6 KSP Residual norm 2.848579237244e-03 > 7 KSP Residual norm 2.847897269641e-03 > 8 KSP Residual norm 2.840502392022e-03 > 9 KSP Residual norm 2.831875522381e-03 > 10 KSP Residual norm 2.688309287993e-03 > 11 KSP Residual norm 1.351494303229e-03 > 12 KSP Residual norm 1.350874246297e-03 > 13 KSP Residual norm 9.154691604943e-06 > Residual norms for fieldsplit_FE_split_ solve. > 0 KSP Residual norm 1.554697370480e-05 > 1 KSP Residual norm 1.554471967929e-05 > 2 KSP Residual norm 1.551293889691e-05 > 3 KSP Residual norm 8.031337431574e-06 > 4 KSP Residual norm 4.137185786243e-06 > 5 KSP Residual norm 4.066606123330e-06 > 6 KSP Residual norm 4.051107282928e-06 > 7 KSP Residual norm 4.047442850256e-06 > 8 KSP Residual norm 4.047129984657e-06 > 9 KSP Residual norm 4.030697964677e-06 > 10 KSP Residual norm 2.882383190940e-06 > 11 KSP Residual norm 3.325005138484e-07 > 12 KSP Residual norm 2.107354774516e-07 > 13 KSP Residual norm 2.107005548204e-07 > 14 KSP Residual norm 4.399320792736e-08 > 15 KSP Residual norm 4.236902403786e-08 > 16 KSP Residual norm 2.932877082709e-08 > 17 KSP Residual norm 3.881909203171e-09 > 18 KSP Residual norm 1.107791399514e-09 > 19 KSP Residual norm 2.645048006100e-11 > 1 KSP Residual norm 8.266776463696e-01 > Residual norms for fieldsplit_FE_split_ solve. > 0 KSP Residual norm 9.262528453386e-08 > 1 KSP Residual norm 5.683232925010e-10 > 2 KSP Residual norm 1.915223168286e-10 > 3 KSP Residual norm 1.397893184942e-10 > 4 KSP Residual norm 1.691441435404e-11 > 5 KSP Residual norm 6.138315243419e-12 > 6 KSP Residual norm 5.576043830003e-12 > 7 KSP Residual norm 5.574440028225e-12 > 8 KSP Residual norm 5.559544964428e-12 > 9 KSP Residual norm 5.539862581746e-12 > 10 KSP Residual norm 5.258329460152e-12 > 11 KSP Residual norm 2.643581511791e-12 > 12 KSP Residual norm 2.641293392449e-12 > 13 KSP Residual norm 2.354608977643e-14 > 2 KSP Residual norm 4.450925351013e-07 > Residual norms for fieldsplit_FE_split_ solve. > 0 KSP Residual norm 6.653681330477e-14 > 1 KSP Residual norm 6.650750698147e-14 > 2 KSP Residual norm 6.111123464526e-14 > 3 KSP Residual norm 2.026817941567e-14 > 4 KSP Residual norm 9.604999144183e-15 > 5 KSP Residual norm 9.208296307424e-15 > 6 KSP Residual norm 9.196769686859e-15 > 7 KSP Residual norm 9.185058975459e-15 > 8 KSP Residual norm 9.180207477303e-15 > 9 KSP Residual norm 8.991574890909e-15 > 10 KSP Residual norm 8.032736869820e-15 > 11 KSP Residual norm 1.536409278928e-15 > 12 KSP Residual norm 1.177374264280e-15 > 13 KSP Residual norm 1.175712092044e-15 > 14 KSP Residual norm 2.572275406087e-16 > 15 KSP Residual norm 2.548423809711e-16 > 16 KSP Residual norm 8.616505207588e-17 > 17 KSP Residual norm 7.563053994201e-18 > 18 KSP Residual norm 6.807636198601e-18 > 19 KSP Residual norm 9.747028518744e-19 > 20 KSP Residual norm 2.419807103570e-21 > 3 KSP Residual norm 2.986369469883e-09 > Residual norms for fieldsplit_FE_split_ solve. > 0 KSP Residual norm 7.813223137340e-16 > 1 KSP Residual norm 4.793103235095e-18 > 2 KSP Residual norm 1.615526128222e-18 > 3 KSP Residual norm 1.179102504397e-18 > 4 KSP Residual norm 1.427467627551e-19 > 5 KSP Residual norm 5.177440470993e-20 > 6 KSP Residual norm 4.703763659148e-20 > 7 KSP Residual norm 4.701953228322e-20 > 8 KSP Residual norm 4.689269668869e-20 > 9 KSP Residual norm 4.672625361251e-20 > 10 KSP Residual norm 4.435174006113e-20 > 11 KSP Residual norm 2.229156843383e-20 > 12 KSP Residual norm 2.228887211080e-20 > 13 KSP Residual norm 3.492936921635e-22 > 4 KSP Residual norm 3.753341263086e-15 From david.knezevic at akselos.com Thu Jan 26 21:36:08 2017 From: david.knezevic at akselos.com (David Knezevic) Date: Thu, 26 Jan 2017 22:36:08 -0500 Subject: [petsc-users] Understanding inner vs outer fieldsplit convergence In-Reply-To: <93603F02-2D76-438F-9DB8-65DF77174E64@mcs.anl.gov> References: <93603F02-2D76-438F-9DB8-65DF77174E64@mcs.anl.gov> Message-ID: On Thu, Jan 26, 2017 at 6:50 PM, Barry Smith wrote: > > David, with Schur complement preconditioning the nesting can be rather > complicated and hard to track. We need to know exactly what monitors you > have turned on and the output from -ksp_view in order to understand why you > are seeing this (seemingly) strange effect. Send all command line arguments > and if you are running a PETSc example. > Hi Barry, Thanks for your comments. The command line arguments I'm using are: -ksp_view -ksp_monitor -ksp_type cg -pc_type fieldsplit -pc_fieldsplit_type schur -fieldsplit_block_2_ksp_monitor -pc_fieldsplit_type schur -fieldsplit_block_1_pc_type cholesky -fieldsplit_block_1_ksp_type preonly -fieldsplit_block_2_pc_type cholesky -fieldsplit_block_2_ksp_type minres Note that I cleaned up my code a bit (I was doing some things via code previously, but now I'm doing it all via command line arguments) and now I can't replicate the strange effect that I reported in my previous email, so I guess that must have been a bug at my end, sorry about that! (I'm using my own example here, not a PETSc example.) I've pasted the output that I get below, and it doesn't have any repeated inner iterations anymore, so it looks good to me now. Thanks, David Residual norms for fieldsplit_block_2_ solve. 0 KSP Residual norm 4.742303891404e+01 1 KSP Residual norm 2.909253505627e-01 2 KSP Residual norm 9.891933795043e-02 3 KSP Residual norm 7.147789520726e-02 4 KSP Residual norm 1.668752975827e-02 5 KSP Residual norm 5.020016959860e-03 6 KSP Residual norm 2.939648855177e-03 7 KSP Residual norm 2.847937179284e-03 8 KSP Residual norm 2.840502402597e-03 9 KSP Residual norm 2.831875522390e-03 10 KSP Residual norm 2.688309294147e-03 11 KSP Residual norm 1.364866662320e-03 12 KSP Residual norm 1.351460340317e-03 13 KSP Residual norm 9.154713357796e-06 0 KSP Residual norm 2.254632359666e+02 Residual norms for fieldsplit_block_2_ solve. 0 KSP Residual norm 1.562796164027e-05 1 KSP Residual norm 1.562568942204e-05 2 KSP Residual norm 1.559644955712e-05 3 KSP Residual norm 4.739731314562e-06 4 KSP Residual norm 4.099203626623e-06 5 KSP Residual norm 4.077340563505e-06 6 KSP Residual norm 4.069934161136e-06 7 KSP Residual norm 4.069081720832e-06 8 KSP Residual norm 4.056610946411e-06 9 KSP Residual norm 4.049977739107e-06 10 KSP Residual norm 6.930727940912e-07 11 KSP Residual norm 2.525753277129e-07 12 KSP Residual norm 2.490106038182e-07 13 KSP Residual norm 2.117790443270e-07 14 KSP Residual norm 4.278506164384e-08 15 KSP Residual norm 4.258945198001e-08 16 KSP Residual norm 4.229334447981e-08 17 KSP Residual norm 4.411718412575e-09 18 KSP Residual norm 2.538786960344e-10 19 KSP Residual norm 1.938707294127e-11 1 KSP Residual norm 8.266789276240e-01 Residual norms for fieldsplit_block_2_ solve. 0 KSP Residual norm 9.262098141749e-08 1 KSP Residual norm 5.654510775654e-10 2 KSP Residual norm 1.850940554946e-10 3 KSP Residual norm 1.215872907413e-10 4 KSP Residual norm 3.606362942651e-11 5 KSP Residual norm 9.320062914875e-12 6 KSP Residual norm 5.745183182943e-12 7 KSP Residual norm 5.567033341218e-12 8 KSP Residual norm 5.549943035966e-12 9 KSP Residual norm 5.535981844425e-12 10 KSP Residual norm 5.145409580427e-12 11 KSP Residual norm 2.666485042347e-12 12 KSP Residual norm 2.642698026095e-12 13 KSP Residual norm 2.351888597601e-14 2 KSP Residual norm 4.408818087377e-07 Residual norms for fieldsplit_block_2_ solve. 0 KSP Residual norm 6.995890260994e-14 1 KSP Residual norm 6.992908711883e-14 2 KSP Residual norm 6.975805821489e-14 3 KSP Residual norm 3.895698269070e-14 4 KSP Residual norm 1.087381803170e-14 5 KSP Residual norm 9.649779038703e-15 6 KSP Residual norm 9.639503278208e-15 7 KSP Residual norm 9.632063248075e-15 8 KSP Residual norm 9.628902454482e-15 9 KSP Residual norm 9.422064141948e-15 10 KSP Residual norm 7.978243292438e-15 11 KSP Residual norm 1.840364998277e-15 12 KSP Residual norm 1.313440209393e-15 13 KSP Residual norm 1.237317810391e-15 14 KSP Residual norm 3.450624715175e-16 15 KSP Residual norm 2.677943917771e-16 16 KSP Residual norm 2.477309550239e-16 17 KSP Residual norm 2.379077634512e-16 18 KSP Residual norm 2.195084093320e-17 19 KSP Residual norm 3.440267981045e-18 20 KSP Residual norm 1.678278636525e-19 3 KSP Residual norm 2.982422976816e-09 Residual norms for fieldsplit_block_2_ solve. 0 KSP Residual norm 7.797636927781e-16 1 KSP Residual norm 4.773690049123e-18 2 KSP Residual norm 1.562856784196e-18 3 KSP Residual norm 1.029604947226e-18 4 KSP Residual norm 3.077276285355e-19 5 KSP Residual norm 7.977399411645e-20 6 KSP Residual norm 4.916849936508e-20 7 KSP Residual norm 4.764440575250e-20 8 KSP Residual norm 4.678272883449e-20 9 KSP Residual norm 4.672106002328e-20 10 KSP Residual norm 4.395115290861e-20 11 KSP Residual norm 2.248590969592e-20 12 KSP Residual norm 2.226836892279e-20 13 KSP Residual norm 3.483134646974e-22 4 KSP Residual norm 3.713946450555e-15 KSP Object: 1 MPI processes type: cg maximum iterations=1000, initial guess is zero tolerances: relative=1e-12, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: 1 MPI processes type: fieldsplit FieldSplit with Schur preconditioner, factorization FULL Preconditioner for the Schur complement formed from A11 Split info: Split number 0 Defined by IS Split number 1 Defined by IS KSP solver for A00 block KSP Object: (fieldsplit_block_1_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_block_1_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 5., needed 1.14458 Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=30, cols=30 package used to perform factorization: petsc total: nonzeros=285, allocated nonzeros=285 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: (fieldsplit_block_1_) 1 MPI processes type: seqaij rows=30, cols=30 total: nonzeros=468, allocated nonzeros=468 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 10 nodes, limit used is 5 KSP solver for S = A11 - A10 inv(A00) A01 KSP Object: (fieldsplit_block_2_) 1 MPI processes type: minres maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using PRECONDITIONED norm type for convergence test PC Object: (fieldsplit_block_2_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 5., needed 23.1555 Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=10512, cols=10512 package used to perform factorization: petsc total: nonzeros=4435992, allocated nonzeros=4435992 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix followed by preconditioner matrix: Mat Object: (fieldsplit_block_2_) 1 MPI processes type: schurcomplement rows=10512, cols=10512 Schur complement A11 - A10 inv(A00) A01 A11 Mat Object: (fieldsplit_block_2_) 1 MPI processes type: seqaij rows=10512, cols=10512 total: nonzeros=372636, allocated nonzeros=372636 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3504 nodes, limit used is 5 A10 Mat Object: 1 MPI processes type: seqaij rows=10512, cols=30 total: nonzeros=1872, allocated nonzeros=1872 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 2143 nodes, limit used is 5 KSP of A00 KSP Object: (fieldsplit_block_1_) 1 MPI processes type: preonly maximum iterations=10000, initial guess is zero tolerances: relative=1e-05, absolute=1e-50, divergence=10000. left preconditioning using NONE norm type for convergence test PC Object: (fieldsplit_block_1_) 1 MPI processes type: cholesky Cholesky: out-of-place factorization tolerance for zero pivot 2.22045e-14 matrix ordering: natural factor fill ratio given 5., needed 1.14458 Factored matrix follows: Mat Object: 1 MPI processes type: seqsbaij rows=30, cols=30 package used to perform factorization: petsc total: nonzeros=285, allocated nonzeros=285 total number of mallocs used during MatSetValues calls =0 block size is 1 linear system matrix = precond matrix: Mat Object: (fieldsplit_block_1_) 1 MPI processes type: seqaij rows=30, cols=30 total: nonzeros=468, allocated nonzeros=468 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 10 nodes, limit used is 5 A01 Mat Object: 1 MPI processes type: seqaij rows=30, cols=10512 total: nonzeros=1872, allocated nonzeros=1872 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 9 nodes, limit used is 5 Mat Object: (fieldsplit_block_2_) 1 MPI processes type: seqaij rows=10512, cols=10512 total: nonzeros=372636, allocated nonzeros=372636 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3504 nodes, limit used is 5 linear system matrix = precond matrix: Mat Object: () 1 MPI processes type: seqaij rows=10542, cols=10542 total: nonzeros=376848, allocated nonzeros=377064 total number of mallocs used during MatSetValues calls =0 using I-node routines: found 3514 nodes, limit used is 5 > > > > > On Jan 26, 2017, at 1:26 PM, David Knezevic > wrote: > > > > I'm exploring fieldsplit with Schur (this continues from some emails I > sent a few weeks ago about this topic), and I had a quick question about > the inner vs outer convergence. > > > > I've pasted the output below from "-ksp_monitor -fieldsplit_FE_split_ksp_monitor", > and I'm just wondering about why the second outer iteration has two inner > iteration loops, whereas all the other outer iterations have one inner > iteration loop? I assume it is something to do with a convergence > tolerance, but it's not clear to me which tolerance would control that. > > > > Thanks, > > David > > > > ------------------------------------------------------------------ > > > > Residual norms for fieldsplit_FE_split_ solve. > > 0 KSP Residual norm 4.742303891408e+01 > > 1 KSP Residual norm 2.909253505630e-01 > > 2 KSP Residual norm 9.891933795059e-02 > > 3 KSP Residual norm 7.147789520745e-02 > > 4 KSP Residual norm 1.668752967907e-02 > > 5 KSP Residual norm 5.019869896662e-03 > > 6 KSP Residual norm 2.848579237244e-03 > > 7 KSP Residual norm 2.847897269641e-03 > > 8 KSP Residual norm 2.840502392022e-03 > > 9 KSP Residual norm 2.831875522381e-03 > > 10 KSP Residual norm 2.688309287993e-03 > > 11 KSP Residual norm 1.351494303229e-03 > > 12 KSP Residual norm 1.350874246297e-03 > > 13 KSP Residual norm 9.154691604943e-06 > > 0 KSP Residual norm 2.254632353893e+02 > > Residual norms for fieldsplit_FE_split_ solve. > > 0 KSP Residual norm 4.742303891408e+01 > > 1 KSP Residual norm 2.909253505630e-01 > > 2 KSP Residual norm 9.891933795059e-02 > > 3 KSP Residual norm 7.147789520745e-02 > > 4 KSP Residual norm 1.668752967907e-02 > > 5 KSP Residual norm 5.019869896662e-03 > > 6 KSP Residual norm 2.848579237244e-03 > > 7 KSP Residual norm 2.847897269641e-03 > > 8 KSP Residual norm 2.840502392022e-03 > > 9 KSP Residual norm 2.831875522381e-03 > > 10 KSP Residual norm 2.688309287993e-03 > > 11 KSP Residual norm 1.351494303229e-03 > > 12 KSP Residual norm 1.350874246297e-03 > > 13 KSP Residual norm 9.154691604943e-06 > > Residual norms for fieldsplit_FE_split_ solve. > > 0 KSP Residual norm 1.554697370480e-05 > > 1 KSP Residual norm 1.554471967929e-05 > > 2 KSP Residual norm 1.551293889691e-05 > > 3 KSP Residual norm 8.031337431574e-06 > > 4 KSP Residual norm 4.137185786243e-06 > > 5 KSP Residual norm 4.066606123330e-06 > > 6 KSP Residual norm 4.051107282928e-06 > > 7 KSP Residual norm 4.047442850256e-06 > > 8 KSP Residual norm 4.047129984657e-06 > > 9 KSP Residual norm 4.030697964677e-06 > > 10 KSP Residual norm 2.882383190940e-06 > > 11 KSP Residual norm 3.325005138484e-07 > > 12 KSP Residual norm 2.107354774516e-07 > > 13 KSP Residual norm 2.107005548204e-07 > > 14 KSP Residual norm 4.399320792736e-08 > > 15 KSP Residual norm 4.236902403786e-08 > > 16 KSP Residual norm 2.932877082709e-08 > > 17 KSP Residual norm 3.881909203171e-09 > > 18 KSP Residual norm 1.107791399514e-09 > > 19 KSP Residual norm 2.645048006100e-11 > > 1 KSP Residual norm 8.266776463696e-01 > > Residual norms for fieldsplit_FE_split_ solve. > > 0 KSP Residual norm 9.262528453386e-08 > > 1 KSP Residual norm 5.683232925010e-10 > > 2 KSP Residual norm 1.915223168286e-10 > > 3 KSP Residual norm 1.397893184942e-10 > > 4 KSP Residual norm 1.691441435404e-11 > > 5 KSP Residual norm 6.138315243419e-12 > > 6 KSP Residual norm 5.576043830003e-12 > > 7 KSP Residual norm 5.574440028225e-12 > > 8 KSP Residual norm 5.559544964428e-12 > > 9 KSP Residual norm 5.539862581746e-12 > > 10 KSP Residual norm 5.258329460152e-12 > > 11 KSP Residual norm 2.643581511791e-12 > > 12 KSP Residual norm 2.641293392449e-12 > > 13 KSP Residual norm 2.354608977643e-14 > > 2 KSP Residual norm 4.450925351013e-07 > > Residual norms for fieldsplit_FE_split_ solve. > > 0 KSP Residual norm 6.653681330477e-14 > > 1 KSP Residual norm 6.650750698147e-14 > > 2 KSP Residual norm 6.111123464526e-14 > > 3 KSP Residual norm 2.026817941567e-14 > > 4 KSP Residual norm 9.604999144183e-15 > > 5 KSP Residual norm 9.208296307424e-15 > > 6 KSP Residual norm 9.196769686859e-15 > > 7 KSP Residual norm 9.185058975459e-15 > > 8 KSP Residual norm 9.180207477303e-15 > > 9 KSP Residual norm 8.991574890909e-15 > > 10 KSP Residual norm 8.032736869820e-15 > > 11 KSP Residual norm 1.536409278928e-15 > > 12 KSP Residual norm 1.177374264280e-15 > > 13 KSP Residual norm 1.175712092044e-15 > > 14 KSP Residual norm 2.572275406087e-16 > > 15 KSP Residual norm 2.548423809711e-16 > > 16 KSP Residual norm 8.616505207588e-17 > > 17 KSP Residual norm 7.563053994201e-18 > > 18 KSP Residual norm 6.807636198601e-18 > > 19 KSP Residual norm 9.747028518744e-19 > > 20 KSP Residual norm 2.419807103570e-21 > > 3 KSP Residual norm 2.986369469883e-09 > > Residual norms for fieldsplit_FE_split_ solve. > > 0 KSP Residual norm 7.813223137340e-16 > > 1 KSP Residual norm 4.793103235095e-18 > > 2 KSP Residual norm 1.615526128222e-18 > > 3 KSP Residual norm 1.179102504397e-18 > > 4 KSP Residual norm 1.427467627551e-19 > > 5 KSP Residual norm 5.177440470993e-20 > > 6 KSP Residual norm 4.703763659148e-20 > > 7 KSP Residual norm 4.701953228322e-20 > > 8 KSP Residual norm 4.689269668869e-20 > > 9 KSP Residual norm 4.672625361251e-20 > > 10 KSP Residual norm 4.435174006113e-20 > > 11 KSP Residual norm 2.229156843383e-20 > > 12 KSP Residual norm 2.228887211080e-20 > > 13 KSP Residual norm 3.492936921635e-22 > > 4 KSP Residual norm 3.753341263086e-15 > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Thu Jan 26 21:50:07 2017 From: knepley at gmail.com (Matthew Knepley) Date: Thu, 26 Jan 2017 21:50:07 -0600 Subject: [petsc-users] Understanding inner vs outer fieldsplit convergence In-Reply-To: References: <93603F02-2D76-438F-9DB8-65DF77174E64@mcs.anl.gov> Message-ID: On Thu, Jan 26, 2017 at 9:36 PM, David Knezevic wrote: > On Thu, Jan 26, 2017 at 6:50 PM, Barry Smith wrote: > >> >> David, with Schur complement preconditioning the nesting can be rather >> complicated and hard to track. We need to know exactly what monitors you >> have turned on and the output from -ksp_view in order to understand why you >> are seeing this (seemingly) strange effect. Send all command line arguments >> and if you are running a PETSc example. >> > > Hi Barry, > > Thanks for your comments. The command line arguments I'm using are: > > -ksp_view -ksp_monitor -ksp_type cg -pc_type fieldsplit > -pc_fieldsplit_type schur -fieldsplit_block_2_ksp_monitor > -pc_fieldsplit_type schur -fieldsplit_block_1_pc_type cholesky > -fieldsplit_block_1_ksp_type preonly -fieldsplit_block_2_pc_type cholesky > -fieldsplit_block_2_ksp_type minres > preonly messes with monitors because "nothing is supposed to happen" so ksp monitors don't fire. I would start with -ksp_type richardson -ksp_max_it 1 Matt > Note that I cleaned up my code a bit (I was doing some things via code > previously, but now I'm doing it all via command line arguments) and now I > can't replicate the strange effect that I reported in my previous email, so > I guess that must have been a bug at my end, sorry about that! (I'm using > my own example here, not a PETSc example.) > > I've pasted the output that I get below, and it doesn't have any repeated > inner iterations anymore, so it looks good to me now. > > Thanks, > David > > Residual norms for fieldsplit_block_2_ solve. > 0 KSP Residual norm 4.742303891404e+01 > 1 KSP Residual norm 2.909253505627e-01 > 2 KSP Residual norm 9.891933795043e-02 > 3 KSP Residual norm 7.147789520726e-02 > 4 KSP Residual norm 1.668752975827e-02 > 5 KSP Residual norm 5.020016959860e-03 > 6 KSP Residual norm 2.939648855177e-03 > 7 KSP Residual norm 2.847937179284e-03 > 8 KSP Residual norm 2.840502402597e-03 > 9 KSP Residual norm 2.831875522390e-03 > 10 KSP Residual norm 2.688309294147e-03 > 11 KSP Residual norm 1.364866662320e-03 > 12 KSP Residual norm 1.351460340317e-03 > 13 KSP Residual norm 9.154713357796e-06 > 0 KSP Residual norm 2.254632359666e+02 > Residual norms for fieldsplit_block_2_ solve. > 0 KSP Residual norm 1.562796164027e-05 > 1 KSP Residual norm 1.562568942204e-05 > 2 KSP Residual norm 1.559644955712e-05 > 3 KSP Residual norm 4.739731314562e-06 > 4 KSP Residual norm 4.099203626623e-06 > 5 KSP Residual norm 4.077340563505e-06 > 6 KSP Residual norm 4.069934161136e-06 > 7 KSP Residual norm 4.069081720832e-06 > 8 KSP Residual norm 4.056610946411e-06 > 9 KSP Residual norm 4.049977739107e-06 > 10 KSP Residual norm 6.930727940912e-07 > 11 KSP Residual norm 2.525753277129e-07 > 12 KSP Residual norm 2.490106038182e-07 > 13 KSP Residual norm 2.117790443270e-07 > 14 KSP Residual norm 4.278506164384e-08 > 15 KSP Residual norm 4.258945198001e-08 > 16 KSP Residual norm 4.229334447981e-08 > 17 KSP Residual norm 4.411718412575e-09 > 18 KSP Residual norm 2.538786960344e-10 > 19 KSP Residual norm 1.938707294127e-11 > 1 KSP Residual norm 8.266789276240e-01 > Residual norms for fieldsplit_block_2_ solve. > 0 KSP Residual norm 9.262098141749e-08 > 1 KSP Residual norm 5.654510775654e-10 > 2 KSP Residual norm 1.850940554946e-10 > 3 KSP Residual norm 1.215872907413e-10 > 4 KSP Residual norm 3.606362942651e-11 > 5 KSP Residual norm 9.320062914875e-12 > 6 KSP Residual norm 5.745183182943e-12 > 7 KSP Residual norm 5.567033341218e-12 > 8 KSP Residual norm 5.549943035966e-12 > 9 KSP Residual norm 5.535981844425e-12 > 10 KSP Residual norm 5.145409580427e-12 > 11 KSP Residual norm 2.666485042347e-12 > 12 KSP Residual norm 2.642698026095e-12 > 13 KSP Residual norm 2.351888597601e-14 > 2 KSP Residual norm 4.408818087377e-07 > Residual norms for fieldsplit_block_2_ solve. > 0 KSP Residual norm 6.995890260994e-14 > 1 KSP Residual norm 6.992908711883e-14 > 2 KSP Residual norm 6.975805821489e-14 > 3 KSP Residual norm 3.895698269070e-14 > 4 KSP Residual norm 1.087381803170e-14 > 5 KSP Residual norm 9.649779038703e-15 > 6 KSP Residual norm 9.639503278208e-15 > 7 KSP Residual norm 9.632063248075e-15 > 8 KSP Residual norm 9.628902454482e-15 > 9 KSP Residual norm 9.422064141948e-15 > 10 KSP Residual norm 7.978243292438e-15 > 11 KSP Residual norm 1.840364998277e-15 > 12 KSP Residual norm 1.313440209393e-15 > 13 KSP Residual norm 1.237317810391e-15 > 14 KSP Residual norm 3.450624715175e-16 > 15 KSP Residual norm 2.677943917771e-16 > 16 KSP Residual norm 2.477309550239e-16 > 17 KSP Residual norm 2.379077634512e-16 > 18 KSP Residual norm 2.195084093320e-17 > 19 KSP Residual norm 3.440267981045e-18 > 20 KSP Residual norm 1.678278636525e-19 > 3 KSP Residual norm 2.982422976816e-09 > Residual norms for fieldsplit_block_2_ solve. > 0 KSP Residual norm 7.797636927781e-16 > 1 KSP Residual norm 4.773690049123e-18 > 2 KSP Residual norm 1.562856784196e-18 > 3 KSP Residual norm 1.029604947226e-18 > 4 KSP Residual norm 3.077276285355e-19 > 5 KSP Residual norm 7.977399411645e-20 > 6 KSP Residual norm 4.916849936508e-20 > 7 KSP Residual norm 4.764440575250e-20 > 8 KSP Residual norm 4.678272883449e-20 > 9 KSP Residual norm 4.672106002328e-20 > 10 KSP Residual norm 4.395115290861e-20 > 11 KSP Residual norm 2.248590969592e-20 > 12 KSP Residual norm 2.226836892279e-20 > 13 KSP Residual norm 3.483134646974e-22 > 4 KSP Residual norm 3.713946450555e-15 > KSP Object: 1 MPI processes > type: cg > maximum iterations=1000, initial guess is zero > tolerances: relative=1e-12, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: 1 MPI processes > type: fieldsplit > FieldSplit with Schur preconditioner, factorization FULL > Preconditioner for the Schur complement formed from A11 > Split info: > Split number 0 Defined by IS > Split number 1 Defined by IS > KSP solver for A00 block > KSP Object: (fieldsplit_block_1_) 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_block_1_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 5., needed 1.14458 > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqsbaij > rows=30, cols=30 > package used to perform factorization: petsc > total: nonzeros=285, allocated nonzeros=285 > total number of mallocs used during MatSetValues calls =0 > block size is 1 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_block_1_) 1 MPI processes > type: seqaij > rows=30, cols=30 > total: nonzeros=468, allocated nonzeros=468 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 10 nodes, limit used is 5 > KSP solver for S = A11 - A10 inv(A00) A01 > KSP Object: (fieldsplit_block_2_) 1 MPI processes > type: minres > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, divergence=10000. > left preconditioning > using PRECONDITIONED norm type for convergence test > PC Object: (fieldsplit_block_2_) 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 5., needed 23.1555 > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqsbaij > rows=10512, cols=10512 > package used to perform factorization: petsc > total: nonzeros=4435992, allocated nonzeros=4435992 > total number of mallocs used during MatSetValues calls =0 > block size is 1 > linear system matrix followed by preconditioner matrix: > Mat Object: (fieldsplit_block_2_) 1 MPI processes > type: schurcomplement > rows=10512, cols=10512 > Schur complement A11 - A10 inv(A00) A01 > A11 > Mat Object: (fieldsplit_block_2_) > 1 MPI processes > type: seqaij > rows=10512, cols=10512 > total: nonzeros=372636, allocated nonzeros=372636 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 3504 nodes, limit used is 5 > A10 > Mat Object: 1 MPI processes > type: seqaij > rows=10512, cols=30 > total: nonzeros=1872, allocated nonzeros=1872 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 2143 nodes, limit used is 5 > KSP of A00 > KSP Object: (fieldsplit_block_1_) > 1 MPI processes > type: preonly > maximum iterations=10000, initial guess is zero > tolerances: relative=1e-05, absolute=1e-50, > divergence=10000. > left preconditioning > using NONE norm type for convergence test > PC Object: (fieldsplit_block_1_) > 1 MPI processes > type: cholesky > Cholesky: out-of-place factorization > tolerance for zero pivot 2.22045e-14 > matrix ordering: natural > factor fill ratio given 5., needed 1.14458 > Factored matrix follows: > Mat Object: 1 MPI processes > type: seqsbaij > rows=30, cols=30 > package used to perform factorization: petsc > total: nonzeros=285, allocated nonzeros=285 > total number of mallocs used during MatSetValues > calls =0 > block size is 1 > linear system matrix = precond matrix: > Mat Object: (fieldsplit_block_1_) > 1 MPI processes > type: seqaij > rows=30, cols=30 > total: nonzeros=468, allocated nonzeros=468 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 10 nodes, limit used is 5 > A01 > Mat Object: 1 MPI processes > type: seqaij > rows=30, cols=10512 > total: nonzeros=1872, allocated nonzeros=1872 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 9 nodes, limit used is 5 > Mat Object: (fieldsplit_block_2_) 1 MPI processes > type: seqaij > rows=10512, cols=10512 > total: nonzeros=372636, allocated nonzeros=372636 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 3504 nodes, limit used is 5 > linear system matrix = precond matrix: > Mat Object: () 1 MPI processes > type: seqaij > rows=10542, cols=10542 > total: nonzeros=376848, allocated nonzeros=377064 > total number of mallocs used during MatSetValues calls =0 > using I-node routines: found 3514 nodes, limit used is 5 > > > > > >> >> >> >> > On Jan 26, 2017, at 1:26 PM, David Knezevic >> wrote: >> > >> > I'm exploring fieldsplit with Schur (this continues from some emails I >> sent a few weeks ago about this topic), and I had a quick question about >> the inner vs outer convergence. >> > >> > I've pasted the output below from "-ksp_monitor >> -fieldsplit_FE_split_ksp_monitor", and I'm just wondering about why the >> second outer iteration has two inner iteration loops, whereas all the other >> outer iterations have one inner iteration loop? I assume it is something to >> do with a convergence tolerance, but it's not clear to me which tolerance >> would control that. >> > >> > Thanks, >> > David >> > >> > ------------------------------------------------------------------ >> > >> > Residual norms for fieldsplit_FE_split_ solve. >> > 0 KSP Residual norm 4.742303891408e+01 >> > 1 KSP Residual norm 2.909253505630e-01 >> > 2 KSP Residual norm 9.891933795059e-02 >> > 3 KSP Residual norm 7.147789520745e-02 >> > 4 KSP Residual norm 1.668752967907e-02 >> > 5 KSP Residual norm 5.019869896662e-03 >> > 6 KSP Residual norm 2.848579237244e-03 >> > 7 KSP Residual norm 2.847897269641e-03 >> > 8 KSP Residual norm 2.840502392022e-03 >> > 9 KSP Residual norm 2.831875522381e-03 >> > 10 KSP Residual norm 2.688309287993e-03 >> > 11 KSP Residual norm 1.351494303229e-03 >> > 12 KSP Residual norm 1.350874246297e-03 >> > 13 KSP Residual norm 9.154691604943e-06 >> > 0 KSP Residual norm 2.254632353893e+02 >> > Residual norms for fieldsplit_FE_split_ solve. >> > 0 KSP Residual norm 4.742303891408e+01 >> > 1 KSP Residual norm 2.909253505630e-01 >> > 2 KSP Residual norm 9.891933795059e-02 >> > 3 KSP Residual norm 7.147789520745e-02 >> > 4 KSP Residual norm 1.668752967907e-02 >> > 5 KSP Residual norm 5.019869896662e-03 >> > 6 KSP Residual norm 2.848579237244e-03 >> > 7 KSP Residual norm 2.847897269641e-03 >> > 8 KSP Residual norm 2.840502392022e-03 >> > 9 KSP Residual norm 2.831875522381e-03 >> > 10 KSP Residual norm 2.688309287993e-03 >> > 11 KSP Residual norm 1.351494303229e-03 >> > 12 KSP Residual norm 1.350874246297e-03 >> > 13 KSP Residual norm 9.154691604943e-06 >> > Residual norms for fieldsplit_FE_split_ solve. >> > 0 KSP Residual norm 1.554697370480e-05 >> > 1 KSP Residual norm 1.554471967929e-05 >> > 2 KSP Residual norm 1.551293889691e-05 >> > 3 KSP Residual norm 8.031337431574e-06 >> > 4 KSP Residual norm 4.137185786243e-06 >> > 5 KSP Residual norm 4.066606123330e-06 >> > 6 KSP Residual norm 4.051107282928e-06 >> > 7 KSP Residual norm 4.047442850256e-06 >> > 8 KSP Residual norm 4.047129984657e-06 >> > 9 KSP Residual norm 4.030697964677e-06 >> > 10 KSP Residual norm 2.882383190940e-06 >> > 11 KSP Residual norm 3.325005138484e-07 >> > 12 KSP Residual norm 2.107354774516e-07 >> > 13 KSP Residual norm 2.107005548204e-07 >> > 14 KSP Residual norm 4.399320792736e-08 >> > 15 KSP Residual norm 4.236902403786e-08 >> > 16 KSP Residual norm 2.932877082709e-08 >> > 17 KSP Residual norm 3.881909203171e-09 >> > 18 KSP Residual norm 1.107791399514e-09 >> > 19 KSP Residual norm 2.645048006100e-11 >> > 1 KSP Residual norm 8.266776463696e-01 >> > Residual norms for fieldsplit_FE_split_ solve. >> > 0 KSP Residual norm 9.262528453386e-08 >> > 1 KSP Residual norm 5.683232925010e-10 >> > 2 KSP Residual norm 1.915223168286e-10 >> > 3 KSP Residual norm 1.397893184942e-10 >> > 4 KSP Residual norm 1.691441435404e-11 >> > 5 KSP Residual norm 6.138315243419e-12 >> > 6 KSP Residual norm 5.576043830003e-12 >> > 7 KSP Residual norm 5.574440028225e-12 >> > 8 KSP Residual norm 5.559544964428e-12 >> > 9 KSP Residual norm 5.539862581746e-12 >> > 10 KSP Residual norm 5.258329460152e-12 >> > 11 KSP Residual norm 2.643581511791e-12 >> > 12 KSP Residual norm 2.641293392449e-12 >> > 13 KSP Residual norm 2.354608977643e-14 >> > 2 KSP Residual norm 4.450925351013e-07 >> > Residual norms for fieldsplit_FE_split_ solve. >> > 0 KSP Residual norm 6.653681330477e-14 >> > 1 KSP Residual norm 6.650750698147e-14 >> > 2 KSP Residual norm 6.111123464526e-14 >> > 3 KSP Residual norm 2.026817941567e-14 >> > 4 KSP Residual norm 9.604999144183e-15 >> > 5 KSP Residual norm 9.208296307424e-15 >> > 6 KSP Residual norm 9.196769686859e-15 >> > 7 KSP Residual norm 9.185058975459e-15 >> > 8 KSP Residual norm 9.180207477303e-15 >> > 9 KSP Residual norm 8.991574890909e-15 >> > 10 KSP Residual norm 8.032736869820e-15 >> > 11 KSP Residual norm 1.536409278928e-15 >> > 12 KSP Residual norm 1.177374264280e-15 >> > 13 KSP Residual norm 1.175712092044e-15 >> > 14 KSP Residual norm 2.572275406087e-16 >> > 15 KSP Residual norm 2.548423809711e-16 >> > 16 KSP Residual norm 8.616505207588e-17 >> > 17 KSP Residual norm 7.563053994201e-18 >> > 18 KSP Residual norm 6.807636198601e-18 >> > 19 KSP Residual norm 9.747028518744e-19 >> > 20 KSP Residual norm 2.419807103570e-21 >> > 3 KSP Residual norm 2.986369469883e-09 >> > Residual norms for fieldsplit_FE_split_ solve. >> > 0 KSP Residual norm 7.813223137340e-16 >> > 1 KSP Residual norm 4.793103235095e-18 >> > 2 KSP Residual norm 1.615526128222e-18 >> > 3 KSP Residual norm 1.179102504397e-18 >> > 4 KSP Residual norm 1.427467627551e-19 >> > 5 KSP Residual norm 5.177440470993e-20 >> > 6 KSP Residual norm 4.703763659148e-20 >> > 7 KSP Residual norm 4.701953228322e-20 >> > 8 KSP Residual norm 4.689269668869e-20 >> > 9 KSP Residual norm 4.672625361251e-20 >> > 10 KSP Residual norm 4.435174006113e-20 >> > 11 KSP Residual norm 2.229156843383e-20 >> > 12 KSP Residual norm 2.228887211080e-20 >> > 13 KSP Residual norm 3.492936921635e-22 >> > 4 KSP Residual norm 3.753341263086e-15 >> >> > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From kandanovian at gmail.com Fri Jan 27 10:48:56 2017 From: kandanovian at gmail.com (Tim Steinhoff) Date: Fri, 27 Jan 2017 17:48:56 +0100 Subject: [petsc-users] MatColoring for non square matrices Message-ID: Hi, are the MatColoring routines supposed to work with non square matrices? It depends on the matrix and the coloring algorithms, that sometimes seem to give a valid coloring, sometimes gives totally wrong results and sometimes crashes. Here is a small fortran sample code snipped, that crashes: Mat :: c MatColoring :: color ISColoring :: isColor call MatCreate(PETSC_COMM_WORLD, c, ierr) call MatSetSizes(c,PETSC_DECIDE,PETSC_DECIDE,3,5,ierr) call MatSetFromOptions(c,ierr) call MatSetUp(c, ierr) if (myrank .eq. 0) then call MatSetValue(c, 0, 0, 1d0, INSERT_VALUES, ierr) call MatSetValue(c, 0, 1, 0d0, INSERT_VALUES, ierr) call MatSetValue(c, 0, 2, 2d0, INSERT_VALUES, ierr) call MatSetValue(c, 0, 4, 0d0, INSERT_VALUES, ierr) call MatSetValue(c, 1, 0, 0d0, INSERT_VALUES, ierr) call MatSetValue(c, 1, 1, 0d0, INSERT_VALUES, ierr) call MatSetValue(c, 1, 2, 0d0, INSERT_VALUES, ierr) call MatSetValue(c, 1, 3, 0d0, INSERT_VALUES, ierr) call MatSetValue(c, 2, 0, 3d0, INSERT_VALUES, ierr) call MatSetValue(c, 2, 2, 4d0, INSERT_VALUES, ierr) call MatSetValue(c, 2, 3, 1d0, INSERT_VALUES, ierr) end if call MatAssemblyBegin(c,MAT_FINAL_ASSEMBLY,ierr) call MatAssemblyEnd(c,MAT_FINAL_ASSEMBLY,ierr) call MatColoringCreate(c, color, ierr) call MatColoringSetType(color, MATCOLORINGGREEDY, ierr) call MatColoringSetFromOptions(color, ierr) call MatColoringApply(color, isColor, ierr) [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range #9 0x00007fb1b242b407 in GreedyColoringLocalDistanceTwo_Private (mc=0x97da50, wts=0x97f640, lperm=0x982980, colors=0x982fe0) at /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/impls/greedy/greedy.c:349 #10 0x00007fb1b242ebdc in MatColoringApply_Greedy (mc=0x97da50, iscoloring=0x7ffcedf22698) at /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/impls/greedy/greedy.c:580 #11 0x00007fb1b240e3de in MatColoringApply (mc=0x97da50, coloring=0x7ffcedf22698) at /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/interface/matcoloring.c:382 #12 0x00007fb1b2410f37 in matcoloringapply_ (mc=0x7ffcedf22690, coloring=0x7ffcedf22698, __ierr=0x6475e8 ) at /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/interface/ftn-auto/matcoloringf.c:115 Thanks and kind regards, Volker From bsmith at mcs.anl.gov Fri Jan 27 13:21:10 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 27 Jan 2017 13:21:10 -0600 Subject: [petsc-users] MatColoring for non square matrices In-Reply-To: References: Message-ID: <70A71B41-5824-4E3A-A626-0D0511086A69@mcs.anl.gov> Definitely not suppose to work. We never thought about supporting nonsquare matrices. I do not know if this is just due to bugs in the code like using assuming the number of rows is the same the number of columns or deeper issues like the algorithms requiring the matrices being square. You need to look directly at the implementations to see if they can be "fixed" for nonsquare matrices. Barry > On Jan 27, 2017, at 10:48 AM, Tim Steinhoff wrote: > > Hi, > > are the MatColoring routines supposed to work with non square > matrices? It depends on the matrix and the coloring algorithms, that > sometimes seem to give a valid coloring, sometimes gives totally wrong > results and sometimes crashes. > Here is a small fortran sample code snipped, that crashes: > > Mat :: c > MatColoring :: color > ISColoring :: isColor > call MatCreate(PETSC_COMM_WORLD, c, ierr) > call MatSetSizes(c,PETSC_DECIDE,PETSC_DECIDE,3,5,ierr) > call MatSetFromOptions(c,ierr) > call MatSetUp(c, ierr) > if (myrank .eq. 0) then > call MatSetValue(c, 0, 0, 1d0, INSERT_VALUES, ierr) > call MatSetValue(c, 0, 1, 0d0, INSERT_VALUES, ierr) > call MatSetValue(c, 0, 2, 2d0, INSERT_VALUES, ierr) > call MatSetValue(c, 0, 4, 0d0, INSERT_VALUES, ierr) > call MatSetValue(c, 1, 0, 0d0, INSERT_VALUES, ierr) > call MatSetValue(c, 1, 1, 0d0, INSERT_VALUES, ierr) > call MatSetValue(c, 1, 2, 0d0, INSERT_VALUES, ierr) > call MatSetValue(c, 1, 3, 0d0, INSERT_VALUES, ierr) > call MatSetValue(c, 2, 0, 3d0, INSERT_VALUES, ierr) > call MatSetValue(c, 2, 2, 4d0, INSERT_VALUES, ierr) > call MatSetValue(c, 2, 3, 1d0, INSERT_VALUES, ierr) > end if > call MatAssemblyBegin(c,MAT_FINAL_ASSEMBLY,ierr) > call MatAssemblyEnd(c,MAT_FINAL_ASSEMBLY,ierr) > > call MatColoringCreate(c, color, ierr) > call MatColoringSetType(color, MATCOLORINGGREEDY, ierr) > call MatColoringSetFromOptions(color, ierr) > call MatColoringApply(color, isColor, ierr) > > > > [1]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, > probably memory access out of range > #9 0x00007fb1b242b407 in GreedyColoringLocalDistanceTwo_Private > (mc=0x97da50, wts=0x97f640, lperm=0x982980, colors=0x982fe0) at > /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/impls/greedy/greedy.c:349 > #10 0x00007fb1b242ebdc in MatColoringApply_Greedy (mc=0x97da50, > iscoloring=0x7ffcedf22698) at > /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/impls/greedy/greedy.c:580 > #11 0x00007fb1b240e3de in MatColoringApply (mc=0x97da50, > coloring=0x7ffcedf22698) at > /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/interface/matcoloring.c:382 > #12 0x00007fb1b2410f37 in matcoloringapply_ (mc=0x7ffcedf22690, > coloring=0x7ffcedf22698, __ierr=0x6475e8 ) > at /fsgarwinhpc/133/petsc/sources/petsc-3.7.5/src/mat/color/interface/ftn-auto/matcoloringf.c:115 > > > > Thanks and kind regards, > > Volker From vukmanh at googlemail.com Fri Jan 27 14:21:34 2017 From: vukmanh at googlemail.com (H. Vukman) Date: Fri, 27 Jan 2017 21:21:34 +0100 Subject: [petsc-users] OpenMPI 2.0.1 and PETSC 3.7.5 Message-ID: Hello! When I try to compile PETSC after running ./configure with make PETSC_DIR=/home/guntah/Downloads/petsc-3.7.5 PETSC_ARCH=arch-linux2-c-debug all I get the error: /home/guntah/Downloads/petsc-3.7.5/include/petscsys.h:150:6: error: #error "PETSc was configured with OpenMPI but now appears to be compiling using a non-OpenMPI mpi.h" # error "PETSc was configured with OpenMPI but now appears to be compiling using a non-OpenMPI mpi.h" I have used the newest OpenMpi 2.0.1 and PETSC 3.7.5 The platform is Linux Mint 17. Attached is the make.log Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: make.log Type: text/x-log Size: 13591 bytes Desc: not available URL: From bsmith at mcs.anl.gov Fri Jan 27 14:28:09 2017 From: bsmith at mcs.anl.gov (Barry Smith) Date: Fri, 27 Jan 2017 14:28:09 -0600 Subject: [petsc-users] OpenMPI 2.0.1 and PETSC 3.7.5 In-Reply-To: References: Message-ID: Perhaps you previously did a ./configure --download-mpich ? So so delete ${PETSC_DIR}/${PETSC_ARCH} and rerun your new configure again. If the same problem occurs send configure.log and make.log Barry > On Jan 27, 2017, at 2:21 PM, H. Vukman wrote: > > Hello! > > When I try to compile PETSC after running ./configure with > make PETSC_DIR=/home/guntah/Downloads/petsc-3.7.5 PETSC_ARCH=arch-linux2-c-debug all > > I get the error: > > /home/guntah/Downloads/petsc-3.7.5/include/petscsys.h:150:6: error: #error "PETSc was configured with OpenMPI but now appears to be compiling using a non-OpenMPI mpi.h" > # error "PETSc was configured with OpenMPI but now appears to be compiling using a non-OpenMPI mpi.h" > > > I have used the newest OpenMpi 2.0.1 and PETSC 3.7.5 > The platform is Linux Mint 17. Attached is the make.log > > Thanks > > From balay at mcs.anl.gov Fri Jan 27 14:29:59 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Fri, 27 Jan 2017 14:29:59 -0600 Subject: [petsc-users] OpenMPI 2.0.1 and PETSC 3.7.5 In-Reply-To: References: Message-ID: Can do a clean build - and see if this problem persists? rm -rf arch-linux2-c-debug [and rebuild] If the problem persists - please run the following - and send us the generated verboseinfo.i file make -f gmakefile V=1 CFLAGS=-save-temps arch-linux2-c-debug/obj/src/sys/info/verboseinfo.o Satish On Fri, 27 Jan 2017, H. Vukman wrote: > Hello! > > When I try to compile PETSC after running ./configure with > make PETSC_DIR=/home/guntah/Downloads/petsc-3.7.5 > PETSC_ARCH=arch-linux2-c-debug all > > I get the error: > > /home/guntah/Downloads/petsc-3.7.5/include/petscsys.h:150:6: error: #error > "PETSc was configured with OpenMPI but now appears to be compiling using a > non-OpenMPI mpi.h" > # error "PETSc was configured with OpenMPI but now appears to be > compiling using a non-OpenMPI mpi.h" > > > I have used the newest OpenMpi 2.0.1 and PETSC 3.7.5 > The platform is Linux Mint 17. Attached is the make.log > > Thanks > From vukmanh at googlemail.com Fri Jan 27 15:14:41 2017 From: vukmanh at googlemail.com (H. Vukman) Date: Fri, 27 Jan 2017 22:14:41 +0100 Subject: [petsc-users] OpenMPI 2.0.1 and PETSC 3.7.5 In-Reply-To: References: Message-ID: Yes this did it. Thanks. 2017-01-27 21:29 GMT+01:00 Satish Balay : > Can do a clean build - and see if this problem persists? > > rm -rf arch-linux2-c-debug > > [and rebuild] > > > If the problem persists - please run the following - and send us the > generated verboseinfo.i file > > make -f gmakefile V=1 CFLAGS=-save-temps arch-linux2-c-debug/obj/src/ > sys/info/verboseinfo.o > > Satish > > On Fri, 27 Jan 2017, H. Vukman wrote: > > > Hello! > > > > When I try to compile PETSC after running ./configure with > > make PETSC_DIR=/home/guntah/Downloads/petsc-3.7.5 > > PETSC_ARCH=arch-linux2-c-debug all > > > > I get the error: > > > > /home/guntah/Downloads/petsc-3.7.5/include/petscsys.h:150:6: error: > #error > > "PETSc was configured with OpenMPI but now appears to be compiling using > a > > non-OpenMPI mpi.h" > > # error "PETSc was configured with OpenMPI but now appears to be > > compiling using a non-OpenMPI mpi.h" > > > > > > I have used the newest OpenMpi 2.0.1 and PETSC 3.7.5 > > The platform is Linux Mint 17. Attached is the make.log > > > > Thanks > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Sun Jan 29 03:45:03 2017 From: cpraveen at gmail.com (Praveen C) Date: Sun, 29 Jan 2017 15:15:03 +0530 Subject: [petsc-users] Getting a local vector with ghosts knowing ghost indices Message-ID: Dear all In my problem, I know the ghost vertices in each partition since I have used metis to partition my unstructured grid. I want to get a vector ul with ghosts, from a global vector ug. I will use ul to assemble my non-linear rhs vector. Is the following approach optimal for this purpose ? PetscInt nvar; // number of variables at each vertex PetscInt nvl; // number of local vertices in this partition PetscInt nvg; // number of ghost vertices for this partition PetscInt *vghost; // global index of ghost vertices for this partition PetscReal **u; // size u[nvl+nvg][nvar] Vec ul; // vector with ghosts Vec ug; // global vector without ghosts VecCreate(PETSC_COMM_WORLD, &ug); VecSetSizes(ug, nvar*nvl, PETSC_DECIDE); VecSetFromOptions(ug); VecCreateGhostBlockWithArray(PETSC_COMM_WORLD, nvar, nvar*nvl, PETSC_DECIDE, nvg, vghost, &u, ul); // Following would be inside RHSFunction double **array; VecGetArray2d(ug, nvl, nvar, 0, 0, &array); for(unsigned int i=0; i From cpraveen at gmail.com Mon Jan 30 11:04:49 2017 From: cpraveen at gmail.com (Praveen C) Date: Mon, 30 Jan 2017 22:34:49 +0530 Subject: [petsc-users] Debugging a petsc code Message-ID: Dear all I am trying to find a possible bug in my fortran petsc code. Running valgrid I see messages like this ==28499== 1,596 (1,512 direct, 84 indirect) bytes in 1 blocks are definitely lost in loss record 174 of 194 ==28499== at 0x4C2D636: memalign (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==28499== by 0x4F0F178: PetscMallocAlign (mal.c:28) ==28499== by 0x4FF7E82: VecCreate (veccreate.c:37) ==28499== by 0x4FDF198: VecCreateSeqWithArray (bvec2.c:946) ==28499== by 0x4FE442E: veccreateseqwitharray_ (zbvec2f.c:12) ==28499== by 0x406921: initpetsc_ (all.f95:2066) ==28499== by 0x4035B1: run_ (all.f95:2817) ==28499== by 0x41760C: MAIN__ (all.f95:1383) ==28499== by 0x417D08: main (all.f95:1330) Does this indicate some bug in my code ? Thanks praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From knepley at gmail.com Mon Jan 30 11:34:36 2017 From: knepley at gmail.com (Matthew Knepley) Date: Mon, 30 Jan 2017 11:34:36 -0600 Subject: [petsc-users] Debugging a petsc code In-Reply-To: References: Message-ID: On Mon, Jan 30, 2017 at 11:04 AM, Praveen C wrote: > Dear all > > I am trying to find a possible bug in my fortran petsc code. Running > valgrid I see messages like this > > ==28499== 1,596 (1,512 direct, 84 indirect) bytes in 1 blocks are > definitely lost in loss record 174 of 194 > > ==28499== at 0x4C2D636: memalign (in /usr/lib64/valgrind/vgpreload_ > memcheck-amd64-linux.so) > > ==28499== by 0x4F0F178: PetscMallocAlign (mal.c:28) > > ==28499== by 0x4FF7E82: VecCreate (veccreate.c:37) > > ==28499== by 0x4FDF198: VecCreateSeqWithArray (bvec2.c:946) > > ==28499== by 0x4FE442E: veccreateseqwitharray_ (zbvec2f.c:12) > > ==28499== by 0x406921: initpetsc_ (all.f95:2066) > > ==28499== by 0x4035B1: run_ (all.f95:2817) > > ==28499== by 0x41760C: MAIN__ (all.f95:1383) > > ==28499== by 0x417D08: main (all.f95:1330) > > > Does this indicate some bug in my code ? > It you run with -malloc_test, and it does not report unfreed objects, then everything is fine. Matt > Thanks > praveen > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefano.zampini at gmail.com Mon Jan 30 12:48:12 2017 From: stefano.zampini at gmail.com (Stefano Zampini) Date: Mon, 30 Jan 2017 21:48:12 +0300 Subject: [petsc-users] Debugging a petsc code In-Reply-To: References: Message-ID: <9CFBEED0-2DE3-41D0-90AD-3861AD9E7BCE@gmail.com> It just reports that you have a memory leak. Probably you did not call VecDestroy on the Vec created at at initpetsc_ in line 2066 of all.f95. > On Jan 30, 2017, at 8:04 PM, Praveen C wrote: > > Dear all > > I am trying to find a possible bug in my fortran petsc code. Running valgrid I see messages like this > > ==28499== 1,596 (1,512 direct, 84 indirect) bytes in 1 blocks are definitely lost in loss record 174 of 194 > ==28499== at 0x4C2D636: memalign (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) > ==28499== by 0x4F0F178: PetscMallocAlign (mal.c:28) > ==28499== by 0x4FF7E82: VecCreate (veccreate.c:37) > ==28499== by 0x4FDF198: VecCreateSeqWithArray (bvec2.c:946) > ==28499== by 0x4FE442E: veccreateseqwitharray_ (zbvec2f.c:12) > ==28499== by 0x406921: initpetsc_ (all.f95:2066) > ==28499== by 0x4035B1: run_ (all.f95:2817) > ==28499== by 0x41760C: MAIN__ (all.f95:1383) > ==28499== by 0x417D08: main (all.f95:1330) > > Does this indicate some bug in my code ? > > Thanks > praveen -------------- next part -------------- An HTML attachment was scrubbed... URL: From cpraveen at gmail.com Mon Jan 30 21:58:18 2017 From: cpraveen at gmail.com (Praveen C) Date: Tue, 31 Jan 2017 09:28:18 +0530 Subject: [petsc-users] Debugging a petsc code In-Reply-To: <9CFBEED0-2DE3-41D0-90AD-3861AD9E7BCE@gmail.com> References: <9CFBEED0-2DE3-41D0-90AD-3861AD9E7BCE@gmail.com> Message-ID: -malloc_test does not report anything. Freeing all petsc vectors got rid of those error. Now I see only MPI related errors like this ==33686== 376 (232 direct, 144 indirect) bytes in 1 blocks are definitely lost in loss record 148 of 159 ==33686== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33686== by 0x660D7EF: mca_bml_r2_add_procs (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33686== by 0x66D11CA: mca_pml_ob1_add_procs (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33686== by 0x65CE906: ompi_mpi_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33686== by 0x65ED082: PMPI_Init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33686== by 0x6352D97: MPI_INIT (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi_mpifh.so.20.0.0) ==33686== by 0x4F393C6: petscinitialize_ (zstart.c:320) ==33686== by 0x417718: MAIN__ (all.f95:1385) ==33686== by 0x4184B2: main (all.f95:1366) Does this indicate some error in my code or my MPI. This is the valgrind summary ==33686== LEAK SUMMARY: ==33686== definitely lost: 1,378 bytes in 14 blocks ==33686== indirectly lost: 64,882 bytes in 88 blocks ==33686== possibly lost: 0 bytes in 0 blocks ==33686== still reachable: 32,984 bytes in 139 blocks ==33686== suppressed: 0 bytes in 0 blocks I have attached the full valgrid output. Thanks praveen On Tue, Jan 31, 2017 at 12:18 AM, Stefano Zampini wrote: > It just reports that you have a memory leak. Probably you did not call > VecDestroy on the Vec created at at initpetsc_ in line 2066 of all.f95. > > On Jan 30, 2017, at 8:04 PM, Praveen C wrote: > > Dear all > > I am trying to find a possible bug in my fortran petsc code. Running > valgrid I see messages like this > > ==28499== 1,596 (1,512 direct, 84 indirect) bytes in 1 blocks are > definitely lost in loss record 174 of 194 > ==28499== at 0x4C2D636: memalign (in /usr/lib64/valgrind/vgpreload_ > memcheck-amd64-linux.so) > ==28499== by 0x4F0F178: PetscMallocAlign (mal.c:28) > ==28499== by 0x4FF7E82: VecCreate (veccreate.c:37) > ==28499== by 0x4FDF198: VecCreateSeqWithArray (bvec2.c:946) > ==28499== by 0x4FE442E: veccreateseqwitharray_ (zbvec2f.c:12) > ==28499== by 0x406921: initpetsc_ (all.f95:2066) > ==28499== by 0x4035B1: run_ (all.f95:2817) > ==28499== by 0x41760C: MAIN__ (all.f95:1383) > ==28499== by 0x417D08: main (all.f95:1330) > > Does this indicate some bug in my code ? > > Thanks > praveen > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- ==33706== Memcheck, a memory error detector ==33706== Copyright (C) 2002-2015, and GNU GPL'd, by Julian Seward et al. ==33706== Using Valgrind-3.12.0 and LibVEX; rerun with -h for copyright info ==33706== Command: ./ug3 ==33706== ==33706== ==33706== HEAP SUMMARY: ==33706== in use at exit: 99,244 bytes in 241 blocks ==33706== total heap usage: 82,707 allocs, 82,466 frees, 23,824,048 bytes allocated ==33706== ==33706== 10 bytes in 1 blocks are definitely lost in loss record 6 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0xC717EEE: opal_pmix_pmix112_pmix_bfrop_unpack_string (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC715FFB: opal_pmix_pmix112_pmix_bfrop_unpack (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6F74CF: job_data (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC723831: opal_pmix_pmix112_pmix_usock_process_msg (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6C1ED7: event_process_active_single_queue (event.c:1370) ==33706== by 0xC6C1ED7: event_process_active (event.c:1440) ==33706== by 0xC6C1ED7: opal_libevent2022_event_base_loop (event.c:1644) ==33706== by 0xC713A6C: progress_engine (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0x73F4453: start_thread (in /lib64/libpthread-2.24.so) ==33706== by 0x76F337E: clone (in /lib64/libc-2.24.so) ==33706== ==33706== 25 bytes in 1 blocks are definitely lost in loss record 21 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0x767B137: vasprintf (in /lib64/libc-2.24.so) ==33706== by 0x765A936: asprintf (in /lib64/libc-2.24.so) ==33706== by 0xC3A5DD4: rte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0xC3683E4: orte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0x65CE425: ompi_mpi_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x65ED082: PMPI_Init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x6352D97: MPI_INIT (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi_mpifh.so.20.0.0) ==33706== by 0x4F393C6: petscinitialize_ (zstart.c:320) ==33706== by 0x417718: MAIN__ (all.f95:1385) ==33706== by 0x4184B2: main (all.f95:1366) ==33706== ==33706== 30 bytes in 1 blocks are definitely lost in loss record 24 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0x767B137: vasprintf (in /lib64/libc-2.24.so) ==33706== by 0x765A936: asprintf (in /lib64/libc-2.24.so) ==33706== by 0xC3A5E01: rte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0xC3683E4: orte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0x65CE425: ompi_mpi_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x65ED082: PMPI_Init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x6352D97: MPI_INIT (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi_mpifh.so.20.0.0) ==33706== by 0x4F393C6: petscinitialize_ (zstart.c:320) ==33706== by 0x417718: MAIN__ (all.f95:1385) ==33706== by 0x4184B2: main (all.f95:1366) ==33706== ==33706== 48 bytes in 1 blocks are definitely lost in loss record 118 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0xC68FFFD: mca_base_component_find (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC69A099: mca_base_framework_components_register (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC69A483: mca_base_framework_register (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC69A4F0: mca_base_framework_open (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6D4371: patcher_query (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6D4255: opal_memory_base_open (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC69A560: mca_base_framework_open (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC675386: opal_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC36827A: orte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0x65CE425: ompi_mpi_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x65ED082: PMPI_Init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== ==33706== 72 bytes in 1 blocks are definitely lost in loss record 129 of 159 ==33706== at 0x4C2D2CF: realloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0x767B10D: vasprintf (in /lib64/libc-2.24.so) ==33706== by 0x765A936: asprintf (in /lib64/libc-2.24.so) ==33706== by 0xC3A5D5D: rte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0xC3683E4: orte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0x65CE425: ompi_mpi_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x65ED082: PMPI_Init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x6352D97: MPI_INIT (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi_mpifh.so.20.0.0) ==33706== by 0x4F393C6: petscinitialize_ (zstart.c:320) ==33706== by 0x417718: MAIN__ (all.f95:1385) ==33706== by 0x4184B2: main (all.f95:1366) ==33706== ==33706== 113 bytes in 4 blocks are definitely lost in loss record 136 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0x768B689: strdup (in /lib64/libc-2.24.so) ==33706== by 0xC3A5904: rte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0xC3683E4: orte_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0x65CE425: ompi_mpi_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x65ED082: PMPI_Init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x6352D97: MPI_INIT (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi_mpifh.so.20.0.0) ==33706== by 0x4F393C6: petscinitialize_ (zstart.c:320) ==33706== by 0x417718: MAIN__ (all.f95:1385) ==33706== by 0x4184B2: main (all.f95:1366) ==33706== ==33706== 317 (56 direct, 261 indirect) bytes in 1 blocks are definitely lost in loss record 146 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0xC6F4470: _putfn (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6C1ED7: event_process_active_single_queue (event.c:1370) ==33706== by 0xC6C1ED7: event_process_active (event.c:1440) ==33706== by 0xC6C1ED7: opal_libevent2022_event_base_loop (event.c:1644) ==33706== by 0xC713A6C: progress_engine (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0x73F4453: start_thread (in /lib64/libpthread-2.24.so) ==33706== by 0x76F337E: clone (in /lib64/libc-2.24.so) ==33706== ==33706== 376 (232 direct, 144 indirect) bytes in 1 blocks are definitely lost in loss record 148 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0x660D7EF: mca_bml_r2_add_procs (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x66D11CA: mca_pml_ob1_add_procs (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x65CE906: ompi_mpi_init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x65ED082: PMPI_Init (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x6352D97: MPI_INIT (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi_mpifh.so.20.0.0) ==33706== by 0x4F393C6: petscinitialize_ (zstart.c:320) ==33706== by 0x417718: MAIN__ (all.f95:1385) ==33706== by 0x4184B2: main (all.f95:1366) ==33706== ==33706== 632 bytes in 1 blocks are definitely lost in loss record 150 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0xC6F7A4F: pmix_client_deregister_errhandler (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6FDA34: OPAL_PMIX_PMIX112_PMIx_Deregister_errhandler (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6DC62F: pmix1_client_finalize (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC3A4EBD: rte_finalize (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0xC3680A1: orte_finalize (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-rte.so.20.0.0) ==33706== by 0x65CF2BD: ompi_mpi_finalize (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) ==33706== by 0x4F2D000: PetscFinalize (pinit.c:1428) ==33706== by 0x4F39972: petscfinalize_ (zstart.c:495) ==33706== by 0x418451: MAIN__ (all.f95:1438) ==33706== by 0x4184B2: main (all.f95:1366) ==33706== ==33706== 1,116 (104 direct, 1,012 indirect) bytes in 1 blocks are definitely lost in loss record 155 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0xC6E8F5A: opal_pmix_pmix112_pmix_hash_store (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6F6F2A: opal_pmix_pmix112_pmix_client_process_nspace_blob (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6F7522: job_data (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC723831: opal_pmix_pmix112_pmix_usock_process_msg (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6C1ED7: event_process_active_single_queue (event.c:1370) ==33706== by 0xC6C1ED7: event_process_active (event.c:1440) ==33706== by 0xC6C1ED7: opal_libevent2022_event_base_loop (event.c:1644) ==33706== by 0xC713A6C: progress_engine (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0x73F4453: start_thread (in /lib64/libpthread-2.24.so) ==33706== by 0x76F337E: clone (in /lib64/libc-2.24.so) ==33706== ==33706== 63,521 (56 direct, 63,465 indirect) bytes in 1 blocks are definitely lost in loss record 159 of 159 ==33706== at 0x4C2B0AF: malloc (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) ==33706== by 0xC6F681E: opal_pmix_pmix112_pmix_client_process_nspace_blob (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6F7522: job_data (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC723831: opal_pmix_pmix112_pmix_usock_process_msg (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0xC6C1ED7: event_process_active_single_queue (event.c:1370) ==33706== by 0xC6C1ED7: event_process_active (event.c:1440) ==33706== by 0xC6C1ED7: opal_libevent2022_event_base_loop (event.c:1644) ==33706== by 0xC713A6C: progress_engine (in /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libopen-pal.so.20.1.0) ==33706== by 0x73F4453: start_thread (in /lib64/libpthread-2.24.so) ==33706== by 0x76F337E: clone (in /lib64/libc-2.24.so) ==33706== ==33706== LEAK SUMMARY: ==33706== definitely lost: 1,378 bytes in 14 blocks ==33706== indirectly lost: 64,882 bytes in 88 blocks ==33706== possibly lost: 0 bytes in 0 blocks ==33706== still reachable: 32,984 bytes in 139 blocks ==33706== suppressed: 0 bytes in 0 blocks ==33706== Reachable blocks (those to which a pointer was found) are not shown. ==33706== To see them, rerun with: --leak-check=full --show-leak-kinds=all ==33706== ==33706== For counts of detected and suppressed errors, rerun with: -v ==33706== ERROR SUMMARY: 11 errors from 11 contexts (suppressed: 0 from 0) From balay at mcs.anl.gov Mon Jan 30 22:03:10 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Mon, 30 Jan 2017 22:03:10 -0600 Subject: [petsc-users] Debugging a petsc code In-Reply-To: References: <9CFBEED0-2DE3-41D0-90AD-3861AD9E7BCE@gmail.com> Message-ID: This is memory leak in openmpi - you can ignore it. For a valgrind clean MPI - you can build PETSc with --download-mpich Satish On Mon, 30 Jan 2017, Praveen C wrote: > -malloc_test does not report anything. > > Freeing all petsc vectors got rid of those error. > > Now I see only MPI related errors like this > > ==33686== 376 (232 direct, 144 indirect) bytes in 1 blocks are definitely > lost in loss record 148 of 159 > > ==33686== at 0x4C2B0AF: malloc (in > /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so) > > ==33686== by 0x660D7EF: mca_bml_r2_add_procs (in > /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) > > ==33686== by 0x66D11CA: mca_pml_ob1_add_procs (in > /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) > > ==33686== by 0x65CE906: ompi_mpi_init (in > /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) > > ==33686== by 0x65ED082: PMPI_Init (in > /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi.so.20.0.1) > > ==33686== by 0x6352D97: MPI_INIT (in > /home/spack/opt/spack/linux-opensuse20161217-x86_64/gcc-6/openmpi-2.0.1-asdjmd22cnyktv2athcx3ouhrozknk22/lib64/libmpi_mpifh.so.20.0.0) > > ==33686== by 0x4F393C6: petscinitialize_ (zstart.c:320) > > ==33686== by 0x417718: MAIN__ (all.f95:1385) > > ==33686== by 0x4184B2: main (all.f95:1366) > > > Does this indicate some error in my code or my MPI. > > This is the valgrind summary > > ==33686== LEAK SUMMARY: > > ==33686== definitely lost: 1,378 bytes in 14 blocks > > ==33686== indirectly lost: 64,882 bytes in 88 blocks > > ==33686== possibly lost: 0 bytes in 0 blocks > > ==33686== still reachable: 32,984 bytes in 139 blocks > > ==33686== suppressed: 0 bytes in 0 blocks > > > I have attached the full valgrid output. > > > Thanks > praveen > > On Tue, Jan 31, 2017 at 12:18 AM, Stefano Zampini > wrote: > > > It just reports that you have a memory leak. Probably you did not call > > VecDestroy on the Vec created at at initpetsc_ in line 2066 of all.f95. > > > > On Jan 30, 2017, at 8:04 PM, Praveen C wrote: > > > > Dear all > > > > I am trying to find a possible bug in my fortran petsc code. Running > > valgrid I see messages like this > > > > ==28499== 1,596 (1,512 direct, 84 indirect) bytes in 1 blocks are > > definitely lost in loss record 174 of 194 > > ==28499== at 0x4C2D636: memalign (in /usr/lib64/valgrind/vgpreload_ > > memcheck-amd64-linux.so) > > ==28499== by 0x4F0F178: PetscMallocAlign (mal.c:28) > > ==28499== by 0x4FF7E82: VecCreate (veccreate.c:37) > > ==28499== by 0x4FDF198: VecCreateSeqWithArray (bvec2.c:946) > > ==28499== by 0x4FE442E: veccreateseqwitharray_ (zbvec2f.c:12) > > ==28499== by 0x406921: initpetsc_ (all.f95:2066) > > ==28499== by 0x4035B1: run_ (all.f95:2817) > > ==28499== by 0x41760C: MAIN__ (all.f95:1383) > > ==28499== by 0x417D08: main (all.f95:1330) > > > > Does this indicate some bug in my code ? > > > > Thanks > > praveen > > > > > > > From knepley at gmail.com Tue Jan 31 09:26:21 2017 From: knepley at gmail.com (Matthew Knepley) Date: Tue, 31 Jan 2017 09:26:21 -0600 Subject: [petsc-users] Getting a local vector with ghosts knowing ghost indices In-Reply-To: References: Message-ID: On Sun, Jan 29, 2017 at 3:45 AM, Praveen C wrote: > Dear all > > In my problem, I know the ghost vertices in each partition since I have > used metis to partition my unstructured grid. I want to get a vector ul > with ghosts, from a global vector ug. > > I will use ul to assemble my non-linear rhs vector. > > Is the following approach optimal for this purpose ? > This should work and be scalable Matt > PetscInt nvar; // number of variables at each vertex > PetscInt nvl; // number of local vertices in this partition > PetscInt nvg; // number of ghost vertices for this partition > PetscInt *vghost; // global index of ghost vertices for this partition > PetscReal **u; // size u[nvl+nvg][nvar] > Vec ul; // vector with ghosts > Vec ug; // global vector without ghosts > > VecCreate(PETSC_COMM_WORLD, &ug); > VecSetSizes(ug, nvar*nvl, PETSC_DECIDE); > VecSetFromOptions(ug); > > VecCreateGhostBlockWithArray(PETSC_COMM_WORLD, nvar, nvar*nvl, > PETSC_DECIDE, nvg, vghost, > &u, ul); > > // Following would be inside RHSFunction > double **array; > VecGetArray2d(ug, nvl, nvar, 0, 0, &array); > for(unsigned int i=0; i for(unsigned int j=0; j u[i][j] = array[i][j]; > VecRestoreArray2d(ug, nvl, nvar, 0, 0, &array); > > // Fill ghost values > VecGhostUpdateBegin(ul, INSERT_VALUES, SCATTER_FORWARD); > VecGhostUpdateEnd (ul, INSERT_VALUES, SCATTER_FORWARD); > > // now use u[][] to compute local part of rhs vector > > Thanks > praveen > -- What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead. -- Norbert Wiener -------------- next part -------------- An HTML attachment was scrubbed... URL: From timothee.nicolas at gmail.com Tue Jan 31 10:08:06 2017 From: timothee.nicolas at gmail.com (=?UTF-8?Q?Timoth=C3=A9e_Nicolas?=) Date: Tue, 31 Jan 2017 17:08:06 +0100 Subject: [petsc-users] conflict between petsc.h90 and MPI_INT, MPI_SUM etc... Message-ID: Dear all, I am a bit confused as to how to link PETSc correctly in a code that already uses MPI natively. I wish to define a DMDA for my problem, but when I try to include petsc.h90, the compiler complains that MPI_XXXXX variables are already defined: /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_SOURCE] INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_TAG] INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR ---------------------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERROR] INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR ------------------------------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(11): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_STATUS_SIZE] INTEGER MPI_STATUS_SIZE ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(13): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_STATUS_IGNORE] INTEGER MPI_STATUS_IGNORE(MPI_STATUS_SIZE) ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(14): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_STATUSES_IGNORE] INTEGER MPI_STATUSES_IGNORE(MPI_STATUS_SIZE,1) ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(15): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERRCODES_IGNORE] INTEGER MPI_ERRCODES_IGNORE(1) ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(16): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ARGVS_NULL] CHARACTER*1 MPI_ARGVS_NULL(1,1) -------------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(17): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ARGV_NULL] CHARACTER*1 MPI_ARGV_NULL(1) -------------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(18): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_SUCCESS] INTEGER MPI_SUCCESS ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(20): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_SIZE] INTEGER MPI_ERR_SIZE ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(22): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_INFO_KEY] INTEGER MPI_ERR_INFO_KEY ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(24): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_FILE] INTEGER MPI_ERR_FILE ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(26): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_AMODE] INTEGER MPI_ERR_AMODE ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(28): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_TRUNCATE] INTEGER MPI_ERR_TRUNCATE ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(30): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_BASE] INTEGER MPI_ERR_BASE ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(32): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_OP] INTEGER MPI_ERR_OP ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(34): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_ASSERT] INTEGER MPI_ERR_ASSERT ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(36): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_NAME] INTEGER MPI_ERR_NAME ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(38): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_NO_MEM] INTEGER MPI_ERR_NO_MEM ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(40): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_INFO] INTEGER MPI_ERR_INFO ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(42): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_COUNT] INTEGER MPI_ERR_COUNT ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(44): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_SPAWN] INTEGER MPI_ERR_SPAWN ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(46): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_CONVERSION] INTEGER MPI_ERR_CONVERSION ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(48): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_GROUP] INTEGER MPI_ERR_GROUP ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(50): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_RMA_SYNC] INTEGER MPI_ERR_RMA_SYNC ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(52): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_NOT_SAME] INTEGER MPI_ERR_NOT_SAME ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(54): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_KEYVAL] INTEGER MPI_ERR_KEYVAL ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(56): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_ACCESS] INTEGER MPI_ERR_ACCESS ---------------^ /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(58): error #6401: The attributes of this name conflict with those made accessible by a USE statement. [MPI_ERR_DIMS] INTEGER MPI_ERR_DIMS ---------------^ I usually write everything consistently within PETSc, but for the first time, I have to use some PETSc routines in a code that is mostly standard FORTRAN 90. Is including PETSc via #include the wrong method? Best Timothee -------------- next part -------------- An HTML attachment was scrubbed... URL: From balay at mcs.anl.gov Tue Jan 31 10:21:13 2017 From: balay at mcs.anl.gov (Satish Balay) Date: Tue, 31 Jan 2017 10:21:13 -0600 Subject: [petsc-users] conflict between petsc.h90 and MPI_INT, MPI_SUM etc... In-Reply-To: References: Message-ID: This is petsc-3.7? Looks like mpif.h is getting included in some module that your code is currently using. Is this module your code - or petsc code? Is using mpi via module [perhaps mpi.mod is preferable over mpif.h] - then you can do something like: #define PETSC_AVOID_MPIF_H #include Satish On Tue, 31 Jan 2017, Timoth?e Nicolas wrote: > Dear all, > > I am a bit confused as to how to link PETSc correctly in a code that > already uses MPI natively. I wish to define a DMDA for my problem, but when > I try to include petsc.h90, the compiler complains that MPI_XXXXX variables > are already defined: > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_SOURCE] > > INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_TAG] > > INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR > > ---------------------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERROR] > > INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR > > ------------------------------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(11): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_STATUS_SIZE] > > INTEGER MPI_STATUS_SIZE > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(13): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_STATUS_IGNORE] > > INTEGER MPI_STATUS_IGNORE(MPI_STATUS_SIZE) > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(14): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_STATUSES_IGNORE] > > INTEGER MPI_STATUSES_IGNORE(MPI_STATUS_SIZE,1) > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(15): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERRCODES_IGNORE] > > INTEGER MPI_ERRCODES_IGNORE(1) > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(16): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ARGVS_NULL] > > CHARACTER*1 MPI_ARGVS_NULL(1,1) > > -------------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(17): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ARGV_NULL] > > CHARACTER*1 MPI_ARGV_NULL(1) > > -------------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(18): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_SUCCESS] > > INTEGER MPI_SUCCESS > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(20): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_SIZE] > > INTEGER MPI_ERR_SIZE > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(22): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_INFO_KEY] > > INTEGER MPI_ERR_INFO_KEY > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(24): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_FILE] > > INTEGER MPI_ERR_FILE > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(26): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_AMODE] > > INTEGER MPI_ERR_AMODE > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(28): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_TRUNCATE] > > INTEGER MPI_ERR_TRUNCATE > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(30): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_BASE] > > INTEGER MPI_ERR_BASE > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(32): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_OP] > > INTEGER MPI_ERR_OP > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(34): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_ASSERT] > > INTEGER MPI_ERR_ASSERT > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(36): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_NAME] > > INTEGER MPI_ERR_NAME > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(38): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_NO_MEM] > > INTEGER MPI_ERR_NO_MEM > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(40): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_INFO] > > INTEGER MPI_ERR_INFO > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(42): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_COUNT] > > INTEGER MPI_ERR_COUNT > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(44): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_SPAWN] > > INTEGER MPI_ERR_SPAWN > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(46): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_CONVERSION] > > INTEGER MPI_ERR_CONVERSION > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(48): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_GROUP] > > INTEGER MPI_ERR_GROUP > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(50): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_RMA_SYNC] > > INTEGER MPI_ERR_RMA_SYNC > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(52): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_NOT_SAME] > > INTEGER MPI_ERR_NOT_SAME > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(54): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_KEYVAL] > > INTEGER MPI_ERR_KEYVAL > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(56): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_ACCESS] > > INTEGER MPI_ERR_ACCESS > > ---------------^ > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(58): error #6401: The > attributes of this name conflict with those made accessible by a USE > statement. [MPI_ERR_DIMS] > > INTEGER MPI_ERR_DIMS > > ---------------^ > > I usually write everything consistently within PETSc, but for the first > time, I have to use some PETSc routines in a code that is mostly standard > FORTRAN 90. > > Is including PETSc via #include the wrong > method? > > Best > > Timothee > From timothee.nicolas at gmail.com Tue Jan 31 10:26:34 2017 From: timothee.nicolas at gmail.com (=?UTF-8?Q?Timoth=C3=A9e_Nicolas?=) Date: Tue, 31 Jan 2017 17:26:34 +0100 Subject: [petsc-users] conflict between petsc.h90 and MPI_INT, MPI_SUM etc... In-Reply-To: References: Message-ID: Hi, Thank you! Yes it is 3.7 Indeed I found that there was a 'use mpi' somewhere in the tree of modules called. I could break the module apart without great difficulty, so that I don't call mpi anymore in the affected portion of the code. It looks like it indeed fixes the issue. But it's also nice to give me the other solution with PETSC_AVOID_MPIF_H, which could have been very useful if I hadn't been able to easily break the module apart. Cheers Timothee 2017-01-31 17:21 GMT+01:00 Satish Balay : > This is petsc-3.7? > > Looks like mpif.h is getting included in some module that your code > is currently using. > > Is this module your code - or petsc code? > > Is using mpi via module [perhaps mpi.mod is preferable over mpif.h] - > then you can do something like: > > #define PETSC_AVOID_MPIF_H > #include > > Satish > > On Tue, 31 Jan 2017, Timoth?e Nicolas wrote: > > > Dear all, > > > > I am a bit confused as to how to link PETSc correctly in a code that > > already uses MPI natively. I wish to define a DMDA for my problem, but > when > > I try to include petsc.h90, the compiler complains that MPI_XXXXX > variables > > are already defined: > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_SOURCE] > > > > INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_TAG] > > > > INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR > > > > ---------------------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(9): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERROR] > > > > INTEGER MPI_SOURCE, MPI_TAG, MPI_ERROR > > > > ------------------------------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(11): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_STATUS_SIZE] > > > > INTEGER MPI_STATUS_SIZE > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(13): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_STATUS_IGNORE] > > > > INTEGER MPI_STATUS_IGNORE(MPI_STATUS_SIZE) > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(14): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_STATUSES_IGNORE] > > > > INTEGER MPI_STATUSES_IGNORE(MPI_STATUS_SIZE,1) > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(15): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERRCODES_IGNORE] > > > > INTEGER MPI_ERRCODES_IGNORE(1) > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(16): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ARGVS_NULL] > > > > CHARACTER*1 MPI_ARGVS_NULL(1,1) > > > > -------------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(17): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ARGV_NULL] > > > > CHARACTER*1 MPI_ARGV_NULL(1) > > > > -------------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(18): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_SUCCESS] > > > > INTEGER MPI_SUCCESS > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(20): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_SIZE] > > > > INTEGER MPI_ERR_SIZE > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(22): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_INFO_KEY] > > > > INTEGER MPI_ERR_INFO_KEY > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(24): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_FILE] > > > > INTEGER MPI_ERR_FILE > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(26): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_AMODE] > > > > INTEGER MPI_ERR_AMODE > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(28): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_TRUNCATE] > > > > INTEGER MPI_ERR_TRUNCATE > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(30): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_BASE] > > > > INTEGER MPI_ERR_BASE > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(32): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_OP] > > > > INTEGER MPI_ERR_OP > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(34): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_ASSERT] > > > > INTEGER MPI_ERR_ASSERT > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(36): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_NAME] > > > > INTEGER MPI_ERR_NAME > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(38): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_NO_MEM] > > > > INTEGER MPI_ERR_NO_MEM > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(40): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_INFO] > > > > INTEGER MPI_ERR_INFO > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(42): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_COUNT] > > > > INTEGER MPI_ERR_COUNT > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(44): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_SPAWN] > > > > INTEGER MPI_ERR_SPAWN > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(46): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_CONVERSION] > > > > INTEGER MPI_ERR_CONVERSION > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(48): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_GROUP] > > > > INTEGER MPI_ERR_GROUP > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(50): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_RMA_SYNC] > > > > INTEGER MPI_ERR_RMA_SYNC > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(52): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_NOT_SAME] > > > > INTEGER MPI_ERR_NOT_SAME > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(54): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_KEYVAL] > > > > INTEGER MPI_ERR_KEYVAL > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(56): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_ACCESS] > > > > INTEGER MPI_ERR_ACCESS > > > > ---------------^ > > > > /cm/shared/apps/mpich/ge/intel/3.2/include/mpif.h(58): error #6401: The > > attributes of this name conflict with those made accessible by a USE > > statement. [MPI_ERR_DIMS] > > > > INTEGER MPI_ERR_DIMS > > > > ---------------^ > > > > I usually write everything consistently within PETSc, but for the first > > time, I have to use some PETSc routines in a code that is mostly standard > > FORTRAN 90. > > > > Is including PETSc via #include the wrong > > method? > > > > Best > > > > Timothee > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: